Australian children ‘easily circumvent’ age restrictions on social media

Children can easily circumvent inadequate and poorly enforced minimum age rules on social media services, according to a new report by Australia’s online safety regulator eSafety.

The survey exploring the social media use of Australian children aged 8-15 found that many sites only ask people to self-declare their age when signing up, meaning that if a child under 13 gives a false date of birth, they can create an account and access the platform.

According to data obtained from the study, YouTube, Snapchat, TikTok and Instagram are popular among Australians 13-17.

Of these, only YouTube allows access to users under 13 when attached to a family account with parental supervision.

eSafety estimates that some 80 per cent of Australian children aged 8-12 used one or more social media services in 2024.

In November last year, Australia approved legislation banning children under 16 from using social media, setting a global benchmark for regulating digital platforms.

The Social Media Minimum Age bill, passed by the Senate with 34 votes to 19, will force tech giants including Meta, TikTok, Snapchat, and X to prevent minors from accessing their platforms, with those failing to do so risking a fine of up to (AUS)$50 million.

The report by eSafety discovered mixed approaches in how social media platforms are enforcing age limits, with many implementing technology or tools to assess the age of users once they were on the service.

Some services, such as TikTok, Twitch, Snapchat and YouTube, use tools to proactively detect users under 13. The report said that while other services have some tools and technology available, they are not using it to detect underage users.

eSafety Commissioner Julie Inman Grant said that the report shows there is still significant work to be done by any social media platforms relying on truthful self-declaration to determine age with enforcement of the government’s minimum age legislation on the horizon.

“Social media services not only need to make it harder for underage users to sign up to their services in the first place, but also make sure that users who are old enough to be on the service, but are not yet adults, have strong safety measures in place by default,” she added. “We’ll be consulting with industry and other stakeholders this year about what reasonable steps platforms should be expected to take to give effect to the minimum age requirements, and this report will be one key input to that process.



Share Story:

Recent Stories


Bringing Teams to the table – Adding value by integrating Microsoft Teams with business applications
A decade ago, the idea of digital collaboration started and ended with sending documents over email. Some organisations would have portals for sharing content or simplistic IM apps, but the ways that we communicated online were still largely primitive.

Automating CX: How are businesses using AI to meet customer expectations?
Virtual agents are set to supplant the traditional chatbot and their use cases are evolving at pace, with many organisations deploying new AI technologies to meet rising customer demand for self-service and real-time interactions.