After a social media ban for under 16s got the green light in Australia, the UK’s technology secretary said that a similar policy could be introduced in Britain. Senior reporter Silvia Iacovcich investigates the impact of this world-first legislation and explores some of the alternative approaches to reducing online harms for young people.
Britain’s science, innovation and technology secretary Peter Kyle recently said that banning social media for children under 16 could be “on the table” in the UK as a measure to ensure the safety of young people online. Kyle’s remark came after the passing of Australia’s global benchmark legislation: the Minimum Age for Social Media Act.
The world-first law will force tech giants, including Meta, TikTok, Snapchat and X, to prevent minors from accessing their platforms, with social media companies that fail to do so facing steep fines of up to AUD$50 million.
The move has sparked a global debate about whether the legislation represents a pragmatic or effective approach to addressing the online harms faced by kids, with contrasting views across the tech world, government, and children’s charities about the benefits and dangers of the regulation.
Limitations of age enforcement
Professor James Davenport, fellow at BCS, The Chartered Institute for IT, stresses the limitations of age enforcement in the online world, arguing that it is “inherently flawed”. He says that expectations for a fully functioning ban in Australia might be, at least initially, unrealistic.
“Age enforcement doesn't work perfectly in the real world, and it's even more challenging in the online world,” he tells National Technology News, as he points out that international standards for age verification are still struggling to find effective solutions.
“This is why the Australians have wisely allowed for a year-long testing period to see how the proposed ban works in practice,” continues Davenport. “We should make sure we learn from the Australians’ practical experience, while maintaining a pragmatic outlook that accepts some level of evasion, rather than expecting perfection.”
According to Davenport, a social media ban for under 16s won’t be 100 per cent effective because kids may find other ways to access these platforms, for example through older siblings. Despite this, he says that the ban still represents a step forward in solving the issue of online harms for children, drawing a parallel with age enforcement for alcohol sales.
“Even though underage drinking still occurs, the alcohol laws are still considered helpful overall,” he highlights.
Combining different approaches
When asked what some of the effective alternatives to a social media ban might be, Davenport explains that the problem can be tackled from two angles: by preventing children from consuming inappropriate material or by preventing inappropriate material from being posted.
“You can attack the problem from either or both ends, and clearly the Australians have chosen to address this by focusing on the consumption end,” he says.
Focusing on the production and posting of digital content, rather than on restricting access, is one way to target the root issue of children viewing inappropriate content. For example, Davenport proposes a children's version of a social media platform with stricter controls.
“It would allow social media platforms to produce a version which is age appropriate, for example, adults could not post publicly unless they've done a DBS check,” he explains.
However, social media companies currently have little incentive to invest in creating age-specific versions of their platforms, continues Davenport. Rolling out a platform like this could be expensive, complex, and resource-intensive, with perhaps very limited financial gain for tech giants like Meta.
On top of this, varying global laws around online content publication make it difficult for multinational firms to produce country-specific versions. “These multinational firms will claim that they can't produce an Australian specific version, and even if they did, people may still find a way to circumvent it,” Davenport says.
Olly Parker, head of external affairs at mental health charity YoungMinds, looks at examples outside of social media for solutions. According to the non-profit organisation, education is a crucial tool for preventing online harms and teaching children how to navigate the internet safely, with schools having a crucial role to play in tackling the issue.
"Social media is here to stay,” he says. “We must do more to educate young people about online harms and how to navigate the internet and social media safely.”
“Schools have a role to play in this, creating space for non-screen-based activities where students can spend time in nature or take part in theatre, sports, art or music lessons - but schools must be given the resources to deliver this.”
Bans involve risks
While an age restriction strategy could be valuable for Australia, and perhaps the UK if eventually implemented, concerns remain about the possible repercussions children may face because of the ban. These include the potential for a "binge effect” when restrictions are lifted, similar to how some teenagers binge drink on their 18th birthday when they can legally access alcohol for the first time.
“We might see a certain amount of the equivalent and kids suddenly going out and say, ‘hey, I can get onto this thing, let's see what I can find,’ and because they've been banned from it, they will look for the worst type of material, because now they can,” says Davenport, warning that the ban lifting could potentially lead some young adults to become online addicts.
Concerns also remain about the potential impact of the social media ban on vulnerable children. Some youth advocacy groups and academics have warned that the ban could shut off the most vulnerable young people, including LGBTQIA and migrant teenagers, from support networks. The Australian Human Rights Commission said the law may infringe the human rights of young people by interfering with their ability to participate in society.
David Shoebridge, from the Australian minority Greens party, has warned that isolation could be a result for many children who used social media to find support.
“This policy will hurt vulnerable young people the most, especially in regional communities and especially the LGBTQI community, by cutting them off,” Shoebridge told the Senate at the end of last month.
Davenport echoes Shoebridge’s sentiment, stressing that social media can help kids who come from separated families, due to divorce or a parent working away, keep in touch with their family.
“While they could potentially use other messaging apps like WhatsApp instead of Instagram, the effects of removing this connectivity tool could damage the ability of families to stay connected,” he continues.
Christopher Stone, executive director of Suicide Prevention Australia, recently said that cutting off this access risks exacerbating feelings of loneliness and isolation. “Social media provides vital connections for many young Australians, allowing them to access mental health resources, peer support networks, and a sense of community,” warned Stone.
YoungMind’s Olly Parker says that young people are growing up through a technological revolution, adding that navigating the online world and their relationship with smartphones is often a challenge for them and the adults in their lives.
“Young people tell us social media brings both positives and negatives, with the online world providing a space to socialise, share information, learn and explore, but they also recognise the potential negative impacts,” explains Parker. “They feel bombarded with distressing content they did not search or ask for and they sometimes feel trapped and unable to leave social media sites.”
Davenport says that one way to keep bonds between family members open whilst protecting children on social media could be for these platforms to create family groups.
“You could register family groups and only allows selected members to post in there,” he adds, explaining that shutting down Instagram and Snapchat could lead children to seek comfort in less regulated social media platforms.
The Heads Up Alliance, a community of Australian families delaying social media and smartphones for their children, agrees with this sentiment.
"When we make it harder for children to access Snapchat and Instagram, we need to pay even more attention to other platforms such as WhatsApp, where many children will hope to migrate,” they said in a Facebook post. There is more reason now to ensure these services are as child-safe as possible.
“In their current forms, they simply are not - and our children will be jumping from the frying pan into the fire.”
According to Davenport, the social media ban should adopt practical limits to age enforcement, focusing on punishing those who help minors circumvent age restrictions, rather than punishing the minors themselves.
“We don't ban Heineken because someone managed to get their way into a shop and buy a six pack of Heineken when they're under 18,” he says. “Conversely, we don't let Heineken store automatic vending machines, because then kids could trick them.”
A changing landscape
Despite social media giants like Meta recently saying that Australia’s law has been “rushed”, many experts seem to agree that a new kind of legislation was necessary to push these companies to improve their enforcement and moderation efforts.
While regulation is important, there is also agreement that social media companies should take their self-regulating practices more seriously. This need is particularly acute following occurrences such as X reducing its trust and safety staff by 80 per cent, including content moderators and policy teams.
But BCS, The Chartered Institute for IT’s James Davenport says that new legislation, including Australia’s new law and the UK’s Online Safety Bill, are ultimately pushing the tech industry away from an ‘all-or-nothing mentality’ and towards a more pragmatic approach of trying to implement the best possible age verification solutions, even if they are not perfect.
“In the past, the standardisation field had a typical tech reaction of saying ‘we can't get this 100 per cent right, so we mustn't do it’,” Davenport explains. “This legislation has changed the attitudes of the companies to enforcement. There's no doubt about that.”
Recent Stories