Meta is introducing Teen Accounts on Facebook and Messenger as the company also announces further restrictions on Teen Accounts on Instagram.
Teen Accounts is a service for accounts opened by teens which includes built-in protections that restrict who can contact them and the type of content they see, while providing new ways for younger users to explore content that interests them.
Currently, teenage Instagram users are automatically granted these restrictions, while those under 16 require parental permission to change the settings.
On Wednesday, Meta said Teen accounts on Facebook and Messenger will offer similar, automatic protections to limit inappropriate content and unwanted connections.
Meta will begin rolling out Facebook and Messenger Teen accounts to teens in the US, UK, Australia and Canada, with plans to roll out the feature to teens in other regions in the future.
In a statement, the tech giant said it is strengthening Instagram's Teen Accounts with additional protections that will prevent teens under 16 from going live or disabling protections from unwanted images in DMs without a parent's permission.
“We’ll also require teens under 16 to get parental permission to turn off our feature that blurs images containing suspected nudity in DMs,” Meta said, adding the updates will be available in the next couple of months.
According to Meta, since Teen Accounts was introduced on Instagram last year, 97 per cent of teenagers between the ages of 13 and 15 have remained within its built-in restrictions.
But experts shared with the BBC their doubts about the effectiveness of the service and Meta's approach.
Andy Burrows, chief executive of the charity Molly Rose Foundation, told the BBC: "Eight months after Meta rolled out Teen Accounts on Instagram, we've had silence from Mark Zuckerberg about whether this has actually been effective and even what sensitive content it actually tackles.”
He added that parents still do not know whether these settings actually prevent inappropriate content from being shown to their children through the algorithms selected by Meta.
Matthew Sowemimo, associate head of online child safety policy at the NSPCC, emphasised that Meta’s changes must be combined with proactive measures to prevent the proliferation of harmful content on social media.
Managing director of social media consultancy Battenhall Drew Benvie said that while it is a positive step that technology companies are moving towards more secure measures, teenagers can still find a way around security settings.
Recent Stories