Meta calls for law forcing apps to get parental permission for under-16s

Meta has called for laws which would force app stores to get parental approval each time a child under the age of 16 downloads an app.

The move would mean that app stores would be responsible for implementing parental controls instead of social media companies such as Facebook and Snapchat.

In a blog post Antigone Davis, global head of safety at Meta, said there needed to be a simple solution to govern children’s use of social media.

She said that while legislation is needed so all apps teens use can be held to the same standard, teenagers move between many apps and websites and it can be hard for parents to keep up.

Davis added that because there is a “patchwork” of laws across US states which hold platforms to different standards, teens are inconsistently protected.

“If laws are passed as written, every time your teen wants to sign up for an app (assuming the app follows the rules) you will need to go through different methods to sign up, provide your and your teen’s potentially sensitive identification information to apps with inconsistent security and privacy practices, and repeat that process over and over again,” Davis wrote. “By verifying a teen’s age on the app store, individual apps would not be required to collect potentially sensitive identifying information.”

Davis continued: “Apps would only need the age from the app store to ensure teens are placed in the right experiences for their age group. Parents and teens won’t need to provide the hundreds of apps their teens use with sensitive information like government IDs.”

In October, several US states accused Meta of contributing to a mental health crisis across the country. Around 33 states filed a complaint in an Oakland court, accusing Meta of misleading the public about the substantial dangers of its platforms and saying that the company knowingly induced young children and teenagers into addictive and compulsive social media use.

Earlier this week, the European Commission set a 1 December deadline for Meta and Snap to give more information on how they protect children from illegal and harmful content. It sent a range of social media platform operators urgent orders to detail measures taken to counter the spread of content related to hate speech, violent content and terrorism.



Share Story:

Recent Stories


Bringing Teams to the table – Adding value by integrating Microsoft Teams with business applications
A decade ago, the idea of digital collaboration started and ended with sending documents over email. Some organisations would have portals for sharing content or simplistic IM apps, but the ways that we communicated online were still largely primitive.

Automating CX: How are businesses using AI to meet customer expectations?
Virtual agents are set to supplant the traditional chatbot and their use cases are evolving at pace, with many organisations deploying new AI technologies to meet rising customer demand for self-service and real-time interactions.