Under new reforms to the upcoming Online Safety Bill, the government will hold companies that publish or allow pornographic content on their platforms to a “higher standard” on the age verification or estimation tools they use.
Tougher rules will see pornography companies, social media platforms and other services explicitly required to use age verification methods to prevent children accessing pornography.
The government also said that it would crack down on content that promotes suicide, self-harm, or eating disorders to better protect children, while new changes to the upcoming law will make it easier for coroners and bereaved parents to gain access to data from social media platforms.
“This Government will not allow the lives of our children to be put at stake whenever they go online; whether that is through facing abuse or viewing harmful content that could go on to have a devastating impact on their lives,” said minister for technology and the digital economy, Paul Scully. “To prevent any further tragedy and build a better future for our children, we are acting robustly and with urgency to make the Online Safety Bill the global standard for protecting our children.”
The amendments will allow Ofcom to obtain information on a child's social media use if it is requested by a coroner to better help families and the police identify whether online activity played a role in their death.
The new measures will force the largest companies to have to have clear policies for disclosing such data, and mechanisms for responding to disclosure requests from parents or guardians.
The regulator will also have updated powers that require it to conduct research into the harms arising from app stores.
Recent Stories