Britain has introduced groundbreaking online safety regulations, requiring technology companies to take decisive action against criminal content on their platforms.
Media regulator Ofcom has published its first codes of practice targeting illegal online harms, including terrorism, hate speech, fraud, child sexual abuse, and content encouraging suicide under the Online Safety Act.
Social media platforms, messaging apps, gaming services, and file-sharing websites must complete comprehensive risk assessments by 16 March 2025, identifying potential illegal content affecting children and adults.
Dame Melanie Dawes, Ofcom's chief executive, emphasised the significance of the new regulatory framework. "For too long, sites and apps have been unregulated, unaccountable and unwilling to prioritise people's safety over profits. That changes from today," she stated.
The regulations introduce several key measures, including mandatory senior accountability for safety, improved content moderation, and enhanced reporting mechanisms. Platforms must ensure their moderation teams are adequately resourced and trained to remove illegal material quickly.
Specific protections for children include preventing strangers from accessing children's profiles and locations, and blocking direct messages from non-connected accounts. The codes also require platforms to use automated tools to detect child sexual abuse material more efficiently.
The Online Safety Act empowers Ofcom to levy significant penalties for non-compliance, with potential fines up to 18 million pounds or 10 per cent of a company's global turnover.
Technology secretary Peter Kyle described the new codes as a "material step change in online safety", warning that platforms failing to meet standards would face regulatory action.
Ofcom plans further consultations in 2025, including additional measures on artificial intelligence's role in tackling illegal content and developing crisis response protocols.
The regulatory approach follows extensive consultation, with over 200 responses considered from civil society, charities, technology firms, and law enforcement agencies.
As the digital landscape continues to evolve, these regulations represent a landmark moment in protecting online users, particularly children, from harmful and illegal content.
Recent Stories