UK Online Safety Act comes into force

Britain has introduced groundbreaking online safety regulations, requiring technology companies to take decisive action against criminal content on their platforms.

Media regulator Ofcom has published its first codes of practice targeting illegal online harms, including terrorism, hate speech, fraud, child sexual abuse, and content encouraging suicide under the Online Safety Act.

Social media platforms, messaging apps, gaming services, and file-sharing websites must complete comprehensive risk assessments by 16 March 2025, identifying potential illegal content affecting children and adults.

Dame Melanie Dawes, Ofcom's chief executive, emphasised the significance of the new regulatory framework. "For too long, sites and apps have been unregulated, unaccountable and unwilling to prioritise people's safety over profits. That changes from today," she stated.

The regulations introduce several key measures, including mandatory senior accountability for safety, improved content moderation, and enhanced reporting mechanisms. Platforms must ensure their moderation teams are adequately resourced and trained to remove illegal material quickly.

Specific protections for children include preventing strangers from accessing children's profiles and locations, and blocking direct messages from non-connected accounts. The codes also require platforms to use automated tools to detect child sexual abuse material more efficiently.

The Online Safety Act empowers Ofcom to levy significant penalties for non-compliance, with potential fines up to 18 million pounds or 10 per cent of a company's global turnover.

Technology secretary Peter Kyle described the new codes as a "material step change in online safety", warning that platforms failing to meet standards would face regulatory action.

Ofcom plans further consultations in 2025, including additional measures on artificial intelligence's role in tackling illegal content and developing crisis response protocols.

The regulatory approach follows extensive consultation, with over 200 responses considered from civil society, charities, technology firms, and law enforcement agencies.

As the digital landscape continues to evolve, these regulations represent a landmark moment in protecting online users, particularly children, from harmful and illegal content.



Share Story:

Recent Stories


Bringing Teams to the table – Adding value by integrating Microsoft Teams with business applications
A decade ago, the idea of digital collaboration started and ended with sending documents over email. Some organisations would have portals for sharing content or simplistic IM apps, but the ways that we communicated online were still largely primitive.

Automating CX: How are businesses using AI to meet customer expectations?
Virtual agents are set to supplant the traditional chatbot and their use cases are evolving at pace, with many organisations deploying new AI technologies to meet rising customer demand for self-service and real-time interactions.