Online Safety Act: Ofcom encourages social media to take early action amidst UK riots

Following recent riots in the UK, Ofcom has urged social media platforms not to wait for the regulator’s enhanced powers under the Online Safety Act before taking action against incitement of violence or hatred.

In an open letter, the UK media watchdog said that while new safety duties under the legislation will be in place in a few months, there is no need for platforms to wait to make their sites and apps safer for users.

The organisation warned online service providers operating in the UK about the increased risk of their platforms being used to stir up hatred, provoke violence, and commit other offences in the context of recent acts of violence in the UK.

It said that under existing regulations that pre-date the new Online Safety Act, which was enacted under the previous government, UK-based video-sharing platforms must protect their users from videos likely to incite violence or hatred.

“We therefore expect video-sharing platforms to ensure their systems and processes are effective in anticipating and responding to the potential spread of harmful video material stemming from the recent events,” said Gill Whitehead, Ofcom group director for online safety.

The new legislation sets out new responsibilities for online services around how they assess and mitigate the risks of illegal activity, which can include content involving hatred, disorder, provoking violence or certain instances of disinformation.

When Ofcom publishes its final codes of practice and guidance later this year, regulated services will have three months to assess the risk of illegal content on their platforms. They will also be required to take appropriate steps to stop it appearing, and act quickly to remove it when they become aware of it.

The most widely used social media platforms, like Instagram, Facebook, X, and TikTok, will also need to go even further by consistently applying their terms of service, which often include banning things like hate speech, inciting violence, and harmful disinformation.

“We expect continued engagement with companies over this period to understand the specific issues they face, and we welcome the proactive approaches that have been deployed by some services in relation to these acts of violence across the UK,” continued Whitehead. “In a few months, new safety duties under the Online Safety Act will be in place, but you can act now – there is no need to wait to make your sites and apps safer for users.”



Share Story:

Recent Stories


Bringing Teams to the table – Adding value by integrating Microsoft Teams with business applications
A decade ago, the idea of digital collaboration started and ended with sending documents over email. Some organisations would have portals for sharing content or simplistic IM apps, but the ways that we communicated online were still largely primitive.

Automating CX: How are businesses using AI to meet customer expectations?
Virtual agents are set to supplant the traditional chatbot and their use cases are evolving at pace, with many organisations deploying new AI technologies to meet rising customer demand for self-service and real-time interactions.