Ofcom to regulate social media platforms

The government is to appoint Ofcom as regulator responsible for policing social media firms over harmful content.

Culture secretary Nicky Morgan this morning announced that the government is preparing to hand the broadcast regulator powers to ensure that the likes of Facebook, YouTube, Snapchat and Twitter take down content deemed harmful, racist, explicit or abusive as part of a “duty of care”.

The announcement comes as the government published its initial response to the public consultation on the Online Harms White Paper.

Earlier this year, Mark Zuckerberg, chief executive of Facebook, called for greater co-operation between social media firms and governments to draw up wide ranging internet regulation. However, rivals have resisted intervention and said they have their own policies to limit unacceptable content generated by users.

Today’s announcement did not detail the penalties that will be available to Ofcom, but the consultation into online harms centred around regulatory powers holding social media bosses liable for the user generated content on their platforms.

The regulation will only apply to companies that allow the sharing of user-generated content - for example, through comments, forums or video sharing.

Fewer than fiver per cent of UK businesses will be in scope, the government said.

The statement from the Department for Culture, Media and Sport said the regulator would play a key role in enforcing a statutory duty of care to protect users from harmful and illegal terrorist and child abuse content.

Morgan said: “With Ofcom at the helm of a proportionate and strong regulatory regime, we have an incredible opportunity to lead the world in building a thriving digital economy, driven by groundbreaking technology, that is trusted by and protects everyone in the UK.

“We will give the regulator the powers it needs to lead the fight for an internet that remains vibrant and open but with the protections, accountability and transparency people deserve.”

A recent Ofcom report showed that 61 per cent of adults and 79 per cent of 12-15 year old internet users reported having had at least one potentially harmful experience online in the previous 12 months.

    Share Story:

Recent Stories


Bringing Teams to the table – Adding value by integrating Microsoft Teams with business applications
A decade ago, the idea of digital collaboration started and ended with sending documents over email. Some organisations would have portals for sharing content or simplistic IM apps, but the ways that we communicated online were still largely primitive.

Automating CX: How are businesses using AI to meet customer expectations?
Virtual agents are set to supplant the traditional chatbot and their use cases are evolving at pace, with many organisations deploying new AI technologies to meet rising customer demand for self-service and real-time interactions.