Tech firms forced to ‘tame toxic algorithms’ under new Ofcom rules

Ofcom has called on tech firms to “tame toxic algorithms” to keep children safe online.

The regulator has released a draft code of practice detailing 40 steps which organisations should take to prevent children from seeing harmful content like suicide and self-harm-related videos and pornography.

In the draft Children’s Safety Codes of Practice, Ofcom said firms must first assess the risk their service poses to children and then implement safety measures to mitigate those risks.

The measures ask firms to carry out “robust” age checks to stop children accessing harmful content; ensure that algorithms which recommend content do not operate in a way that harms children; and introduce better moderation of harmful content.

Ofcom said the guidelines have been drawn up over the past year in consultation with around 7,000 parents as well as professionals who work with children.

Around 15,000 young people had spoken to the regulator about their lives online. Over a four-week period, 62 per cent of children aged 13-17 reported encountering online harm, while many consider it an ‘unavoidable’ part of their lives online.

The regulator says that it expected all services, including search engines, to have content moderation systems and processes in place to ensure action is taken against harmful content.

Ofcom said it will launch an additional consultation later this year on how automated tools, including AI, can be used to proactively detect illegal content and content most harmful to children.

Dame Melanie Dawes, chief executive at Ofcom said that many children have their experiences “blighted” by harmful content which they have no control over, while many parents are worried about how to keep their children safe online.

“Our measures – which go way beyond current industry standards – will deliver a step-change in online safety for children in the UK,” she continued. “Once they are in force we won’t hesitate to use our full range of enforcement powers to hold platforms to account. That’s a promise we make to children and parents today.”

Michelle Donelan, technology secretary, said that the new measures would bring about a “fundamental change” in how children experience the online world and urged technology companies to “step up” and meet their responsibilities rather than wait for enforcement and fines.

“When we passed the Online Safety Act last year we went further than almost any other country in our bid to make the UK the safest place to be a child online,” Donelan said. “That task is a complex journey but one we are committed to, and our groundbreaking laws will hold tech companies to account in a way they have never before experienced.”

Hanna Basha, partner at legal firm Payne Hicks Beach said that Ofcom’s new measures are a welcome development following its earlier announcement of an investigation into OnlyFans for failure to implement age verification measures.

“These measures are a welcome development, but it is difficult to see how Ofcom will be taken seriously by social media platforms without demonstrating its commitment to levying serious penalties on offending platforms,” she warned. “If OnlyFans has failed to meet its obligations, Ofcom will need to send a strong signal that this failure will not be tolerated otherwise its statement of intention to protect children will not be credible."

The announcement comes after British charity 5Rights Foundation said that brands selling products or advertising on TikTok to use their “collective power” to fight against online harms on the platform.

On Tuesday Duncan McCann, head of accountability at the organisation, told FStech that while brands and retailers are concerned with online harms and actively talking about issues such as inappropriate product placements or controversial content, there's a sense that companies feel that they are prisoners to platforms like TikTok because they are one of the most valuable and important segments to target.

In November last year, Ofcom published detailed plans for the regulation of tech firms following the launch of the government’s long-awaited Online Safety Bill. It said that its first priority as the UK’s new online safety regulator was to protect children, highlighting research which revealed the scale and nature of “scattergun” friend requests used by predators looking to groom children online.



Share Story:

Recent Stories


Bringing Teams to the table – Adding value by integrating Microsoft Teams with business applications
A decade ago, the idea of digital collaboration started and ended with sending documents over email. Some organisations would have portals for sharing content or simplistic IM apps, but the ways that we communicated online were still largely primitive.

Automating CX: How are businesses using AI to meet customer expectations?
Virtual agents are set to supplant the traditional chatbot and their use cases are evolving at pace, with many organisations deploying new AI technologies to meet rising customer demand for self-service and real-time interactions.