Calls for clarity on online harms regulation

MPs and industry bodies have warned that the government’s plans to regulate tech giants and hold them liable for harmful content are “too vague” and will need significant work before the public can have confidence that concrete action is being taken to protect internet users.

The Department for Culture Media and Sport yesterday published a whitepaper outlining plans for the UK to consult on a regulator to introduce new internet rules that will make social media companies legally liable for harmful content posted on their platforms.

The plans also included measures for technology firms to face substantial fines and for a mandatory duty of care, in which senior bosses could be held personally responsible if they fail to comply with tough new rules. Funding for the regulator would come from a new levy on social media firms.

The government said the paper formed part of plans to make the UK one of the safest places for online users in the world. The move comes after mounting calls for tougher action to tackle violent, extremist and harmful online content.

The plans will now be subject to a 12 week consultation process with BigTech firms and industry bodies.

Prime minister Theresa May hailed the proposals as evidence that the era of ‘self-regulation’ by social media platforms was coming to an end, adding that “for too long these companies have not done enough to protect users, especially children and young people, from harmful content”.

However, while industry body TechUK issued a cautious welcome for the direction of plans, head of policy Vinous Ali said there was “still a long way to go” before the government achieved its ambition of creating a world leading framework to combat online harms, with “many key questions open for consultation”.

He added that implementing the proposed framework would “not be easy” and “some of the key pillars of the government's approach remain too vague”.

Ali warned that not all of the legitimate concerns about online harms could be addressed through regulation. “The new framework must be complemented by renewed efforts to ensure children, young people and adults alike have the skills and awareness to navigate the digital world safely and securely,” he said.

Meanwhile, Damian Collins, chairman of the Digital, Culture, Media and Sport Select Committee welcomed the move towards social media companies taking “legal liability to take down harmful content”.

He stated: “It is right as well that an independent regulator, established in law, should have the power to oversee the compliance of the social media companies with these requirements.

“This should include a range of sanctions for failing to meet their duty of care to their users, including large fines, and personal liability for senior directors who are proven to have been negligent of their responsibilities.”

However, Collins outlined concerns that the whitepaper does not address points made by the committee during their inquiry into fake news.

“Disinformation is clearly harmful to democracy and society as a whole” he said. “The social media companies must have a responsibility to act against accounts and groups that are consistently and maliciously sharing known sources of disinformation.”

Lord Gilbert of Panteg, chairman of the House of Lords Communications Committee, also commented: “Major platforms have failed to invest in their moderation systems, leaving moderators overstretched and inadequately trained – a duty of care is therefore needed to implement minimum standards and to give effect to human rights including freedom of expression.”

However, he said the need for further regulation of the digital world went beyond online harms.

“A comprehensive new approach to regulation is needed to address the diverse range of challenges that the internet presents, such as misuse of personal data and the concentration of digital markets.”

    Share Story:

Recent Stories


Bringing Teams to the table – Adding value by integrating Microsoft Teams with business applications
A decade ago, the idea of digital collaboration started and ended with sending documents over email. Some organisations would have portals for sharing content or simplistic IM apps, but the ways that we communicated online were still largely primitive.

Automating CX: How are businesses using AI to meet customer expectations?
Virtual agents are set to supplant the traditional chatbot and their use cases are evolving at pace, with many organisations deploying new AI technologies to meet rising customer demand for self-service and real-time interactions.