Social media ‘responsible for harmful content’

Written by Hannah McGrath
05/04/2019

New laws could hold bosses at BigTech firms such as Facebook and Twitter responsible for harmful content on their platforms, according to a leaked report.

The Guardian reported that the Department for Culture, Media and Sport is planning to introduce a statutory duty of care for bosses of tech firms, according to a white paper seen by the publication.

The document also suggested that the scheme could be funded by a levy on media companies and that broadcast regulator Ofcom could step in to oversee the system until a permanent regulator could be established.

Under the plans outlined in the white paper, the new regulator would have powers to impose fines on individuals and firms that breach the terms of the new legislation, which would make tech platforms liable for controlling harmful content on online platforms.

The action from government to draw up proposals for regulation of social media and technology platforms comes after the death of Molly Russell, who took her own life after viewing self-harm images online.

Responding to the reports of a new legal framework to tackle online harms, Vinous Ali, head of tech policy at industry bodyTech UK said that it was vital that any new laws "get the detail right so that we have an approach that is clear, effective and proportionate."

He added: "Digital platforms and social media now play a huge part in all our lives and make a significant contribution to the success of the UK’s digital economy. We share the Government’s ambition to reduce harms where they exist, particularly for vulnerable users. To provide a workable blueprint for effective regulation the White Paper will have to tackle some challenging issues head on."

Earlier this week, Facebook chief executive Mark Zuckerberg said he was mulling whether to introduce a verified news section on the platform, following mounting criticism of the spread of fake news and disinformation on the social media site, which has been linked to interference in elections.

Discussing the move, Zuckerberg said the new section could use either human ‘editors’ or algorithms to curate stories from “broadly trusted” outlets. He also said Facebook was considering whether to pay news publishers in order to include their content in the news section in order to support publishers of “high-quality, trustworthy content”.

“We’re not going to have journalists making news,” he said. “What we want to do is make sure that this is a product that can get people high-quality news.”