Online Safety Bill ‘must have clear guidance’ for tech companies

The government needs to be clear about what makes online material harmful if new laws are to be effective, experts have said.

BCS, The Chartered Institute for IT, praised the new legislation but warned that tighter guidelines are necessary to make sure the Online Safety Bill is a success.

“This is a positive step that will reassure parents and increase online safety and accountability,” said Dr Bill Mitchell, director of policy, BCS. “However, the proposed Bill will be challenging in practice unless there is robust and objective guidance for social media platforms on what legal online content constitutes as harmful and must be removed.”

Mitchell called for a comprehensive public debate on how to balance the need to limit online harm and at the same time nurture freedom of speech and the freedom to disagree in a civilised manner.

“We fully support the Government’s focus on tackling child-abuse, racist and misogynistic abuse online while re-stating that there are people from a wide range of backgrounds who need more help with online safety, and indeed access to the benefits of the internet,” the director added. “We believe the UK won’t reach its full digital potential until everyone who is willing and able is equipped with the skills and opportunity to use the internet safely.

Mitchell concluded: “Lack of access to digital technology is an ‘offline harm’ and further government action to coordinate efforts to close the digital divide is required to understand who is being left behind and how they can be included.”

Under the new rules, Ofcom will be given the power to heavily fine companies by up to £18 million or 10 per cent of their annual global turnover, whichever is higher – a figure which could run into billions of pounds for larger companies.

The Bill also includes a deferred power making senior managers at firms criminally liable for failing to follow a new duty of care. Ofcom will also have the power to block access to sites.

“As it stands, the Bill is focussed almost entirely on providers to fix and safeguard against potential harms; there is little mention of other stakeholders with influence and social capital in this area,” said professor Andy Phippen, professor of IT ethics and digital rights at Bournemouth University and BCS law specialise group committee member. “Young people tell me they want better education, skills and knowledgeable adults they can talk to about their concerns, rather than calling on platforms to adopt an intangible duty of care.”

Phippen added: “We can’t make kids safe online, but we can make them informed about risk and help them mitigate them. Industry are responsible for providing them with the tools to help mitigate risk, but can’t solve it on their own.”

    Share Story:

Recent Stories


Bringing Teams to the table – Adding value by integrating Microsoft Teams with business applications
A decade ago, the idea of digital collaboration started and ended with sending documents over email. Some organisations would have portals for sharing content or simplistic IM apps, but the ways that we communicated online were still largely primitive.

Automating CX: How are businesses using AI to meet customer expectations?
Virtual agents are set to supplant the traditional chatbot and their use cases are evolving at pace, with many organisations deploying new AI technologies to meet rising customer demand for self-service and real-time interactions.