‘It’s time to rein in big tech’: Lords committee
Written by Peter Walker
The House of Lords Communications Committee has called for a new, overarching regulatory framework so that services in the digital world are held accountable to an enforceable set of shared principles.
In a new report, the committee noted that over a dozen UK regulators have a remit covering the digital world, but there is no body which has complete oversight. As a result, regulation of the digital environment is fragmented, with gaps and overlaps meaning big tech companies have failed to adequately tackle online harms.
Responses to growing public concern have been piecemeal and inadequate, the committee argued, recommending a new Digital Authority, guided by 10 principles to inform regulation of the digital world.
Chairman of the committee, Lord Gilbert of Panteg, said: “The government should not just be responding to news headlines, but looking ahead so that the services that constitute the digital world can be held accountable to an agreed set of principles.
“Self-regulation by online platforms is clearly failing and the current regulatory framework is out of date,” he continued. “Without intervention, the largest tech companies are likely to gain ever more control of technologies which extract personal data and make decisions affecting people's lives.”
This new ‘digital authority’ would be established to co-ordinate regulators, continually assess regulation and make recommendations on which additional powers are necessary to fill gaps. It would report to a new joint committee of both Houses of Parliament, whose remit would be to consider all matters related to the digital world.
The 10 principles identified in the committee’s report should guide all regulation of the internet, including accountability, transparency, respect for privacy and freedom of expression.
“The principles will help the industry, regulators, the government and users work towards a common goal of making the internet a better, more respectful environment which is beneficial to all,” read the report, adding that if rights are infringed, those responsible should be held accountable in a fair and transparent way.
The committee stated that a duty of care should be imposed on online services which host and curate content which can openly be uploaded and accessed by the public. Given the urgent need to address online harms, Ofcom’s remit should expand to include responsibility for enforcing the duty of care, the report suggested.
Online platforms should also make community standards clearer through a new classification framework akin to that of the British Board of Film Classification. Social media platforms should invest in more effective moderation systems to uphold their community standards, according to the report.
The committee argued that users should have greater control over the collection of personal data, with maximum privacy and safety settings being the default.
“Data controllers and data processors should be required to publish an annual data transparency statement detailing which forms of behavioural data they generate or purchase from third parties, how they are stored, for how long, and how they are used and transferred,” the report read.
The government should also empower the Information Commissioner’s Office to conduct impact-based audits where risks associated with using algorithms are greatest.
Finally, the committee stated that the modern internet is characterised by the concentration of market power in a small number of companies which operate online platforms.
The government should therefore consider creating a public-interest test for data-driven mergers and acquisitions, the peers said.