Social media firms must have legal ‘duty of care'

The Science and Technology Committee has concluded that social media companies must be subject to a legal duty of care to help protect young people’s health and wellbeing when accessing their sites.

Figures produced by Ofcom indicate that 70 per cent of 12 to 15 year-olds have a profile on a social media, while the OECD reports that 94.8 per cent of 15-year olds in the UK used social media sites before or after school.

The committee’s new report examined whether the growing use of social media, and screens, among children was healthy or harmful, the evidence base for such claims, and whether any new measures or controls were required.

During the inquiry, over 3,000 young people were surveyed, the committee held an evidence session with young people to hear about their experiences, facilitated focus groups in Westminster with students from Welland Park Academy and took part in an outreach session in Reading with parents.

While the committee heard from witnesses who stated that social media can have a positive impact, the evidence received also pointed towards the potential negative effects of social media on the health and emotional wellbeing of young people.

These ranged from damage to sleep patterns and body image, to bullying, grooming and ‘sexting’.

Although these risks existed before social media, its rise has helped to facilitate it – especially child abuse. The National Crime Agency reported that referrals it received from the National Centre for Missing and Exploited Children had "increased by 700 per cent over the last four years”.

The committee noted that despite these statistics, the quality and quantity of academic evidence on the effects of social media remains low.

Social media companies must be willing to share data with researchers, within the boundaries of data protection legislation, especially on those who are at risk from harmful behaviours, the MPs argued.

The committee also urged government to consider what legislation is required to improve researchers’ access to this type of data, to ensure that social media companies help protect their young users, identify those at risk and help improve current online safety measures.

It also stated that there is currently a “loose patchwork” of regulation and legislation in place, resulting in a "standards lottery".

Key areas that are not currently the subject of specific regulation, identified by Ofcom, include:

• Platforms whose principal focus is video sharing, such as YouTube.
• Platforms centred around social networks, such as Facebook and Twitter.
• Search engines that direct internet users towards different types of information from many Internet services, such as Google and Bing.

The report recommended that a comprehensive regulatory framework, that clearly sets out the responsibilities of social media companies towards their users.

The government's forthcoming Online Harms White Paper, and subsequent legislation, presents a crucial opportunity to put a world-leading regulatory framework in place, the committee stated, adding concerns that “the framework may not be as coherent as it ought to be”.

It suggested establishing a regulator to provide guidance on how to spot and minimise the harms social media presents, as well take enforcement action when warranted.

Norman Lamb, chair of the committee, said: “The government has a vital part to play and must act to put an end to the current ‘standards lottery’ approach to regulation.

“We concluded that self-regulation will no longer suffice, we must see an independent, statutory regulator established as soon as possible, one which has the full support of the government to take strong and effective actions against companies who do not comply.”

    Share Story:

Recent Stories


Bringing Teams to the table – Adding value by integrating Microsoft Teams with business applications
A decade ago, the idea of digital collaboration started and ended with sending documents over email. Some organisations would have portals for sharing content or simplistic IM apps, but the ways that we communicated online were still largely primitive.

Automating CX: How are businesses using AI to meet customer expectations?
Virtual agents are set to supplant the traditional chatbot and their use cases are evolving at pace, with many organisations deploying new AI technologies to meet rising customer demand for self-service and real-time interactions.