Ofcom has introduced new measures to crackdown on terrorism, child sexual abuse, and racism related content on video sharing platforms (VSPs).
The watchdog claim that one third of users have seen “hateful” content on VSPs such as TikTok, Snapchat, OnlyFans, Vimeo, and Twitch.
If platforms breach the rules they will face a fine of up to 5 per cent of turnover or £250,000, whichever is bigger, similar to GDPR rules.
In extreme cases, Ofcom will have the jurisdiction to shut down the UK operations of non-compliant companies.
The rules will impact 18 VSPs in total, however YouTube and Facebook will not be impacted by the new regulations as they are domiciled in Ireland, though they will fall under the remit of the wider reaching Online Safety Bill.
VSPs operating in the UK will now be expected to:
• provide and effectively enforce clear rules for uploading content
• make the reporting and complaints process easier
• restrict access to adult sites with robust age-verification
Ofcom's job will not involve assessing individual videos, unlike its role in policing in broadcast TV.
There has been a 77 per cent increase in the amount of self-generated abuse content in 2020 according to UK non-profit the Internet Watch Foundation.
The news comes after Ofcom appointed Anna-Sophie Harling as its head of online safety in August.
"Online videos play a huge role in our lives now, particularly for children,” said chief executive Dame Melanie Dawes. "But many people see hateful, violent or inappropriate material while using them.”
She added: "The platforms where these videos are shared now have a legal duty to take steps to protect their users."
Recent Stories