Meta, Snap given 1 December deadline to inform EU about child protection actions

The European Commission has set a 1 December deadline for Meta and Span to give more information on how they protect children from illegal and harmful content.

The EC last month sent a range of social media platform operators urgent orders to detail measures taken to counter the spread of content related to hate speech, violent content and terrorism.

The Commission said that it can open investigations into the companies if it is not satisfied with their responses under the newly introduced Digital Services Act (DSA). The DSA entails that major platforms are required to tightly regulate illegal and harmful content, or risk fines up to six per cent of their global turnover.

Also on Friday, an official for the Commission warned that X – formerly known as Twitter – has significantly fewer content moderation staff than its rivals. In reports submitted to the EU, X confirmed it had 2,294 EU content moderators compared with 16,974 at Google's YouTube, 7,319 at Google Play and 6,125 at TikTok.

X owner Elon Musk sent shockwaves through the industry last year when he slashed staffing numbers at the social media platform, including a significant portion of its content moderation team.

    Share Story:

Recent Stories


Bringing Teams to the table – Adding value by integrating Microsoft Teams with business applications
A decade ago, the idea of digital collaboration started and ended with sending documents over email. Some organisations would have portals for sharing content or simplistic IM apps, but the ways that we communicated online were still largely primitive.

Automating CX: How are businesses using AI to meet customer expectations?
Virtual agents are set to supplant the traditional chatbot and their use cases are evolving at pace, with many organisations deploying new AI technologies to meet rising customer demand for self-service and real-time interactions.