Meta board recommends 'overhauling adult nudity standard'

Meta’s oversight board has overturned the social media giant’s original decisions to remove two Instagram posts which featured transgender and non-binary people displaying their bare chests.

Following a slew of automated trigger notifications and user reports, the Instagram posts were reviewed for potential violations of various community standards. Meta ultimately chose to remove both posts as they were deemed to be in violation of the sexual solicitation community standard.

Users appealed to the oversight board, which advises Meta on its content moderation policies, leading Meta to restore the deleted posts.

The board’s ruling also recommended the company change its adult nudity and sexual activity community standard “so that it is governed by clear criteria that respect international human rights standards”.

Meta has 60 days to respond to the board’s recommendation and has already said it “welcomes the board’s decision in this case”.

A Meta spokesperson said: “We are constantly evolving our policies to help make our platforms safer for everyone.”

They added: “We know more can be done to support the LGBTQ+ community, and that means working with experts and LGBTQ+ advocacy organisations on a range of issues and product improvements.”

    Share Story:

Recent Stories


Bringing Teams to the table – Adding value by integrating Microsoft Teams with business applications
A decade ago, the idea of digital collaboration started and ended with sending documents over email. Some organisations would have portals for sharing content or simplistic IM apps, but the ways that we communicated online were still largely primitive.

Automating CX: How are businesses using AI to meet customer expectations?
Virtual agents are set to supplant the traditional chatbot and their use cases are evolving at pace, with many organisations deploying new AI technologies to meet rising customer demand for self-service and real-time interactions.