Instagram is introducing new tools to prevent sextortion, a method whereby scammers trick children and teenagers into sending explicit images online.
The social media site said it will introduce a nudity protection feature for Instagram DMs which blurs images containing nudity and encourages people to think before sending nude images. Users will also be told they can unsend these images if they change their mind.
The move comes as a new wave of scammers trick young people into sending explicit images and then demand money or gift cards to prevent these images from being sent to the victim’s friends and family.
Speaking about Instagram’s new anti-sextortion tools, John Shehan, senior vice president at the National Center for Missing & Exploited Children said: “Companies have a responsibility to ensure the protection of minors who use their platforms. Meta’s proposed device-side safety measures within its encrypted environment is encouraging. We are hopeful these new measures will increase reporting by minors and curb the circulation of online child exploitation.”
Instagram added that it will make it harder for potential sextortion accounts to message or interact with people, with adults will be restricted from starting chats with teenagers that they are not already connected to.
Ofcom recently published detailed plans for the regulation of tech firms following the recent launch of the UK government’s Online Safety Bill. It said that its first priority will be protecting children, highlighting research which revealed the scale and nature of “scattergun” friend requests used by predators looking to groom children online.
Instagram said that if someone receives a message containing a nude image, it will automatically be blurred and the recipient can choose whether or not to view it. They will also receive a message encouraging them not to feel pressured to respond and an option to block and report the chat.
The feature uses machine learning to work out whether an image contains nudity, with the images are analysed on the device itself, which means it works in end-to-end encrypted chats.
Additionally, Instagram said it is testing new pop-up messages for those who may have interacted with accounts removed for sextortion, who will also be directed to resources such as support helplines.
Instagram said the new feature will be turned on by default for users under 18, while adults will be sent a notification encouraging them to turn it on.
The features build on measures already taken by Instagram including Lantern, a programme founded by Meta and other companies which allows the sharing of accounts and behaviours which violate child safety policies.
Recent Stories