Instagram to blur nudity as part of new features to protect minors from 'sextortion'
Euronews
Instagram will begin testing new features that aim to "protect young people from sextortion and intimate image abuse".
Sexual extortion or "sextortion" is a form of blackmail that involves coercing someone into sharing sexually explicit photos or videos and using them to threaten or exploit someone.
One of the new tools will detect and blur images that contain nudity and notify people to think twice before sending them. For everyone under 18, that feature will be activated on Instagram.
AI deepfake porn videos of underage young people are surging in Greece
When the "nudity protection" tool is on, users will receive a message urging them to "take care when sharing sensitive photos".
They can also withdraw photos they have already sent that contain nudity.
When someone receives a nude photo, it will be blurred with a warning that the picture "may contain nudity".
The tool uses machine learning to analyse whether the image contains nudity.
"Because the images are analysed on the device itself, nudity protection will work in endto-end encrypted chats, where Meta won’t have access to these images - unless someone chooses to report them to us," Instagram, owned by Mark Zuckerberg's Meta, said in a blog post.
Ad-free subscription versions of Facebook and Instagram to start in the EU
The other features include new noti cations sent to people who may have interacted with an account removed for sextortion as well as hiding the message button on young people's pro les to possible accounts engaging in sextortion.
Susie Hargreaves, CEO of the UK-based Internet Watch Foundation (IWF), said in a statement provided to Euronews Next that sexual extortion can have "horrendous repercussions" and that the non-pro t organisation applauds "any e orts by tech companies to safeguard" children using social media.
"However, while the new tool is a welcome move by Meta, any potential bene ts will be undermined by its decision to roll out end-to-end encryption on its messaging channels," Hargreaves added.
"By doing so, it is willfully turning a blind eye to child sexual abuse. More can, and should be done, to protect the millions of children who use Meta’s services".
Children left to rely on ‘instinct’ to protect themselves from online harm, new study nds
Recent increases in sextortion
In a report released in January, the Network Contagion Research Institute said nancial sextortion was growing at an alarming rate.
The report added that Instagram was the most common platform for targeting victims because blackmailers could get personal information about people quickly.
Snapchat, meanwhile, was the most frequently used platform to coerce people into sending compromising photos.
The European Commission sent requests to Meta and Snap, the parent company of Snapchat, in 2023 to provide more information on measures they have taken to protect minors, which is a requirement under the Digital Services Act.
Social media companies have been repeatedly criticised for failing to protect children online.
From message editing to disabling the read receipts: Meta is rolling out new features on Instagram
At a US Senate hearing earlier this year, Meta CEO Mark Zuckerberg apologised to parents who said social media had harmed their children.
He said no one "should go through what you and your fami
lies have su ered".
EU institutions, meanwhile, are considering new laws on child sexual abuse content that would make it mandatory for companies to prevent the dissemination of this type of material. They have been subject to debate, however, over privacy concerns.
This week, EU lawmakers voted to extend a current exemption to privacy rules that allows platforms to detect child sexual abuse material until 2026.