Instagram Takes Steps to Protect Teens and Combat Sexual Exploitation by Blurring Nudity in Messages

25 days ago
Instagram Takes Steps to Protect Teens and Combat Sexual Exploitation by Blurring Nudity in Messages


The social media platform said in a blog post Thursday that it’s testing out the features as part of its campaign to fight sexual scams and other forms of “image abuse,” and to make it tougher for criminals to contact teens.

Meta, which is based in Menlo Park, California, also owns Facebook and WhatsApp but the nudity blur feature won’t be added to messages sent on those platforms.

Instagram has announced the rollout of innovative tools aimed at safeguarding young users and eradicating sexual extortion. Among these measures is a new feature that will automatically blur any nudity present in direct messages.

Sextortion is a disturbing crime where individuals are coerced into sharing explicit photos online, only to be blackmailed with the threat of public exposure or other harmful consequences. Recent cases have shed light on this issue, such as the conviction of two Nigerian brothers who targeted teen boys and young men in Michigan, tragically resulting in the loss of one victim’s life. Additionally, a Virginia sheriff’s deputy made headlines for engaging in sexual extortion and abduction of a 15-year-old girl.

In recent times, Instagram and other social media platforms have been under fire for their perceived lack of measures to safeguard young users. Mark Zuckerberg, the CEO of Meta Platforms which owns Instagram, offered his heartfelt apologies to the parents of victims of such incidents during a Senate hearing earlier this year.

Instagram said scammers often use direct messages to ask for “intimate images.” To counter this, it will soon start testing out a nudity-protection feature for direct messages that blurs any images with nudity “and encourages people to think twice before sending nude images.”

“The feature is designed not only to protect people from seeing unwanted nudity in their DMs, but also to protect them from scammers who may send nude images to trick people into sending their own images in return,” Instagram said.

The feature will be turned on by default globally for teens under 18. Adult users will get a notification encouraging them to activate it.

Images with nudity will be blurred with a warning, giving users the option to view it. They’ll also get an option to block the sender and report the chat.

For people sending direct messages with nudity, they will get a message reminding them to be cautious when sending “sensitive photos.” They’ll also be informed that they can unsend the photos if they change their mind, but that there’s a chance others may have already seen them.

As with many of Meta’s tools and policies around child safety, critics saw the move as a positive step, but one that does not go far enough.

“I think the tools announced can protect senders, and that is welcome. But what about recipients?” said Arturo Béjar, former engineering director at the social media giant who is known for his expertise in curbing online harassment. He said 1 in 8 teens receives an unwanted advance on Instagram every seven days, citing internal research he compiled while at Meta that he presented in November testimony before Congress. “What tools do they get? What can they do if they get an unwanted nude?”

Béjar said “things won’t meaningfully change” until there is a way for a teen to say they’ve received an unwanted advance, and there is transparency about it.

White House assistant press secretary Robyn Patterson also noted Thursday that President Joe Biden “has been outspoken about his belief that social media companies can do more to combat sexual exploitation online.”

Instagram said it’s working on technology to help identify accounts that could be potentially be engaging in sexual extortion scams, “based on a range of signals that could indicate sextortion behavior.”

To stop criminals from connecting with young people, it’s also taking measures including not showing the “message” button on a teen’s profile to potential sextortion accounts, even if they already follow each other, and testing new ways to hide teens from these accounts.

In January, the FBI warned of a “huge increase” in sextortion cases targeting children — including financial sextortion, where someone threatens to release compromising images unless the victim pays. The targeted victims are primarily boys between the ages of 14 to 17, but the FBI said any child can become a victim.

In the six-month period from October 2022 to March 2023, the FBI saw a more than 20% increase in reporting of financially motivated sextortion cases involving minor victims compared to the same period in the previous year.