Instagram is taking new measures to protect young people from sexual exploitation and exposure by including a feature that will automatically blur nudity in direct messages.
In an effort to fight sexual scams and other forms of “image abuse,” the social media platform is testing out new features, according to Instagram’s blog on April 11.
The new features aim to make it harder for sexual predators and online criminals to contact teenagers through direct messaging on the platform.
Sexual extortion, also referred to as sextortion, works similar to blackmail, where a victim is coerced into sending explicit photos of themselves. Once the photos have been received by the criminals, the victim is then threatened with public exposure unless they provide money or sexual favors in return.
Some notable cases of such incidents include two Nigerian men who pleaded guilty to sexually extorting teen boys and young men in Michigan. One of the boys committed subsequently committed suicide. Another incident saw a law enforcement officer from Virginia sexually extorting and kidnapping a 15-year-old girl.
The new initiative followed growing criticism and pressure on social media platforms to do more to protect young people, prompting an apology to the parents of victims of such abuse by Meta CEO Mark Zuckerberg during a Senate hearing earlier this year.
Instagram is part of the Meta conglomerate, operating out of Menlo Park, California. The nudity blur feature, however, will only apply to Instagram and not to other Meta apps, including Facebook or WhatsApp.
Requests for Intimate Images
According to Instagram, requests for intimate images often come from scammers via direct messages in users’ inboxes. The nudity-protection feature will blur out any images with nudity in direct messaging, in addition to “encouraging people to think twice before sending nude images.”
“The feature is designed not only to protect people from seeing unwanted nudity in their DMs, but also to protect them from scammers who may send nude images to trick people into sending their own images in return,” Instagram said.
Under-18s will have the feature activated by default, while adult users will receive a notification that encourages them to activate it for their protection. A warning for nude images will be issued alongside it, with an option to either view the image or block and report the chat.
Those who send direct messages containing nudity will be prompted via message to act with caution when sending sensitive photos, in addition to an option to un-send the images in case they change their mind later, although there is a risk of the photos having already been viewed by that time.
Family Disconnect
According to Rachel Goldberg, a marriage/family therapist who spoke to NTD, a disconnect between parents and their children contributes to the problem, which ultimately leads to the children not discussing the issue with their parents when it arises.
“These teens are coming to me first if they are my clients, instead of their parents, which is something that needs to be addressed. Parents need to maybe have this conversation with their children about what can happen, and that they will not be in trouble if they come to them,” Ms. Goldberg said.
Meanwhile, critics of the new measure say it’s a step in the right direction, although it does not go far enough.
“I think the tools announced can protect senders, and that is welcome. But what about recipients?” said Arturo Béjar, former engineering director at the social media giant who is known for his expertise in curbing online harassment.
According to Mr. Béjar, unwanted advances aimed at teenagers are frequent, amounting to around 1 in 8 on a weekly basis, based on internal research he gathered while at Meta and which was brought as testimony before Congress last November.
“What tools do they get? What can they do if they get an unwanted nude?”
Mr. Béjar said that no significant changes to these instances would occur until teens can openly report on unwanted advances with adequate transparency on the issue.
The White House has also weighed in on the issue, with assistant press secretary Robyn Patterson noting on April 11 that President Joe Biden has openly expressed his view that more can be done by social media companies to combat sexual exploitation online.
Attorneys general in 33 states, including California and New York, sued Instagram in October over allegations of repeatedly misleading the public on the dangers of its platform.
Instagram said it’s working on technology to help identify accounts that could be potentially be engaging in sexual extortion scams “based on a range of signals that could indicate sextortion behavior.”
Additional measures are also being taken to stop criminals from connecting with young people. This includes not showing the “message” button on a teen’s profile to potential sextortion accounts, even if they already follow each other, and testing new ways to hide teens from these accounts.
Increase in Sextortion Cases
The FBI reported in January that there has been a significant increase in sextortion cases aimed specifically at children. This included financial sextortion, where the victim is threatened with public exposure unless they pay the perpetrator.
According to the FBI, while the majority of cases are aimed at boys between the ages of 14 to 17, children of any age are at risk.
From October 2022 through March 2023 specifically, the FBI noted a more than 20 percent increase in reported cases of financial sextortion involving young minors compared to the same period the year before.
The Associated Press contributed to this article.