For years, Meta has had the ability to limit the overwhelming majority of nude and obscene photos being exchanged on one of its platforms. The company now appears ready to utilize technology to stop the spread of these images.
Meta announced on Thursday it will be testing a nudity protection feature on Instagram. Meta said that directly messaged images detected as containing nudity will be automatically blurred. The tool also encourages people to think twice before sending nude images.
Meta said that the warnings and blurring will be default settings for users under 18. Meta will also instruct adults on how to make changes to their settings.
The company said that the changes will cut down on instances of sextortion, the practice where scammers try to obtain nude images and then use those images to extort money from victims.
The social media company said that the changes have gotten the endorsement from a variety of advocates.
“As an educator, parent, and researcher on adolescent online behavior, I applaud Meta’s new feature that handles the exchange of personal nude content in a thoughtful, nuanced, and appropriate way," said John Shehan, senior vice president of the National Center for Missing and Exploited Children. "It reduces unwanted exposure to potentially traumatic images, gently introduces cognitive dissonance to those who may be open to sharing nudes, and educates people about the potential downsides involved. Each of these should help decrease the incidence of sextortion and related harms, helping to keep young people safe online.”
Senate Judiciary Committee grills social media CEOs on child safety
In a poignant moment in the hearing, parents stood up and held images of their children who lost their lives to suicide due to social media.
The changes come amid lawsuits filed against Meta. In 2023,the state of Vermont filed a lawsuitagainst the company claiming its platforms were harming the mental health of children and young adults.
The lawsuit claimed that 11.9% of all users received unwanted sexual advances, including 13% of teens ages 13-15 and 14.1% of teens 16-17.
The lawsuit also claimed that 16.3% of users viewed nudity that they “did not want to see" in the prior seven days of an internal survey.
According to a 2021 congressional report, on Facebook, a sister platform of Instagram, artificial intelligence has the capability of flagging over 99% of violent and graphic content posted and child nudity without users flagging it.
Meta's content monitoring process has also drawn scrutiny from Congress, which held a hearing earlier this year on the harm social media poses to young people.
"It is unbelievable, indescribable material. And these platforms are absolutely awash with it," said Sen. Josh Hawley, a Missouri Republican.