Facebook announced new tools to combat "revenge porn." In a press release Friday, it said it's introducing photo detection technology to help prevent sharing of intimate images without consent.
The technology uses artificial intelligence to detect "near nude" images or videos on Facebook or Instagram, and it flags them for review by a moderator. Facebook says the new technology will help get maliciously shared images removed from the platforms faster. The company previously relied on users to report the images.
Facebook also launched a new resource hub that gives users advice and instructions on how to remove photos from its sites and elsewhere on the internet.
The company has faced criticism about its response time for removing offensive content, as well as its policies on what content it removes. The issue became a priority for the company in 2017 after news broke that 30,000 people were part of a private Facebook photo-sharing ring that targeted female U.S. Marines.
Facebook also announced Friday it's creating a "victim support toolkit" to provide locally and culturally relevant information to people around the world. Those resources are expected to be available later this year.