Facebook gets about 500,000 reports of revenge porn a month, report says
The social network is using a mix of AI tools and a dedicated team to combat revenge porn.

Facebook reportedly launched a research program in 2018 to explore how it can better support victims of revenge porn.
Facebook has been working for years on tools to prevent and remove revenge porn on its apps, but that apparently hasn't stopped bad actors trying to share these images. Facebook, which also owns popular apps Instagram, Messenger and WhatsApp, has to assess about 500,000 reports of revenge porn each month, according to a report Monday from NBC News.
Facebook, the world's largest social network, earlier this year launched artificial intelligence tools that can spot revenge porn, also known as nonconsensual intimate images, before being reported by users. In 2017, the company also launched a pilot program that let users submit intimate pictures to Facebook in an effort to prevent them from being shared on the social network.
However, Facebook's Radha Plumb told NBC News that the initial explanation of the pilot wasn't clear enough, and after negative feedback the company launched a research program in 2018 to explore how it can better prevent revenge porn and support victims.
"In hearing how terrible the experiences of having your image shared was, the product team was really motivated in trying to figure out what we could do that was better than just responding to reports," Plumb, head of product policy research at Facebook, told NBC News.
Facebook reportedly now has a team of 25 people, not including content moderators, focused on preventing the nonconsensual sharing of intimate photos and videos.
Facebook didn't immediately respond to a request for comment.
Mobile Guides
Phones
Foldable Phones
Headphones
Mobile Accessories
Smartwatches
Wireless Plans