X

Facebook tests tools to combat child sexual abuse

Facebook is testing a pop-up notice that'll appear when users try to search for content tied to child exploitation. And an alert will warn users who try to share such content.

Queenie Wong Former Senior Writer
Queenie Wong was a senior writer for CNET News, focusing on social media companies including Facebook's parent company Meta, Twitter and TikTok. Before joining CNET, she worked for The Mercury News in San Jose and the Statesman Journal in Salem, Oregon. A native of Southern California, she took her first journalism class in middle school.
Expertise I've been writing about social media since 2015 but have previously covered politics, crime and education. I also have a degree in studio art. Credentials
  • 2022 Eddie award for consumer analysis
Queenie Wong
3 min read
Facebook social media app

Facebook has been under pressure to do more to crack down on images of child sexual abuse. 

James Martin/CNET

Facebook is testing new tools aimed at curbing searches for photos and videos that contain child sexual abuse and at preventing the sharing of such content.

"Using our apps to harm children is abhorrent and unacceptable," Antigone Davis, who oversees Facebook's global safety efforts, said in a blog post Tuesday

The move comes as the social network faces more pressure to combat this problem amid its plans to enable default encryption for messages on Facebook Messenger and Facebook-owned photo service  Instagram . The end-to-end encryption would mean that except for the sender and recipient, messages couldn't be viewed by anyone, including Facebook and law enforcement officials. Child safety advocates have raised concerns that Facebook's encryption plans could make it harder to crack down on child predators.

The first tool Facebook is testing is a pop-up notice that appears if users search for a term that's associated with child sexual abuse. The notice will ask users if they want to continue, and it includes a link to offender diversion organizations. The notice also says that child sexual abuse is illegal and that viewing these images can lead to consequences including imprisonment.

nrp-child-safety-bundle-announcement-inline2

Facebook users who try to search for words tied to child sexual abuse content will see this pop-up notice that urges them not to view these images and to get help. 

Facebook

Last year, Facebook said it analyzed the child sexual abuse content reported to the National Center for Missing and Exploited Children. The company found that more than 90% of the content was the same or similar to previously reported content. Copies of six videos made up more than half the child exploitative content reported in October and November 2020. 

"The fact that only a few pieces of content were responsible for many reports suggests that a greater understanding of intent could help us prevent this revictimization," Davis wrote in the blog post. The company also conducted another analysis, which showed that users were sharing these images for other reasons outside of harming the child, including "outrage or in poor humor."

The second tool Facebook said it's testing is an alert that'll inform users if they try to share these harmful images. The safety alert tells users that if they share this type of content again, their account may get disabled. The company said it's using this tool to help identify "behavioral signals" of users who might be at a greater disk of sharing this harmful content. This'll help the company "educate them on why it is harmful and encourage them not to share it" publicly or privately, Davis said.

Facebook also updated its child safety policies and reporting tools. The social media giant said it'll pull down Facebook profiles, Pages, groups and Instagram accounts "that are dedicated to sharing otherwise innocent images of children with captions, hashtags or comments containing inappropriate signs of affection or commentary about the children depicted in the image." Facebook users who report content will also see an option to let the social network know that the photo or video "involves a child," allowing the company to prioritize it for review. 

During the coronavirus pandemic, online child sexual abuse images have increased, according to a January report by Business Insider. From July to September, Facebook detected at least 13 million of these harmful images on the main social network and Instagram.