X

Tech firms saw 50% rise in online child sexual abuse content in 2019, report says

Facebook, Snapchat and Twitter were among 164 companies that submitted reports of such imagery, according to the report.

Shelby Brown Editor II
Shelby Brown (she/her/hers) is an editor for CNET's services team. She covers tips and tricks for apps, operating systems and devices, as well as mobile gaming and Apple Arcade news. Shelby also oversees Tech Tips coverage. Before joining CNET, she covered app news for Download.com and served as a freelancer for Louisville.com.
Credentials
  • She received the Renau Writing Scholarship in 2016 from the University of Louisville's communication department.
Shelby Brown
2 min read
gettyimages-1066911160
Boonchai Wedmakawand/Getty Images

The number of photos and videos of child sexual abuse online surged over 50% in 2019, according to a Friday report from The New York Times. Nearly 70 million images and videos were reported to the National Center for Missing and Exploited Children, the Times reported. 

More of these reports involved videos (41 million) than photos, the Times said. Five years ago, the number of videos reported was under 360,000. Some 85% of the total content reported (60 million photos and videos) came from Facebook, according to the Times. Instagram, owned by Facebook, reported an additional 1.7 million pieces of abusive content. 

Out of the 164 companies that submitted reports, Snapchat, Twitter, Microsoft, Apple, Dropbox and Google also reportedly detected abusive imagery and videos. 

In an emailed statement, the National Center for Missing and Exploited Children said it works closely with electronic service providers on voluntary initiatives to deter and prevent the spread of online child exploitation images. 

Twitter said it has a zero-tolerance approach to child sexual exploitation. The social media site pointed to its latest transparency report, which says the company suspended more than 244,000 accounts for violations related to child sexual exploitation.

"We continue to invest in our proactive technology, industry and NGO partnerships, and law enforcement engagement to tackle this critical issue," a Twitter spokesperson told CNET.

CNET reached out to the other companies named in the report that detected illegal content, and we'll update when we hear back.

Originally published Feb. 7, 12:11 p.m. PT.
Updates, 1:32 p.m.: Adds comment from the National Center for Missing and Exploited Children; 1:50 p.m.: Includes comment from Twitter.