X

YouTube algorithms make it easier for pedophiles to find more videos, study finds

That's how YouTube's algorithms work.

Joan E. Solsman Former Senior Reporter
Joan E. Solsman was CNET's senior media reporter, covering the intersection of entertainment and technology. She's reported from locations spanning from Disneyland to Serbian refugee camps, and she previously wrote for Dow Jones Newswires and The Wall Street Journal. She bikes to get almost everywhere and has been doored only once.
Expertise Streaming video, film, television and music; virtual, augmented and mixed reality; deep fakes and synthetic media; content moderation and misinformation online Credentials
  • Three Folio Eddie award wins: 2018 science & technology writing (Cartoon bunnies are hacking your brain), 2021 analysis (Deepfakes' election threat isn't what you'd think) and 2022 culture article (Apple's CODA Takes You Into an Inner World of Sign)
Joan E. Solsman
2 min read
2014-youtube-logo-offices.jpg

YouTube says it recognizes that "minors could be at risk of online or offline exploitation."

Seth Rosenblatt/CNET

YouTube's recommendation system enables people who leer at videos of children in bathing suits to more easily find other videos that could offer pedophiliac appeal, according to a study reported Monday by The New York Times

YouTube said it has expanded its efforts from earlier this year to limit recommendations of "borderline content" to include videos featuring minors in risky situations, YouTube said in a blog post Monday

"While the content itself does not violate our policies, we recognize the minors could be at risk of online or offline exploitation," Google-owned YouTube said. "We've already applied these changes to tens of millions of videos across YouTube."

The company referred CNET to the blog post when asked for comment on the study.

The study comes after YouTube disabled comments in February on videos featuring minors. That move followed reports about an subculture of posting comments on YouTube videos of children that could be construed in a sexualized way; the comments include time-stamp links that would take viewers to similar videos at other sexually suggestive moments. YouTube's algorithm played a role in that scandal as well: Once somebody happened upon these video and clicked one of the comment links, the sidebar of recommended videos would surface many others like it. 

The latest study, from researchers at Harvard's Berkman Klein Center for Internet and Society, was conducted after Google removed comments from videos with minors. The study found that YouTube's automatic video-recommendation system would surface otherwise innocuous home movies after a user watched sexually themed content. It resulted in a library of videos that sexualizes children, according to researchers in the report.