YouTube algorithms make it easier for pedophiles to find more videos, study finds
That's how YouTube's algorithms work.
YouTube's recommendation system enables people who leer at videos of children in bathing suits to more easily find other videos that could offer pedophiliac appeal, according to a study reported Monday by The New York Times.
YouTube said it has expanded its efforts from earlier this year to limit recommendations of "borderline content" to include videos featuring minors in risky situations, YouTube said in a blog post Monday.
"While the content itself does not violate our policies, we recognize the minors could be at risk of online or offline exploitation," Google-owned YouTube said. "We've already applied these changes to tens of millions of videos across YouTube."
The company referred CNET to the blog post when asked for comment on the study.
The study comes after YouTube disabled comments in February on videos featuring minors. That move followed reports about an subculture of posting comments on YouTube videos of children that could be construed in a sexualized way; the comments include time-stamp links that would take viewers to similar videos at other sexually suggestive moments. YouTube's algorithm played a role in that scandal as well: Once somebody happened upon these video and clicked one of the comment links, the sidebar of recommended videos would surface many others like it.
The latest study, from researchers at Harvard's Berkman Klein Center for Internet and Society, was conducted after Google removed comments from videos with minors. The study found that YouTube's automatic video-recommendation system would surface otherwise innocuous home movies after a user watched sexually themed content. It resulted in a library of videos that sexualizes children, according to researchers in the report.