X

YouTube's 'whack-a-mole approach' to child safety isn't working, critics say

The Google-owned video site's latest scandal isn't the first time the service has had a problem with the exploitation of kids.

Richard Nieva Former senior reporter
Richard Nieva was a senior reporter for CNET News, focusing on Google and Yahoo. He previously worked for PandoDaily and Fortune Magazine, and his writing has appeared in The New York Times, on CNNMoney.com and on CJR.org.
Richard Nieva
3 min read
google-hq-sede-mountain-view.jpg

YouTube is facing a scandal involving child exploitation.

Getty

YouTube is under fire again for allowing content that exploits children onto its platform. After reports of a "softcore pedophile ring" on the Google-owned site surfaced last weekend, YouTube on Thursday said it's taking aggressive action to fix the problem. The company banned more than 400 accounts and took down dozens of videos that put children at risk.

But even though YouTube addressed this particular controversy, critics of the company say they're fed up that problems with child safety keep arising in the first place. For example, two years ago, YouTube faced a backlash after disturbing videos got past filters on YouTube Kids, a version of the service designed for children.

"This has been happening for years," Haley Halverson, vice president of advocacy and outreach at the National Center on Sexual Exploitation, said in an email. "Why isn't it YouTube's No. 1 priority to create sustained solutions, instead of carrying on with its current whack-a-mole approach?"

The latest incident began on Sunday, when a video blogger named Matt Watson detailed how pedophiles could enter a "wormhole" of YouTube videos to see footage of children in sexually suggestive positions. In the comments of those videos, users would post time stamps linking to other videos, and YouTube's algorithms would recommend even more of those kinds of videos.  

In response, advertisers including AT&T and Epic Games, maker of Fortnite, pulled ad spending from YouTube.

YouTube declined to make an executive available for an interview, but in a statement a spokeswoman said: "Any content -- including comments -- that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube ... There's more to be done, and we continue to work to improve and catch abuse more quickly."

But child advocacy groups say the company isn't working fast enough.

Not the first time

YouTube has a history of people abusing the service to exploit kids.

In 2017, parents started noticing troubling videos appearing on YouTube Kids. One video showed Mickey Mouse in a pool of blood while Minnie looks on in horror. In another video, a claymation version of Spider-Man urinates on Elsa, the princess from "Frozen." The videos were knockoffs depicting the beloved Disney and Marvel characters.

Also that year, YouTube stoked controversy after sexually explicit comments appeared under videos of kids doing innocuous activities, like performing gymnastics.

Outside the realm of child safety, advertisers that year also boycotted YouTube after their ads appeared next to extremist and hate content because of YouTube's automated advertising technology. Major brands including AT&T and Johnson & Johnson ditched advertising on the platform.

In response to those scandals, CEO Susan Wojcicki overhauled YouTube's safety guidelines. The new rules included removing ads from inappropriate videos targeting families and blocking inappropriate comments on videos featuring minors.

Two years later, critics are upset these incidents are still popping up. Josh Golin, executive director of the Campaign for a Commercial Free Childhood, says the problem is YouTube treating the issue like a public relations problem and just dealing with individual controversies when they receive media attention, instead of getting at the root of the problem.

Golin said it's particularly egregious that YouTube's recommendation algorithm suggested even more videos that put children at risk.

"If you realize that your algorithm is recommending videos that would appeal to pedophiles and you're not stopping to think about that, what will make you stop and think?" he said. "If pedophiles won't make you look in the mirror, what will?"

There are no easy fixes, he said. Part of that is because of YouTube's massive scale. It's the largest video site on the planet, with more than 1 billion visitors a month.

"I don't think you can provide a safe space for children if your business model is volume," he said.