YouTube to disable comments on videos featuring minors after child safety fears

But the company will leave comments on for a "small number" of creators who are minors.

Richard Nieva Former senior reporter
Richard Nieva was a senior reporter for CNET News, focusing on Google and Yahoo. He previously worked for PandoDaily and Fortune Magazine, and his writing has appeared in The New York Times, on CNNMoney.com and on CJR.org.
Richard Nieva
2 min read

YouTube is updating its comments policy.

Getty Images

YouTube on Thursday said it won't allow people to leave comments on some videos that feature minors, as the Google-owned video site deals with a scandal involving what one blogger called a "softcore pedophilia ring."

The company will disable comments on all videos that star "young minors" and "older minors that could be at risk of attracting predatory behavior," YouTube said in a blog post.

However, the company said it'll keep comments enabled for a small number of creators who are minors. In those cases, the videos will be actively moderated and YouTube will work with the creators directly. Though the company said it's starting with a small group, it eventually wants to open up comments again to more creators.

YouTube said it's also launching software that could automatically detect and remove predatory comments.

The controversy over child exploitation began last week when a video blogger named Matt Watson detailed how pedophiles could enter a "wormhole" of YouTube videos to see footage of children in sexually suggestive positions. In the comments of those videos, users would post time stamps linking to other videos, and YouTube's algorithms would recommend even more of those kinds of videos.  

In response, advertisers including AT&T and Epic Games, maker of Fortnite, pulled ad spending from YouTube. After that, YouTube banned more than 400 accounts and took down dozens of videos that put children at risk.

The new comments policy comes as YouTube has been criticized more intensely over the content on its platform, which sees more than 1 billion visitors a month. Last week, YouTube also said it would remove ads from antivaccination videos because the site considers them harmful. The site also faced blowback earlier this week after a pediatrician from Florida reportedly pointed out children's videos with suicide tips spliced into them.  

YouTube's problems with child safety aren't new. In 2017, parents started noticing troubling videos appearing on YouTube Kids. One video showed Mickey Mouse in a pool of blood with Minnie looking on in horror. In another video, a claymation version of Spider-Man urinated on Elsa, the princess from "Frozen." The videos were knockoffs depicting the beloved Disney and Marvel characters.

Also that year, YouTube stoked controversy after sexually explicit comments appeared under videos of kids doing innocuous activities, like performing gymnastics.