A pedophilia scandal spurred YouTube to pledge it wouldn't allow comments on videos depicting kids age 13 and younger. Six months later, they're easy to find.
YouTube, Google's massive online video site, is about to embark on the biggest changes yet to kids videos. The company is reengineering how it treats clips directed at children, following this month's record $170 million penalty for violating kids' data privacy. YouTube pledged to disable comments, notifications and personalized ads on all videos directed at children, and its machine learning will police YouTube's sprawling catalog to keep kids videos in line, the company said.
One problem: YouTube's machine learning was already supposed to be suspending comments on videos featuring young minors. It hasn't.
Comment-enabled videos prominently depicting young kids are still easy to find on YouTube. A single YouTube search for one kids-focused subject -- "pretend play" -- returned more than 100 videos with comments enabled, all prominently featuring infants, preschoolers and other children young enough to still have their baby teeth.
After CNET contacted YouTube with a list of these videos, comments were disabled on nearly half of them.
In a statement, YouTube said it invests significantly in the teams and technologies that allow it "to provide minors and families the best protection possible."
"We've suspended comments on hundreds of millions of videos featuring minors in risky situations and implemented a classifier that helps us remove two times the number of violative comments," YouTube spokeswoman Ivy Choi said. "We continue to disable comments on hundreds of thousands of videos a day and improve our classifiers."
But after this report was published, YouTube continued to vaunt its commitment to impeccable accuracy whenever child safety is a concern.
"We work hard to do this at a very high level of accuracy, and ... different policies have to have different levels," YouTube CEO Susan Wojcicki said Wednesday at the Atlantic Festival in Washington, DC, after she was asked about YouTube's acceptable failure rates in rooting out policy violations. "So child safety, that's one where we're going to say: We need to be right all the time. We need to try to find every single example that we think is problematic."
YouTube is the world's biggest online video source, with 2 billion monthly users -- so big, in fact, it's the world's top source for kids videos too. Kids content is one of its most-watched categories, but YouTube has come under fire for a range of scandals involving children. That $170 million penalty addressed the data YouTube collects on kids without parents' consent. But YouTube has also faced scandals involving videos of child abuse and exploitation and nightmarish content in its YouTube Kids app, pitched as a kid-safe zone.
YouTube's difficulty managing children's content is only one problem in a parade that the Google-owned site has faced in the last few years, including allegations that it allows hate speech to proliferate, spreads conspiracy theories and discriminates against some creators. Google is one of the big tech companies facing increasing questions about the power it wields, with the Justice Department kicking off an antitrust investigation into big tech.
In February, YouTube said it would disable comments on videos with young kids following an outcry over a ring of softcore pedophilia. Some videos featuring young children included comments with predatory links. Clicking on the links would transport viewers to other moments in YouTube videos with a minor in a sexually suggestive position. And once you fell into that rabbit hole, YouTube's recommendation algorithm appeared to feed you more of the same.
So YouTube said it would suspend comments on videos featuring minors who were 13 and younger, as well as on videos featuring older minors who could be at risk of attracting predatory behavior. The changes would take place "over the next few months," YouTube said then. YouTube would make an exception for "a small number of channels that actively moderate their comments and take additional steps to protect children," the company said at the time.
"We announced some really significant changes, one of which is that we are no longer going to allow comments on videos that are featuring young minors anymore and older minors that are engaged in risky behavior," YouTube CEO Susan Wojcicki said on stage the day after the policy was announced.
It was a move that she expected would anger innocent young creators and parents who rely on comments for genuine feedback, she said. But "that was a trade-off that we made because we felt like we wanted to make sure that protecting children was our No. 1 priority," she added.
Six months later, CNET's single search found more than 100 videos posted in the last month by more than 100 different channels. They all featured young children -- babies, toddlers and kids clearly no older than elementary school students. All had comments enabled.
The videos ranged from clips with almost no views on channels with zero subscribers to videos that've been viewed nearly 23 million times. One video had 1,750 comments. Several videos showed children in limited clothing, like a young girl in a bathing suit or a baby in a diaper.
Of the more than 100 videos, YouTube suspended comments on 48 of them after CNET provided a list of links -- the video that had 1,750 comments was among them. Generally, these videos with newly disabled comments didn't have any adults on camera; YouTube said that adult supervision is one of the several things it evaluates when disabling comments.
Most comments on these videos appeared innocent: heart-eye emojis, praise about adorableness or feedback about toys.
But occasionally one would raise eyebrows: A video of three girls eating lollipops and playing on fairground rides, for example, had two total comments. One was posted by an account with a swastika avatar going by the name "Kurdish Nazi." The comment, translated from Kurdish, appeared to be a reference to bitcoin.
(YouTube disabled comments on that video after CNET reached out, which erased the Kurdish Nazi comment from public view.)
Other searches found YouTube videos of older children in scant clothing. The pedophilia-ring scandal earlier this year was triggered by a vlogger exposing predatory links after he searched the term "bikini haul." In response, YouTube said it would suspend comments on videos featuring children aged 14 to 17, too, if the subject had potential for abuse, such as videos of dancing or gymnastics.
But one YouTube search for "tumbling" found more than 40 videos with comments enabled that featured teens and preteens performing gymnastics, often in leotards or sports bras. Again, they ranged from videos with only a few views to some that had been viewed millions of times and had hundreds of comments.
One video -- with more than 3 million views and 2,800 comments -- featured three teen girls doing flips in bikinis. They identify their ages as 16 in the video's description. Another video, a compilation of cheerleading fails, showed girls doing the splits to camera in bathing suits and boy shorts and appeared to briefly show a naked infant. It has 123 comments.
Again, most of the comments on these videos were innocent, but stray comments were questionable. Teen girls in sports bras cheerleading were called "jailbait" by one commenter. Another commenter on a different video, of a teen girl doing a series of flips, said "I like ur ass shakin to music better!" One person replied "gross. this girl looks like she's 14 in this video." But another, under the account DudeMan, asked for clarification: "Where was the ass shake?"
A search for "teen bikini haul" videos posted in the last month returned one video by a girl identifying herself as 16 years old, modeling different swimsuits. It has 75 comments. Another was by an influencer who discloses her age -- 17 -- and her birthdate in the video's description. Her video, showing off more than a dozen two-piece suits, has 309 comments, including one comment asking her to "show your uncensored sweet tushy."
"Whats the point of a try-on haul if you dont show your butt??," another wrote. "You're not nude, and you wear these in public right..?? good grief...waste of bandwidth."
One difficulty in relying on algorithms to police videos with minors is that no technology will catch everything, machine learning experts said.
"Whatever they do, it's never going to be perfect," said Christo Wilson, a professor of computer science at Northeastern University. "We just have to accept that an adult could be flagged as a child, or it just doesn't see a child. Regardless of how much machine learning they have, they need to have some sort of human process behind the scenes."
YouTube has more than 10,000 human moderators tasked with addressing videos that violate any of its policies.
The scale of YouTube, where 500 hours of video are uploaded every minute, adds to the challenge. Even if the baseline success rate of YouTube's machine learning is high, the amount of videos it fails to catch will still be significant.
"You may still only have one needle in a haystack. But add more and more haystacks, and it'll be easier for someone somewhere to find it," said Christian Shelton, a professor of computer science at the University of California, Riverside. "The technology will never be perfect. No other solution would also be perfect, but you shouldn't let the technology off the hook."
Google and YouTube's scale works in its favor, in some respects. Algorithms need data to learn, and YouTube has more video and data about it than anyone else. Machine learning for video, which essentially looks at videos as collections of still frames, also requires a level of computational power that's more feasible for a company with Google's resources.
And one of YouTube's policy changes announced this month could help its machine learning improve. As part of its settlement with the FTC, YouTube will require uploaders to identify videos that are "made for kids," it said, effectively introducing more labels on its data.
Algorithms need annotations like these to learn, and the more content that's getting processed, the more the annotations are necessary, according to Arnav Jhala, a computer science professor at North Carolina State University. Algorithms find patterns and correlations between labels and visible features in the frames.
"The more labels they have, the higher correlation they will have, and on unlabeled video, the algorithms will have a higher accuracy," he said. "But you are dealing with almost an adversary on the other side."
That is, some uploaders have motives to misidentify their videos.
Trolls, for example, could mislabel inappropriate content as kids videos, aiming to sneak sensitive images in front of children's eyes. YouTube previously had instances of kids videos with self-harm tips spliced into it. A trend of supposedly "child-friendly" YouTube videos that had familiar kids characters engaged in bizarre, violent or disturbing behavior earned its own moniker: Elsagate. Mislabeled data would pollute the information training a machine-learning algorithm.
Beyond that, video that targets kid audiences is "pretty freaking vague" as a directive to give an algorithm, Wilson said.
And YouTube's track record so far, according to Wilson? "Not great."
Originally published Sept. 9, 5 a.m. PT.
Updates, Sept. 10, Sept. 11 and Sept. 25: Adds quotes from YouTube's CEO and more details of comment-enabled videos.