At YouTube, every time a user uploads something appalling, a cat-and-mouse hunt begins, to find and purge the pest.
Now imagine the cat having to sort through 400 critters a minute to find the offending rodent.
With 400 hours of video uploaded to YouTube every 60 seconds, the site has an "absurd" scale, Chief Product Officer Neal Mohan said in an exclusive interview at CES 2018. And rooting out the worst sliver of content is something YouTube admits it must do better.
"It's something that I take extremely seriously, that the teams take extremely seriously," Mohan said, speaking in a makeshift meeting room at the Aria hotel in Las Vegas, where CES' media industry events take place. "We really do care about this community, that it be a place that is safe. We understand the societal impact the platform can have."
YouTube's 2017 was rocked by headlines that pointed out offensive content on the site. The situation endangered the company's relationship with the advertisers who provide its revenue and with the community of video creators who fill YouTube with clips.
Last spring, an outcry about commercials running next to offensive videos sparked an advertiser boycott. When YouTube responded by more aggressively pulling ads off sensitive clips, it ended up outraging people who uploaded video and who lost moneymaking power -- an event dubbed "Adpocalypse."
YouTube has grappled with reports that it surfaces nightmarish clips inside its kids-safe app and that some YouTube channels appear to be making money off child exploitation and abuse. Most recently, a video by star personality Logan Paul featured images of a suicide victim, triggering a backlash that spurred YouTube to , more than a week after it was posted.
Without addressing those specific examples, Mohan said that what gets lost in the headlines is how much gray area there is between what needs to be removed and what's fine. Navigating the nuances around what's objectionable requires a sophisticated understanding of context. And then there's the pure scale of YouTube.
Given those considerations, it's not enough to rely on human users flagging bad material for review, Mohan said. Neither computers nor humans are enough by themselves, so YouTube counts on a combination of machine learning and human reviewers to address objectionable videos.
Machine learning involves software that recognizes patterns to guide its tasks, rather than relying on programmed instructions. Mohan's team establishes a "ground truth training set": a set of examples that helps the system understand and learn what's good and what's bad in videos. YouTube uses the set to narrow the breadth of its content, since humans -- like the cat dealing with a stampede of 400 mice -- can't review it all.
Mohan said his strategy with the machine-learning part of the equation is for machines to be good at covering the entire corpus of video on YouTube, even if it means they sometimes suggest that humans review harmless videos.
On the human front, YouTube said last month that it'll add a flood of new reviewers, to reach a. But Mohan said the human element isn't only about having an army of trained people, it's also about turning to experts in particular kinds of dangerous content, like terrorism or pedophilia. For that, the company is finding partners in those fields of study.
Presumably, a successful setup would mean a YouTube with fewer misbehaving mice and more cats. And everyone, in this case, can agree it'd be great if YouTube had more cats...
: CNET's complete coverage of tech's biggest show.
CNET Magazine: Check out a sample of the stories in CNET's newsstand edition.