X

Mozilla is sharing YouTube horror stories to prod Google for more transparency

Researchers should have more access to YouTube's algorithms to figure out how the video site makes its recommendations, Mozilla says.

Joan E. Solsman Former Senior Reporter
Joan E. Solsman was CNET's senior media reporter, covering the intersection of entertainment and technology. She's reported from locations spanning from Disneyland to Serbian refugee camps, and she previously wrote for Dow Jones Newswires and The Wall Street Journal. She bikes to get almost everywhere and has been doored only once.
Expertise Streaming video, film, television and music; virtual, augmented and mixed reality; deep fakes and synthetic media; content moderation and misinformation online Credentials
  • Three Folio Eddie award wins: 2018 science & technology writing (Cartoon bunnies are hacking your brain), 2021 analysis (Deepfakes' election threat isn't what you'd think) and 2022 culture article (Apple's CODA Takes You Into an Inner World of Sign)
Joan E. Solsman
3 min read
YouTube logo on a smartphone screen

How you get from one YouTube video to the next is a tangle that researchers would like to figure out.

Angela Lang/CNET

Mozilla is publishing anecdotes of YouTube viewing gone awry -- anonymous stories from people who say they innocently searched for one thing but eventually ended up in a dark rabbit hole of videos. It's a campaign aimed at pressuring Google's massive video site to make itself more accessible to independent researchers trying to study its algorithms. 

"The big problem is we have no idea what is happening on YouTube," said Guillaume Chaslot, who is a fellow at Mozilla, a nonprofit best known for its unit that makes and operates the Firefox web browser. 

Chaslot is an ex-Google engineer who has investigated YouTube's recommendations from the outside after he left the company in 2013. (YouTube says he was fired for performance issues.) "We can see that there are problems, but we have no idea if the problem is from people being people or from algorithms," he said. 

YouTube is the world's biggest online video source, with 2 billion monthly users, and 70% of the time people spend watching videos there is the result of one of its recommendation engines. But YouTube has come under fire for how its recommendation technology can inflame problems on the service, such as pedophilia, extremist content or misinformation. 

In a statement, YouTube said that it has working on improving recommendations since the beginning of the year and aims to elevate authoritative material.

"While we welcome more research on this front, we have not seen the videos, screenshots or data in question and can't properly review Mozilla's claims," a YouTube spokesman said. "Generally, we've designed our systems to help ensure that content from more authoritative sources is surfaced prominently in recommendations. We've also introduced over 30 changes to recommendations since the beginning of the year, resulting in a 50% drop in watchtime of borderline content and harmful misinformation coming from recommendations in the U.S."

Watch this: YouTube's machine learning can't keep up with its promises (The Daily Charge, 9/9/2019)

Ashley Boyd, Mozilla's vice president of advocacy, said that YouTube's claim about the 50% drop in time spent watching borderline or harmful misinformation is exactly the kind of data YouTube should allow independent researchers to verify.

"That's the old era of 'Trust us, we've got this,'" she said. "Show us your work." 

Borderline content is how YouTube describes videos that aren't glaring misinformation but still questionable. YouTube originally announced its 50% reduction statistic in a June blog post about its progress tackling hate on the platform. 

"What we're trying to do is provide a window into the consumer concern about this," Boyd said. "There's a question about whether people care about this problem because they haven't left the platform. We don't think that's a good measure of whether people care."

Mozilla is publishing 28 stories it's terming #YouTubeRegrets; they include, for example, an anecdote from someone who who said a search for German folk songs ended up returning neo-Nazi clips, and a testimonial from a mother who said her 10-year-old daughter searched for tap-dancing videos and ended up watching extreme contortionist clips that affected her body image. CNET hasn't independently verified any of the stories Mozilla published.

Originally published Oct. 14 at 9 p.m. PT. 
Update, Oct. 15: Adds YouTube statement and more details from Mozilla.
Correction, Oct. 16: Chaslot was previously a Google engineer.

Meet the seven new(ish) Roku streamers of 2019

See all photos