X
Exclusive

YouTube says 'authoritative' news viewership has jumped amid COVID-19 pandemic

Another sign of the times: Queries for the term "home school" have doubled.

Richard Nieva Former senior reporter
Richard Nieva was a senior reporter for CNET News, focusing on Google and Yahoo. He previously worked for PandoDaily and Fortune Magazine, and his writing has appeared in The New York Times, on CNNMoney.com and on CJR.org.
Richard Nieva
4 min read
google-hq-sede-mountain-view.jpg

YouTube has seen a surge in usage during the coronavirus pandemic.

Getty

YouTube , the most popular video platform in the world, has only become more popular during the coronavirus pandemic. As people around the world shelter in place, the Google-owned site has attracted parents on the hunt for children's content, consumers looking for news, and people just trying to find a distraction during stressful times. 

The surge in usage, though, could prove thorny for a platform that has for years been plagued with misinformation, extremism and child exploitation. The latest blight on the platform has been conspiracy theories tying COVID-19 to 5G wireless towers

Still, YouTube says it has a handle on the situation when it comes to misinformation. During the first three months of the year, the company says it has seen a 75% increase in people watching videos from "authoritative" sources, such as legitimate news outlets, government agencies and health authorities like the World Health Organization. YouTube declined to share specific viewership numbers. 

To steer people toward credible information, the company says it has been proactive. YouTube reached out to the team of Dr. Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases, to set up videos with prominent creators on the platform, including Phil DeFranco, Doctor Mike and Lilly Singh. The videos have tallied around 30 million views collectively. YouTube has also tried to highlight educational content for kids stuck in quarantine. Last month, the company launched a hub called Learn@Home for parents to find education videos. The company says queries of "home school" on YouTube have doubled since March 13. 

Scott Silver, YouTube's top engineer, says he feels a sense of duty to make sure the company acts responsibly during the pandemic. "I feel a personal, moral and ethical impetus to put all of my energy into this so that we get the right information out there," he said in an interview. "It's very easy to feel motivated about this."

The pandemic is the latest test for YouTube to prove it can keep its more than 2 billion monthly users safe when they visit the site. In the past, YouTube has fared particularly poorly during major events or in times of crisis. During the 2016 US presidential election, for example, Russian agents exploited the platform to spread propaganda. After a school shooting in Parkland, Florida, two years later, YouTube recommended videos claiming one of the survivors was a paid actor

Silver, a 14-year Google veteran who reports to YouTube CEO Susan Wojcicki, says the company has learned from past difficulties to hone its systems to be prepared for this moment. Since January 2018, YouTube says it has reduced viewership by 80% on videos that would be later taken down for policy violations. That means stopping the spread of would-be viral misinformation, even though the videos weren't initially deemed to violate community guidelines. 

YouTube's engineers are constantly under pressure to remove offending material. Those takedowns run the gamut: They can be as innocuous as a copyright violation or as horrendous as a terrorist attack. After a shooter in Christchurch, New Zealand, live streamed himself killing worshippers at two mosques last year, tens of thousands of videos of the incident began to flow onto YouTube. The company's engineers worked feverishly through the night to remove the content. 

While the Christchurch tragedy played out online over a horrific several hours, the COVID-19 situation, from an engineering standpoint, will be a drawn out process of policing the site for objectionable content over at least the next few months. Earlier in April, YouTube made the call to ban coronavirus 5G conspiracy videos.

"What this feels like, in some ways, is a very, very long Christchurch," Silver said, though he adds he wants to be careful about comparing the situations.

'Every day is Saturday'

Not surprisingly, people have flocked to YouTube in recent months. Normally, peak usage is usually on the weekends, but as people settle into stay-at-home routines, that time on YouTube has seeped into the work week. "Every day is Saturday," Silver said. The company last month said it's reducing the default playback quality on videos to alleviate the added strain on internet bandwidth. (YouTube, however, would not provide numbers to illustrate the jump in viewership.)

The platform has also leaned on engineering chops to keep people away from misinformation. In 2018, YouTube announced a product called "information panels," short blurbs that appear under false or misleading videos that aim to debunk misinformation by linking to accurate sources. YouTube says the information panel for coronavirus videos has been the most successful one the company has created, with more than 14 billion impressions.

The panels are helpful, but they haven't always worked as planned in the past. When the Notre Dame cathedral in Paris went up in flames last April, YouTube's algorithm accidentally displayed an information panel on the 9/11 terrorist attacks because the software made a mistake in analyzing the images in the video. After the fire, YouTube said its systems made the "wrong call."

When it comes to the coronavirus situation, YouTube knows the tech won't always be perfect. But the company says it's in a better position to deal with the crisis -- and the influx of people on the platform -- because it's used to working at a big scale. "We essentially built for that growth," Silver said. "In many ways, a lot of what we prepared for has come true."