X

YouTube bans misinformation about COVID-19 vaccines

The platform will remove content that contradicts information from the World Health Organization and other health authorities.

Richard Nieva Former senior reporter
Richard Nieva was a senior reporter for CNET News, focusing on Google and Yahoo. He previously worked for PandoDaily and Fortune Magazine, and his writing has appeared in The New York Times, on CNNMoney.com and on CJR.org.
Richard Nieva
2 min read
google-hq-sede-mountain-view.jpg

YouTube updates its COVID-19 misinformation policies.

Angela Lang/CNET

YouTube on Wednesday updated its policy on COVID-19 misinformation, banning false claims about vaccinations for the respiratory disease caused by the coronavirus. 

The world's largest video platform, with more than 2 billion visitors a month, will remove content that contradicts "expert consensus" from the World Health Organization or local health authorities. That includes baseless claims that a vaccine will kill people or cause infertility.

Previously, YouTube's rules covered misinformation about coronavirus treatments and prevention, but Wednesday's update specifically calls out content on vaccines. The company said it's removed more than 200,000 videos containing misleading COVID-19 information since February.

The policy change comes as big tech companies face intense scrutiny over misinformation spreading on their platforms. YouTube, owned by Google , has tried to curb false claims about everything from mail-in voting to medical misinformation, a task that's become more critical during a pandemic. The stakes will continue to grow as researchers race toward developing a vaccine for the virus. Earlier this year, YouTube struggled to contain the spread of the Plandemic, a video that contains false information about COVID-19.  

Facebook on Tuesday announced it's banning advertisements that discourage people from getting vaccines, though the social network will still allow ads that advocate for or against legislation or government policies about vaccines.

YouTube has had a long history dealing with anti-vax content. Last year, the company removed advertisements from video channels that discourage vaccinations. YouTube considers anti-vaccination videos to be harmful content, and the channels shouldn't have been monetized in the first place, but the videos slipped through YouTube's filters. 

Lawmakers have already put pressure on tech giants to stop the spread of misinformation concerning vaccinations. Last year, Rep. Adam Schiff, a Democrat from California, wrote an open letter to Google CEO Sundar Pichai urging him to fix the problem of anti-vax content on the search giant's platforms.