Facebook to remove anti-vaccination recommendations after backlash

The social network outlines steps it's taking to stop the spread of misinformation about vaccines.

Queenie Wong Former Senior Writer
Queenie Wong was a senior writer for CNET News, focusing on social media companies including Facebook's parent company Meta, Twitter and TikTok. Before joining CNET, she worked for The Mercury News in San Jose and the Statesman Journal in Salem, Oregon. A native of Southern California, she took her first journalism class in middle school.
Expertise I've been writing about social media since 2015 but have previously covered politics, crime and education. I also have a degree in studio art. Credentials
  • 2022 Eddie award for consumer analysis
Queenie Wong
2 min read

Facebook CEO and co-founder Mark Zuckerberg. 

James Martin

Facebook is stepping up its efforts to curb the spread of misinformation about vaccines, a move that follows mounting criticism that the world's social network hasn't done enough to combat the medical myths. 

On Thursday, Facebook said it would demote the ranking of groups or pages that spread misinformation about vaccines on its News Feed and in search results. The company also won't recommend these groups or pages to users when they search for vaccination information on the social network.

Facebook said it will rely on the World Health Organization and the US Centers for Disease Control and Prevention to help identify vaccine conspiracies, the company said in a blog post. Facebook will also reject ads that include misinformation about vaccines and won't allow advertisers to target users interested in topics such as "vaccine controversies." 

The effort to combat anti-vaccination misinformation will include other sites Facebook owns. Recommendations for vaccine misinformation won't be displayed in Instagram's "Explore" section, which surfaces posts based on the content a user likes. Instagram also won't recommend the information on pages that display hashtags, which are designed to help users find posts about a particular topic.  

Facebook's move comes as it faces mounting pressure from lawmakers, activists and health experts to prevent anti-vax misinformation from going viral. The misinformation may have contributed to an outbreak of measles in the US. Users opposed to vaccines have used the social network to spread misinformation in groups where members have to be approved, making it difficult for Facebook to police this content, according to The Guardian.

Ethan Lindenberger, an Ohio teenager who got vaccinated despite his mom's opposition, told US lawmakers this week in a committee hearing that his mother got misinformation about vaccines from social networks such as Facebook.

The World Health Organization listed "vaccine hesitancy," which is the reluctance or refusal to get vaccinated despite availability of a treatment, one of the top 10 threats of 2019. Washington State declared a local public health emergency in January after 25 cases of measles popped up in Clark County. Since Jan. 1, measles outbreaks in that county have climbed to 70 confirmed cases and 61 of those patients -- which were mostly children -- were unvaccinated, according to Clark County's Public Health

Facebook is also following in the steps of other tech companies that have beefed up their efforts to combat misinformation about vaccines. Pinterest blocked anti-vaccination searches and has tried to pull down anti-vax content. In February, YouTube said it would remove ads from videos that feature anti-vaccination content. 

Facebook, which has 2.3 billion users worldwide, said it plans to do more to combat misinformation about vaccines. 

"We are exploring ways to give people more accurate information from expert organizations about vaccines at the top of results for related searches, on Pages discussing the topic, and on invitations to join groups about the topic," said Monika Bickert, Facebook's vice president of global policy management in the blog post. "We will have an update on this soon."