Facebook pulls down fake accounts that spread COVID-19 vaccine disinformation

The social network says the operation was based in Russia and posted about the AstraZeneca and Pfizer COVID-19 vaccines.

Queenie Wong
Queenie Wong Former Senior Writer
Queenie Wong was a senior writer for CNET News, focusing on social media companies including Facebook's parent company Meta, Twitter and TikTok. Before joining CNET, she worked for The Mercury News in San Jose and the Statesman Journal in Salem, Oregon. A native of Southern California, she took her first journalism class in middle school.
Expertise I've been writing about social media since 2015 but have previously covered politics, crime and education. I also have a degree in studio art. Credentials 2022 Eddie award for consumer analysis
2 min read

Facebook is filled with false claims about COVID-19 vaccines.

Sarah Tew/CNET

Facebook said Tuesday that it pulled down 308 fake accounts, including from Instagram, that pushed disinformation about the AstraZeneca and Pfizer COVID-19 vaccines.

The social network has been under pressure from US politicians and regulators to do more to combat false claims about the COVID-19 vaccines. Though the social network partners with fact-checkers, labels content and directs people to a hub with coronavirus information, advocacy groups and other critics have pointed out that misinformation about the vaccine still continues to spread on the social network and its photo app, Instagram. 

Facebook said it removed the fake accounts in July for violating its rules against foreign interference and for misleading others about the purpose of the accounts and the identity of those behind them. The social network linked the fake accounts to Fazze, a subsidiary of a UK-registered marketing firm that mainly operated from Russia. Facebook also banned Fazze from its platform. CNET couldn't immediately reach Fazze. The accounts targeted people in Latin America, India and the US but didn't get a large audience to engage with their posts, Facebook said. 

In November and December 2020, the accounts posted false claims that the AstraZeneca vaccine would turn people into chimpanzees. The accounts also created misleading petitions and posts on other sites, such as Change.org, Medium and Reddit, Facebook said. Posting in English and Hindi, the blog posts and petitions claimed that AstraZeneca manipulated its COVID-19 vaccine trial data and used an untried technology to create the vaccine. The accounts then paired blogs and petitions with memes that falsely claimed that getting the vaccine would turn people into chimpanzees and used hashtags that suggested the vaccine was dangerous. Facebook said it's likely that the operation asked health and well-being influencers to share their content to spread their campaign against the AstraZeneca vaccine. 

The fake accounts then stopped posting until May 2021, Facebook said. They then pushed content that Pfizer's COVID-19 vaccine caused a much higher "casualty rate" than other vaccines, citing a 12-page document the operation claimed was hacked and leaked from AstraZeneca. The BBC reported in July that Fazze was recruiting influencers, including on Google-owned YouTube, to spread disinformation about COVID-19 vaccines, and the news outlet noted that the "data the influencers were asked to share had actually been cobbled together from different sources and taken out of context." Just because someone dies after receiving the vaccine doesn't mean the vaccine was the cause of death, the BBC noted. 

Both phases of the campaign coincided with a time when governments were discussing the emergency authorization for the AstraZeneca and Pfizer vaccines. Facebook, though, noted there's still information that it doesn't know, including who commissioned Fazze to run the disinformation campaign.