Facebook post raising concerns about COVID vaccine was most viewed in first quarter

The data follows news that the company shelved the report out of image concerns.

Queenie Wong Former Senior Writer
Queenie Wong was a senior writer for CNET News, focusing on social media companies including Facebook's parent company Meta, Twitter and TikTok. Before joining CNET, she worked for The Mercury News in San Jose and the Statesman Journal in Salem, Oregon. A native of Southern California, she took her first journalism class in middle school.
Expertise I've been writing about social media since 2015 but have previously covered politics, crime and education. I also have a degree in studio art. Credentials
  • 2022 Eddie award for consumer analysis
Queenie Wong
3 min read

Facebook has been sharing more data, but the company is still under scrutiny from politicians and other critics.

Sarah Tew/CNET

Facebook said over the weekend that its most-viewed content during the first quarter was an article about a doctor who died after a COVID vaccination, data that Facebook reportedly avoided releasing earlier because of concerns it would make the company look bad.

In a shelved transparency report, Facebook said a link to an article bearing the headline "A 'healthy' doctor died two weeks after getting a COVID-19 vaccine; CDC is investigating why" was its most viewed content between January and March. The link was to a story republished by The Chicago Tribune.  Far fewer people saw an update reporting the medical examiner hadn't found sufficient evidence to determine if the vaccine was a factor in the doctor's death, according to The New York Times.

The Epoch Times, a far-right media outlet, was the 19th most-popular page on the platform, the report showed.

Facebook's commitment to transparency will almost assuredly be challenged because the data was released after The New York Times reported the company initially shelved the information out of concern for its image. The revelation raises questions about whether the social network is selectively publishing data that helps it fight concerns that polarizing content spreads widely on the platform. 

The social network has been a target of politicians and activists who say it doesn't do enough to stop the spread of misinformation, hate speech and other socially questionable material. The Biden administration has urged Facebook to do more to combat COVID-19 misinformation that could make people hesitant to get vaccinated.

Facebook didn't respond to a request for comment. 

Andy Stone, a spokesman for the company, tweeted that "we ultimately held [the information] because there were fixes we needed to make." 

The Times reported on Friday that Alex Schultz, Facebook's chief marketing officer and vice president of analytics, and other executives debated whether the results of the first-quarter report would harm Facebook's image. 

Facebook executives have reportedly raised concerns before about information from CrowdTangle, a company-owned data analytics tool, that shows high engagement for right-wing sites. 

On Wednesday, Facebook published for the first time a report that included what domains, links, pages and posts were the most widely viewed in the US on Facebook during the second quarter. The most viewed domain in the April-June period  was YouTube and the most viewed link was for Player Alumni Resources. The top page was from Unicef. The most viewed post was an image from a motivational speaker that asked people about the first words they see in a block of letters.

Company executives said during a press call that Facebook released the data as part of its broader commitment around transparency. But some people, including former vice president of product marketing at Facebook Brian Boland, said the report "fails to deliver on the transparency it promises" because there are limitations to the data and he finds it "useless."

"After reading through the press release and the report itself I came away believing that this entire effort is a PR stunt," Boland said in a Medium post.

Facebook has also drawn criticism for disabling accounts tied to a New York University research project on political ads on the platform and to German researchers looking into the algorithm used by Instagram, the company's photo-sharing site.