Misinformation posts on Facebook reportedly saw more engagement than factual news during 2020 election

Reputable sites like CNN and the World Health Organization reportedly received six times less engagement.

Shelby Brown Editor II
Shelby Brown (she/her/hers) is an editor for CNET's services team. She covers tips and tricks for apps, operating systems and devices, as well as mobile gaming and Apple Arcade news. Shelby also oversees Tech Tips coverage. Before joining CNET, she covered app news for Download.com and served as a freelancer for Louisville.com.
  • She received the Renau Writing Scholarship in 2016 from the University of Louisville's communication department.
Shelby Brown

An upcoming study reports that misinformation on Facebook saw higher user engagement than factual news sites. 

Sarah Tew/CNET

Misinformation on Facebook got six times more clicks than reputable news sites during the 2020 election, according to an upcoming peer-reviewed study from researchers at New York University and the Université Grenoble Alpes in France. The study, reported earlier by The Washington Post, sought to measure and isolate the misinformation effect across a wide group of publishers on the social media site. 

The study offered evidence to support criticism that Facebook rewards publishers that post misleading information, The Post reported. The study reportedly found that misinformation from both the far right and far left generated more engagement from Facebook users than factual news pages, though far right publishers reportedly had a higher propensity for sharing misleading information. 

"This report looks mostly at how people engage with content, which should not be confused with how many people actually see it on Facebook," Facebook spokesman Joe Osborne told The Post. "When you look at the content that gets the most reach across Facebook, it is not at all like what this study suggests."

Osborne also told The Post that Facebook has 80 fact-checking partners working in over 60 languages to combat false information. CNET reached out to Facebook for comment. 

The study's authors reportedly relied on NewsGuard and Media Bias/Fact Check, two organizations that study misinformation, and used similar categorization tools to examine 2,551 Facebook pages including Occupy Democrats, Dan Bongino and Brietbart, according to The Post.