X

Facebook Adds Tools To Combat Misinformation in Groups

Facebook users who run Groups will be able to automatically decline incoming posts that have been rated false by fact-checkers.

Queenie Wong Former Senior Writer
Queenie Wong was a senior writer for CNET News, focusing on social media companies including Facebook's parent company Meta, Twitter and TikTok. Before joining CNET, she worked for The Mercury News in San Jose and the Statesman Journal in Salem, Oregon. A native of Southern California, she took her first journalism class in middle school.
Expertise I've been writing about social media since 2015 but have previously covered politics, crime and education. I also have a degree in studio art. Credentials
  • 2022 Eddie award for consumer analysis
Queenie Wong
2 min read
Facebook social media app

Facebook has tens of millions of active groups on the social network.

James Martin/CNET

Facebook said Wednesday it's adding new tools that could make it easier to combat the spread of misinformation in Groups. 

Facebook Groups, which can be public or private, are online spaces where people can chat about various topics including hiking, parenting and cooking. But users have also used Groups to spread misinformation including about the coronavirus, elections and vaccines. False claims and propaganda are still a big problem on Facebook especially after Russia's invasion of Ukraine. In some cases, people have used old footage or photoshopped images to misrepresent what's happening in those countries.

screen-shot-2022-03-08-at-4-26-21-pm.png

Facebook will let administrators who manage Groups automatically decline any posts that have been rated false by fact-checkers.

Facebook

One new feature will allow administrators who run Facebook Groups to automatically decline any incoming posts that have been rated false by the company's third-party fact-checkers. The social network said that will help reduce how many people see misinformation. 

Facebook didn't say whether posts typically get fact checked before they're shared in a Group. A company spokeswoman said the social network is also working on a new way for administrators to remove posts that are later flagged for containing false claims after they've been posted to a Group.

Facebook partners with more than 80 fact-checking organizations such as PolitiFact, Reuters and The Associated Press to help identify false claims. Users who try to share a fact-checked post see a warning that says it contains false information in the post but can share the content if they want. Facebook doesn't share data about how much content gets fact checked on its platform. 

The release of the new tools show how Facebook is trying to ramp up efforts to combat misinformation. There's been questions, though, about how well labeling misinformation on social media works. In 2020, a study by MIT found that labeling false news could result in users believing stories that hadn't gotten labels even if they contained misinformation. The MIT researchers call this consequence the "implied truth effect." Facebook said that more than 95% of the time when people see a fact-checking label, they don't end up viewing the original content. 

The social network also announced the release of other features meant to help make it easier for administrators to manage and share Groups. Administrators, for example, will be able to send invites via email and share QR codes that will direct people to a Group's About page where they can learn about the community and join. More than 1.8 billion people use Facebook Groups every month.

Social media sites have also been used to spread scams so users should be wary about clicking on links or sharing QR codes. Facebook said the QR codes for Groups include the social network's logo.