X

Here's how social media companies are fighting election misinformation

The tech companies say they're better prepared to tackle interference than they were four years ago.

Queenie Wong Former Senior Writer
Queenie Wong was a senior writer for CNET News, focusing on social media companies including Facebook's parent company Meta, Twitter and TikTok. Before joining CNET, she worked for The Mercury News in San Jose and the Statesman Journal in Salem, Oregon. A native of Southern California, she took her first journalism class in middle school.
Expertise I've been writing about social media since 2015 but have previously covered politics, crime and education. I also have a degree in studio art. Credentials
  • 2022 Eddie award for consumer analysis
Andrew Morse Former executive editor
Andrew Morse is a veteran reporter and editor. Before joining CNET, he worked at The Wall Street Journal, Reuters and Bloomberg, among other publications.
Richard Nieva Former senior reporter
Richard Nieva was a senior reporter for CNET News, focusing on Google and Yahoo. He previously worked for PandoDaily and Fortune Magazine, and his writing has appeared in The New York Times, on CNNMoney.com and on CJR.org.
Queenie Wong
Andrew Morse
Richard Nieva
4 min read
Google, Facebook and Twitter logos on a phone screen

The 2020 presidential election is putting social media to the test.

Getty Images

Social networks came under fire after Russian trolls used them to sow discord among Americans during the 2016 US presidential election. Now Facebook , Twitter and Google say they're better prepared to tackle misinformation during this year's presidential election. 

Facebook and Google, which owns the YouTube video-sharing service, have created databases that allow anyone to check the source of a political ad, who financed the message and how much was spent. Twitter banned political ads last year. Facebook works with third-party fact-checkers, though it exempts posts by politicians from this program. All of the social networks label problem posts, directing users to online hubs that include election information from authoritative sources. (Short-form video app TikTok, another popular social media network, also launched a US elections guide and said it will also reduce the spread of misleading videos.)

The efforts to combat misinformation come as social media companies weather a storm of criticism from all quarters about the Nov. 3 election, which major news organizations called for Democrat Joe Biden on Saturday morning. Conservatives supporting Republican President Donald Trump say social networks suppressed their speech in an effort to sway the election. The companies deny the allegations. Liberals say the companies haven't done enough to stamp out fake news. 

Read moreIt's Election Day: How to avoid getting fooled by misinformation

All of these companies are faced an onslaught of misinformation while votes were being counted and will likely see new waves now that the contest has been called. Here are the ways the big three social media companies are trying to limit the spread of misinformation.

Facebook

  • Facebook let users turn off all political ads on both the social network's main site and its Instagram photo-sharing service ahead of the election. 
  • Facebook and Instagram launched an online hub for US users for information about voting, including registration, mail-in voting and election-related deadlines. Facebook CEO Mark Zuckerberg says the company helped an estimated 4.4 million Americans register to vote. 
  • The social network stopped accepting new political or issue advertising during the final week of the campaign and will expand policies addressing voter suppression. Facebook will also temporarily halt all election and issue ads after Nov. 3 for an indefinite period of time. 
  • The company will also label posts from politicians who declare premature victory. 
  • Facebook is removing fake accounts designed to mislead others about their identity and purpose, including some with ties to Iran and Russia. The social network has also cracked down on accounts related to QAnon, a far-right conspiracy theory falsely alleging there's a deep state plot against Trump. 
  • Facebook's Messenger and WhatsApp, both messaging apps, are limiting message forwarding.
  • Facebook temporarily suspended recommendations for new and political groups, online spaces where users gather to chat about shared interests. 
  • Facebook is demoting content on the social network and Instagram that may contain misinformation. The company is also limiting the distribution of live videos about the election. On Facebook and Instagram, users who try to share a post with an election label will see a message urging them to visit the voting information center.

Twitter

  • Twitter banned political ads, one of the strictest moves taken by a social media company.
  • Twitter may delete tweets that violate its policies, temporarily lock the accounts of offending users and suspend repeat offenders. Tweets that could violate the company's policies include messages that provide misleading information about voting, attempt to suppress or intimidate voters, provide false or misleading information about results, fail to fully or accurately disclose the identity of the tweeter.
  • The short-message social network may also add labels to tweets with misleading information, including those from politicians who declare victory prematurely. In addition, Twitter is labeling tweets that include manipulated media, state-affiliated media or content from politicians and government leaders that violates its rules, but leaves the tweets up because of public interest. 
  • Twitter is making users think twice before they share a tweet that contains disputed information by showing them a warning. The company is also encouraging users to add a comment to retweets. Like other companies, Twitter is trying to direct people to a page with trustworthy election information. 

Google

  • Google made changes to its popular search engine, blocking some suggestions its auto-complete function provides if a query is election-related. For example, if someone types the phrase "donate to," Google will block autocomplete suggestions that include the names of candidates or political parties. 
  • Google will temporarily ban political advertisements after the polls close to try to prevent ads falsely claiming victory.
  • YouTube will label election videos and search results with an information panel that warns, "Results may not be final." The panel will link to a feature on Google with real-time results from the Associated Press.
  • YouTube will show people information panels on mail-in voting when they watch videos that discuss the subject. (The ballot-casting method has become fraught with misinformation as Trump has tried to discredit the process, while providing no evidence of security flaws in the time-tested system.)
  • YouTube banned some videos pushing false conspiracies such as QAnon, pledging to remove content that "targets an individual or group with conspiracy theories that have been used to justify real-world violence." 
  • YouTube banned videos containing information that was obtained through hacking and could interfere with elections or censuses.
Watch this: Big tech explains how it will fight foreign government hacks in US elections

CNET's Alfred Ng contributed to this report.