X

The misinformation pandemic is out of control, but there's a fix

Here's why the war on facts will get worse, but there's also a way out of the madness.

Oscar Gonzalez Former staff reporter
Oscar Gonzalez is a Texas native who covered video games, conspiracy theories, misinformation and cryptocurrency.
Expertise Video Games, Misinformation, Conspiracy Theories, Cryptocurrency, NFTs, Movies, TV, Economy, Stocks
Oscar Gonzalez
7 min read
Misinformation
Getty Images

Conspiracy theories and misinformation about QAnon, COVID-19 and 2020 election fraud took a deadly turn in 2021. As bad as things were last year, experts worry it'll get worse in 2022.

"I think we're going to see an acceleration and expansion of the conspiracy theories," said Mike Caulfield, research scientist at the University of Washington Center for an Informed Public. "They're going to go bigger, they are going to play even more loosely with the truth."

This expected ramp-up could mean a widening divide among Americans, more outlandish ideas being shared and, as shown in this past year, potentially more lost lives. We'll see a real-world test of how bad this could get with the approach of the 2022 midterm elections, around which misinformation peddlers are expected to continue their onslaught on the truth. 

One reason it could get worse is that the federal governments and tech companies aren't getting ahead of the problem. 

"We are still very much in a reactive mode, and until we get ahead of some of this, we can expect each cycle to be worse than the last," Caulfield said. 

There is good news. This can be fixed, but it'll take some effort.

In his study, The Perfect Storm: A Subcultural Analysis of the QAnon Movement, Chris Conner, visiting assistant professor of sociology at the University of Missouri, Columbia, argues that people who support the QAnon conspiracy theory do so because they mistrust the government and public officials after social systems in the US have failed them -- whether through economic hardships or a lack of proper mental illness coverage. This leaves believers feeling alienated and dissatisfied with how their lives ended up. 

If these hardships aren't addressed, many will continue to go further down the rabbit hole. Others could end up martyrs for what they believe to be a just cause. 

"What's going to be productive is listening to these people and taking them seriously about what it is they were responding to," Conner said.

Social media's response

For the most part, social media companies are sticking with what they did in the past year to tackle misinformation, including removing false content, banning popular influencers who spread the misinformation and improving their reporting systems to flag content. That, however, might not be enough.

Experts have repeatedly pointed to social media as being a driving force in the spread of misinformation. 

"It is absolutely 100% on the backs of the social media companies to continue to crack down on these movements and maybe risk one tiny little sliver of their profits in the service of doing the right thing for the greater good," said Mike Rothschild, conspiracy researcher and author of The Storm Is Upon Us, which provides a history of the QAnon conspiracy theory. "I don't know that QAnon would have spread if Facebook , Twitter and YouTube had cracked down on it in 2018."

QAnon -- a pro-Donald Trump fringe conspiracy theory that claims the former president was fighting a war against Satanist pedophiles in Hollywood and the Democratic Party -- continues to fester even though President Joe Biden has been in office for nearly a year. Q supporters are likely to continue spreading misinformation across social media platforms, and there are dozens planning to run for office in 2022, according to Media Matters

Social media companies say they're prepping for the 2022 elections and have learned from how things went in 2020. 

That includes Facebook (now rebranded as Meta), which was lambasted after multiple whistleblowers came forward in October. Former employees accused the social media platform of allowing hate speech and misinformation to escalate in their desire for higher profits. 

As for voter fraud conspiracies surrounding the 2020 elections, Facebook did take down some groups -- one with more than 300,000 members -- and accounts that spread false information, and the company says it's ready for 2022. 

"We're enforcing our policies against removing voter interference content, and we'll continue to refine our strategy to combat content that discusses the legitimacy of voting methods, like voter fraud claims," Monika Bickert, vice president of content policy at Facebook, said in a November press call that a Facebook spokesman referred CNET to. "While each election will bring its own unique set of challenges, we're working diligently to apply the lessons we've learned from previous years to elections in the US and other countries in 2022 and beyond." 

Twitter cracked down on misinformation accounts throughout 2021 and has taken action against the accounts of politicians such as Rep. Marjorie Taylor Green for spreading vaccine misinformation. She has since been banned from the platform. The social media company also instituted a new reporting option on tweets spreading misinformation about health and politics. Twitter says it's "committed to improving the health and integrity of the public conversation."

Twitter and Facebook also banned Trump days after the deadly Jan. 6 Capitol riot, in which his supporters stormed Congress as the 2020 election was being certified. 

YouTube stepped up its misinformation policies in 2020 and 2021 by banning COVID vaccine misinformation and anti-vax misinformation. YouTube, which is owned by search giant Google, also removed more than 1 million videos in 2021 related to COVID misinformation. The company says it's looking to keep improving its systems to weed out misinformation.

TikTok began dealing with misinformation about the elections and COVID vaccines in 2020. This resulted in hundreds of thousands of videos being deleted from the platform in 2021. The company also instituted a warning prompt to help stop the spread of flagged videos

"While TikTok isn't the first place people look to for political content, we're committed to doing our part to help stop the spread of disinformation and connect our community to authoritative information on elections," a TikTok spokesperson said.

Social media companies have implemented other tools to help out individuals with misinformation that experts agree has worked. This includes labeling misinformation and fact-checking posts. Another useful action is to simply slow down the spread of certain misinformation content soon after it's posted. 

Even though these larger social media platforms are trying to stomp out misinformation, people spreading false information and conspiracy theories have found other platforms. Telegram, an encrypted messaging app, has become a haven for QAnon influencers who talk to their hundreds of thousands of followers. Rumble and Odysse are two video platforms filled with misinformation and conspiracy theories that would be quickly removed from YouTube. 

Along with social media companies, Rothschild also points to payment platforms as having a role in spreading misinformation. 

The larger influencers of conspiracy theories profit from the misinformation they peddle. Patreon, PayPal and other services that let people pay money to creators have instituted policies that attempt to prevent funds from going to people producing misinformation, but people continue to find ways around those policies or look to more lenient platforms such as Subscribestar.  

What's in store for 2022? 

A big factor in 2022 could be the person who was at the center of much misinformation in 2021: Trump. In October, the former president said he would launch his own social media network called Truth Social that will "stand up to the tyranny of Big Tech." 

Since the announcement, however, there was little talk from Trump about his social media network. The site allows people to sign up for accounts and be put on a waiting list, but so no official date was given. A date did show up on its App Store page showing a launch of Feb. 21, which is also President's Day. Those who wish to reserve a username will have to make a donation to the National Republican Senatorial Committee.  

Other pro-Trump social media platforms exist, including GabParlerGettr and Frank, but they don't have active user bases that compare with the more popular platforms.

These platforms, along with Telegram, will likely be going into overdrive in 2022 because of the midterm elections. There will be 34 Senate seats up for grabs as well as every seat in the House of Representatives, plus 36 gubernatorial elections and races for countless state and local offices. For pro-Trump misinformation peddlers, there's a lot at stake, though it might not be so clear at  first glance. 

"The bigger thing that people in that [misinformation] universe are trying to do with 2022 is to win the narrative battle so that they are sitting in a much better place going into 2024," Caulfield said, referring to the next presidential election year. "If they are able to convince large swaths of the public that the 2022 elections are illegitimate, then they are more likely to get the sorts of legislative changes that they want." 

While all of this misinformation can be overwhelming, there are things you can do to ensure you're not getting taken in, from running Google reverse image searches to verify photos to just generally being aware of this problem. It also helps to take stock of your emotions when a certain social media post shows up. Some content is designed to outrage people, especially when it comes from dubious sources. 

Even though academic experts, government entities and tech companies are all aware of how bad the misinformation problem is, and will be, it will still be an uphill battle to stop it from spreading.