Facebook's InfoWars, fake news, Alex Jones problems aren't going away

It seems Facebook would rather shush misinformation than shut it off completely.

Alfred Ng Senior Reporter / CNET News
Alfred Ng was a senior reporter for CNET News. He was raised in Brooklyn and previously worked on the New York Daily News's social media and breaking news teams.
Joan E. Solsman Former Senior Reporter
Joan E. Solsman was CNET's senior media reporter, covering the intersection of entertainment and technology. She's reported from locations spanning from Disneyland to Serbian refugee camps, and she previously wrote for Dow Jones Newswires and The Wall Street Journal. She bikes to get almost everywhere and has been doored only once.
Expertise Streaming video, film, television and music; virtual, augmented and mixed reality; deep fakes and synthetic media; content moderation and misinformation online Credentials
  • Three Folio Eddie award wins: 2018 science & technology writing (Cartoon bunnies are hacking your brain), 2021 analysis (Deepfakes' election threat isn't what you'd think) and 2022 culture article (Apple's CODA Takes You Into an Inner World of Sign)
Alfred Ng
Joan E. Solsman
6 min read
Facebook logos

It may sometimes seem like Facebook has a split personality when it comes to handling fake news.

Jaap Arriens/NurPhoto via Getty Images

Facebook can't get its story straight on fake news.

Over the last two years, the company has been ramping up its efforts to fight disinformation, after realizing some had been using the social network to push propaganda to millions of people. Despite its efforts, however, hoaxes continue to pop up, while pages pushing conspiracy theories thrive.

This week, Facebook tried to offer an explanation for why it allows pages that post false news, hoaxes and propaganda to stay in business, arguing that it's defending free speech. Much of the ensuing debate centered on Alex Jones' notorious conspiracy theory site InfoWars and whether its Facebook page should be shut down. The fact that it stirred up so much controversy underscores the tricky role Facebook plays as the master of such a massive platform for expression.

The social network said it'd rather demote the posts spreading misinformation than ban the source outright. The approach means that these pages don't get as many views as they used to, but they still live to spread misinformation. Facebook said pages lose about 80 percent of views when they're demoted.

"We just don't think banning Pages for sharing conspiracy theories or false news is the right way to go," Facebook said in a tweet on Thursday. "They seem to have YouTube and Twitter accounts too -- we imagine for the same reason."

A Twitter spokesman said the company doesn't comment on individual accounts, but did said it "should not be the arbiter of truth."

YouTube didn't respond to a request for comment about whether it allowed such posts on similar grounds. YouTube has a "three strikes" rule for terminating accounts that run afoul of its community standards. After three violations of its official policies about appropriate content within three months, YouTube removes a channel. YouTube has issued strikes against InfoWars in the past but its channel remains active. 

Media and journalism experts took an arching view of Facebook's stance. 

"This looks like another instance of Facebook wanting to have its cake and eat it too. It's trying to appear like it's solving a problem while not upsetting conservative users," said Brett Johnson, an assistant professor of journalism studies in the Missouri School of Journalism. 

Sometimes, not much separates Fox News, InfoWars, or Diamond and Silk from the latest fake-news site from Serbia, he added. 

"Facebook either cannot or will not devise a policy to properly handle such nuance," he said. "So this is what we get: fighting fake news with fake nuance."

Jay Rosen, a widely followed associate professor of journalism at New York University, called Facebook's reaction a "confession of weakness."

"A pitiful and helpless giant unable to grasp problems and make decisions. It's Goliath with a learning disability. You want to fight misinformation on your platform, but you also think Infowars should be on it...," he tweeted Thursday. 

The challenge of addressing fake news on Facebook aggravates the company's core discomfort with having to make editorial decisions, according to Karen North, a professor at the University of Southern California's Annenberg journalism school and the director of its digital social-media program. 

"The problem is if they create a policy, they want to create one they can implement without making editorial decisions," she said. "And that's very hard to do."

And not very many people would want Facebook to literally ban lies, she said. 

"People lie all the time on Facebook, that's what people do," she said. "Nobody wants Facebook to say you can't embellish your personal life, your job, your vacation or dating status."

Jason Kint, the CEO of a trade association for digital-content companies called Digital Content Next, noted that while it's "abhorrent" for companies to profit off disturbing misinformation, "we need to be very careful in advocating for the deletion of accounts and any form of censorship by these powerful gatekeepers without clear terms-of-use violations."

Politicians weighed in too. US Sen. Chris Murphy of Connecticut said Facebook equating InfoWars with "normal political dialogue" was concerning. 

"I refuse to live in a world where nothing is untrue or morally wrong, just left or right," he tweeted. Connecticut is home to Sandy Hook, where a gunman shot and killed 20 elementary school students and six school staff. InfoWars has persistently publicized a theory that the shooting was a hoax. 

Facebook's stance contradicts the efforts the company has outlined for fighting fake news.

During an event at Facebook's offices in New York on Wednesday, the social network played a nearly 12-minute video showcasing its fight against false news to a roomful of journalists.

At one point, Eduardo Ariño de la Rubia, a data science manager for News Feed Integrity at Facebook, called out content like Pizzagate as a hoax and false news, mentioning, "we have to get this right if we're going to regain people's trust." Pizzagate, a conspiracy theory that InfoWars' Jones pushed and later apologized for, claimed Democrats were operating a child sex ring out of a DC pizzeria.

Other hoaxes mentioned in the video include the false claims that an undocumented immigrant started the Napa Valley wine country fires, and an image of a Seattle Seahawks player edited to make it look like he was holding a burning American flag.

Facebook pointed to all those examples of misinformation, telling reporters that it's dedicated to regaining public trust and eradicating hoaxes on what is, with 2 billion people visiting at least once a month, the largest social network in the world.

But when CNN's Oliver Darcy asked why InfoWars was still allowed on the social network with over 900,000 followers, Facebook's head of News Feed, John Hegeman, explained that InfoWars didn't violate any rules.

"Just for being false doesn't violate the community standards," Hegeman said. "They haven't violated something that would result in something being taken down."

Michelle A. Amazeen, a Boston University assistant professor of mass communication, said Facebook's admission that fake news hurt Facebook users' experience and the company's pledges to change have the ring of "sorry not sorry" to them.

"If Facebook retains a fake news site like InfoWars, even if they reduce its distribution, what does that even mean?" she asked. "Reduce it to what?"

Jonathan Albright, director of the Digital Forensics Initiative at the Tow Center for Digital Journalism at Columbia, said it's more than just about the pages themselves. 

"It's more about Facebook allowing a growing collective of InfoWars-branded pages, live channels and profiles using conspiracies and fear-drama to sell a huge range of alternative health products and supplements through an official Facebook store," he said. "Always look for the money."

Despite Facebook's video showing those three examples of fake news on its network, the pages behind those posts are still up.

Vets for Trump, a page with more than 120,000 followers, was behind the Seattle Seahawks post. While the image has been removed, the page is still around and gaining followers.

The misinformation about the Napa Valley wildfires came from Breitbart, which is also still on Facebook with more than 3.8 million followers.

At the event, Sara Su, Facebook's product specialist for the News Feed, said the social network doesn't classify posts as hoaxes itself -- instead relying on outside fact-checkers to label the content.

"For the hoax classifiers, third-party fact-checking is the best source of ground truth that we have," she said.

Facebook didn't respond to a request for comment Friday. Facebook's head of global policy management, Monika Bikert, will answer questions at a Tuesday hearing of the US House Judiciary Committee focused on social media filtering practices. 

First published July 13 at 10:10 a.m. PT.
Update at 1:58 p.m. PT.: Adds comments from experts, context on YouTube's policies and a mention of Tuesday's House committee hearing.
Update on July 14 at 5 a.m. PT: Adds comment from another expert and additional background. 
Update on July 18 at 8:49 am PT: Adds comment from additional expert.

CNET Magazine: Check out a sample of the stories in CNET's newsstand edition.

Cambridge Analytica: Everything you need to know about Facebook's data mining scandal.