Facebook makes policy changes after Trump ban: What you need to know

The social network said the former president will remain suspended from its platform until at least 2023.

Queenie Wong Former Senior Writer
Queenie Wong was a senior writer for CNET News, focusing on social media companies including Facebook's parent company Meta, Twitter and TikTok. Before joining CNET, she worked for The Mercury News in San Jose and the Statesman Journal in Salem, Oregon. A native of Southern California, she took her first journalism class in middle school.
Expertise I've been writing about social media since 2015 but have previously covered politics, crime and education. I also have a degree in studio art. Credentials
  • 2022 Eddie award for consumer analysis
Queenie Wong
4 min read

Facebook took a closer look at how it handles political speech after suspending Trump from its platform.

Image by Pixabay/Illustration by CNET

Facebook took away former President Donald Trump's digital megaphone when the social network suspended him following the deadly riot at the US Capitol in January. On Friday, the company said the politician might get it back.

The social network giant said Trump would remain suspended until at least January 2023, two years after the company booted him from its platform. The company made the decision following recommendations from an independent oversight board tasked with reviewing some of its toughest content moderation decisions.

The board upheld Facebook's decision to suspend Trump because of the risks his comments posed. But it chastised the company for locking Trump out indefinitely, a punishment Facebook hadn't told users was on the books.

Facebook wasn't alone in kicking Trump off its platform out of concern his remarks could incite violence. Twitter and Snapchat banned Trump permanently. Google-owned YouTube said the former president will remain suspended until the risk of real-world violence has decreased. 

Social networks have typically stayed away from pulling down political speech because they consider the comments of politicians newsworthy. But the platforms also have rules against inciting hate speech, harassment and other offensive content. Trump's controversial remarks put those rules to the test, prompting changes to how these companies handle speech that is often left untouched.

Facebook addressed both Trump specifically and political speech broadly on Friday. Here's what you need to know:

How did Facebook decide to handle Trump's suspension?

Facebook said Trump will remain suspended from its platform for a total of at least two years. The social network suspended Trump on Jan. 7, 2021, so that means he could return to the platform in January 2023, before the next US presidential election.

The former president's return to Facebook isn't guaranteed, however. When the two years are up, Facebook will consult with experts on public safety and examine factors such as "instances of violence, restrictions on peaceful assembly and other markers of civil unrest" before deciding whether to allow Trump back on Facebook and Instagram, its photo-sharing site. Trump has 35 million followers on Facebook and 24 million followers on Instagram.

"When the suspension is eventually lifted, there will be a strict set of rapidly escalating sanctions that will be triggered if Mr. Trump commits further violations in the future, up to and including permanent removal of his pages and accounts," Facebook's vice president of global affairs Nick Clegg said in a statement. The company didn't say what these sanctions are.

What was Trump's reaction to Facebook's decision?

Trump has pushed back against allegations that his January remarks were meant to incite violence. Before the Capitol Hill riot, Trump told his supporters they needed to "fight like hell" and said "we're going to the Capitol." Facebook removed two posts from Trump that reiterated baseless claims that the 2020 election results were fraudulent, but he also told his supporters to go home.

In a statement Friday, Trump said Facebook's decision was an "insult" to the people who voted for him in the 2020 presidential election. "They shouldn't be allowed to get away with this censoring and silencing, and ultimately, we will win. Our Country can't take this abuse anymore!," Trump wrote. 

What other actions did Facebook take that could impact public figures?

Facebook outlined how it could penalize public figures during times of civil unrest and violence. Public figures who violate Facebook's rules could be restricted from posting on the social network for anywhere from one month to two years.

How did Facebook handle political speech before Trump's suspension?

Facebook typically has had a mostly hands-off approach to political speech, which has prompted criticism that it doesn't do enough to combat misinformation, hate speech and harassment from high-profile figures. The social network exempts politicians, for example, from being fact-checked because it says political speech is already highly scrutinized.

In 2019, Clegg said Facebook "will treat speech from politicians as newsworthy content that should, as a general rule, be seen and heard."

The company has also had a newsworthiness exemption since 2016, meaning that Facebook will sometimes leave up content that violates its rules if the public interest outweighs the risk of harm.

Was the newsworthy exemption ever applied to Trump's posts?

Facebook said it applied the newsworthy exemption to one of Trump's posts. In 2019, Trump posted a New Hampshire rally video in which he said of a person in the crowd: "That guy's got a serious weight problem. Go home. Start exercising." Facebook said those remarks violated its rules against bullying and harassment but the company left the video up because it determined the harm was low risk and there was a high public interest because Trump was running for re-election. 

What other changes is Facebook making to how it handles political speech?

Facebook said it will no longer presume any content is newsworthy, including posts from politicians.

"We will simply apply our newsworthiness balancing test in the same way to all content, measuring whether the public interest value of the content outweighs the potential risk of harm by leaving it up," Clegg said.

The company said it will provide more information in its online transparency center about how this newsworthiness exemption is applied. Facebook said it's also trying to be more open about how it moderates content.

The company provided more details, for example, about how it applies "strikes" to a user's Facebook and Instagram account for violating its rules. "Whether we apply a strike depends on the severity of the content, the context in which it was shared and when it was posted," Facebook said. In extreme cases -- child exploitation material, for example -- a user's account could get pulled after a single incident.