will "indefinitely" block President
from its platforms, saying his posts pose an unacceptable risk in the wake of a harrowing attack by his supporters on the US Capitol. Facebook CEO
announced the unprecedented move on Thursday, a day after rioters stormed the legislative heart of America's democracy as Congress met inside to certify Joe Biden as the next president.
"We believe the risks of allowing the president to continue to use our service during this period are simply too great," Zuckerberg said in a Facebook post. "Therefore, we are extending the block we have placed on his Facebook and Instagram accounts indefinitely and for at least the next two weeks until the peaceful transition of power is complete."
Biden's inauguration will take place on Jan. 20.
Facebook isn't the only social network taking action on accounts or posts belonging to or relating to Trump. To varying degrees, Twitter, YouTube, Snapchat, Reddit and others have blocked, labeled or deleted posts or accounts over the last several days. The actions by those social networks highlight some of the differences in how they handle political content.
Facebook's ban, which followed an earlier 24-hour block, represents the company's strongest action against Trump's use of social media to spread misinformation, stir grievances and incite violence. The social media giant, which owns photo-sharing app Instagram, has had a mostly hands-off approach to political speech, exempting politicians from fact-checking. Instead, Facebook allowed some of Trump's controversial posts to remain visible or added labels to his baseless claims of election fraud.
"We did this because we believe that the public has a right to the broadest possible access to political speech, even controversial speech," Zuckerberg said. "The current context is now fundamentally different, involving use of our platform to incite violent insurrection against a democratically elected government."
Read more: 25th Amendment, if invoked, could remove Trump from power
The violence that broke out on Capitol Hill on Wednesday marked a turning point for Facebook and other social networks that have been reluctant to silence Trump on social media because of public interest. The Metropolitan Police Department said Wednesday night that four people died when a mob stormed the US Capitol. A fifth person, a Capitol Police officer, died Thursday.
Meanwhile, the companies are also facing more pressure from civil rights activists, politicians and others to do more. Sen. Mark Warner, a Virginia Democrat, said Thursday the steps taken by Facebook and
-- as well as by
-- were "too late and not nearly enough" to curb the problem. Former first lady Michelle Obama called on Silicon Valley companies to "stop enabling this monstrous behavior," permanently ban Trump, and create policies to prevent technology from "being used by the nation's leader to fuel insurrection."
As social media companies clamped down on Trump, some analysts praised the moves but said they were overdue. "Anyone who was following the disinformation on these platforms knew this was probably, unfortunately, inevitable," said Bob O'Donnell, chief analyst at Technalysis Research. "And what happened on the platforms unquestionably led to these events."
The White House didn't immediately respond to a request for comment. On Thursday, Trump shared a statement via White House social media director Dan Scavino that said an "orderly transition" of power would occur on Jan. 20. However, Trump also used the statement to note that he "totally" disagrees with the election's outcome. He has yet to concede that he lost the election two months ago.
Stronger action by social networks could also push users to alternatives such as Parler and Gab. Here's how other social networks are handling Trump and content that could incite violence:
Trump tweeted a two-and-a-half minute video on Thursday, his first post on Twitter since his account was temporarily locked for the first time after violating the company's rules against interfering in elections or other civic processes with three tweets. The lock was set to last for 12 hours following Trump's removal of the offending tweets.
In the video, which marked the closest Trump has come to conceding he lost the 2020 election, the president acknowledged that Congress had certified the results. "A new administration will be inaugurated on January 20th," Trump said, appearing to read from a teleprompter. "My focus now turns to ensuring a smooth, orderly and seamless transition of power."
The locking of Trump's account came after University of Virginia law professor Danielle Citron, journalist Kara Swisher, Obama Foundation CTO Leslie Miley, Anti-Defamation League CEO Jonathan Greenblatt and other high-profile figures urged Twitter to boot the president from the platform.
Twitter in the past has placed a public interest notice over Trump's tweets for glorifying violence, which limited the tweet from being spread. Trump has more than 88 million followers on Twitter, allowing him to reach a massive audience online.
"Our public interest policy — which has guided our enforcement action in this area for years — ends where we believe the risk of harm is higher and/or more severe," Twitter said in a tweet.
On Thursday, Google-owned YouTube tightened a new policy that Trump's channel violated a day earlier. This intensification of enforcement could accelerate his account's termination if the channel continues to run afoul of the rule.
Last month, YouTube instituted a policy to remove any new videos alleging that fraud altered the outcome of the 2020 US presidential election. On Wednesday, Trump's channel posted a video doing just that. His video message urged supporters to "go home now" but also repeated false claims about election fraud. YouTube removed the video under its policy. But the policy, implemented last month, has a grace period lasting until Inauguration Day. With the grace period, channels breaking the rule would have the offending video removed but face no other penalties.
YouTube said it has now ended the grace period, rather than waiting until Inauguration Day. Now, videos that violate that policy will be issued a "strike." Channels are temporarily suspended from posting or livestreaming when they get strikes, and YouTube's "three strike" system permanently bans channels with three violations in a 90-day period.
"We apply our policies and penalties consistently, regardless of who uploads it," YouTube tweeted.
A YouTube spokesman said the company didn't feel it needed to specifically address Trump because it's already laid out its three-strikes policy on barring creators from posting content. The spokesman also said false claims may not only come from Trump himself, but others within the president's orbit, and the policy would apply to them as well.
Snapchat also locked Trump's account for the first time on Wednesday, though it wasn't the first time the disappearing-messaging app had taken action against the president's content.
In June, Snapchat said it would no longer promote Trump's account on a page of curated content called Discover because it doesn't want to "amplify voices who incite racial violence and injustice." The move came after racial justice protests broke out in the aftermath of the police killing of George Floyd.
The company told The New York Times it had made the decision after the president tweeted that if protesters outside the White House breached the fence, they'd be "greeted with the most vicious dogs, and most ominous weapons."
Twitch, owned by Amazon, unplugged Trump's account as well.
"In light of yesterday's shocking attack on the Capitol, we have disabled President Trump's Twitch channel," a spokeswoman said in a statement. "Given the current extraordinary circumstances and the president's incendiary rhetoric, we believe this is a necessary step to protect our community and prevent Twitch from being used to incite further violence."
Reddit on Friday said it banned the popular subreddit r/Donaldtrump. While not an official page hosted by the president or his campaign, the group was reportedly one of Reddit's largest political communities.
"Reddit's site-wide policies prohibit content that promotes hate, or encourages, glorifies, incites, or calls for violence against groups of people or individuals," said a Reddit spokesperson in a statement. "We have also taken action to ban the community r/donaldtrump given repeated policy violations in recent days regarding the violence at the US Capitol."
On Thursday, the company said it had been reaching out to moderators to remind them of the platform rules in the wake of the attack on the Capitol. The site has dedicated teams to enforce its policies, and the company has built an internal tool to "detect and remove policy-breaking content," a spokesperson said Thursday.
Gab said in a blog post it's in the process of connecting with Trump's team about joining the platform. The company set up an account for Trump that has more than 448,100 followers. Gab CEO Andrew Torba said in an email that the Trump video other social networks removed "explicitly called for peace."
Gab, which says it champions free speech, has been used by extremists, such as neo-Nazis and white supremacists, who have been booted from other social networks. On Wednesday, some Gab users documented themselves going into the offices of Congress members and called for people inside the building to hunt down Vice President Mike Pence, who Trump had criticized earlier in the day, The New York Times reported.
In a blog post, Gab said it works with law enforcement to promote public safety. "We proactively report when our moderation team discovers content which we believe poses an imminent threat to life and respond rapidly when law enforcement identifies any such threat."
Parler, which has a similar feel to Twitter, is another social network that conservatives have been flocking to after social networks intensified their crackdown against far-right groups such as the Proud Boys.
The company, though, has fewer rules than Facebook, Twitter and other major social networks. "We prefer that removing community members or member-provided content be kept to the absolute minimum," Parler's rules state. The platform, for example, can't knowingly be used "for crime, civil torts, or other unlawful acts."
In an interview with The New York Times published Thursday, Parler CEO John Matze said the company would get involved if its users were breaking the law but that it's up to a community jury to decide what is illegal or against the company's rules.
"Look, if it was illegally organized and against the law and what they were doing, they would have gotten it taken down," he said in the interview. "But I don't feel responsible for any of this and neither should the platform, considering we're a neutral town square that just adheres to the law."
TikTok said that Trump doesn't have an account on the short-form video platform that it's aware of.
"We expect everyone on our platform to follow our community guidelines, and content and accounts that violate our policies are removed," the company said in a statement.
Content that seeks to glorify or promote violence would violate those rules and be removed, the company said.
Discord also said it isn't aware of an official Trump account on the platform.
"We have a zero-tolerance policy against hate and violence of any kind on the platform, or the use of Discord to support or organize around violent extremism," a spokeswoman said in a statement. "We are always vigilant and proactively monitor our service for activity that violates our terms of service and community guidelines, particularly at times of heightened tension. When we become aware of such activity, we take immediate action, including banning users and shutting down servers."
CNET's Carrie Mihalcik contributed to this report.