X

YouTube suspends Trump's account for at least one week

The Google-owned platform cites "ongoing potential for violence" for the action.

Richard Nieva Former senior reporter
Richard Nieva was a senior reporter for CNET News, focusing on Google and Yahoo. He previously worked for PandoDaily and Fortune Magazine, and his writing has appeared in The New York Times, on CNNMoney.com and on CJR.org.
Steven Musil Night Editor / News
Steven Musil is the night news editor at CNET News. He's been hooked on tech since learning BASIC in the late '70s. When not cleaning up after his daughter and son, Steven can be found pedaling around the San Francisco Bay Area. Before joining CNET in 2000, Steven spent 10 years at various Bay Area newspapers.
Expertise I have more than 30 years' experience in journalism in the heart of the Silicon Valley.
Richard Nieva
Steven Musil
3 min read
youtube-logo-laptop-4692

YouTube has suspended President Donald Trump.

Angela Lang/CNET

YouTube on Tuesday evening said it's suspending President Donald Trump from uploading new videos for one week, joining a chorus of social media companies curbing the president's presence on their platforms in the wake of the deadly riot that engulfed the Capitol last week.

"After careful review, and in light of concerns about the ongoing potential for violence, we removed new content uploaded to the Donald J. Trump channel and issued a strike for violating our policies for inciting violence," a YouTube spokeswoman said in a statement. "As a result, in accordance with our long-standing strikes system, the channel is now prevented from uploading new videos or livestreams for a minimum of seven days -- which may be extended."

YouTube said it would also indefinitely disable comments on the president's channel, citing "ongoing concerns about violence." YouTube parent company Google says on a support page that "content encouraging others to commit violent acts are not allowed on YouTube." It's unclear which video on Trump's channel triggered the penalties.

The White House didn't immediately respond to a request for comment.

Social media companies have been trying to avoid a repeat of the violence that erupted last when a mob of Trump supporters stormed the US Capitol during the vote to confirm President-elect Joe Biden's election victory. Twitter has permanently banned Trump, while Facebook indefinitely blocked the president's account. Before the suspension on Tuesday, YouTube had been the last major platform not to restrict the president.

While Facebook and Twitter were quick to take action against Trump in the wake of the attack, YouTube took a tamer approach. Instead of addressing Trump specifically, the company said it would issue a strike on any account that posts videos making false claims about election fraud. Under YouTube's rules, three strikes within a 90-day period results in permanently being kicked off the platform. The first strike comes with a one-week ban from posting content. The second strike comes with a two-week ban.

YouTube first announced the policy update against election fraud claims last month but allowed a grace period before offenders were penalized with strikes. The grace period was set to expire on Inauguration Day on Jan. 20 but was moved up after the Capitol riot. 

Employees at Google and its parent company Alphabet have demanded the company boot Trump from the platform permanently. In an open letter last week, Alphabet's employee union said YouTube's response to the Capitol riot was "lackluster" and urged YouTube executives to take stronger action. 

"We know that social media has emboldened the fascist movement growing in the United States and we are particularly cognizant that YouTube, an Alphabet product, has played a key role in this growing threat, which has received an insufficient response by YouTube executives," the union said.

Beyond restricting Trump, Silicon Valley companies have clamped down on others inciting violence and spreading misinformation. Last week the company permanently took down the channel of Steve Bannon's popular War Room podcast, after it violated the platform's three-strikes policy. Twitter has purged over 70,000 accounts devoted to the QAnon conspiracy theory, which baselessly contends the government is being run by a cabal of Satan-worshipping sex traffickers.

The tech giants have taken action against Parler, a social network popular with far-right and extremist users, which rioters used to help plan the attack. Google and Apple suspended the platform from their app stores and Amazon cut off hosting the app from Amazon Web Services, which rents server space to other companies. In response, Parler has sued Amazon alleging a breach of contract.