Twitter under pressure to ban white supremacists after El Paso shooting

Twitter isn't doing enough, civil rights activists say.

Queenie Wong Former Senior Writer
Queenie Wong was a senior writer for CNET News, focusing on social media companies including Facebook's parent company Meta, Twitter and TikTok. Before joining CNET, she worked for The Mercury News in San Jose and the Statesman Journal in Salem, Oregon. A native of Southern California, she took her first journalism class in middle school.
Expertise I've been writing about social media since 2015 but have previously covered politics, crime and education. I also have a degree in studio art. Credentials
  • 2022 Eddie award for consumer analysis
Queenie Wong
2 min read

Twitter should ban white supremacists from the platform, civil rights groups said Wednesday.

Graphic by Pixabay/Illustration by CNET

Civil rights activists on Wednesday urged Twitter to ban white supremacists in the wake of a shooting that left 22 people dead in El Paso, Texas

For years, social media companies have been under mounting pressure to combat hate speech on their platforms, but the calls have intensified after a series of mass shootings. In March, Facebook said it would bar white nationalist and white separatist content from its platform. The Change the Terms coalition, made up of more than 50 advocacy groups, is calling on Twitter to do the same. 

"While others platforms like 8chan and 4chan might be the cesspool of white supremacist ideas, it is Twitter where these ideas become mainstream," said Steven Renderos, co-director of the Oakland nonprofit MediaJustice.

The gunman in the El Paso shooting appears to have posted a hate-filled, anti-immigrant manifesto on the online message board 8chan, but the screed also spread to larger social media platforms such as Facebook and Twitter. President Donald Trump this week urged social media companies to "develop tools to detect mass shooters before they strike." 

Twitter already has rules against violent threats and hateful conduct including promoting violence or directly attacking people based on race, religion, sexual orientation and other characteristics. But advocacy groups say that isn't enough. Extremists who helped organize the 2017 white nationalist rally in Charlottesville, Virginia, still remain on the platform, they pointed out. That includes white supremacist Jason Kessler and Andrew Anglin, publisher of the neo-Nazi website The Daily Stormer, according to an article by The Huffington Post cited by the coalition's leaders. 

A Twitter spokeswoman said the company suspended 166,513 unique accounts for promoting terrorism from July to December 2018. Nick Pickles, Twitter's senior policy strategist, also said during a recent congressional hearing that the company took action against 184 groups that violated the company's policy on violent extremism and that 93 of those groups "advocate violence against civilians alongside some form of extremist white supremacist ideology."

Civil rights activists say Twitter doesn't enforce its rules consistently and that they've seen a mixed response when users report these accounts to the company. 

If Twitter moves forward with a ban, one of the challenges will be how the company defines who is a white supremacist. Social media platforms are also facing allegations they're suppressing conservative speech, but they deny doing so. And despite Facebook's ban, there are still white supremacists on the platform. 

"I'm not suggesting that Facebook is knocking it out of the park. They're a lot further along," said Jessica González, co-founder of Change the Terms. "I think explicitly stating that they're going to ban white supremacists is an important move."