X

Facebook ramps up efforts to remove terrorist content

In the third quarter, Facebook removed 3 million posts tied to terrorism, the social network says.

Queenie Wong Former Senior Writer
Queenie Wong was a senior writer for CNET News, focusing on social media companies including Facebook's parent company Meta, Twitter and TikTok. Before joining CNET, she worked for The Mercury News in San Jose and the Statesman Journal in Salem, Oregon. A native of Southern California, she took her first journalism class in middle school.
Expertise I've been writing about social media since 2015 but have previously covered politics, crime and education. I also have a degree in studio art. Credentials
  • 2022 Eddie award for consumer analysis
Queenie Wong
2 min read
Facebook logo seen displayed on smart phone
Getty Images

Facebook says it's been trying to do a better job of finding and pulling down terrorist content, and on Thursday the world's largest social network said it's seen signs of success. 

Facebook said that in the third quarter, it pulled down 3 million posts related to terrorism, a drop from the 9.4 million posts Facebook removed in the second quarter. The median amount of time terrorist content stayed on the platform after users reported it also dropped, from 43 hours in the first quarter to 18 hours in the third quarter, the company said.

Social networks are under pressure to remove terrorist content before violence spills into the real world. As they increase their efforts, though, bad actors are constantly changing strategy to evade detection, the companies say. Some terrorists try to create new accounts or break up their messages, Facebook said.

"We can reduce the presence of terrorism on mainstream social platforms, but eliminating it completely requires addressing the people and organizations that generate this material in the real world," Monika Bickert, Facebook's global head of policy management, and Brian Fishman, the company's head of counterterrorism policy, wrote in a blog post

Facebook relies on machine learning to detect terrorist content its reviewers should prioritize. Sometimes the company will automatically pull down posts, if the system determines there's "high confidence" the post contains support for terrorism. The company has also been expanding some of its tools to more languages.

In the third quarter, about 99 percent of content related to ISIS and al-Qaeda was pulled down by the tech firm before a user reported it, Facebook said.

Infowars and Silicon Valley: Everything you need to know about the tech industry's free speech debate.

Cambridge Analytica: Everything you need to know about Facebook's data mining scandal.