Google says it's reviewed over 1M suspected terrorist videos on YouTube this year

The company says it spends "hundreds of millions of dollars" yearly on content review.

Shelby Brown Editor II
Shelby Brown (she/her/hers) is an editor for CNET's services team. She covers tips and tricks for apps, operating systems and devices, as well as mobile gaming and Apple Arcade news. Shelby also oversees Tech Tips coverage. Before joining CNET, she covered app news for Download.com and served as a freelancer for Louisville.com.
  • She received the Renau Writing Scholarship in 2016 from the University of Louisville's communication department.
Shelby Brown
2 min read

Congress is cracking down on how the internet handles violent content.

Getty Images

Google has reviewed more than 1 million suspected terrorist videos on YouTube in the first three months of 2019, according to a letter the tech giant sent to US lawmakers.

In the April 24 letter, made public Thursday as part of a press release from the House Committee on Homeland Security, Google said 90,000 of those videos violated its terrorism policy. Google, which owns YouTube, said it spends "hundreds of millions of dollars annually" on content review.

The House committee urged Facebook, Twitter, YouTube and Microsoft to do a better job of removing violent content, following posts about the deadly New Zealand mosque shooting in March. In April, Rep. Max Rose and other Democrats asked for the websites' budgets, to see how the platforms were fighting terrorism.

"The fact that some of the largest corporations in the world are unable to tell us what they are specifically doing to stop terrorist and extremist content is not acceptable," Rose, along with committee Chairman Bennie Thompson, said in the Thursday press release, which also included a link to the Google letter, as well as a link to a letter from Twitter.

In its letter, Twitter said putting a dollar amount on efforts to combat terrorism is a "complex request." Instead, the company detailed its efforts to suspend accounts in violation of its policies.

"We have now suspended more than 1.4 million accounts for violations related to the promotion of terrorism between Aug. 1, 2015, and June 30, 2018," Twitter's director of public policy and philanthropy, Carlos Monje Jr., said in the company's letter. "During the most recent reporting period of Jan. 1, 2018, through June 30, 2018, we suspended a total of 205,156 accounts."

The release from the House Committee on Homeland Security documented responses from Twitter and Google's YouTube but said no company answered the request from Congress "properly or fully." Microsoft's response wasn't listed. The release said Facebook hadn't responded yet. The social media site did, however, bar far-right figures like Alex Jones and Milo Yiannopoulos on Thursday.

The companies didn't immediately respond to requests for comment. 

Watch this: Google and Amazon make peace, Facebook collects data without permission