YouTube steps up fight against extremist, terrorist videos

The video giant will try new ways to identify as well as remove these videos, prevent people from making money off them and redirect potential terrorist recruits.

Natalie Weinstein Former Senior Editor / News
I spent a decade as a reporter and editor before joining the CNET News staff as a copy editor in 2000, right before the dot-com bust.
Expertise Copy editing. Curating, editing and reading newsletters of all stripes. Playing any word-related game, specifically Scrabble, Wordle and Boggle. Credentials
  • I've been a journalist for more than three decades. I was a finalist in the 2021 Digiday Media Award for Best Newsletter.
Natalie Weinstein
2 min read
Watch this: YouTube sets rules for terrorism gray-zone videos

YouTube will take new steps to combat extremist- and terrorist-related videos, parent company Google said Sunday.

"While we and others have worked for years to identify and remove content that violates our policies, the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done. Now," Kent Walker, Google's general counsel, said in an op-ed column in the UK-based Financial Times that was later posted on the Google blog.

Google and YouTube will:

  • Use "more engineering resources to apply our most advanced machine learning research to train new 'content classifiers' to help us more quickly identify and remove such content."
  • Expand YouTube's Trusted Flagger program by adding 50 independent, "expert" non-governmental organizations to the 63 groups already part of it. Google will offer grants to fund the groups.
  • Take a "tougher stance on videos that do not clearly violate our policies -- for example, videos that contain inflammatory religious or supremacist content." Such videos will "appear behind a warning" and will not be "monetized, recommended or eligible for comments or user endorsements."
  • Expand YouTube's efforts in counter-radicalization. "We are working with Jigsaw to implement the 'redirect method' more broadly across Europe. This promising approach harnesses the power of targeted online advertising to reach potential Isis recruits, and redirects them towards anti-terrorist videos that can change their minds about joining." A Google spokeswoman said Jigsaw's "redirect method" is already in use in the US.

Google's announcement comes four weeks after a suicide bomber killed 22 people and wounded nearly 120 others after an Ariana Grande concert in Manchester, UK, and two weeks after three terrorists used a van and knives to kill eight people and injure nearly 50 more on London Bridge as well as in the Borough Market area in London.

The internet has been a critical tool for terrorists and extremists to recruit, communicate and share information among themselves. The internet giants, including Google, Facebook and Twitter, have been working for years to try to contain extreme content on their sites, though many have criticized them for not doing enough.

"Collectively, these changes will make a difference. And we will keep working on the problem until we get the balance right. Extremists and terrorists seek to attack and erode not just our security, but also our values; the very things that make our societies open and free," Walker said in the column. "We must not let them."

First published, June 18 at 10:46 a.m. PT.
Update, 12:58 p.m. PT: Adds comment from Google.

CNET Magazine: Check out a sample of the stories in CNET's newsstand edition.

Logging Out: Welcome to the crossroads of online life and the afterlife.