X

YouTube automation removes 11M videos in 3 months

Due to COVID-19, YouTube has fewer human reviewers.

Corinne Reichert Senior Editor
Corinne Reichert (she/her) grew up in Sydney, Australia and moved to California in 2019. She holds degrees in law and communications, and currently writes news, analysis and features for CNET across the topics of electric vehicles, broadband networks, mobile devices, big tech, artificial intelligence, home technology and entertainment. In her spare time, she watches soccer games and F1 races, and goes to Disneyland as often as possible.
Expertise News, mobile, broadband, 5G, home tech, streaming services, entertainment, AI, policy, business, politics Credentials
  • I've been covering technology and mobile for 12 years, first as a telecommunications reporter and assistant editor at ZDNet in Australia, then as CNET's West Coast head of breaking news, and now in the Thought Leadership team.
Corinne Reichert
2 min read
youtube-logo-laptop-4692

YouTube says it took down more than 11 million videos in the second quarter.

Angela Lang/CNET

YouTube removed 11.4 million videos between April and June, with the vast majority -- 10.85 million -- flagged by automated systems alone the company said Tuesday. Due to the coronavirus pandemic, the video sharing site said, it had "greatly reduced human review capacity" to double-check whether videos breached its user policies. As a result, it decided to "over enforce" by using automated systems, with YouTube removing more than double the videos it removed in the period January to March.

"The decision to over-enforce in these policy areas -- out of an abundance of caution -- led to a more than 3x increase in removals of content our systems suspected was tied to violent extremism or was potentially harmful to children," YouTube explained. "This includes dares, challenges, or other innocently posted content that might endanger minors."

Usually, YouTube relies on automation to flag videos that are then assessed by people. Since it was relying on automation alone this time, it made sure the appeals process had more staffers so the appeals could be quickly reviewed. Less than 3% of removals resulted in an appeal, but YouTube ended up with double the reinstatement rate it had last quarter.

Of the removed videos, 3.8 million were taken down for child safety reasons, 3.2 million for spam or scams, 1.7 million for nudity or sexual content, 1.2 million for violence and 900,000 for the promotion of violence.

Read more: The best live TV streaming services

Just 382,000 videos were flagged for removal by users, 167,000 by individual trusted flaggers, 2,220 by NGOs and 25 by government agencies. Three-quarters were removed before they got more than 10 views.