X

YouTube removed 120,000 sexually explicit videos with children in first half of year

For context, YouTube removed 15.9 million videos total during that period for breaking any of its rules.

Joan E. Solsman Former Senior Reporter
Joan E. Solsman was CNET's senior media reporter, covering the intersection of entertainment and technology. She's reported from locations spanning from Disneyland to Serbian refugee camps, and she previously wrote for Dow Jones Newswires and The Wall Street Journal. She bikes to get almost everywhere and has been doored only once.
Expertise Streaming video, film, television and music; virtual, augmented and mixed reality; deep fakes and synthetic media; content moderation and misinformation online Credentials
  • Three Folio Eddie award wins: 2018 science & technology writing (Cartoon bunnies are hacking your brain), 2021 analysis (Deepfakes' election threat isn't what you'd think) and 2022 culture article (Apple's CODA Takes You Into an Inner World of Sign)
Joan E. Solsman
3 min read
youtube-3

YouTube is the world's biggest online video platform. 

Angela Lang/CNET

YouTube told Congress on Tuesday that it removed more than 120,000 videos that either sexually exploited children or were otherwise sexually explicit featuring minors during the first half of this year. By comparison, YouTube removed 15.9 million videos total during that period for violating any of its community guidelines. YouTube also reported them to the National Center for Missing and Exploited Children. 

The stat was part of YouTube's written testimony for a Senate hearing on online child protections Tuesday morning, delivered by Leslie Miller, YouTube's vice president of government affairs and public policy. The hearing -- in which lawmakers are grilling representatives from YouTube, Snapchat and TikTok at the Senate subcommittee on consumer protection, product safety and data security -- is similar to the one that Facebook whistleblower Frances Haugen addressed earlier this month, when she called out Facebook for "moral bankruptcy" because of its products that "harm children, stoke division and weaken our democracy." 

The latest hearing comes as the Google-owned platform and other Big Tech companies are facing unprecedented heat from lawmakers and regulators about the real-world effects of their products and policies. Some of the most intense scrutiny has homed in on how technology hurts or endangers children. Recently, YouTube has largely sidestepped lawmakers' most ferocious criticism about how social platforms can hurt kids. YouTube, however, is wildly popular with young people. One study suggests kids content may be the single most watched video category on YouTube overall. 

YouTube also noted Tuesday that it removed 7 million accounts believed to belong to young children and preteens on the sly in the first nine months of this year, with 3 million of those removals coming in the third quarter as the company "ramped up our automated removal efforts." (For context, YouTube has more than 2 billion accounts that actively visit YouTube each month.)

YouTube's terms of service require that accounts belong to people 13 and up. Children under the age of 13 technically aren't allowed to have YouTube accounts. They can access YouTube -- and still abide by its rules -- through what's known as Supervised Experiences, which limit some content and aspects of the platform that may be risky for younger viewers, or with YouTube Kids, a specialized app for small children. 

But many online platforms with age limits, including YouTube, have been criticized for limp enforcement of their age restrictions. 

Recently, YouTube has been stepping up automated age-violation enforcement in other aspects of its service too. A year ago, it said artificial intelligence would automatically apply age restrictions on videos. Essentially, machine learning would decide if a video should be categorized as appropriate only for people over 18. 

During its initial testimony Tuesday, YouTube also noted that 85% of the videos it removed for violating its child safety rules in the second quarter were taken down quickly before they hit 10 views. That's a moderately more aggressive rate for removing videos before they spread widely than the percentage for all videos YouTube removed for breaking any of its rules during the same period. Among all YouTube's removed videos, about 75% were seen by 10 people or fewer.  

YouTube has come under fire for a range of scandals involving children in the past. In 2019, Google agreed to a record $170 million penalty to settle a federal investigation into children's data privacy on the giant video site. YouTube has also faced scandals involving videos of child abuse and exploitation, nightmarish content in its YouTube Kids app and predatory comments that sexualized clips of young children. 

Watch this: Whistleblower tells Congress that Facebook is hurting children, privacy, democracy