YouTube told Congress on Tuesday that it removed more than 120,000 videos that either sexually exploited children or were otherwise sexually explicit featuring minors during the first half of this year. By comparison, YouTube removed 15.9 million videos total during that period for violating any of its community guidelines. YouTube also reported them to the National Center for Missing and Exploited Children.
The stat was part of YouTube's written testimony for a, delivered by Leslie Miller, YouTube's vice president of government affairs and public policy. The hearing -- in which lawmakers are grilling representatives from YouTube, Snapchat and TikTok at the Senate subcommittee on consumer protection, product safety and data security -- is similar to the one that Facebook whistleblower Frances Haugen addressed earlier this month, when she because of its products that "harm children, stoke division and weaken our democracy."
The latest hearing comes as the Google-owned platform and other Big Tech companies are facing unprecedented heat from lawmakers and regulators about the real-world effects of their products and policies. Some of the most intense scrutiny has homed in on how technology hurts or endangers children. Recently, YouTube has largely sidestepped lawmakers' most ferocious criticism about how social platforms can hurt kids. YouTube, however, is wildly popular with young people. One most watched video category on YouTube overall.suggests kids content may be the single
YouTube also noted Tuesday that it removed 7 million accounts believed to belong to young children and preteens on the sly in the first nine months of this year, with 3 million of those removals coming in the third quarter as the company "ramped up our automated removal efforts." (For context, YouTube has more than 2 billion accounts that actively visit YouTube each month.)
YouTube's terms of service require that accounts belong to people 13 and up. Children under the age of 13 technically aren't allowed to have YouTube accounts. They can access YouTube -- and still abide by its rules -- through what's known as Supervised Experiences, which limit some content and aspects of the platform that may be risky for younger viewers, or with YouTube Kids, a specialized app for small children.
But many online platforms with age limits, including YouTube, have been criticized for limp enforcement of their age restrictions.
Recently, YouTube has been stepping up automated age-violation enforcement in other aspects of its service too. A year ago, it said artificial intelligence would automatically apply age restrictions on videos. Essentially, machine learning would decide if a video should be categorized as appropriate only for people over 18.
During its initial testimony Tuesday, YouTube also noted that 85% of the videos it removed for violating its child safety rules in the second quarter were taken down quickly before they hit 10 views. That's a moderately more aggressive rate for removing videos before they spread widely than the percentage for all videos YouTube removed for breaking any of its rules during the same period. Among all YouTube's removed videos, about 75% were seen by 10 people or fewer.
YouTube has come under fire for a range of scandals involving children in the past. In 2019, Google agreed to a record $170 million penalty to settle a federal investigation into children's data privacy on the giant video site. YouTube has also faced scandals involving videos of child abuse and exploitation, nightmarish content in its YouTube Kids app and predatory comments that sexualized clips of young children.