X

Lawmakers pressure Google to share how YouTube collects, uses kids' data

Representatives worry the company's data collection practices may violate children's online privacy laws.

Abrar Al-Heeti Technology Reporter
Abrar Al-Heeti is a technology reporter for CNET, with an interest in phones, streaming, internet trends, entertainment, pop culture and digital accessibility. She's also worked for CNET's video, culture and news teams. She graduated with bachelor's and master's degrees in journalism from the University of Illinois at Urbana-Champaign. Though Illinois is home, she now loves San Francisco -- steep inclines and all.
Expertise Abrar has spent her career at CNET analyzing tech trends while also writing news, reviews and commentaries across mobile, streaming and online culture. Credentials
  • Named a Tech Media Trailblazer by the Consumer Technology Association in 2019, a winner of SPJ NorCal's Excellence in Journalism Awards in 2022 and has three times been a finalist in the LA Press Club's National Arts & Entertainment Journalism Awards.
Abrar Al-Heeti
3 min read
YouTube
Getty Images

Two lawmakers sent a letter to Google CEO Sundar Pichai on Monday, asking the company to provide information on how its subsidiary YouTube collects data on child users. 

Rep. David Cicilline, a Democrat from Rhode Island, and Jeff Fortenberry, a Republican from Nebraska, said in their letter that YouTube's data collection practices "may not be in compliance with the Children's Online Privacy Protection Act of 1998," or COPPA, a federal law regulating user data collection from sites with users under 13 years old.

The representatives sent the letter after a coalition of 23 child advocacy, privacy and consumer groups filed a complaint in April asking the US Federal Trade Commission to investigate YouTube for allegedly violating children's online privacy laws. The compliant said YouTube is violating COPPA because it collects data on child viewers younger than 13, such as their location or what device they're using, without obtaining parental consent beforehand. This information is then sold to advertising services to create targeted ads, the complaint alleged.

YouTube's terms of service technically forbid anyone under 13 from using the platform, but kids can bypass the age restriction by claiming to be older when they make an account or by using an older person's account, the complaint noted. The FTC compliant also alleged that Google knows children are using YouTube, and that the platform has plenty of content catered to children. 

In their letter to Pichai, Cicilline and Fortenberry ask that Google respond to questions such as whether children's programs on YouTube are marked to prevent data collection, and whether any data that may be collected is used for targeted ads. They also ask why there isn't an age gate to prevent underage users from accessing videos on the platform, and how the company determines whether a user is a child. The representatives ask for a response by Oct. 17.

"Protecting kids and families has always been a top priority for us," a YouTube representative said in a statement. "Because YouTube is not for children, we've invested significantly in the creation of the YouTube Kids app to offer an alternative specifically designed for children. We appreciate all efforts to protect families and children online and look forward to working with members of Congress to answer their questions."

The FTC declined to comment.

The letter comes as tech companies face mounting scrutiny over data collection and user privacy. Facebook is still reeling from the Cambridge Analytica scandal, in which data from as many as 87 million Facebook users was improperly shared with the political consultancy. Facebook CEO Mark Zuckerberg testified before Congress over user privacy in April.

YouTube has also faced heat from lawmakers after Russian trolls used the site to meddle in the 2016 US presidential election. It's also been criticized for inadvertently highlighting fake news and misinformation.

In addition, YouTube has been criticized for failing to quickly take down videos featuring disturbing content aimed at children, such as one video depicting Mickey Mouse in a pool of blood. The company has since cut off advertising revenue from inappropriate videos and added 10,000 content moderators to review objectionable content. It also introduced stronger parental controls, allowing adults to handpick the videos and channels their children have access to.

Watch this: 5 ways to set parental controls in the YouTube Kids app

First published Sept. 18 at 3:27 p.m. PT.
Update, 4:09 p.m.: Adds comment from YouTube, and adds that the FTC declined to comment. 

Cambridge Analytica: Everything you need to know about Facebook's data mining scandal.

iHate: CNET looks at how intolerance is taking over the internet.