CNET también está disponible en español.

Ir a español

Don't show this again

Culture

YouTube faces complaint demanding FTC probe over kids' data

Consumer and privacy groups say the Google-owned video site is violating online child privacy laws.

YouTube app on digital devices

Advocacy groups want the FTC to investigate YouTube.

Getty

YouTube is in hot water again over how it deals with kids who use the video site.  

On Sunday evening, a coalition of 20 child advocacy, privacy and consumer groups filed a complaint asking the US Federal Trade Commission to investigate the Google-owned video site for alleged violations of children's online privacy laws.

The groups -- which include Common Sense Media, the Center for Digital Democracy and Parents Across America -- allege that YouTube violates COPPA, a federal law that regulates user data collection from sites with users who are under 13 years old. For Google, that could potentially mean fines levied by the FTC of up to billions of dollars.

Now Playing: Watch this: YouTube accused of violating child privacy laws
2:22

The crux of the complaint, the groups said, is that kids younger than 13 watch YouTube videos, even though the company's terms of service technically forbid them to do so. When anybody watches a YouTube video, the company collects certain types of personal information, such as location or what kind of device is being used, to help with ad targeting. The complaint says YouTube is violating COPPA because it doesn't get parental consent before collecting the data.  

"Google's violations are particularly egregious," the complaint reads. "Google had actual knowledge of both the large number of child-directed channels on YouTube and the large numbers of children using YouTube."

YouTube said it hasn't received the complaint yet. "Protecting kids and families has always been a top priority for us," a spokesman for the video site said in a statement. "We will read the complaint thoroughly and evaluate if there are things we can do to improve. Because YouTube is not for children, we've invested significantly in the creation of the YouTube Kids app to offer an alternative specifically designed for children." 

But the complaint says YouTube Kids, launched in 2015, doesn't make a difference when it comes to data collection because all of the content that's available on YouTube Kids is also available on the normal version of the site. The complaint also cites a Common Sense Media survey that says 71 percent of parents said their kids watch YouTube on its main site or app. Only 24 percent said their children use YouTube Kids.

The FTC said it hadn't received the complaint yet either, but is looking forward to reviewing it. "We take enforcement of COPPA very seriously and have brought more than two dozen COPPA cases since the COPPA Rule was enacted," an FTC spokeswoman said in an email.

The complaint comes at a trying time for YouTube. Last week, a shooter opened fire at the company's campus in San Bruno, California. Three people were hospitalized and the shooter killed herself, police said.

More broadly, the advocacy groups' call for an FTC investigation illustrates the heightened environment around user privacy and data collection. Silicon Valley companies have been scrutinized for the troves of personal data they keep on users so they can serve them ads. But the practice has been under a microscope as Facebook has dealt with a data scandal that affected up to 87 million accounts. Cambridge Analytica, a digital consultancy with ties to the Trump presidential campaign, used the data without user permission. 

Facebook CEO Mark Zuckerberg is set to testify before Congress over user privacy this week.

Meanwhile, YouTube has been in the doghouse with lawmakers, too. Russian trolls tried to abuse the site to meddle in the US presidential election. YouTube has also been criticized for unwittingly highlighting fake news and misinformation.

This isn't the first time YouTube has drawn ire for child-related issues. Last year, YouTube Kids' filters failed to pull down some videos that featured disturbing imagery aimed at children -- such as Mickey Mouse lying in a pool of blood, or a claymation version of Spider Man urinating on Elsa, the Disney princess from Frozen. Also, videos featuring children doing innocuous activities like exercising were also riddled with predatory or sexual comments from viewers.

After those scandals, YouTube tightened its policies to try to keep kids safe. The company cut off advertising revenue from inappropriate videos and added 10,000 content moderators to review objectionable content.

The Smartest Stuff: Innovators are thinking up new ways to make you, and the things around you, smarter.

Special Reports: CNET's in-depth features in one place.