Lawmakers urge FTC to probe Google for pushing apps that allegedly violate child privacy law

The search giant's app store engages in practices that "mislead parents and harm kids," the lawmakers say.

Richard Nieva Former senior reporter
Richard Nieva was a senior reporter for CNET News, focusing on Google and Yahoo. He previously worked for PandoDaily and Fortune Magazine, and his writing has appeared in The New York Times, on CNNMoney.com and on CJR.org.
Richard Nieva
2 min read

Lawmakers want the FTC to investigate Google.

Angela Lang/CNET

Democratic lawmakers on Wednesday urged the US Federal Trade Commission to investigate Google for marketing apps on its Play Store that allegedly violate a federal child privacy law.

In a letter sent by Sen. Ed Markey of Massachusetts and Rep. Kathy Castor of Florida, the lawmakers take aim at a Google program called Designed for Families, which pushes apps that the company says are compliant with the Children's Online Privacy Protection Act. The law, also known as COPPA, regulates user data collection from sites with users who are under 13 years old. 

But the lawmakers cite research from last month by child advocacy nonprofits that examined more than 150 apps that are part of the program and found that almost half of them share user data with outside parties. 

"The FTC must use its full authority to protect the interests of children, many of whom are increasingly online during the coronavirus pandemic," the letter says. "Therefore, we urge you to investigate whether the Google Play Store has engaged in unfair and deceptive practices that mislead parents and harm kids."

Google didn't immediately respond to a request for comment. 

The letter comes as big tech companies are increasingly under fire in regard to child safety on their platforms. At a hearing last month with the CEOs of Facebook, Google and Twitter, the tech leaders were slammed by both Democrats and Republicans who accused the companies of exploiting kids to make money. 

Google has received blowback in the past for its treatment of children on its services. Two years ago, the FTC slapped the company with a record $170 million fine, as well as new requirements, for Google-owned YouTube's violation of COPPA. In response, the video site made major changes to how it treats videos for kids, including limiting the data it collects from those views. 

YouTube Kids, a version of the video service specifically meant for children, faced controversy in 2017 when the service's filters failed to recognize some videos that are aimed at children but feature disturbing imagery, like Mickey Mouse lying in a pool of blood, or PAW Patrol characters bursting into flames after a car crash. 

The letter follows another push by Castor for child safety from Silicon Valley companies. In September, she introduced the Kids Internet Design and Safety, or KIDS, Act, in the House. The bill bans "auto-play" sessions on websites and apps geared for children and young teens. The legislation also bans push alerts targeting children and prohibits platforms from recommending or amplifying certain content involving sexual, violent or other adult material, including gambling or "other dangerous, abusive, exploitative, or wholly commercial content."