X

Google will let minors request to have their pictures removed from image search

The search giant announces a number of protections for people under 18, including making YouTube video uploads private by default.

Richard Nieva Former senior reporter
Richard Nieva was a senior reporter for CNET News, focusing on Google and Yahoo. He previously worked for PandoDaily and Fortune Magazine, and his writing has appeared in The New York Times, on CNNMoney.com and on CJR.org.
Richard Nieva
3 min read
google-hq-sede-mountain-view.jpg

Google announced policy updates to protect minors.

Angela Lang/CNET

Google on Tuesday unveiled a handful of policy changes aimed at protecting people under 18 from abuse on the search giant's platforms. 

The company will allow minors or their parents to request to have their pictures removed from Google's Image Search feature, a notable change because Google has historically taken a hands-off approach when it comes to managing its search engine. Google also said it will block targeted advertising based on the age, gender or interests of people under 18. 

YouTube, which is owned by Google, said it will change the default video upload settings for minors, automatically choosing the most private option available. The platform will also turn off autoplay by default for minors and turn on digital well-being tools, like alerts that remind people to take a break when they've been binging videos for a long time. 

The changes come as Silicon Valley companies have been in the hot seat over child safety. Apple stirred controversy last week when it announced it would scan iPhones for child exploitation imagery when those photos are uploaded to the company's iCloud storage service. The move has worried some privacy advocates concerned about the potential for surveillance and abuse. Google didn't respond to a request for comment asking if the company has similar plans for its Android mobile operating system. 

Google doesn't allow standard accounts for children under 13, though it has some products, like YouTube Kids, that are meant to be used by children with parental supervision. The company on Tuesday also said it will automatically turn on a feature called SafeSearch, which filters out explicit search results, for users under 18 that are signed into their Google accounts. Minors also won't be able to turn on Google's Location History setting, which tracks where someone has been for maps and other products.

This isn't the first time Google has tweaked its search engine policies in an attempt to curb abuse. In June, the company said it would update its search algorithms to crack down on websites that publish unverified and slanderous posts about people. Google also updated its policies after the European Union ruled in 2014 that Google must alter search results as part of the "right to be forgotten." The standard lets residents demand that Google delete personal data about them from search results if the information is considered outdated, irrelevant or not in the public interest. 

YouTube has received blowback in the past when it comes to its treatment of children's content. YouTube Kids faced controversy in 2017 when the service's filters failed to recognize some videos that feature disturbing imagery but are aimed at children -- like Mickey Mouse lying in a pool of blood or PAW Patrol characters bursting into flames after a car crash. 

Critics have also accused Google of skirting the Children's Online Privacy Protection Act, or COPPA, a federal law that regulates user data collection from sites with users who are under 13 years old. In 2019, the US Federal Trade Commission slapped the company with a record $170 million fine, as well as new requirements, for YouTube's violation of COPPA. In response, the video site made major changes to how it treats kids videos, including limiting the data it collects from those views.