TikTok, YouTube, Snap say child safety is a top priority. Lawmakers aren't buying it
The three social media platforms face senators already riled up about safety thanks to the leaked Facebook documents.
Queenie WongFormer Senior Writer
Queenie Wong was a senior writer for CNET News, focusing on social media companies including Facebook's parent company Meta, Twitter and TikTok. Before joining CNET, she worked for The Mercury News in San Jose and the Statesman Journal in Salem, Oregon. A native of Southern California, she took her first journalism class in middle school.
ExpertiseI've been writing about social media since 2015 but have previously covered politics, crime and education. I also have a degree in studio art.Credentials
and Snap showed up to a congressional hearing armed with talking points about the importance of child safety, the initiatives they're putting in place to better engage parents, and their sophisticated artificial intelligence systems employed to combat harmful content.
US lawmakers, however, weren't buying it.
"I just want folks watching to know that we're not taking at face value what you've told us," Sen. Richard Blumenthal, a Connecticut Democrat, told the companies at the end of a nearly four-hour hearing on Tuesday.
Blumenthal echoed that sentiment as he lambasted the three companies on Tuesday, bringing up content related to bullying and self-harm.
"More eyeballs means more dollars. Everything that you do is to add users, especially kids, and keep them on your apps for longer," Blumenthal said.
Blumenthal said he's heard from parents about the "rabbit hole" teenagers go down when they log on to TikTok, YouTube and Snapchat. His office, which created accounts on TikTok and YouTube as part of its own research, also found that content on extreme dieting and eating disorders is easy to find on these platforms.
"Like Big Tobacco, Big Tech has lured teens despite knowing its products can be harmful," he said.
The hearing also marks the first time Snap, the parent company of Snapchat, and TikTok have testified. Snap is being represented by Jennifer Stout, the vice president of global public policy; TikTok by Michael Beckerman, vice president and head of public policy, Americas; and YouTube by Leslie Miller, vice president of government affairs and public policy. (Google, a subsidiary of Alphabet, owns YouTube.)
Lawmakers are considering legislation to update child privacy laws and Section 230, a law that shields social media sites from liability for content posted by their users. All three platforms signaled they might support changes to these laws, but they raised concerns that there could be unintended consequences.
"We see 230 as the backbone of the internet, and it is what allows us to moderate content," Miller said.
Snapchat, TikTok and YouTube tried to distinguish themselves from Facebook during the hearing, highlighting differences in how their products work. That wasn't enough to appease lawmakers who brought up problems they found on these platforms.
Stout said Snapchat was built as "an antidote to social media." Unlike Facebook, Snapchat doesn't have a News Feed or a like button. The disappearing-message app is being used by people to communicate privately with their friends.
"We have a moral responsibility to take into account the best interest of our users and everything we do. And we understand that there is more work to be done," she said.
Lawmakers brought up that Snapchat has been used by drug dealers and surfaces sexualized content. In October, Snapchat rolled out new tools and educational resources on its platform to crack down on the sale of counterfeit pills and illegal drugs.
Beckerman said TikTok has built features to protect younger users. People under 16 have their TikTok accounts set to private automatically. TikTok also collects less data than some of its competitors such as Facebook and Instagram, he said.
"I appreciate your trying with gotcha questions, but I'm trying to be truthful and accurate," Beckerman said. Cruz then accused Beckerman of dodging questions more than any other witness he's seen in his nine years in the Senate.
YouTube told Congress in prepared remarks that it removed 7 million accounts believed to belong to young children and preteens in the first three quarters. Roughly 3 million of those removals came in the third quarter as the company "ramped up our automated removal efforts." (For context, YouTube has more than 2 billion accounts that actively visit YouTube each month.)
The company said that on YouTube Kids and on YouTube autoplay videos are off by default for users under 18. YouTube also plans to launch more parental controls in the YouTube Kids app, including the ability for a parent to choose a locked default autoplay setting.
"There is no issue more important than the safety and wellbeing of our kids online," Miller said.
Sen. Marsha Blackburn, a Tennessee Republican, said her staff was able to find content that encourages self-harm and suicide on YouTube. "It's imperative that we take the steps that are necessary to prevent children and teens from seeing this content," she said. Miller said that YouTube prohibits the content Blackburn cited.