Near the end of a more than two-hour congressional hearing, Sen. Marsha Blackburn gave Instagram head Adam Mosseri a chance to speak directly to parents whose children have been harmed by the platform.
"We're not talking to people that have ever had any kind of response from Instagram and you have broken these children's lives and you have broken these parents' hearts," Blackburn, a Tennessee Republican, told Mosseri on Wednesday.
"To any parent who's lost a child or even had a child hurt themselves, I can't begin to imagine what that would be like for one of my three boys. As the head of Instagram, it's my responsibility to do all I can to keep people safe. I've been committed to that for years. I'm going to continue to do so," Mosseri responded.
US lawmakers weren't satisfied with Mosseri's reply. The executive was testifying during a Senate hearing, titled Protecting Kids Online: Instagram and Reforms for Young Users, that focused on what the Meta-owned Instagram knows about the impact of its service on young people. Mosseri's testimony comes at an uncomfortable moment for Instagram and Facebook, which rebranded itself as Meta. Frances Haugen, a former Facebook product manager turned whistleblower, leaked a trove of internal research to Congress and the US Securities and Exchange Commission before leaving the company in May.
Lawmakers still don't trust Instagram to self-regulate
Lawmakers kicked off the hearing by expressing their frustration that not much has changed to safeguard children online. In September, Antigone Davis, who runs Facebook's global safety operations, appeared before the same subcommittee. The Senate panel also held a hearing in October about online child safety with executives from Snapchat, TikTok and Google-owned YouTube.
Sen. Richard Blumenthal, a Connecticut Democrat, said his office on Monday created a fake Instagram account for a teenager and was soon shown recommendations for eating disorder content. The example was one of several anecdotes lawmakers brought up to illustrate how enforcement of Instagram's rules falls short.
"The resounding bipartisan message from this committee is legislation is coming. We can't rely on trust anymore. We can't rely on self policing. It's what parents and our children are demanding," he said.
Ahead of the hearing, Instagram also announced new tools, including a feature that reminds people to take a break from the platform, to demonstrate that the company is serious about the mental health of its users.
Blumenthal said the new safety tools Instagram released "fall way short of what we need" -- and should have been released earlier.
Instagram pushes for the creation of an industry body
Mosseri told US lawmakers that keeping young people safe online is "not just about one company." One idea he pushed during the hearing is the creation of an industry body to determine best practices for protecting young people online such as how to verify a user's age and to build parental controls.
Citing a survey from Forrester, Mosseri also noted it appears that more teens are using short-form video app TikTok and YouTube more than Instagram.
Companies like Instagram "should have to adhere to these standards" to earn protections under Section 230, a federal law that shields online platforms from liability for user-generated content, he said.
Sen. Ed Markey, a Massachusetts Democrat, and other lawmakers didn't appear to support that idea.
"Your idea of regulation is an industry group creating standards that your company follows. That's self regulation, that's status quo, and that just won't cut it," Markey said.
Instagram Kids isn't permanently off the table
In September, Instagram said it was pausing the development of a version of the photo-sharing app for children under 13 known as Instagram Kids. Instagram says the project is meant to give parents more control over the social media usage of kids between the ages of 10 and 12 who may already be on the app.
But the project raised concerns from child advocacy groups who say kids aren't developmentally equipped to deal with the social comparison and mental health risks that come with being on Instagram.
During the hearing, Blumenthal asked Mosseri if he would commit to permanently pausing Instagram Kids. Mosseri said that what he could commit to is that no child between the ages of 10 and 12 would have access to the app, if the company ever managed to build it, "without their explicit parental consent."
Teen accounts created on the web don't default to private
Instagram said in July that users under the age of 16, or 18 in some countries, will have their accounts set to private by default.
Blackburn, though, pointed out her staff created a fake Instagram account for a 15-year-old girl, but it defaulted as public, not private.
"While Instagram is touting all these safety measures, they aren't even making sure the safety measures are in effect," she said.
Mosseri said accounts for teenagers created on a mobile device do default to private, but that's not the case when accounts are created on the web.
"We will correct that quickly," he said.
Instagram could bring back the chronological feed next year
Mosseri said during the hearing he thinks users should have more control over their experience on Instagram, including the ability to view their feed chronologically. The company got rid of the chronological feed in 2016 and instead displays content that users are more likely to be interested in based on activity such as what they "liked."
Instagram is also working on a way to display the people users want to see at the top of their feed and a chronological version of Instagram.
"I wish I had a specific month to tell you right now, but right now we're targeting the first quarter of next year," Mosseri said.