The CEOs of Facebook, Google and Twitter on Thursday went head to head with US lawmakers, who hammered the three executives about misinformation, tech addiction and other problems plaguing some of the world's largest online platforms.
The executives have testified before Congress in the past, but Thursday's marathon hearing was the first time that Facebook CEO Mark Zuckerberg, Google CEO Sundar Pichai and Twitter CEO Jack Dorsey have appeared before lawmakers since the deadly Capitol Hill riot in January.
The spread of misinformation about the coronavirus, the 2020 US elections and other topics has heightened tensions between tech companies and lawmakers exploring new regulation. At various points during the more than five-hour hearing, members of Congress interrupted the tech executives and pressed them to provide straight answers to their questions with a yes or no response.
"You have the means, but time after time you are picking engagement and profit over the health and safety of your users, our nation and our democracy," Rep. Mike Doyle, a Pennsylvania Democrat who chairs the House Subcommittee on Communications and Technology, said during the hearing.
Here are some highlights from Thursday's hearing:
Children and screen time
Democrats and Republicans raised concerns about the negative impacts social media could have on children. "Big tech is essentially handing our children a lit cigarette and hoping they stay addicted for life," said Rep. Bill Johnson, an Ohio Republican. Facebook-owned Instagram requires its users to be at least 13 years old, but the company is building a version of the photo sharing app for children 12 and under. The company already has a kids version for Facebook Messenger, and Google-owned YouTube also has an app for kids.
Zuckerberg: The Facebook CEO said his 3-year-old and 5-year-old daughters don't use most of the company's products. He does allow his older daughter, Max, to use Messenger Kids to message her cousins, and he watches educational YouTube videos with both kids. Zuckerberg pushed back against the idea that Facebook's products harm children but acknowledged that there are still issues that need to be worked out including "how people can control the experience of kids."
Pichai: Asked if Google has researched the effects of its products on the mental health of children, Pichai said the company consults widely with experts including mental health organizations. He added that YouTube works with partners to curate content for kids, surfacing videos about science, cartoons and Sesame Street. Pichai added later in the hearing that he agrees this is an important issue, noting he has children too and worries about their screen time.
Lawmakers say they're exploring new regulation, including around Section 230 of the 1996 Communications Decency Act, a law that shields online companies from liability for content posted by its users. "Today our laws give these companies a blank check to do nothing rather than limit the spread of disinformation," said Energy and Commerce Committee Chairman Frank Pallone Jr., a New Jersey Democrat.
Zuckerberg: Facebook's chief has expressed support for changing Section 230. He said in prepared remarks that companies should "be required to demonstrate that they have systems in place for identifying unlawful content and removing it" but that they shouldn't be held liable if a piece of content evades their detection. He added, though, lawmakers need to be wary about the impact on smaller platforms.
Pichai: Google is worried that changing or repealing Section 230 could make content moderation tougher or harm free expression. When asked if he supported Zuckerberg's proposed changes, Pichai said there are "definitely good proposals around transparency and accountability" that the company would welcome.
Dorsey: Echoing Pichai's remarks, Dorsey said Zuckerberg's ideas around transparency are "good" but added "it's going to be very hard to determine what's a large platform and a small platform."
The Jan. 6 attack on the US Capitol
The tech CEOs were pressed on the roles their platforms played in connection with the January attacks on the US Capitol, in which a mob of Donald Trump supporters sought to stop the certification of the election. In the wake of the attack, the platforms all either suspended or banned Trump for his role in inciting the riot.
Zuckerberg: The Facebook CEO said the company tried to remove posts that could lead to violence and worked closely with law enforcement to identify the insurrectionists. However, he downplayed Facebook's role in the event. "I believe that the former president should be responsible for his words and that the people who broke the law should be responsible for their actions," he said.
Pichai: Google's CEO said YouTube had taken down thousands of videos that violated its rules in the leadup to the riot. "We had clear policies and we were vigorously enforcing this area."
Dorsey: The Twitter leader said his company worked hard to remove posts and tried not to amplify misinformation. "We didn't have any upfront indication this would happen, so we had to react to it quite quickly." Asked to answer yes or no on whether the platforms "bear some responsibility" for disseminating misinformation that led to the storming of the Capitol, each of the CEOs waffled. Dorsey noted that lawmakers have to consider the "broader ecosystem" and "not just technology platforms we use."
Alleged anti-conservative bias
Many Republican lawmakers grilled the CEOs on the oft-repeated claim that the platforms censor conservative voices. The tech leaders denied the accusations, saying that they apply their policies regardless of politics.
Dorsey: The Twitter CEO was asked about the company's decision to forbid sharing of a New York Post article about Hunter Biden, son of President Joe Biden, that was posted three weeks before Election Day. Dorsey said Twitter's handling of the article was a "total mistake." "We don't write policy according to any particular political leaning," Dorsey said. "If we find any of it, we write it out."
Zuckerberg: Asked about removing content that could silence conservative and other voices, Zuckerberg said the company's artificial intelligence software doesn't always get it right. "We need to build systems and content in 150 languages around the world, and we need to do it quickly. And unfortunately there are some mistakes in trying to do this quickly and effectively."