Jan. 6 Capitol Hill riot forced social networks to look at their ugly side

On the insurrection's first anniversary, Facebook, Twitter and other social networks still face scrutiny for how they police political misinformation.

Queenie Wong Former Senior Writer
Queenie Wong was a senior writer for CNET News, focusing on social media companies including Facebook's parent company Meta, Twitter and TikTok. Before joining CNET, she worked for The Mercury News in San Jose and the Statesman Journal in Salem, Oregon. A native of Southern California, she took her first journalism class in middle school.
Expertise I've been writing about social media since 2015 but have previously covered politics, crime and education. I also have a degree in studio art. Credentials
  • 2022 Eddie award for consumer analysis
Queenie Wong
6 min read
Pro-Trump protesters gather in front of the US Capitol on Jan. 6, 2021.

Pro-Trump protesters gather in front of the US Capitol on Jan. 6, 2021.

Brent Stirton/Getty Images

Days before a pro-Trump mob stormed the US Capitol, social media posts foreshadowed the deadly Jan. 6 riot.

"You better be ready chaos is coming and I will be in DC on 1/6/2021 fighting for my freedom!," Maryland resident Andrew Ryan Bennett wrote in a Facebook post shared on Jan. 4, 2021, with #STOPTHESTEAL. Two days later, Bennett would livestream videos on Facebook from inside the Capitol. The videos included images of Bennet, who wore a baseball hat with a Proud Boys motto, chanting "Break it down!" outside the door of the speaker's lobby, where a woman was fatally shot, according to the FBI

Bennett was one of more than 700 people federal prosecutors charged with allegedly committing crimes connected to the Jan. 6 Capitol Hill attack. Some of those people recorded their participation in the melee on platforms like Facebook and Google-owned YouTube. 

A year after the insurrection, US lawmakers, researchers and journalists are still examining the role social networks played in the attack that left five people dead. Members of Congress have criticized the companies for downplaying their roles in the riot. Prompted by the attack, social networks have looked more closely at how they tackle misinformation spread by politicians and public figures. 

An investigation by ProPublica and The Washington Post published on Tuesday found evidence that Facebook played a "critical role in spreading lies that fomented the violence of Jan. 6." At least 650,000 posts in Facebook groups attacked the legitimacy of Joe Biden's presidential victory over Donald Trump and many called for political violence, the news outlets reported.

Trump and his supporters continue to peddle unfounded claims on social networks that the election was stolen from him. Concerned about the risk of violence, Facebook, Twitter and other social media sites last year took the rare step of booting Trump from their platforms.

On Thursday, Biden and Vice President Kamala Harris both spoke about the attack on the Capitol, calling on Americans to face the truth of the "brutal attack" that took place one year ago.

"We must be absolutely clear about what is true and what is a lie," said Biden, speaking from Statuary Hall in the US Capitol Building. "Here's the truth. The former president of the United States of America has created and spread a web of lies about the 2020 election."

During his remarks, Biden also criticized Trump for his lack of action on Jan. 6 as well as his role in instigating and inciting the mob. "They came in here in rage," Biden said. "Not in the service of America, but rather in service of one man."

Trump canceled a scheduled press conference at his Mar-a-Lago resort on Thursday but has a rally scheduled on Jan. 15 in Arizona. Here's a look at the impact the attack had on social networks:

Social media sites tweak policies, roll out new tools


Trump had a large following on both Twitter and Facebook.

Angela Lang/CNET

Trump's indefinite suspension from Facebook on Jan. 7, 2021, forced the social network, which renamed itself Meta in October, to examine how it moderates speech posted by public figures. Trump had 35 million followers on Facebook and 24 million on Instagram, a photo service owned by the social media giant. 

In June, Facebook introduced new enforcement protocols for content posted by public figures during times of civil unrest and violence. Facebook made the changes after a semi-independent oversight board upheld Trump's suspension but noted in its decision that the company doesn't describe indefinite suspensions in its content policies. Facebook then clarified that Trump will be suspended from Facebook for two years, and the social network said it would assess the risk of violence near the end of his suspension period, which runs until at least January 2023.

Despite Trump's suspension, Media Matters for America said Thursday that the former president's posts from Jan. 6 continue to receive interactions and that his fundraising committee has run hundreds of ads.

The social network also said it would provide regular updates in 2022 about when it leaves up content that violates its rules because of newsworthiness and would no longer presume that speech from politicians is inherently of public interest. Ahead of Biden's inauguration, Facebook said it removed content that included the phrase "stop the steal," banned US ads that promoted weapon accessories and protective equipment, and blocked the creation of new events happening around the Capitol, among other steps.

Trump had a much bigger presence on Twitter, with nearly 89 million followers. Twitter permanently suspended the former president for violating its rules against the glorification of violence. Also in response to the attacks, Twitter no longer allowed people to reply to, like or retweet tweets that violated its updated civic integrity policy and permanently suspended thousands of accounts that mainly shared QAnon content. 

The company then started asking the public for feedback about whether they believe world leaders should be subjected to the same rules as other users. It launched a pilot program called Birdwatch in which users can identify tweets they think are misleading and add more context and teamed up with Associated Press and Reuters to elevate credible information on the platform. 

"Our approach both before and after Jan. 6 has been to take strong enforcement action against accounts and tweets that incite violence or have the potential to lead to offline harm. Engagement and focus across government, civil society, and the private sector are also critical. We recognize that Twitter has an important role to play, and we're committed to doing our part," a Twitter spokesperson said in a statement.

US lawmakers still want answers from social networks


Google CEO Sundar Pichai, Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey testified before lawmakers in March.

James Martin/CNET

In March, US lawmakers grilled Facebook CEO Mark Zuckerberg, Google CEO Sundar Pichai and Jack Dorsey, who was Twitter's CEO at the time, about a variety of topics, including the insurrection.

"I think that the responsibility here lies with the people who took the actions to break the law and do the insurrection," Mark Zuckerberg said. 

While Zuckerberg downplayed Facebook's role, Dorsey acknowledged Twitter did play a part. Rep. Mike Doyle, a Pennsylvania Democrat, asked the executives to answer "yes" or "no" to a question about whether their platforms had fueled the spread of misinformation and the planning of the Capitol Hill riot.

"Yes," Dorsey replied. "But you also have to take into consideration the broader ecosystem. It's not just about the technological systems that we use."

In August, a House of Representatives select committee tasked with investigating the Jan. 6 attack requested records from 15 social media companies, including Facebook, Twitter, Reddit, TikTok, YouTube, Gab and Parler. As part of the request, the committee also asked the companies about any policy changes made to address misinformation, posts condoning violent extremism and other offensive content. 

The committee didn't respond to a request for comment about the investigation. 

A Facebook whistleblower comes forward

Frances Haugen testifies in the US House of Representatives

Former Facebook employee Frances Haugen testifies during a hearing before a subcommittee of the House Energy and Commerce Committee on Dec. 1. 

Alex Wong/Getty Images

Criticism of Facebook's role also came from inside the company. Internal documents gathered by former Facebook product manager Frances Haugen, acting as a whistleblower, shed more light on the social network's response to the Jan. 6 attack. Facebook's employees felt like the company didn't do enough to crack down on misinformation ahead of the 2020 US presidential election, the documents showed.

A complaint filed on behalf of Haugen to the US Securities and Exchange Commission accuses Facebook of misleading the public investors about its role in perpetuating misinformation and violent extremism related to the 2020 election and the Jan. 6 insurrection. 

Haugen's legal team disclosed redacted documents to Congress and the SEC. A consortium of news organizations, including CNET, also viewed the redacted versions. 

In the complaint, Haugen accuses Facebook of failing to adopt or continue measures to combat misinformation and violent groups, including content related to the Jan. 6 insurrection, to "promote virality and growth on its platforms." The social network, for example, could have done more to limit the resharing of posts with misinformation, the complaint said.

Since the release of the documents, US lawmakers and advocacy groups have been pushing for more regulation including a federal data privacy law

"It's time Congress ensures Facebook puts our safety over their profits," said José Alonso Muñoz, deputy communications director of United We Dream, a nonprofit focused on the immigrant community.

CNET's Carrie Mihalcik contributed to this report.