Skateboarding at the Olympics Offspring drops unvaccinated drummer Simone Biles wins bronze Marvel's What If...? on Disney Plus Amazon's Lord of the Rings series 4th stimulus check update

Section 230: How it shields social media, and why Congress wants changes

The law has served as a legal shield for Facebook, Twitter and YouTube. Now lawmakers say it should be changed to hold companies accountable.

gettyimages-1231925159

Congress is looking to strip protections for social media companies in an effort to rein in disinformation online. 

Caroline Brehman/CQ-Roll Call, Inc via Getty Images

A 25-year-old law that shields social media companies from lawsuits over content their users post is under attack, as leaders in Congress look to hold those companies accountable for disseminating health-related misinformation. 

Last week, Sens. Amy Klobuchar of Minnesota and Ben Ray Luján of New Mexico introduced the Health Misinformation Act, a bill that would create an exception to Section 230, a provision in the Communications Decency Act that gives legal protections to social media companies over user-generated content. This exception would make companies like Facebook, Twitter and YouTube liable for inaccurate statements about health information, including misinformation about COVID-19 vaccines, in addition to other false health-related claims.

The bill is the latest in a slew of legislation aimed at reforming the liability shield. Democrats and Republicans on Capitol Hill generally agree that changes need to be made to Section 230. Calls for reform have taken on new urgency as social media sites battle a flood of troubling content, including disinformation about the coronavirus vaccines, the outcome of the US presidential election and the deadly attack on the US Capitol

It's not just theoretical: Extremist content and conspiracy theories posted on social media platforms have led to real-world violence. In spite of the companies' efforts to address these concerns, lawmakers have expressed frustration that it's not enough, and they've urged the companies to do more. 

"Earlier this year, I called on Facebook and Twitter to remove accounts that are responsible for producing the majority of misinformation about the coronavirus," Klobuchar said in a statement. "But we need a long-term solution."

Republicans have widely called for the reform or repeal of the law because of their perception that the Silicon Valley powerhouses are biased against conservative views and work to censor conservatives, like former President Donald Trump, while giving liberal politicians a pass. 

Democrats agree that reforms are needed, but they see the problem differently, arguing that Section 230 prevents social media companies from doing more to moderate their platforms, such as taking down or limiting hate speech and misinformation about COVID-19.

"Throughout the COVID-19 pandemic, social media companies like Facebook, Twitter, and YouTube did little while COVID-19 related misinformation spread on their platforms -- fueling distrust in public health officials, promoting conspiracy theories, and putting lives at risk," Luján said in a statement. 

Now playing: Watch this: Facebook, Google and Twitter take on Congress, Amazon...
1:46

Tech companies say Section 230 protections, which shield them from liability for their users' posts and also let them moderate harmful content without facing repercussions, allowed online platforms to flourish in the early days of the internet. And they see even well-intentioned reforms as potential threats to the internet. 

"While nobody is denying the prevalence of harmful misinformation around the pandemic and vaccines, attacking Section 230 will actually make it harder for platforms to remove harmful-but-not-illegal content," said Evan Greer, director of the grassroots advocacy organization Fight for the Future. "And in the process it will also threaten human rights and free expression for marginalized people."

Facebook said it plans to work "with Congress and the industry as we consider options for reform."  Twitter and Google have yet to comment on the latest legislation. 

As the influence and size of companies like Google (which owns YouTube), Twitter and Facebook have grown, lawmakers say they're determined to rein in their power. 

Here's what you need to know about the government's potential role in regulating social media:

What is Section 230? 

Section 230 is a provision of the 1996 Communications Decency Act that protects companies that host user-created content from lawsuits over posts on their services. The law shields both internet service providers, like AT&T, Comcast and Verizon, and social media and internet platforms, like Facebook, Twitter and Google. 

Section 230 isn't blanket protection. There are exceptions for federal crimes or intellectual property claims. A company could still be held accountable if it knowingly allowed users to post illegal content.

The law provides social media companies with sweeping protections that let them choose what content they restrict, and how. This means social media platforms can't be sued for taking down content or leaving it up. 

Why did lawmakers think this was a good idea?

By eliminating liability risk, Section 230 has allowed companies to experiment. Without it, Twitter and Facebook almost assuredly wouldn't exist, at least not as they do now. And it isn't just big companies that gain from the law. Nonprofits have benefited too. Many experts say the law has enabled the internet to develop into a medium that allows ideas and political discourse to flow freely. 

OK. So what are the problems with Section 230?

Most of the problems around Section 230 involve which posts social networks allow to stand and which ones they remove. 

Democrats are most concerned about getting big social media companies to take down hate speech, harassment, disinformation and terrorism-related content. Democrats have even accused the companies of using the liability protections as a means to profit from the lies spread on their platforms.

Republicans allege that social media companies censor conservative viewpoints. This is a narrative Trump used in the lead-up to the 2020 presidential election when Twitter and Facebook began slapping warning labels on his posts for containing inaccurate information. 

After the election, Trump used social media to falsely claim victory. Following the deadly attack on the Capitol, he was banned by Twitter, Facebook and other social media platforms. 

How is Congress proposing to fix these issues?

As the rhetoric around Section 230 has heated up, lawmakers on both sides of the political aisle have introduced a flurry of legislation over the past year. Some call for liability protections to go away entirely, while others want to alter or refine the protections. Other bills entirely strip away liability protections and would have companies earn those protections by showing they're politically neutral in how they moderate content. 

But the fact that so many bills have been introduced -- some of which overlap in scope, and some of which are vastly different in approach -- is perhaps a good indication that there's no easy fix for this issue. 

Where do the executives of the big companies -- Facebook, Google and Twitter -- stand on regulation?

Facebook CEO Mark Zuckerberg has expressed support for changing Section 230. Google CEO Sundar Pichai has said he has concerns about changing or repealing the law, noting during a congressional hearing earlier this year that there could be unintended consequences that make content moderation tougher or that harm free expression. Twitter CEO Jack Dorsey echoed Pichai's concerns that restrictions could be difficult to enforce and could have unintended consequences, especially for smaller platforms. 

What do tech advocates think of these reforms?

The Electronic Frontier Foundation, Fight for the Future and other groups have argued it'll do more harm than good to change liability protections to address the many concerns people have with social media companies. They fear these regulations could lead to more censorship as social media companies try to minimize their legal risk. 

They say there are better ways to address the issues that concern lawmakers than gutting Section 230. For example, they say stronger federal data privacy legislation could limit the ability of companies like Facebook to target misinformation directly to the people most susceptible to it. They also say increased antitrust enforcement and the restoration of net neutrality protections would help limit the monopoly power of these companies. And they want to see greater transparency in regard to how algorithms manipulate content and newsfeeds. 

"We don't agree that more aggressive content moderation on its own will address the harms of Big Tech," said Fight for the Future's Greer. "And we fear that without structural changes, more aggressive platform moderation and content removal will disproportionately harm marginalized people and social movements."

Didn't then-President Trump issue an executive order directing the FCC to write regulations on Section 230?

Yes. Trump heightened the debate over Section 230 in May 2020 when he issued an executive order directing the Federal Communications Commission to establish regulations that clarify the parameters of the good-faith effort Section 230 requires online companies to make when deciding whether to delete or modify content. At the heart of Trump's executive order was the claim that social media sites censor conservative viewpoints they disagree with. 

Does the FCC have any authority to make rules limiting Section 230?

That was the big question. The FCC's top lawyer said it did. But Democrats and watchdog groups, such as Public Knowledge, said the FCC doesn't have the authority to impose these regulations.  

Then the election happened. Trump lost, Biden won, and the chairman of Trump's FCC, Ajit Pai, didn't pursue writing new regulations. 

What's President Joe Biden's stance on Section 230?

Biden has also weighed in on the flood of inaccurate health information. Last week, he said that vaccine misinformation on platforms like Facebook is "killing people," and he pressed the social network to do more to combat it. In response, Facebook said it's taken down more than 18 million pieces of COVID-19 misinformation.

The White House said last week that it's reviewing whether social media platforms should be held legally accountable for publishing misinformation, via Section 230. 

This isn't Biden's first attack on Section 230. When he was a candidate for president, he argued that social media companies don't deserve protection because they knowingly allow false information on their platforms. 

In an interview with The New York Times editorial board in January 2020, Biden called for Section 230 to be "immediately" revoked. But unlike Trump, Biden is likely to recognize that any changes to the law must come from Congress. 

So, what's next?

It's clear lawmakers are ready to do something. But exactly what reform will look like is unclear. Stay tuned.