X

The Supreme Court's NetChoice Cases Could Change Online Speech Forever

The court is debating whether Florida and Texas laws governing social media companies violate the First Amendment.

Ian Sherr Contributor and Former Editor at Large / News
Ian Sherr (he/him/his) grew up in the San Francisco Bay Area, so he's always had a connection to the tech world. As an editor at large at CNET, he wrote about Apple, Microsoft, VR, video games and internet troubles. Aside from writing, he tinkers with tech at home, is a longtime fencer -- the kind with swords -- and began woodworking during the pandemic.
Ian Sherr
5 min read
picture of court gavel and computer parts

The Supreme Court is debating whether Florida and Texas laws governing social-media companies violate the First Amendment.

Aitor Diago/Getty Images

A pair of Supreme Court cases due to be ruled on later this year may change the future of online speech. They're also raising questions about legislative power over big tech companies.

The two cases, NetChoice v. Paxton and Moody v. NetChoice, are a direct result of the Jan. 6, 2021, attack on the US Capitol, which left five people dead, including a Capitol Police officer. Then-President Donald Trump had rallied supporters at the White House, some of whom marched up to the Capitol in an effort to disrupt Congress from counting Electoral College votes to formalize his defeat by then-President-elect Joe Biden.

In response, social media companies, including Twitter, Facebook and YouTube, banned Trump's account, citing concerns he might incite more violence in his efforts to overturn the election. Investigations, including from ProPublica and The Washington Post, found that those social networks had played "a critical role in spreading lies that fomented the violence of Jan 6."

Lawmakers and governors in Texas and Florida responded with laws that included must-carry provisions, effectively requiring platforms like Twitter, Facebook and YouTube to host controversial speech whether they want to or not. NetChoice, a tech industry group, sued to block both laws, defending the rights of social networks to moderate content and make editorial decisions.

Now, the Supreme Court will decide whether social media companies must carry speech, including Nazi rhetoric and medical disinformation.

After nearly four hours of court debate on Feb. 26, according to reports from The Washington Post, The New York Times and The Wall Street Journal, the Supreme Court justices appeared skeptical of the arguments on behalf of Texas and Florida.

The Supreme Court hasn't said when it will rule, but it typically announces decisions for high-profile cases at the end of its term in mid- to late June.

The First Amendment

Many proponents of online speech point to the First Amendment to the US Constitution as a guide for how companies should moderate expression online.

The First Amendment says, "Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances."

The first five words are a key point of this issue. Until social media became a daily part of many of our lives, there was little debate that the First Amendment had been written to keep the government from deciding what Americans can and can't say. But as social networks have grown to include billions of people, in some cases with online populations larger than any country on Earth, some politicians have begun arguing that it should apply to big tech platforms too.

Section 230

Some lawmakers have focused their attention on a 1996 law called the Communications Decency Act, which included a provision, called Section 230, protecting tech companies from legal liability for what's posted on their platforms. But the law also allows services to remove posts that are illegal, obscene or violate their platform rules, so long as they're acting in "good faith."

Lawmakers say that Section 230 has been twisted. Republican lawmakers argue that tech companies use the law to justify censoring posts they don't like. Many Democrats, victims rights organizations and anti-hate groups say the law has allowed big tech to profit off widespread harassment, disinformation and violence.

As a result, lawmakers who don't usually agree on much of anything have found themselves together on the opposite side of tech companies in this debate.

"Nobody elected Big Tech executives to govern anything, let alone the entire digital world," Sens. Lindsey Graham, a Republican from South Carolina, and Elizabeth Warren, a Democrat from Massachusetts, wrote in The New York Times last year.

The Electronic Frontier Foundation, Fight for the Future and other groups warned that changing liability protections in Section 230 to address issues around online expression could lead to more censorship. Social media companies, for example, could start clamping down on broad swaths of online speech in order to to minimize their legal risk.

Social media companies have already shown they're willing to take drastic action. Even though Congress hasn't passed any substantive laws around the issue, Facebook parent Meta said it will demote and slow the spread of news and political discussions on its platforms in an effort to avoid repeats of broad disinformation campaigns, including those that helped spark the Jan. 6 attack on Capitol Hill.

The Supreme Court's NetChoice arguments

Many of the Supreme Court's nine justices revealed skepticism about the Texas and Florida laws during the marathon session to hear the case on Feb. 26.

Chief Justice John Roberts said the First Amendment plays an important role in the debate over whether social media companies or the government have power to decide which voices are heard online.

Other justices, including Sonia Sotomayor and Ketanji Brown Jackson, expressed concern about the nature of the laws being overly broad. Justice Elena Kagan suggested the laws may be unconstitutional when applied to tools for expression, like Facebook and YouTube.

But Justice Kagan also said the laws could legitimately stop a company like rideshare company Uber from possibly kicking people off its service because it doesn't like their political views. Right-wing activist and failed Florida congressional candidate Laura Loomer was banned from Uber and Lyft in 2017 after she accused the services of "hiring Islamic terrorists."

What's next

The court will meet over the next few months to debate and draft their opinions, whether in support or against whatever the majority of justices decide. It's unlikely the court will say anything until the end of its term, typically around late June or early July. 

We may get a sense of the court's decision before its official announcement though. Two years ago, a draft decision on Dobbs v. Jackson Women's Health Organization leaked to Politico, alerting the public of the Supreme Court's decision to ultimately overturn the landmark Roe v. Wade ruling that legalized abortion across the US. The court said it conducted an investigation to find the leaker, but nothing appears to have come of it.

Instead, the Supreme Court has become more central to the political world, with numerous stories of scandals concerning ethical lapses and financial malfeasance from some of the court's most high profile justices.