Twitter's problems. Among its biggest: how to moderate divisive content shared on the service.has acknowledged he doesn't have the answers to all of
For years, Twitter and other social networks have struggled to police offensive content such as hate speech, harassment and misinformation that can lead to real-world harm. Progressives and civil rights activists say the platforms aren't doing enough to crack down on offensive posts. Conservatives say Twitter censors their speech, an allegation the company denies.
No one seems happy with how Twitter decides what gets taken down and what is left up.
Now Musk, the world's richest person, has thrust himself into the middle of the debate. On Monday, the Tesla and SpaceX CEO struck a $44 billion deal to buy Twitter, which he plans to take private. He's indicated he wants looser content moderation for the platform, a change that would have outsized influence on politics and society.
The deal has sparked concerns among employees and advocacy groups about whether the serial entrepreneur might undo efforts Twitter has taken to combat harmful content, such as COVID misinformation. It has also raised questions about whether Twitter could welcome back Twitter users the company has banned. Last year, the social network famously banned former President Donald Trump after the Jan. 6 insurrection because of fears his comments could incite violence. Trump has said he won't return to Twitter.
"Free speech is the bedrock of a functioning democracy," Musk said in a Monday press release, dubbing Twitter a "digital town square" for debating issues. On Tuesday, he tweeted, "By 'free speech', I simply mean that which matches the law."
(Free speech is protected under the First Amendment, which safeguards citizens from government interference, but that amendment to the US Constitution doesn't apply to companies, such as Twitter, which are allowed to set rules for moderating content.)
Content moderation is a nuanced practice, something Musk knows firsthand. In 2020, Musk falsely tweeted that "kids are essentially immune" from COVID-19. Children do catch the virus and can suffer the same effects as adults, though at a lower rate. Twitter told Axios the tweet didn't violate its rules against harmful coronavirus misinformation because it wasn't "definitive."
Katie Harbath, a former public policy director at Facebook who now leads consultancy Anchor Change, said in a Twitter direct message that she doesn't expect major changes right away at Twitter. Over time, however, Musk could allocate fewer resources to content moderation, Harbath said.
Musk will likely be more involved in decision making than Twitter co-founder Jack Dorsey was, she added.
A Twitter spokeswoman declined to comment beyond earlier statements the company has made. Before the deal was struck, Twitter said that it had no plans to reverse any past policy decisions and that its employees and managers make the day-to-day decisions.
Still, advocacy groups are sounding alarms about the deal.
"We should be worried about any powerful central actor, whether it's a government or any wealthy individual -- even if it's an ACLU member -- having so much control over the boundaries of our political speech online," the American Civil Liberties Union said in a tweet. Musk is an ACLU member and supporter of the nonprofit.
Jonathan Greenblatt, CEO of the Anti-Defamation League, said in a statement that Musk hasn't focused on tackling hate speech and extremism.
"We worry that he could take things in a very different direction," Greenblatt said. "Moreover, as a private company, Twitter will lack the transparency and accountability of a public firm."
Mariana Ruiz Firmat, executive director of technology-focused racial justice group Kairos, said Musk's ownership of Twitter will result in a rollback of content moderation under the guise of "free speech."
The deal, she said, "is alarming to the employees, advocates and users who have fought for years to push the company to adopt appropriate safety and disinformation guidelines."
Musk's views on content moderation
Musk, who once described himself as a "free speech absolutist," has publicly shared his thoughts about content moderation.
On April 14, Musk said in a TED Talk in Canada that Twitter should let users know when a piece of content gets promoted and demoted on the site. Free speech, in his opinion, means someone is allowed to say something on Twitter another person doesn't like.
"I'm not saying I have all the answers here but I do think that we want to be very reluctant to delete things and be very cautious with permanent bans," Musk said. "Time-outs I think are better."
Last week, Musk tweeted that a "social media platform's policies are good if the most extreme 10% on left and right are equally unhappy."
But taking down speech is only a part of content moderation. Leaving up harassment or hate speech could have a "chilling effect" in which users, especially minorities and women, don't feel comfortable speaking on Twitter, said Emma Llansó, director of the free expression project at the Center for Democracy and Technology.
While opening up Twitter's algorithm could make the social network more transparent, spammers and bad actors could also use that information to try to abuse the system so their tweets get promoted higher on the timeline.
"Every decision like this ends up having those kinds of tradeoffs in content moderation," Llansó said. "It's where actually taking the time to think it through really carefully and understand both the intended and unintended consequences of a big move is going to be really important."