X

Facebook Could Stop Removing COVID-19 Misinformation

The social network requested advice from its oversight board on policy adjustments, like labeling or demoting misinformation rather than removing it.

Brian Rosenzweig Editorial Intern
Brian is an editorial intern focusing on home entertainment and general news assignments. Brian is a rising senior studying journalism and English at the University of North Carolina at Chapel Hill. Despite the whole two-majors-in-writing thing, he's struggling to come up with a third sentence for this bio.
Brian Rosenzweig
2 min read
Meta logo on a phone screen, next to Facebook's f logo

Facebook is considering loosening policies on COVID misinformation.

Getty

Facebook may stop removing false or misleading COVID-19 posts and may label or demote them instead, the social network said Tuesday. Parent company Meta is considering the move as it seeks advice from Facebook's independent oversight board on whether to modify its COVID-19 misinformation policy.

Facebook expanded its policy on harmful misinformation in early 2020 as the virus spread across the globe, allowing for posts that could lead to an "imminent risk of physical harm" to be removed worldwide, rather than only being taken down at the advice of local partners and experts. This was designed to combat misinformation about the pandemic, such as false claims about the effectiveness of masks, social distancing and the transmissibility of the virus.

In late 2020, as first doses of vaccines began rolling out, Facebook updated its policy to also remove vaccine misinformation. More than 25 million pieces of content have been removed since the beginning of the pandemic, according to Meta.

But now, amid rapidly shifting pandemic trends and declining stocks for the company in the year so far, Meta is looking into revising its policy, beginning with input from its oversight board.

"The policies in our Community Standards seek to protect free expression while preventing this dangerous content. But resolving the inherent tensions between free expression and safety isn't easy, especially when confronted with unprecedented and fast-moving challenges, as we have been in the pandemic," the company said.

Facebook's content moderation guidelines have long been a contentious topic, with the company being accused both of enabling hate speech to optimize profit and limiting free speech in recent years.

Meta's independent oversight board comprises legal advisers from various think tanks, professors from universities across the globe, journalists and human rights advocates. In this case, Meta requested an advisory opinion from the oversight board, meaning recommendations they provide to the company are nonbinding.