Facebook said Monday it's taking a tougher stance against misinformation about and vaccines as part of an effort to prevent the online lies from causing harm.
The social media giant is expanding a list of debunked claims about COVID-19 and vaccines it will take down. Facebook said it works with public health authorities, such as the World Health Organization, to compile this list.
The policy, which covers posts on its photo-sharing service Instagram, includes claims that COVID-19, the respiratory illness caused by the novel coronavirus, is human-made or manufactured. Facebook and Instagram users will also not be allowed to post false statements about vaccines being ineffective at preventing the disease, vaccines being more dangerous than getting the disease, or vaccines being toxic, dangerous or causing autism. Facebook said it will remove ads that contain those claims.
"We will begin enforcing this policy immediately, with a particular focus on pages, groups and accounts that violate these rules, and we'll continue to expand our enforcement over the coming weeks," Guy Rosen, Facebook's vice president of integrity, said in a blog post on Monday. Facebook said groups, pages and accounts on the main social network and Instagram that share these false claims repeatedly "may be removed altogether."
The social network works with third-party fact-checkers and typically labels misinformation, but draws a line when false claims could lead to physical harm. The company has been reluctant in the past to pull down anti-vaccination misinformation. "If someone is pointing out a case where a vaccine caused harm or that they're worried about it — you know, that's a difficult thing to say from my perspective that you shouldn't be allowed to express at all," Facebook CEO Mark Zuckerberg told Axios in September. Facebook says it generally allows posts that include a personal anecdote or experience or satire. The company, though, will remove posts about vaccines and diseases if they could lead to "reduced vaccinations and harm public health and safety."
Facebook and other social networks have faced an onslaught of misinformation about the coronavirus since the pandemic started last year. The company has come under scrutiny, including from lawmakers and politicians who say Facebook isn't doing enough to combat this problem.
Despite these increased efforts, misinformation about the coronavirus vaccine is still spreading on the social network. CNN found that four of the top 10 search results for "vaccine" on Instagram were for anti-vaccination accounts. In the coming weeks, Instagram is planning to make it tougher to find in its search results any accounts that discourage people from getting vaccinated.
Facebook's takedowns of COVID-19 misinformation have also been challenged by atasked with reviewing some of the social network's toughest content moderation decisions. In January, the board overturned Facebook's decision to pull down a COVID-19 post for its potential to cause harm. The board found the social network's rules addressing health misinformation "to be inappropriately vague" and urged the company to create a new standard.
Facebook also has an online hub with authoritative information about the coronavirus and said this week it will feature links with information about whether you're eligible to get vaccinated and how to do so. Facebook also said it's giving $120 million in ad credits to health ministries, NGOs and UN agencies to spread information about the COVID-19 vaccine and preventive health.
More than 2 billion people from 189 countries have viewed Facebook's COVID-19 Information Center and informational messages, according to the company. The social network has also pulled down more than 12 million pieces of content on Facebook and Instagram with false claims that could lead to imminent physical harm.