Facebook to de-emphasize more political posts because of user feedback

Earlier this year, the social network tested reducing political content for some people, and users were happy with the change.

Sean Keane Former Senior Writer
Sean knows far too much about Marvel, DC and Star Wars, and poured this knowledge into recaps and explainers on CNET. He also worked on breaking news, with a passion for tech, video game and culture.
Expertise Culture | Video Games | Breaking News
Sean Keane

Facebook users say they want to see less political content on their News Feed.

Sarah Tew/CNET

Facebook said Tuesday it's expanding an experiment the company started earlier this year to reduce the amount of political content in people's New Feeds. 

In February, the social network said it would test reducing political content for users in the US, Canada, Brazil and Indonesia. After receiving a positive response, Facebook is now testing the reductions in other countries, including Costa Rica, Sweden, Spain and Ireland, the company said in an updated blog post.

The company said it's going to put less weight on signals such as how likely someone is to comment or share political content and more emphasis on how likely people are to provide Facebook with negative feedback on ranked posts about politics and current events.

Publishers could see their traffic impacted, the social network said. "Knowing this, we are planning a gradual and methodical rollout for these tests, but remain encouraged, and expect to announce further expansions in the coming months," Facebook said in the blog post.

This isn't the first time the social network has signaled an attempt to move away from politics and news. In early 2018, CEO Mark Zuckerberg said his company was overhauling the news feed to prioritize posts from family and friends, rather than those from publishers and brands.

Axios, which reported earlier about Facebook's new moves, noted that breaking news poses challenges for fact checkers and "is most likely to be exploited by bad actors for misinformation."

CNET's Queenie Wong contributed to this report.