X

How has Facebook decided what's trending? Leaked docs shed light

Internal guidelines obtained by The Guardian show that the social media network has relied on a small editorial team to decide what news topics to show. Facebook says the guidelines are old.

Shara Tibken Former managing editor
Shara Tibken was a managing editor at CNET News, overseeing a team covering tech policy, EU tech, mobile and the digital divide. She previously covered mobile as a senior reporter at CNET and also wrote for Dow Jones Newswires and The Wall Street Journal. Shara is a native Midwesterner who still prefers "pop" over "soda."
Shara Tibken
2 min read
facebook-trending-topics-smaller.png

Facebook has been accused of having bias with the news topics it says are trending.

Screenshot by Shara Tibken/CNET

Turns out Facebook's trending topics have involved human influence, not just machines.

Documents leaked to The Guardian show that the social-media giant employed a small editorial team, in addition to algorithms, to decide what news should appear in its box of trending topics. The box sits at the top right of Facebook's desktop page and is one of the most high-profile pieces of real estate on the Internet. An editorial team was given the power after Facebook was criticized for not including enough coverage of the 2014 unrest in Ferguson, Missouri, The Guardian said.

The team looked at 10 news sources, including The Guardian and BBC News, to decide if a topic had "editorial authority." The employees had the power to add a newsworthy topic or blacklist a topic for removal if it didn't "represent a real-world event."

Facebook told CNET that the guidelines posted by The Guardian aren't its current parameters. It plans to post an update Thursday.

Justin Osofsky, Facebook vice president of global operations, said in a statement that Facebook relies on more than 1,000 sources of news "to help verify and characterize world events and what people are talking about." He adamantly denied claims that Facebook has been biased.

"We have at no time sought to weight any one viewpoint over another, and in fact our guidelines are designed with the intent to make sure we do not do so," Osofsky said.

The rest of his comment said:

"The guidelines demonstrate that we have a series of checks and balances in place to help surface the most important popular stories, regardless of where they fall on the ideological spectrum. Facebook does not allow or advise our reviewers to systematically discriminate against sources of any political origin, period. What these guidelines show is that we've approached this responsibly and with the goal of creating a high-quality product -- in the hopes of delivering a meaningful experience for the people who use our service."

Earlier this week, allegations emerged that Facebook's trending topics list may have become a tool for employees to wield their political opinions. Facebook denied the claims, but Congress has said it may look into the issue.

The outcry over Facebook's editorial policies comes as the company takes more control over the news we all consume. A study released by Nielsen earlier this week showed that 89 percent of mobile-device owners use their handset to access news. More than half of those who use social networks on a device are using the networks to find news. And social media now trails only television as our top source of news, according to the study.

Update, 11:30 a.m. PT: Adds comment from Facebook that the guidelines cited by The Guardian aren't current.