We already know Facebook manipulates its algorithms to test our emotions and encourage us to vote. Now it seems CEO Mark Zuckerberg and his team may have tried to affect our politics as well.
Two reports from the tech blog Gizmodo allege that Facebook's trending topics list, which sits at the top right of its website and is one of the most high-profile pieces of real estate on the Internet, may have become a tool for employees to wield their political opinions.
Gizmodo claims that Facebook's "news curators," as they're called internally, were told to suppress news stories from politically conservative news outlets. That meant stories from outlets like Breitbart News and The Washington Examiner were excluded or demoted in the trending topics list, in preference for more traditional publications like The New York Times and The Wall Street Journal. Some stories weren't allowed to appear in the list either, Gizmodo alleges, citing unnamed former news curators, including pieces about key conservative commentators and politicians such as Mitt Romney, the former Republican candidate for US president.
Facebook said it takes the allegations "very seriously," and added it has "rigorous guidelines in place for the review team to ensure consistency and neutrality." The company didn't have an immediate comment about Gizmodo's allegations. But in a statement issued late Monday, Facebook vice president of search Tom Stocky said the company had "found no evidence that the anonymous allegations are true."
The questions surrounding Facebook's alleged behavior strike at a growing concern over the company's influence throughout the globe and how its employees wield it. In this case, Facebook says employees have a hand in writing descriptions for trending topics, but how it selects "trending" news was assumed to be based upon computer algorithms, not the company's political whims.
Facebook is used by more than 1.65 billion people each month, making its population larger than any country on Earth. It's also become a political and media powerhouse, serving as a go-to place where governments and politicians can connect with citizens. And it's a key outlet for news organizations to popularize their stories.
When employees manipulate a user's news feeds to see how they'll react emotionally or to encourage us all to vote, it raises doubt about whether we can trust Facebook as a neutral source for connecting people, "to share and make the world more open and connected."
Now, if Gizmodo is right, it appears some employees allowed their own biases to influence not only what news Facebook's users saw, but also how important each item was.
Journalism experts and media commentators are crying foul. Facebook's trending topics list was assumed to be a list of the most popular things on Facebook, they say, not a curated list of what Facebook's employees think is relevant.
"With Facebook's trending stories, we had every right to think that this represented the wisdom of the crowd," said Dan Kennedy, an associate professor of journalism at Northeastern University.
Facebook's site even said as much, describing its trending topics list as "a list of topics and hashtags that have recently spiked in popularity on Facebook." The list is supposedly influenced by Pages you've liked, your location and what's trending across Facebook, but it doesn't say employees have a hand in selecting topics.
Until he'd heard the allegations of misconduct, Kennedy assumed the list was controlled by an algorithm, like the "most e-mailed" list of stories showcased on many news sites. The mistake the Internet giant may have made, he said, is it wasn't forthright about how stories and topics make the list. "If they just called it 'editor's picks,' there'd be no problem."
All this drama likely won't stop people from using Facebook's service, said Rory O'Connor, an adjunct professor at Stony Brook University's School of Journalism. But Zuckerberg & Co. should fess up to its mistake anyway, just as they did after word went out two years ago about how they were toying with our emotions.
But getting an admission, or apology, doesn't really change much, added O'Connor. "It's a big mistake and it's a huge problem, but what can we do about it? Not much."
Updated at 10:50 p.m. PT with Facebook denial.