Commentary: First it was the Disney Princess quizzes. Then it was Russian propaganda. Now Facebook is going to ask who you trust when it comes to news. That’s dangerous.
One day soon, Facebook may ask you two seemingly straightforward questions that could decide the future of news on your feed.
1. "Do you recognize the following websites?" (Yes/No)
2. "How much do you trust each of these domains?" (Entirely/A lot/Somewhat/Barely/Not at all).
These are, in fact, some of the actual questions, written by teams at Facebook.
The questions stem from a decision by Mark Zuckerberg, Facebook's CEO, who said last week that he's going to seek the wisdom of the crowd -- that is, the 2 billion monthly users of his service -- to determine which media organizations are writing honest and trustworthy stories worthy of appearing in your feed.
The world's largest social network, with a population greater than that of any country on Earth, by default won't consider facts, honesty or professionalism when judging news organizations.
Instead, Zuckerberg and his team are going to survey random people, maybe some of your friends, maybe not, who'll decide what publications are most trustworthy. Whatever Facebook learns from us -- and a Facebook spokesman told me it won't make any of those details public -- will filter down into how often you see my stories in your feed.
Yes, your ranting Uncle Ed may help determine whether you see the next big scoop from The New York Times or Wall Street Journal or CNN or Fox News.
"People who use Facebook have made clear that they want to see accurate, informative and relevant news on Facebook, as well as news from sources they trust," a Facebook spokesman told me. "The question was how to measure that. We could try to make that decision ourselves, but that's not something we were comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask the community, and have their feedback determine the ranking."
So, he added, "We decided that having the community determine which sources are broadly trusted would be most objective."
Welcome to Facebook's vision of journalism in the 21st century. No wonder many people are calling out Zuckerberg and saying, with a strong twang of irony, "what could go wrong?"
Why does all this matter? More people than ever get their news from social media sites, with Facebook taking the top ranking in a Pew survey.
So it's probably no surprise that reaction to Zuckerberg's decision by media experts and those who follow tech closely has been largely negative, given that Facebook seems to be abdicating its responsibility as a news distribution service by not vetting the pieces people share.
This isn't that different from how Facebook's acted before. The social network has been criticized for allowing Russian agents, white supremacists and other propagandists to use Facebook to fool readers with real "fake" news stories and for creating filter bubbles, through which Facebook's mysterious algorithms only show you stories that reinforce a point of view.
This latest move -- to crowdsource credibility -- seems like a logical extension of that, said Michael Kearney, an assistant professor at the University of Missouri School of Journalism. "It feels like Facebook is taking the easy route to please people now," he said. That stands in stark contrast to the work that fact-checker websites often have to do.
What's odd is that Facebook employs some of the smartest engineers on the planet, all working to "bring the world closer together." Why isn't it smart enough to figure out how to clean up its propaganda and fake news problems?
Some people believe Zuckerberg instead may just be playing us. Emily Bell, a professor at Columbia's Graduate School of Journalism, is going with the "playing us" theory, given that Facebook makes its money (more than $10 billion in 2016) from letting advertisers target users.
"If you really wanted to rank news outlets by credibility and reliability there are many better ways to do it than ask 2 billion people," Emily Bell, founding director of the Tow Center for Digital Journalism at Columbia's Graduate School of Journalism, told her Twitter followers after Zuckerberg posted his plan last week. "HOWEVER, if you want to collect personal data from 2 bn people about media preferences to sell to advertisers…"
By the way, Rupert Murdoch, the chairman of News Corp., put out a statement Monday arguing that some of Facebook's ad money should be paid back to news organizations.
"If Facebook wants to recognize 'trusted' publishers then it should pay those publishers a carriage fee similar to the model adopted by cable companies," he wrote.
There's one more theory to consider. Andrew Keen, a tech critic and author, thinks maybe Zuckerberg is responding to souring attitudes. People, he told the tech news site Recode, are realizing that the way social networks operate is "not in their best interests."
"Mark Zuckerberg has been rearranging the deck chairs on the Titanic with these latest reforms," Keen said. "I'd like to see him really acknowledge the problem and deal with it directly and come up with radical solutions."
Batteries Not Included: The CNET team reminds us why tech is cool.
CNET en Español: Get all your tech news and reviews in Spanish.