Facebook shirks responsibility, says experts can't be trusted

Commentary: In asking Facebook's so-called community to decide which news sources are trustworthy, Mark Zuckerberg offers a truly disturbing rationale.

Chris Matyszczyk
3 min read

Technically Incorrect offers a slightly twisted take on the tech that's taken over our lives.

Facebook CEO Mark Zuckerberg

An expert in responsibility avoidance?

Paul Marotta/Getty Images

The man whose mission it is, this year, to fix Facebook would prefer you to do it for him.

All of you.

All of you who can be bothered to answer surveys on Facebook, that is.

This is the remedy CEO Mark Zuckerberg offered on Friday for deciding which media outlets are trustworthy sources of information.

Zuckerberg said that in order to combat "sensationalism, misinformation and polarization," Facebook would be slipping some questions about media sources into its quality surveys.

Facebook will ask some questions about whether you think a certain publication is trustworthy, and then it will look at all the data it's collected from respondents.

That's going to work, isn't it?

Some will say that the Daily Anarchist is a fine, trusted source. Others will insist it's the Hammer and Sickle Express.

Zuckerberg hopes, though, that everyone who responds will at least find one or two publications from the "other side" to be labeled as trustworthy. Quite a hope.

Might it be that extremists will answer these surveys far more readily than slightly more reasonable people? You know, those who are very tired of the world's current nonsense and would prefer sanity to return from its overly long vacation.

Zuckerberg prefers to cede responsibility to Facebook's so-called community because it means the company itself doesn't have to take a meaningful stance. Which so many who consider it a media platform believe it should.

"We decided that having the community determine which sources are broadly trusted would be most objective," Zuckerberg said in his Facebook post.

This would be the same community that apparently got fooled in quite some numbers by Russian-written fake news posts around election time.

That aside, there's one sentence in Zuckerberg's explanation that mightn't pass some people's trustworthiness test.

"We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem," said Zuckerberg.

How odd, some might think, that he chooses to apparently denigrate experts in an era when expertise is being shunned by many. The government, for example.

Experts who insist that climate change is real are dismissed as scaremongers

Why, last week most of the members of the National Park Service advisory panel quit because, they said, Interior Secretary Ryan Zinke isn't interested in meeting with them.

And here is Zuckerberg saying that, well, we could have appointed a panel of experts who might have actual knowledge of this misinformation stuff but, nah, we'll ask anyone who'll answer our surveys.

Facebook didn't immediately respond to a request for comment.

Facebook insists it has a "community." I put the word in inverted commas, as I'm not convinced there's any such thing. Just because you have an enormous number of users, it doesn't mean they're bonded into anything other than the small groups in which they spend their Facebook time.

Still, Facebook has often preferred to throw decision-making over to that "community." Because it allows the company to avoid responsibility for larger decisions. And, oh, it's cheaper.

In 2016, for example, Facebook decided that the "community" should determine which events warranted mobilizing its Safety Check system. The company also said it relies on users to help it decide what should and shouldn't be considered hate speech.

In that latter case, though, it claims it learns from experts, as well as the community. 

How odd, then, that it doesn't believe experts can be objective when it comes to determining which news sources may be bunkum-based.

At a time when people seem ever more polarized -- partly because of the misinformation they're being fed from all sides -- Facebook believes those same people are the best repositories of objectivity.

Well, it's far easier than, say, figuring out your criteria for what is and isn't misinformation, publicizing those criteria and then hiring a panel of, oh, wise expert types to help you ensure they're enforced.

That's unlikely to happen, though. Facebook would actually have to stand for something.