This is the second story in a two-part series about the complications of Facebook's unique media impact. The first part examines Facebook's struggles with challenges typically faced by media and news organizations.
Let's give Facebook some props: The first step is admitting you have a problem.
Facebook has always preferred to be a tech company first, a media power never. Facebook's role is to "build the tools...not produce any content," founder and CEO Mark Zuckerberg said in August. But with 1.8 billion people signing on monthly to share and revel in videos, pictures and news stories, Facebook has grown into a publisher of unparalleled scale.
Now, Facebook is tiptoeing toward taking that mantle of media giant. On Thursday, it took its boldest strides yet to check the spread of so-called "fake news," one of its most hot-button issues this year because of possible effects on the US presidential election. (The company rejects accusations that fake news on Facebook swayed the vote.) A day earlier, Facebook said it was eyeing the idea of producing original video content.
Facebook has been instrumental in redefining what it means to be a media company. Its instinct to deny that part of its identity may have fueled snafus and a perception that it's tone-deaf to its own influence. The fake-news and content moves hint Facebook is starting to heed the broader debate over its obligations as a media source. Facebook is showing that it can act like a media company but one that serves its own best interest by keeping users loyal.
"People just get sick of a particular platform, especially if a platform doesn't adapt," said Corynne McSherry, legal director of the Electronic Frontier Foundation, a nonprofit organization dedicated to digital civil liberties.
Facebook declined to comment for this story, but Zuckerberg on Thursday indicated he's beginning to change his tune.
"I think of Facebook as a technology company, but I recognize we have a greater responsibility than just building technology that information flows through," he said.
A new definition
One challenge Facebook faces is how rapidly the internet has changed the way people learn about and judge the world.
Society has shifted from mass media to one of "individuated" media, according to Vin Crosbie, an adjunct professor at the University of Syracuse's Newhouse journalism school. In Facebook's case, those 1.8 billion users all go to the same place for news and information, but each one of them sees something different.
Social media, in particular, has caused a second sea change. Instead of people seeking out news and information themselves, social networks teach us to expect the news to come to us.
On platforms like Facebook, "we receive news already editorialized, already curated, and with a known bias" to keep us engaged, said Karen North, a professor of digital social media at the USC Annenberg School for Communication and Journalism. In other words, Facebook's objective is to feed us media it believes will make us hungry for more.
And people may not consciously realize that their personal bubble is missing other voices. Susan Robinson, a University of Wisconsin professor who analyzes how people talk to one another on social networks, said her research shows clear silos of conversation.
While Facebook may offer microphones to a wide variety of voices, "we're not getting the benefit in the mainstream of all those diversity of opinions," she said.
Facebook isn't the puppetmaster of this new media era, and nothing obligates Facebook to stop fake news or even protect the posting of a nude picture no matter how culturally meaningful it is. Zuckerberg has warned against Facebook acting as "arbiters of truth," as he said last month and this week.
But experts note that for Facebook to accept the mantle of a media heavyweight, it needn't become judge and jury. It could simply adjust how it carries out its own stated mission: "to give people the power to share and make the world more open and connected."
Squelching fake news, for example, isn't the only way to manage it. "In the United States, our default presumption to counter speech you don't like is more speech," the EFF's McSherry said.
Facebook's latest measures to deal with fake news seem to do just that. They nudge the company away from the pure tech identity that it has clung to, as well.
The measures tap into the company's algorithmic prowess to identify stories that likely are the worst "fake news" offenders. Significantly, Facebook is relying more on human oversight now, by passing those algorithmic warning signs on to third-party fact-checkers who flag items.
Those flagged stories will now include links that explain they're labeled as suspect. And new "warning labels" pop up before you share a disputed story, alerting you that the validity of the article is in question. Both provide more context before misinformation can spread -- Facebook's way of answering misleading speech with more speech.
Facebook might also have a special capacity to help people better appreciate differing points of view.
Rachel Davis Mersey, an associate Northwestern professor, specializes in audience reception to news, or, as she puts it, teaching journalists how to tell stories people don't want to hear.
Facebook, by virtue of knowing so much about what you like, is in a unique position to present users with counterpoint stories.
"When you really know someone, you can talk to them about something you know is controversial in a way that they'll listen to it," she said. That would require Facebook to break away from its bread-and-butter -- social recommendations -- to put more emphasis on social responsibility. Such a change might risk reducing user time and activity on the site, which are key to Facebook's ad-based business model.
"I don't know if Facebook has a responsibility, but [it] has been pretty open about its desire about being a force for good," she said.
Robinson, the Wisconsin professor, noted that Facebook isn't the root of the fake news problem. If such articles did influence the US election, the deeper issues may be a lack of general media literacy and education systems failing to teach citizens to think critically.
"We're asking Facebook to solve a societal problem," she said. "Can they help? I think they can...They have to recognize how powerful they are as a media company."