Facebook says it's a tech firm, not a media company. But as it continues to grow, so too does its media might, bringing along conflicts.
Joan E. SolsmanFormer Senior Reporter
Joan E. Solsman was CNET's senior media reporter, covering the intersection of entertainment and technology. She's reported from locations spanning from Disneyland to Serbian refugee camps, and she previously wrote for Dow Jones Newswires and The Wall Street Journal. She bikes to get almost everywhere and has been doored only once.
ExpertiseStreaming video, film, television and music; virtual, augmented and mixed reality; deep fakes and synthetic media; content moderation and misinformation onlineCredentials
Three Folio Eddie award wins: 2018 science & technology writing (Cartoon bunnies are hacking your brain), 2021 analysis (Deepfakes' election threat isn't what you'd think) and 2022 culture article (Apple's CODA Takes You Into an Inner World of Sign)
Facebook's media headaches don't start or end with fake news: With 1.8 billion monthly visitors, Facebook is a media-sharing powerhouse. But unlike the search giant Google or other big networks such as Twitter, Facebook exerts more control over what you see. Those News Feed algorithms are really, really good at getting you to click your way down a comfy rabbit hole. The election exposed how your own personal Facebook burrow -- with its echo chambers, fake news and entry points for abuse -- may not be a safest place to live.
The bigger it has grown, the more Facebook has struggled with challenges typically faced by media and news organizations. Even as Facebook has become the primary vehicle for media critical to public debate, like live videos in July that further ignited the Black Lives Matter movement, the company has continued to insist it is a tech platform, not a media company.
The wrinkle is that Facebook users may not draw so clear a distinction.
"Most people don't differentiate," said Rachel Davis Mersey, an associate professor at Northwestern University who specializes in audience reception to news. "Consumers of media see Facebook as a media company."
All those stories are false -- "fake news." They also sunk into the brains of Facebook users better than real news. When people complain that Facebook is failing to honor its obligations as a news outlet, though, they're only getting it half right.
Facebook is a major news source, and a growing one. Nearly half of US adults get news from Facebook, according to a Pew Research Center study this year. In 2013, 47 percent of all US Facebook users said they got news on Facebook; this year, it was 66 percent.
But criticizing Facebook's news judgment typically holds the company to editorial standards from a different age, like trying to nail horseshoes to the tires of a car.
"Your friends have the right to embellish and lie about their own lives," said Karen North, a professor of digital social media at the USC Annenberg School for Communication and Journalism. "What if Facebook said everything on there had to be true? There would be nothing left."
Facebook is stuck in a seemingly lose-lose scenario with news. The hands-off approach lets misinformation thrive, especially since false reports tend to play well with News Feed algorithms. Facebook sets up the News Feed to prioritize postings by your own friends and stories you're most likely to engage with: "liking," sharing, clicking a link and so on. Those items are also most likely to be ones that affirm your current point of view.
"We love to find evidence that supports the opinions we already have," North said.
Criticism about Facebook's media shortcomings don't stop with news. As a publishing forum for all kinds of media, Facebook drew criticism that it fumbled sensitive subjects by removing materials that deserve to be seen.
When Illma Gore painted a nude portrait of Trump with a tiny penis, her posting of the work went viral on Facebook. Someone, presumably a critic, filed what's known as a DMCA takedown request, accusing her of illegally posting media that belongs to somebody else.
As the creator and owner of the painting, she holds the copyright. Yet the request led Facebook to remove her posting and temporarily ban her from the site. That shut down her access to the network where the greatest amount of discussion about her was taking place.
"It felt like I couldn't speak," she said.
As a company, Facebook isn't legally obligated to protect anyone's speech, but its community standards say the site should "give people the power to share and make the world more open and connected." Copyright complaints offer a tool with which postings can be removed almost immediately with little review, according to Corynne McSherry, legal director of the Electronic Frontier Foundation, a nonprofit dedicated to digital civil liberties.
Facebook doesn't provide numbers about takedowns, but it boasts a monthly audience quadruple the size of WordPress'.
Takedowns aren't limited to copyright violations. Facebook also removes media and bans users over violations of its community standards, sometimes provoking outcries of censorship.
That was Tom Egeland's experience, when the Norwegian writer in August posted an iconic photo from the Vietnam War. Facebook removed the picture depicting a Vietnamese girl fleeing naked from her napalmed village, citing child nudity.
Others posted the photo in solidarity, including the prime minister of Norway, only to see the image deleted again and, sometimes, their account banned as well. Facebook eventually recognized that a historically pivotal photograph isn't the same as child pornography, and it restored some of the posts (but not Egeland's).
"Facebook's primary problem and challenge is to separate valid breaches of their community standards -- like porn -- from a post that should not be removed," Egeland said in an email interview. "Facebook's second problem is that nobody listens to you."
McSherry at the EFF said Egeland isn't alone in his frustration. Her organization has noticed two recurrent shortcomings in Facebook's community standards. One is a lack of clarity about how standards are applied. The other is recourse.
"We hear over and over again, people don't know how to appeal...It can be days or weeks to get any kind of person to go back and say, 'Wait a minute; did we make a mistake?'" she said. "To have one company have so much power to decide what is and is not indecent is troubling."
Part two in this series addresses solutions Facebook could explore.