X

Can Facebook overcome its darker side?

The social network faces controversies over livestreamed violence, murder and now revenge porn. The challenge lies in balancing safety and free speech.

Richard Nieva Former senior reporter
Richard Nieva was a senior reporter for CNET News, focusing on Google and Yahoo. He previously worked for PandoDaily and Fortune Magazine, and his writing has appeared in The New York Times, on CNNMoney.com and on CJR.org.
Richard Nieva
4 min read
facebook-f8-2017-0154.jpg

Facebook CEO Mark Zuckerberg is dealing with controversies over revenge porn and livestreamed killing.

James Martin/CNET

Facebook CEO Mark Zuckerberg, with a nearly religious zeal, has said for years his mission is "making the world more open and connected."

But in the last few weeks, that more open and connected world has led to headlines about violence, murder and, most recently, revenge porn shared on his social network. People have used Facebook Live, the company's video livestreaming service, to broadcast murder over the internet. Meanwhile Facebook is fighting the problem of revenge porn, defined as nude or near-nude photos shared in order to "shame or embarrass" someone.

In April, Zuckerberg even had to take up stage time during his most important speech of the year, at the company's annual F8 developer conference, to offer condolences to the family and friends of a Cleveland man whose killing was shown in a video posted to Facebook.

Over the weekend and on Monday, a pair of stories from The Guardian delved into how Facebook handles illicit content on its site. One story detailed Facebook's manual for moderators who review posts flagged as objectionable. The manual gives specific examples of what's OK and what's not. For instance, photos of animal abuse or bullying of children can reportedly be shared, while calling for an assassination cannot. Another story on Monday said Facebook saw 54,000 cases of potential revenge porn in just a single month.

Taken together, the controversies give a us glimpse into the sheer power and challenge Facebook has: deciding what its nearly 2 billion users can and can't see and figuring out how to make those decisions. Facebook doesn't get specific about how many flagged items it reviews a week, but the company has said it's "millions."

Facebook has come a long way from the tiny website Zuckerberg created in his Harvard dorm room 13 years ago. In the last year, the social network has been criticized for the rise of fake news, which detractors argue tipped the scales in the US election toward President Donald Trump . The site has also been blamed for "filter bubbles," which some argue distort people's view of the world because mostly everything they're fed on Facebook aligns with what they already think.

"The hard part is balancing the goal of being a social media platform -- letting people communicate with each other in a wide variety of ways -- without poisoning the well," said Larry Downes, project director at the Georgetown Center for Business and Public Policy.

Facebook says the biggest "hurdle" is understanding the context of different posts.

"It's hard to judge the intent behind one post, or the risk implied in another," Monika Bickert, head of global policy management at Facebook, wrote in a blog post Tuesday. "Someone posts a graphic video of a terrorist attack. Will it inspire people to emulate the violence, or speak out against it? Someone posts a joke about suicide. Are they just being themselves, or is it a cry for help?"

'Nowhere near prime time'

To curb the problem, Zuckerberg said earlier this month, Facebook will add 3,000 content moderators by the end of the year -- on top of the 4,500 it already has on the job around the world.

Experts say it's a good start. But it's only a Band-aid for the problem.

Eventually, Facebook wants software to do the work, with artificial intelligence to decide what stays and what goes before posts spread across the internet. But even Zuckerberg admits it's going to take time, saying earlier this month we shouldn't hold our breath. "That will take a period of years, though, to really reach the quality level that we want," he said.

"It would not surprise me that that we're really nowhere near prime-time level," said Jennifer King, a Ph.D candidate at the UC Berkeley School of Information. King co-taught a class on the effects of communicating on computerized platforms. "It's a really tough problem."

She should know. Before Berkeley, she worked at Yahoo from 2002 to 2004, helping moderate Yahoo Groups, which she describes as working "at the front lines of what I call the toxic waste of the internet." The software had a hard enough time identifying content in photos, King said. Videos make the problem even trickier.

Having worked on the ground to battle objectionable content, she's sympathetic to Facebook's burden in having to sort through the millions of posts. But she also said Facebook and Zuckerberg should have thought more about how people would abuse services like livestreaming.

"I don't think there's any excuse in 2017 to say 'Let's just throw it open and see what happens,'" King said. "We know what will happen. It will be bad."

Originally published May 23 at 5:00 a.m. PT.
Updated 9:51 a.m. PT: Added a quote from Facebook's blog post on community standards.

Tech Enabled: CNET chronicles tech's role in providing new kinds of accessibility.

Special Reports: CNET's in-depth features in one place.