X

In matters of life and death, how should Facebook decide what you see?

Live streams of shootings in Minnesota and Dallas raise questions about the social network's responsibilities.

Joan E. Solsman Former Senior Reporter
Joan E. Solsman was CNET's senior media reporter, covering the intersection of entertainment and technology. She's reported from locations spanning from Disneyland to Serbian refugee camps, and she previously wrote for Dow Jones Newswires and The Wall Street Journal. She bikes to get almost everywhere and has been doored only once.
Expertise Streaming video, film, television and music; virtual, augmented and mixed reality; deep fakes and synthetic media; content moderation and misinformation online Credentials
  • Three Folio Eddie award wins: 2018 science & technology writing (Cartoon bunnies are hacking your brain), 2021 analysis (Deepfakes' election threat isn't what you'd think) and 2022 culture article (Apple's CODA Takes You Into an Inner World of Sign)
Joan E. Solsman
4 min read
Stephen Maturen, Getty Images

This week brought horrifying realities to your newsfeed.

For 10 minutes on Wednesday, Facebook Live brought millions of users next to Philando Castile after he was fatally shot by a Minnesota police officer during a traffic stop. His girlfriend, Diamond Reynolds, livestreamed the aftermath of the shooting, allowing viewers to watch the bloodied Castile slump over in his seat. They saw his breathing slow. They heard her 4-year-old daughter, who witnessed the shooting, try to comfort her mother after Reynolds screamed and started to cry.

Facebook Live captured another deadly scene the next day in Dallas. Michael Kevin Bautista streamed live video of cops crouching while one or more snipers fired on them amid a peaceful demonstration against police violence.

Diamond Reynolds live-streamed Philando Castile moments after he was shot by a Minnesota police officer.

Diamond Reynolds live-streamed Philando Castile moments after he was shot by a Minnesota police officer.

STF/AFP/Getty Images

Both the Castile and Dallas videos were initially streamed unedited and uncensored. The Castile video temporarily disappeared from the social network because of a "technical glitch," according to Facebook. It was restored later with a warning about its graphic nature. Bautista's video was viewed more than 3 million times over about 10 hours before Facebook added the same warning. Such a warning prevents clips from automatically playing and requires viewers to opt in to watch.

Facebook's treatment of the broadcasts highlights its dilemma as it increasingly distributes the news -- often generated by users like Reynolds and Bautista -- to its 1.1 billion users. Facebook prioritizes live video in your newsfeed but doesn't have to reveal editorial transparency like a traditional news outlet does.

Whatever was behind the removal and reappearance of Reynolds' video, "those were editorial choices," said Emily Bell, a professor at Columbia's Graduate School of Journalism and the founding director of the Tow Center for Digital Journalism. "Whether it was done automatically or manually, it doesn't matter."

"This responsibility was going to be handed to Facebook whether they wanted it or invited it, or not," she said.

Nearly half of US adults get news from Facebook, according to a Pew Research Center study conducted this year. In 2013, 47 percent of all US users said they got news on Facebook; this year, it was 66 percent.

Despite that growing impact, people are "deeply confused about whether algorithmic ordering or human systems are bringing up what they see," Bell said.

Openness

That confusion erupted in controversy in May. Workers involved with Facebook's influential "Trending Topics" section were reportedly instructed to have bias against conservative publications. The reports shattered a presumption that the listing was a purely data-driven ranking. Facebook's opaque policies around Trending Topics -- holding workers there under nondisclosure agreements, for example -- made some suspect the company had something to hide.

"If they're really going to start having more news, they're going to be faced with the question about whether they need to have a more editorial role," said Karen North, a professor at the University of Southern California's Annenberg journalism school and the director of its digital social-media program.

Snapchat, for example, has a news division headed by a former CNN political reporter who curates users' posts into "Live Story" collections. These focus on a news event, like the presidential campaign or the Orlando, Florida, mass shooting. Facebook doesn't have anything comparable, North said.

Facebook declined to elaborate on the glitch that removed Reynolds' video. When asked how the company applies editorial judgment, a spokeswoman pointed to its community standards on graphic and violent content. The company doesn't proactively monitor for violating posts, she said. Instead, it relies on users to flag content, which is then reviewed by staff trained in Facebook's policies. Staff will decide whether to add the warning screen to graphic content that otherwise meets Facebook's standards.

Minneapolis NAACP President Nekima Levy-Pounds leads a chant of "Hands up, don't shoot" outside the Minnesota governor's mansion one day after Philando Castile's death.

Minneapolis NAACP President Nekima Levy-Pounds leads a chant of "Hands up, don't shoot" outside the Minnesota governor's mansion one day after Philando Castile's death.

Stephen Maturen/Getty Images

Facebook's guidelines indicate it takes responsibility for judging content. "We remove graphic images when they are shared for sadistic pleasure or to celebrate or glorify violence," the rules state.

Without explicitly protecting content that serves the public interest, the guidelines define Facebook as a place to share experiences and raise awareness. They note that depictions of violence, such as human rights abuses or acts of terrorism, can be shared as a means of condemning them.

On Friday Facebook explained how its Community Standards apply to Facebook Live. The company said it has a team on-call 24-7 to review flagged videos. It also gave an example of how the standards work.

"One of the most sensitive situations involves people sharing violent or graphic images of events taking place in the real world," the company wrote. "In those situations, context and degree are everything. For instance, if a person witnessed a shooting, and used Facebook Live to raise awareness or find the shooter, we would allow it. However, if someone shared the same video to mock the victim or celebrate the shooting, we would remove the video."

The arrival of live video on social media echoes the change in TV news in the 1960s, when Americans first began to see vivid footage of soldiers fighting in Vietnam and riots on the streets at home. Just as those images provided insight into the horrors of war and racial conflict then, this week's videos depict violent racial tension in the US. Facebook now livestreams it to a global audience of unprecedented size.

Facebook co-founder and CEO Mark Zuckerberg has been outspoken about the need to provide a place for stories like Castile's.

"While I hope we never have to see another video like Diamond's, it reminds us why coming together to build a more open and connected world is so important -- and how far we still have to go," he said Thursday evening in a Facebook post.

The company will eventually have to decide if its dedication to openness extends to its own editorial standards.

Update, 6:31 p.m. PT: Late Friday afternoon Facebook posted a note on how its Community Standards apply to Facebook Live. This story has been amended to include that information.