X

Facebook lifts ban on decapitation videos

The social network will no longer remove graphic images or videos if members are sharing to condemn them.

Jennifer Van Grove Former Senior Writer / News
Jennifer Van Grove covered the social beat for CNET. She loves Boo the dog, CrossFit, and eating vegan. Her jokes are often in poor taste, but her articles are not.
Jennifer Van Grove
3 min read
CNET

Facebook has lifted a ban, implemented in May, that prevented images and videos depicting graphic content such as beheadings and other acts of violence from being published to the social network. The company is returning to a prior practice of not policing violent content that members share in condemnation of the depicted acts.

The social network officially made the about-face earlier this year, though the BBC first caught wind of the controversial reversal on Monday.

"Facebook has long been a place where people turn to share their experiences, particularly when they're connected to controversial events on the ground, such as human rights abuses, acts of terrorism, and other violent events," a company spokesperson told CNET. "People share videos of these events on Facebook to condemn them. If they were being celebrated, or the actions in them encouraged, our approach would be different."

In May, Facebook took a different tack and said it would delete violent videos reported by users. At the time, the company was responding to a backlash that surfaced after two videos depicting executions were passed around the social network. Facebook said then that it was still evaluating its policy around the contentious content, essentially pressing pause on the heated matter for the duration of the summer.

Facebook declined to say exactly when it lifted the ban on graphic content. Meanwhile, Facebook's Statement of Rights and Responsibilities, one of the two primary policy documents governing conduct on the social network, continues to state: "You will not post content that: is hate speech, threatening, or pornographic; incites violence; or contains nudity or graphic or gratuitous violence."

The social network now contends (once again) that graphic content such as beheadings are only a violation of its policy if the content celebrates violence or is being shared for sadistic pleasure. The Facebook spokesperson said, however, that the company is attempting to figure out the best way to give people control over the types of content they find on Facebook.

"Since some people object to graphic video of this nature, we are working to give people additional control over the content they see," the spokesperson said. "This may include warning them in advance that the image they are about to see contains graphic content."

Though Facebook had previously allowed the content, the lifting of the temporary ban will surely stir up strong emotions among many members, especially the parents of teens who would prefer to shield their underage children from such graphic material.

Update, October 22 at 4:34 p.m. PT: Facebook told CNET Tuesday that it has removed a video that depicted a beheading in Mexico, but that it has not reversed the policy of allowing graphic content. The company said it would look at content on a case-by-case basis to determine whether the context surrounding graphic images or videos makes the material appropriate for the social network. An updated company statement, which includes some of the same text as Monday's statement, is included here:

People turn to Facebook to share their experiences and to raise awareness about issues important to them. Sometimes, those experiences and issues involve graphic content that is of public interest or concern, such as human rights abuses, acts of terrorism, and other violence. When people share this type of graphic content, it is often to condemn it. If it is being shared for sadistic pleasure or to celebrate violence, Facebook removes it.

As part of our effort to combat the glorification of violence on Facebook, we are strengthening the enforcement of our policies. First, when we review content that is reported to us, we will take a more holistic look at the context surrounding a violent image or video, and will remove content that celebrates violence.

Second, we will consider whether the person posting the content is sharing it responsibly, such as accompanying the video or image with a warning and sharing it with an age-appropriate audience.

Based on these enhanced standards, we have re-examined recent reports of graphic content and have concluded that this content improperly and irresponsibly glorifies violence. For this reason, we have removed it.

Going forward, we ask that people who share graphic content for the purpose of condemning it do so in a responsible manner, carefully selecting their audience and warning them about the nature of the content so they can make an informed choice about it.