X

Facebook updates standards to explain what it will remove

The social network updated its content guidelines to more clearly explain what is and isn't acceptable on its service, as it balances balance free speech with removal of offensive content.

Luke Lancaster Associate Editor / Australia
Luke Lancaster is an Associate Editor with CNET, based out of Australia. He spends his time with games (both board and video) and comics (both reading and writing).
Luke Lancaster
2 min read

screen-shot-2015-03-13-at-1-14-33-pm.png
Facebook

Facebook has updated its community standards to clarify the content that people are and aren't allowed to share.

In an update Sunday, Facebook said it has rewritten much of the language of its community rules, offering "more detail and clarity on what is and is not allowed." The company said the revised community standards will offer more guidance for users about what is acceptable to post to the social network.

"These standards are designed to create an environment where people feel motivated and empowered to treat each other with empathy and respect," Monika Bickert, Facebook's head of global policy management, and Chris Sonderby, Facebook's deputy general counsel, said in the post.

Facebook's updated policies follow efforts by social networks across the Internet to respond to a rising tide of high-profile posts, Tweets and photos that have upset many users. Despite being a small sliver of overall activity, these events have come to typify the dramatic struggles some users face.

In the past year, feminist bloggers have faced torrents of harassment for a variety of issues. Zelda Williams chose to temporarily stop using Twitter after users sent her unsettling messages after it was announced her father, the famous comedian Robin Williams, had committed suicide. There was also the wide circulation of photos depicting the beheading of photojournalist James Foley. Each incident drew attention to how social networks police themselves.

The updated policy reiterates Facebook's stance against harassment and provides "more guidance on policies related to self-injury, dangerous organizations, bullying and harassment, criminal activity, sexual violence and exploitation, nudity, hate speech, and violence and graphic content."

Also included are sections on protecting intellectual property and account security, as well as "encouraging respectful behavior."

Facebook relies on user reports to deal with offensive or prohibited content, and based on the revised guidelines, the social network has no plans to change that system. "If people believe Pages, profiles or individual pieces of content violate our Community Standards, they can report it to us by clicking the 'Report' link at the top, right-hand corner," they said.

Facebook has also said that it may geographically target areas in accordance with laws of specific countries, even if the content doesn't violate the Facebook's standards. Citing the example of blasphemy, Facebook said "if a country requests that we remove content because it is illegal in that country, we will not necessarily remove it from Facebook entirely, but may restrict access to it in the country where it is illegal."

In addition to the updated guidelines, the post also noted government requests for both account data and content removal made to Facebook during the second half of 2014. There was an 11 percent increase from the previous six months in the amount of content removed due to violation of local laws, totaling "9,707 pieces of content restricted" from July through December 2014. Meanwhile, the number of government requests for account data remained fairly flat, rising to 35,051 from 34,946. Government requests are detailed here.