After being petitioned by women's rights groups, Facebook has agreed to revise its policy on hate speech.
Although hate speech is prohibited in Facebook's Community Standards, the social network is often criticised for allowing racist, sexist and homophobic content to remain on its site, as well as offensive pages about the deceased.
Last year, for example, Facebookfor refusing to take down a racist "Aboriginal Memes" page that surfaced on the site, in spite of ACMA involvement to investigate whether such pages constitutes a federal offence. The owner of the page ended up removing it himself with no intervention from Facebook.
However, the social network has now agreed to revise its current practices in response to a petition signed by a large number women's rights groups under the leadership of Women, Action and The Media, and The Everyday Sexism Project.
In an open letter to Facebook, the groups called upon Facebook users to ask advertisers whose ads appear next to violence against women pages to withdraw until Facebook takes action:
Specifically, we are referring to groups, pages and images that explicitly condone or encourage rape or domestic violence, or suggest that they are something to laugh or boast about. Pages currently appearing on Facebook include Fly Kicking Sluts in the Uterus, Kicking your Girlfriend in the Fanny because she won't make you a Sandwich, Violently Raping Your Friend Just for Laughs, Raping your Girlfriend and many, many more. Images appearing on Facebook include photographs of women beaten, bruised, tied up, drugged and bleeding, with captions such as "This b***** didn't know when to shut up" and "Next time don't get pregnant".
In response, the website issued a statement saying that it would make a more concerted effort to monitor hateful content — although it would not remove everything. "We work hard to remove hate speech quickly, however, there are instances of offensive content, including distasteful humour, that are not hate speech according to our definition," said Marne Levine, Facebook VP of Global Public Policy.
However, going forward, the website is planning to instate a number of new practices to minimise hate speech, including: updating the guidelines by which Facebook evaluates reports of hate speech; better training for the teams, including working with legal experts; requiring the creators of cruel and insensitive humour to include their real identity on these pages; and working with human rights groups, including the Anti-Defamation League's Anti-Cyberhate group, to both research the effects of hate speech and to ensure faster responses to reported content.
In short, Facebook will allow "humorous" hate speech to remain on the site, but intends to approach reported offensive content with more sensitivity.
"Facebook is strongest when we are engaging with the Facebook community over how best to advance our mission," Levine said. "As we've grown to become a global service with more than one billion people, we're constantly re-evaluating our processes and policies. We'll also continue to expand our outreach to responsible groups and experts who can help and support us in our efforts to give people the power to share and make the world more open and connected."