Hate speech on Facebook: How much is too much?
The release of a Jewish human rights center's annual report about "hate 2.0" coincided with a sticky situation for Facebook over Holocaust denial groups on the site.
NEW YORK--One of the most troubling things about the proliferation of hate speech on social media sites is the potential exposure to young people, Rabbi Abraham Cooper of the Simon Wiesenthal Center said here on Wednesday.
The Los Angeles-based Wiesenthal Center, a Jewish human rights advocacy group, had just released, which this year is focused on the proliferation of hate and intolerance on social networks. The audience consisted primarily of students from Manhattan's Independence High School who were enrolled in a class about genocide and ethnic violence and who had been invited to listen to the presentation and provide their reaction to the conclusions.
"As more and more people are going to MySpace, YouTube, and especially Facebook, the extremists...they're going to exactly the same neighborhood," said Cooper, who met with Facebook representatives in Palo Alto, Calif., earlier this year to voice concerns about the amount of content promoted by extremists on the social network.
The timing was especially apt considering the recent prominence in the news of, and the social network's insistence that such groups would only be removed if they directly advocated violence or threats. That, company representatives said, is what falls under a violation of the site's terms of service.
It's obviously an extremely contentious issue. TechCrunch blogger Michael Arrington posted a long entry last weekend accusing Facebook of hypocrisy for letting Holocaust denial groups to remain intact whereas all forms of nudity are 100 percent banned on the social network. Some commenters applauded his stance against Facebook, whereas others accused Arrington of "page view trolling" or argued that "allowing these groups to post in public places like Facebook makes it easier to create tabs on when merely speech (though appalling) turns into a push for violence against the hated group."
Facebook employee Ezra Callahan joined the debate, posting a long "note" on Facebook about why he supports the company's decision to leave some of the Holocaust denial groups intact. Callahan, who is Jewish (as is, he pointed out, company founder Mark Zuckerberg), wrote, "I find the mounting pressure on us to remove Holocaust-denying groups incredibly frustrating. I feel no shame at all working at a company that holds free speech as its core ideal in setting content guidelines, even if the end result is the occasional presence of content that I find personally outrageous and offensive."
"Silencing stupid people is not how you make stupid people go away. It's by pointing out how stupid they are and bringing those people into the light of day so everyone with a shred of common sense can see who they are and remember never to give them an ounce of respect in any aspect of life," Callahan wrote. "You do not combat ignorance by trying to cover up that ignorance exists. You confront it head on. Facebook will do the world no good by trying to become its thought police."
Callahan, as many commenters on his original post pointed out, may not be justified in saying that Holocaust denial groups that don't directly incite violence shouldn't be removed from Facebook. At the event Wednesday, Cooper suggested that when it comes to an atrocity on the scale of the Holocaust, anything promoting denial of its existence amounts to advocating violence.
But Callahan does have a point. If Facebook aspires to a culture of free speech, where should the line be drawn? There are a lot of fringe ideas and beliefs in religion, culture, and even academia that the Internet has allowed to bubble to the surface, from theories about 9/11 having been carried out by the Bush administration to environmental extremists who believe it's a moral and just act to vandalize Hummer dealerships. Many gay rights activists would say that some very mainstream religious denominations' views of homosexuality are tantamount to hate speech, and some animal rights activists would undoubtedly argue that a Facebook group for hunting fans serves to incite violence.
The issue also stands when it comes to comedic and satirical content on the Web. Should YouTube pull a clip from the movie "Borat" in which star Sacha Baron Cohen performs a "folk song" promoting the marginalization of Jewish people, because the three-minute clip doesn't explain that what appears to be a vicious anti-Semitic tirade is actually a satirical routine performed by an edgy Jewish comedian?
Cooper made it clear in his talk on Wednesday that there is no way to eradicate hate speech on the Web, bringing up a screenshot of a prominent white supremacist on YouTube who has been banned by the Google-owned video-sharing site over 60 times and keeps coming back.
"If they're spending all their creative time on hate, they will more often than not find ways to come back," he explained.
Cooper said that what's more important--and why the centerpiece of the announcement was the presence of a class of high schoolers--is education and awareness. The Wiesenthal Center distributed an "action plan" for parents that advocated tips like "make sure your child understands the difference between legitimate criticism or analysis or hate that seeks to rewrite history," and "communicate and challenge your kids: just because it's posted doesn't make it true or real.
Facebook might not be right in refusing to take down all Holocaust-denying content, and indeed, the social network is in a tight spot here. But here's where it's right on: A precedent could be set here that's dangerous at worst and annoying at best (when LiveJournal started purging its ranks of accounts that housed tawdry "Harry Potter" fan fiction) if Facebook doesn't handle the situation carefully. When it comes to the young and impressionable, deciding where to draw the line should be up to parents and educators, not the technology company that built the site that lets you "poke" your friends.