X

Wiesenthal study details online hate, terror groups

New study from the Simon Wiesenthal Center finds that social networks are more popular then ever among hate and terror groups.

Lance Whitney Contributing Writer
Lance Whitney is a freelance technology writer and trainer and a former IT professional. He's written for Time, CNET, PCMag, and several other publications. He's the author of two tech books--one on Windows and another on LinkedIn.
Lance Whitney
13 min read

An online game that lets you bomb victims of the Haiti earthquake. A Facebook group that compares gays to rats and vermin. An eBay auction selling rings supposedly taken from concentration camp prisoners during the Holocaust.

These are just a few of the Web sites and pages uncovered in the latest study of hate and terrorism on the Internet by the Simon Wiesenthal Center.

The report "Digital Terrorism and Hate 2010," released last week, highlights a growing number of Web sites and social networks used by people propagating hateful, racist, or terrorist ideas and activities. In combing through the Web, the center uncovered about 11,500 different sites, networks, and forums that it categorized as hateful or terrorist--20 percent more than found in last year's study.

Those 11,500 may just be a fraction of the actual number of sites and pages, acknowledges the Wiesenthal Center. And it's not just the quantity of sites that alarms the center--it's the trends found among them.

Just as social networking and online video sites have boomed with the general online community, they've become popular as tools for those spreading hate and fear. Facebook is home to a variety of people and groups that urge violence against minorities and certain religions. Sites like YouTube and LiveLink display videos that purportedly show you how to create a binary explosive, such as the type used by "shoe bomber" Richard Reid in 2001 and "underwear bomber" Umar Farouk Abdulmutallab in December.

The Los Angeles-based center also found that the Internet is a growing factor among the new "lone wolf" terrorist. American born and bred Colleen LaRose, better known as Jihad Jane, was reportedly sucked into the world of terrorism by watching YouTube videos and immersing herself in online chats devoted to jihadism. She also set up her own MySpace page to advance and spread her beliefs.

The Wiesenthal Center has produced a report on Internet hate for the past 12 years, though after the attacks of September 11, 2001, the focus changed to include terrorism as well. The full report is distributed on CD-ROM to law enforcement officials and government agencies. It is not released to the general public because of the sensitive material it contains, such as the instructional videos on how to build a bomb.

The center conducts its own research into the online world but encourages people to report any hateful or terrorist Web sites they find by using the e-mail address ireport@wiesenthal.com.

I recently spoke with Rabbi Abraham Cooper, associate dean at the Wiesenthal Center, to learn more about the latest study and get his views on hate and terrorism on the Net.

Q: Can you tell me about the content you cover in your latest study?
Rabbi Abraham Cooper: In our report, we look at two universes: the hate groups, and there, of course, we try to work with companies and communities to get this stuff thrown off the Net if possible, and the terrorism stuff, in which our focus is not to get it off the Internet but to make sure the right people in positions of responsibility and power are seeing the same things so they can track the bad guys and interdict them. It's the same sort of research tool, but it's two different universes.

If we're talking about hate speech, we're less interested in getting government intervention. I don't want a U.N. protocol written about what can get on and not get on the Internet because that's just likely to make the people who run China and Cuba feel more comfortable. That's not our goal at all. When it's speech, we're looking for voluntary activism on the part of the Internet community here in the U.S., along with the rules that other democracies have to try to marginalize the online hatred.

In the real world, you can't eliminate hate with laws. You can't eliminate hate with laws online either. So the idea is to try to get good citizenship online and get positive messages on the Internet. But as you well know, it's not the same as in the non-Internet world where the answer to hate speech is more speech. You can put up 20 Web sites with positive games for kids. But if they think it's cool to use a jet fighter to bomb victims in Haiti, they're going to play the game. So we do have that kind of limitation.

But when you have the behavior that sometimes will find its way onto Facebook or YouTube or LiveLink, in which the issue is not speech at all but is really a matter of promoting a culture of death and backing and sometimes facilitating terrorism, that needs a completely different aggressive zero-tolerance approach on the part of anybody involved with the issue.

What trends have you seen in the types of content online?
Cooper: I think there are a couple of important lessons that did emerge. One is quite shocking. And it seems that every other day we hear about another...Jihad Jane or the Holocaust Memorial Museum shooter. And looking for common denominators, we can say that the Internet definitely plays a role in incubating and validating and promoting and sometimes actually instructing the individuals who want to go the extra mile to fulfill their beliefs and actually act them out.

Before September 11, this project was called "Digital Hate." On September 12, 2001, it became "Digital Terrorism and Hate." In the early years, the traditional hate groups--the KKK, neo-Nazis, and the like were obviously thrilled, and we were very concerned, because they had access to this brand new spectacular marketing tool and the states couldn't find any laws to stop them.

But what the groups found out fairly quick was that [the Internet] was not their ticket to mass movements. It did give them more exposure. They were able to sell more of their products, maybe make some money. They were able to lash out against their targets and maybe spawn some hate crimes, which in fact did happen in the U.S. against Jews and gays and others. But the mass movements failed

And many of these groups, including some of the more radical militias that would disappear with 2001, started saying: "Don't join the group." With law enforcement and human rights groups infiltrating these groups, they found it doesn't make sense for them to get together. Instead, [they would say:] if you're a true believer like we are, go to this Web site where you can teach yourself how to deploy like a terrorist. And that was actually this lone wolf concept that was posted and nurtured by our domestic bigots and racists initially. And what we see more recently is it's being embraced and co-opted not by the lunatic fringe in this country...but by the Islamists and the people committed to presenting an existential threat to our cities and our very lives.

A couple of hours after the underpants bomber was captured and his name came out, you saw his picture and the background of the plane being posted by the al-Qaeda types all over the world to say he was a great hero. And we have no idea if the person or group who posted the live link to the binary explosions has a political agenda. But this notion of sharing that kind of information--how you, too, can help vaporize something or someone is extraordinarily distressing. And for law enforcement and the people trying to contain the terrorist threat, it must give them terrible headaches and challenges. How do you combat the long wolf stuff?

Simon Wiesenthal Center

Your report mentioned social networks as a new outlet for hate groups. What are you seeing on sites like Facebook and other networks?
Cooper: Our last year's study actually mentioned Facebook in the report, but there's plenty of Facebook material in 2010. We give high marks to Facebook. They're 400 million plus people and counting. We've sent our senior researcher to lecture to their people at their headquarters. I've been up there two or three times. I think they actually have the right rules in place. The numbers, and the languages, and the scope of what's available is so stunning that it is a huge challenge to them. So I give Facebook good grades for trying to do the right thing and trying to upgrade its own monitoring.

On the other hand, when I talk about a company like eBay, that's not a matter of speech either--they're just in it to make a buck. They're very honest about that. In the case of eBay, there's...an ad for an SS Deaths Head ring that's being auctioned, probably in violation of its own rules. Then we had a guy who's clearly a neo-Nazi by virtue of his own posting that has the number 88 for HH, which is Heil Hitler, in which he claims that his uncle was posted at Dauchau, took rings from victims in the camp, and here's one he's selling. And we e-mailed eBay and went through the usual way of communicating with them. Never heard back and only got results because the Jerusalem Post captured the page and put it in its newspaper. And a reporter from the Pittsburgh Post-Gazette found a human being to speak to. So they dropped that auction.

eBay ad selling rings taken from concentration camp prisoners.
eBay auction selling ring supposedly taken from a concentration camp prisoner. Simon Wiesenthal Center

I think some of these companies can do a lot better just to do their share. If the Internet is one big shopping mall, they can do a lot more than they are currently doing. And that sometimes means investing a little bit more in human resources at their end. We're ready at our end to be a facilitator for the community when people send us stuff that they think is problematic. We have the address ireport@wiesenthal.com. But it would be very helpful if companies actually had human beings who were not merely e-mail addresses but could be spoken to in order to help facilitate corporate decisions without us having to run to the media every time to get their attention.

How can the social networks and Web sites distinguish between content that's legally protected and content that should be stopped?
Cooper: You can do it one of two ways. We can either do it like they do in Canada and the U.K. and Germany and Australia where they have traditions on where they draw the line on speech, and we're going to apply those pre-Internet rules to any Web site that we have in our domain. The approach that we try to take with the U.S. stuff is that we come to you and show you what the problem is.

For example, with Facebook, we don't agree with all of their decisions when it comes to speech. They still allow presentations of Holocaust denial, which they consider to be speech. And we say in this era, we think you're wrong. We think this stuff could lead to violence and hate crimes and denigrations. They said, "We have the rules. We hear what you're saying. But in this particular case, we don't agree with you." We understand the rules of engagement. We're not always going to be in agreement. But at least in the case of Facebook, there are rules. It's their rules. And you try to deal as best you can to get a lot of the bad stuff off.

On terrorism-related stuff, that should take any company a nanosecond to say: "I'm getting this to the local law enforcement or FBI or Homeland Security." This doesn't have any place in our solar system. So I've tried to recast your question as a speech issue to one of community responsibility of a for-profit company. And when you're dealing with terrorism, it's the post-911 world, and there we have a whole different approach. The unseen 24/7 battle that's going on out there in terms of terrorism is the Internet and all the technologies thereof.

I was [recently] at UCLA for the eighth memorial for Daniel Pearl, which his parents sponsor each year. When Danny Pearl's horrible execution was shown, it didn't come to CNN. It was released in the first 24 hours on four different Web sites. Later on, we saw that each of those Web sites was based in the United States. That ain't speech. That's promotion of terrorism. And on that kind of discussion, I would say I hope there's no one out there who would say this is protected speech that should be able to be posted. No, that is terrorism related.

Also, during the height of the insurgency battles against the U.S. and coalition forces in Iraq, al-Qaeda in Iraq and every single one of these groups in addition to putting on their roadside bombings and suicide attacks on hotels, they would capture all of their activities and post them online, almost in real time. So the part and parcel of the terrorist is now the Internet. It's as important to them about how they're going to broadcast the beheading of Daniel Pearl as it is to pick out the people who are actually going to kill him.

So for us, the two universes couldn't be more distinct. Debates about who's a terrorist? Yes. Promotion, fund-raising, recruitment, validation of terrorism? No. That belongs in the hands of the people who are there to protect us.

How was this study actually done? How were all these sites and content uncovered?
Cooper: Most of our work is done the old-fashioned way. We deploy about 70 percent of our research hours during the year to being online. That is where the action is on both the hate and the terrorism. Having said that, we have a universe of about 400,000 constituent families. So in theory, that could be a million computers. And then at Wiesenthal.com, we have about 300,000 folks who get our e-mail blasts. When you combine those figures, it means a lot of our leads come from the public.

The hate game on Haiti came from somebody--we did not see it. Some of the overseas stuff comes from the community at large. And then we have offices in South America and in Europe. And we have the capability, even though it only scratches the surface, to see what goes on in Arabic in terms of terrorism. That's also a two-way street. It's not just looking at what's in Arabic about terrorism. But it also gives us an insight into Islamophobia--how another community feels that itself is under attack. And that also comes under the rubric. In the online world, there's plenty of Islamophobia out there.

The other point is that we made a decision from day one that we do not look exclusively at anti-Semitism. It doesn't make any sense because the racists and the bigots are increasingly using the same sort of language, the same sort of symbolism, sometimes the same games. So for us, it's always been that if we're going to do this, we have to try to try our best to do this across the board. And it's not easy because we're only one private organization. But we're very proud of our team.

Looking at the content for this latest study, was there anything you saw that surprised and shocked you?
Cooper: We're already worried about the seepage of the poison from the Islamists to our domestic bigots. But to come to the conclusion that the reverse has happened with the lone wolf is to me stunning and shocking because it proves yet again how much attention and effort the terrorists are giving to analyzing what's going on over the Internet. They're on the cutting edge of this technology. They're looking at everything. And whatever the Internet is prepared to give them, they're going to take and co-opt. And they're always sort of half a step ahead of the good guys. So that was the most unexpected result of what we saw.

I think the other part is how much of this stuff is really oriented toward kids, or spreading the hate toward kids. The Haiti Hate game is the only thing that gave me a visceral reaction in the game area. The other one that's still around is a game where you have to blow up as many people as a suicide bomber. That's been around for four or five years. I have a lot of interaction with young people, and inevitably, it's always the same. The adults in the audience are shocked out of their minds. And then you ask the kids how many of them have seen this stuff. [And the answer is] everyone.

Online game that encourages people to bomb victims of the Haiti earthquake.
Online game that encourages people to bomb victims of the Haiti earthquake. Simon Wiesenthal Center

We have to remember it's still a youth-driven and -oriented culture. And the bad news is that the bad guys know that and focus their creative attention on trying to win over young people to their world view. And I think that's a very corrosive thing in our world where you don't need mass movements to change history. That is a tremendous challenge to try to make sure the hate doesn't play out in dramatic ways. It means that parents have to actually talk to their kids.

Forget about blocking mechanisms because kids can disable that in two seconds. You have to get the young people involved. You have to explain to them what the Net is and isn't. The most important lesson to adults: never use the Internet as a babysitter. If we want to protect the young people, we have to inform and empower them.

We also need to develop a consortium approach. We're going to need the collective attention of the geniuses giving us the Internet technology. They have to have a small piece of their operation to look at these issues. They can't make believe it doesn't exist. We have to be able to work with governments overseas. The U.N. and collective protocols written about the Internet is absolutely not the answer to this. It's not the answer to the hate. We don't want to make dictatorships that are deep into censorship more comfortable with the Internet. We want to make them feel less comfortable as a result of the Internet.

Our only hope is we need the companies to be more directly involved, from their business model and their community. They have to take these issues into account.