YouTubeAuvinen's videos and profile page from the site within hours of the murders. While many believe that's a proper and respectful response to the violence, some observers wish YouTube had allowed the videos to stay on the site so users could gain a better understanding of what happened.
"Yes, many people would find those videos disturbing," said Sonja Baumer, a postdoctoral researcher at the University of California at Berkeley. But, she cautions, "how is the public going to learn what led him to do this? The public is curious and it should have the right to respond. What are we supposed to do, just forget about it until the next time it happens?"
Baumer's point is that the videos posted to YouTube by Auvinen, who went by the username Sturmgeist89, may offer clues as to why he allegedly gunned down six of his classmates, the school nurse, and its headmistress.
Baumer was studying political participation by young people on YouTube prior to the shootings. While studying groups that were proponents of anti-immigration, she stumbled upon on some radical right-wing YouTube users with loose connections to Auvinen.
Hours before the shooting, authorities say Auvinen posted a video to YouTube that revealed his plans. Some of the other 88 clips he uploaded shed light on what Finnish police called Auvinen's "radical beliefs." A YouTube spokeswoman declined to comment.
In fairness, YouTube is in a no-win situation. Leave the videos up, and the company is accused of insensitivity. Take them down, and it's accused of censorship, regardless of the unpleasant nature of the material.
The world's most popular video-sharing site now finds itself in the middle of a very heated debate over issues of free speech, censorship, and whether the site is responsible for spotting criminals. Noel McNamara, president of the Crime Victims Support Association, was quoted Thursday in The Age, an Australian publication, saying he believes YouTube should filter for criminal or potentially criminal material.
This, according to John Palfrey, executive director of Harvard Law School's Berkman Center for Internet & Society, is too much to ask for any community site.
"The squeeze is definitely on YouTube," Palfrey said. "But it's nearly impossible to filter information as it goes online--whether you're filtering for copyright violations or hate speech. The only hope (YouTube) has, once they are notified by technology or human beings, is to take a look at something and then to take it down. I think you can require them to act responsibly after they know something is on the site."
Internet video allows anyone to broadcast to millions, but there isn't any regulatory body to supervise. According to Palfrey, YouTube must judge what is acceptable content, and is therefore thrust into the role of being the site's Federal Communications Commission (FCC).
"TV is now on the Web, and it conveys the same sort of power and impact," Palfrey said. "Yet there is no FCC or someone saying you can't show that. YouTube finds itself in the same position as a government regulator. It must make judgment calls in the same way the FCC draws the line for broadcast media."
According to reports, some of Auvinen's videos espoused violent, white supremacist, and neo-Nazi ideas. Many people are asking why this material was allowed on YouTube. Auvinen owned a previous YouTube account under the username NaturalSelector89, but was booted from the site for violating its user agreement, according to reports. He simply created a new account under the name Sturmgeist89, which means "storm spirit" in German.
Auvinen had more than 300 YouTube subscribers--meaning each time he uploaded a video they would automatically be alerted--and a large number of them shared his views, according to Baumer. "They would subscribe to each other's videos," she said.
CNET News.com reviewed clips from some of those connected to this group. The material contained montages of Nazi historical footage, endorsements of, known as the "Oklahoma City bomber," and anti-Semitic slogans. Some featured still photographs of dead bodies and people firing guns.
One member of the group wrote this entry on his or her profile page: "Just another white prophet spreading white pride. Formerly (deleted) that was apparently banned for spreading racial realism, and the truth of white pride. I guess some people just 'can't handle the truth.' Back with a vengeance, hope to spread the message even further."
The user uploaded videos to YouTube with such titles as "Multicultural Nightmare," "Secure the Existence of the White Race," and "White Pride." None of these terms-of-use agreement, which bans "pornography, obscene or defamatory material."YouTube's
To many, such materials are hate speech and should be removed. To others, removing them is censorship. Palfrey said he fully supports free speech, but as someone who helps run a blog network at Harvard, he frequently strikes material that he considers inappropriate.
He thinks the YouTube community, like any other, must adopt social norms that govern how people act.
Baumer concurs, but believes it should be left up to the YouTube community to decide what's appropriate. She said there are ample signs the community will make smart decisions.
Shortly after the shootings, dozens of people began logging on to the profile page of someone calling themselves NietzscheanSpirit, who had voiced support for Sturmgeist89's views, Baumer said. Most of the people who commented condemned both.
A search for NietzscheanSpirit's profile on Thursday showed that it had been closed by the account holder.
"I think there should be more of a dialogue between YouTubers," Baumer said. "Who at YouTube is going to make the decisions about what is appropriate? YouTube is not just a company but also a community of users. And they should be allowed to negotiate criteria for censorship."