Sen. Joe Lieberman wants YouTube and its rivals to delete any videos produced by al-Qaida, other Islamic terrorist groups, and any suspected sympathizers. But because there's no U.S. law requiring deletion--at least not yet--there's not much the onetime veep candidate can do except complain.
On Monday, the chairman of the U.S. Senate's Homeland Security and Governmental Affairs Committee suggested in a letter to Google CEO Eric Schmidt that the company wasn't doing enough to remove videos that are violent or could be used by terrorist groups to enlist followers. "By taking action to curtail the use of YouTube to disseminate the goals and methods of those who wish to kill innocent civilians, Google will make a singularly important contribution to this important national effort," Lieberman wrote.
On the other hand, there's no national consensus on censoring YouTube, and there's actually evidence that U.S. spy agencies like being able to monitor what their adversaries are doing online. In addition, scholars routinely evaluate al-Qaida videos as part of their research, in much the same way as a previous generation researched Nazi propaganda (which has become part of a online exhibit by the U.S. Holocaust Museum).
Google, for its part, said its YouTube administrators had reviewed videos flagged by Lieberman's staff last week and pulled down an unspecified number, but only if they contained violence or hate speech. YouTube's community guidelines do prohibit a number of categories of content, including "graphic or gratuitous violence" and depictions of "bad stuff" like "bomb making."
"While we respect and understand his views, YouTube encourages free speech and defends everyone's right to express unpopular points of view," the company said on its official blog. "We believe that YouTube is a richer and more relevant platform for users precisely because it hosts a diverse range of views, and rather than stifle debate we allow our users to view all acceptable content and make up their own minds."
At the moment, legally speaking, Google and other Web hosts aren't required to censor what their users post. That's because of a piece of federal law known as Section 230 of the Communications Decency Act. Under that law, Web hosts are free to restrict access to content that it deems "objectionable"--if they choose--but in general they can't be sued for choosing not to do so.
"Under section 230, YouTube has no obligation to review this kind of content," said John Morris, an attorney for the Center for Democracy and Technology, an advocacy group that has received funding from Google. "The policy judgment that underlies Section 230 is that speech on the Internet--and indeed commerce and everything else on the Internet--would be radically harmed if sites had the responsibility to review every single bit of posting and content that their users put up there."
Of course, as a senator, Lieberman could try to rewrite Section 230, and there are some hints (and in tech-policy circles, much speculation) that he'll do just that. Without mentioning that law, Lieberman has hinted that he may want to create some sort of new exception to those existing rules, saying in his letter to Google that removal of such material "should be a straightforward task since so many of the Islamist terrorist organizations brand their material with logos or icons identifying their provenance."
Leslie Phillips, a spokeswoman for the Senate committee that Lieberman leads, told News.com on Monday afternoon that her boss found Google's response to be unsatisfactory and was troubled that the company "does not appear willing to change its guidelines to prevent foreign terrorist organizations (as designated by the State Department) from posting videos used to radicalize followers and incite them to violence." She declined to comment on the status of any pending legislation.
Concern about terrorists mobilizing through online venues is hardly a new concern for Lieberman and other senators, who held a hearing last year on the topic and recently released a report calling for the government to coordinate strategies for counteracting terrorist messages on the Internet.
If Lieberman were to try to seek to prohibit the distribution of certain videos through federal law, it would most likely run into First Amendment difficulties.
"Certainly that's an appropriate thing for Congress to do, to restrict financial support to terrorist organizations (though the State Department watch list)," CDT's Morris said. "It's quite a different thing to say that the ideas...that, for example, some people in this world do not like America, that those ideas are ideas that need to be censored in this country. It is anathema to what this country stands for."
Beyond that, such a policy could also be impractical on a few levels, Morris said. First, it may be difficult to determine whether terrorist organizations themselves are posting the content, and second, because intelligence agencies reportedly monitor terrorist activity online as a way of helping to track their activities and potentially prevent attacks.
Eric Goldman, an assistant professor of law at Santa Clara University, noted that YouTube censorship isn't exactly a new idea: a non-binding resolution introduced in the House of Representatives last year (that got stuck in committee) called on user-posted video sites to do precisely that. "So what if Google/YouTube suppressed these videos?" Goldman said. "They would still be available online somewhere, so why do politicians care if they are hosted on Google/YouTube vs. somewhere else? What a silly PR stunt by Lieberman."
News.com's Declan McCullagh contributed to this report