X

Google's double standard on user-generated content

Prescreening content uploaded to YouTube for copyright violations is impossible, Google argues constantly. However, it requires its AdSense partners to do just that.

Tom Krazit Former Staff writer, CNET News
Tom Krazit writes about the ever-expanding world of Google, as the most prominent company on the Internet defends its search juggernaut while expanding into nearly anything it thinks possible. He has previously written about Apple, the traditional PC industry, and chip companies. E-mail Tom.
Tom Krazit
4 min read

Updated 5:05 p.m. PST with comment from Google.

When it comes to user-generated content, Google has adopted a "do as I say, not as I do" policy.

Google's AdSense team attempted to sort out fact and fiction today but instead exposed a double standard.
Google's AdSense team attempted to sort out fact and fiction today but instead exposed a double standard. Google

The company's AdSense team sent out a reminder to its partners today that contained a few jaw-dropping statements about Google's policies on the content produced by AdSense partners. "You are responsible for ensuring that all of your content, including user-generated content such as forum posts, blog comments or outside feeds, is in compliance with AdSense policies on any page or site for which you've enabled AdSense ads," Google declared, using a bold font just in case anybody missed the sentence.

Those who have followed Google's legal battles with the media industry likely spit out their coffee when they read that statement. Google has long declared that it is not responsible for content uploaded by users to YouTube and other Google properties, even if that content violates the law.

That was the heart of its successful argument against a lawsuit filed by Viacom, citing "safe harbor" provisions of the Digital Millennium Copyright Act that protected Google against claims that YouTube users uploaded copyright content. That's what it told courts in Italy, saying there was simply no way that it could monitor every video uploaded to YouTube, where 35 hours of video are uploaded every minute.

In a presentation for the press (PDF) created specifically for the Viacom-Google case, Google wrote the following:

When YouTube says that copyright owners, not Internet video hosting services, should take primary responsibility for policing any misuse of their copyrights, it is saying exactly what Congress expected when it passed the DMCA. And that approach makes perfect sense: YouTube doesn't know which videos are among the thousands that Viacom and its stealth-marketers uploaded to YouTube, just like YouTube doesn't know which videos are among the countless videos that Viacom deliberately chose not to take down for promotional reasons.

When its AdSense partners were the audience, it wrote this:

Making sure content complies with our policies can be complex when factoring in user-generated content. Keeping tabs on the hundreds (or even thousands!) of videos, blog posts, photos, tweets, and comments that can come in every day is a massive undertaking. However, you are ultimately responsible for all sites on which you have placed your ad code, regardless of whether you own or have produced the content.

Of course, Google doesn't completely shirk responsibility for user-generated content uploaded to YouTube: it will remove content when it is flagged by the owner or other users. But later in the post, it uncorked this gem:

We understand that it can take time to find the best solution to prevent problematic content from appearing on your site and we want to work with you to give you the time needed to find a fix. You should ensure, though, that you are able to effectively screen all content proactively before choosing to monetize user-generated content. If it becomes evident that a publisher is unable to do so, or if the violation is continuing or egregious, then we will disable an account.

Now Google is perfectly within its rights to adopt this policy. The AdSense network (where Web publishers get a cut from clicks on an ad unit on their site containing text ads sold by Google) is a private network and if Google decides that subsidizing porn, piracy, or racism is not in its best interests, there's no reason why it has to do so.

But Google is holding its AdSense partners to a standard that it is incapable of meeting itself. Google has said time and time again that it can't proactively screen every user-generated video uploaded to YouTube because of the sheer volume of content that arrives every day. It has developed systems for identifying illegal or offensive content, but that processing takes place after the initial submission, not before.

So what's this really about? Google has not-so-subtly started to enforce stronger policies against copyright infringement, deleting references to torrent sites from Google Autocomplete and Google Instant and enforcing standards against piracy sites funded in part by AdSense. That's believed to be in anticipation of deals with content companies on music services and possibly a reversal of their stance on Google TV.

As part of that shift in December, Google vowed to "improve our AdSense antipiracy review," which likely prompted today's reminder on AdSense policies. Yet in that same blog post on its tougher antipiracy practices, Google stopped short of declaring that its own sites would "effectively screen all content proactively," as it requires AdSense partners to do, vowing instead to "reduce our average response time to 24 hours or less."

You can't help but roll your eyes. Would AdSense ads be allowed on YouTube?

Google did not respond to a request for comment.

Updated 5:05 p.m. PST: Google finally responded, saying that they don't require sites to prescreen content. Their statement follows below.

"All publishers on whose sites we serve "Ads by Google" -- including Google sites -- are required to comply with AdSense program policies. Our policies do not require any site to pre-screen all content; rather, the policy requires publishers to take steps to keep the pages on which ads appear free from problematic content (eg pornography) that violates any of our polices. There are a number of ways in which a publisher could achieve this goal without pre-screening content - as suggested here. As a responsible publisher, YouTube employs these sorts of tools."