X

Google algorithm change tackles content copying

Company says that with tweak, search results are more likely to show sites that produced original content, rather than sites that scraped or copied that content.

Sam Diaz Senior editor, ZDNet
Sam Diaz is a senior editor at ZDNet. He has been a technology and business blogger, reporter and editor at the Washington Post, San Jose Mercury News, and Fresno Bee for more than 18 years.
Sam Diaz

In a blog post last week, Matt Cutts, head of Google's Webspam team, wrote about the progress the team has made in reducing the amount of spam in search engine results. In that post, he hinted at some changes in the works to push spam levels lower, including one that affects sites that copy content from other sites, as well as those that have low levels of original content.

Clearly, there's a blurry line there--or a "slippery slope," as Larry Dignan referred to it in his own post that waved some red flags over how the quality of a site would be judged.

On Friday, Cutts posted an update to last week's post on his own blog, announcing that one specific change to the algorithm was approved at the team's weekly meeting and that it was launched earlier this week. In his post, Cutts explains:

This was a pretty targeted launch: slightly over 2 percent of queries change in some way, but less than half a percent of search results change enough that someone might really notice. The net effect is that searchers are more likely to see the sites that wrote the original content rather than a site that scraped or copied the original site's content.

When you're a search engine that processes billions of searches, small percentages equal big numbers--so, for Google, this is still a pretty significant change.

Read more of "Google algorithm change tackles content copying?" at ZDNet's Googling Google.