- HubPages»
- Technology»
- Internet & the Web»
- Search Engines
Google Weeds Out "Content Farms"
Content Farms Under Fire
February 25, 2011
This morning Google announced that they have updated the algorithms used to search web-sites in an attempt to remove spurious entries from content farms.
Algorithms: In the computer sciences an algorithms is a effective method represented by a list of well defined functions to perform a particular task or procedure.
In this case an algorithm is a set of instructions designed to weed out or exclude sites that have copied or low quality content from search results.
Content Farm: is a company that employs large numbers of writers to generate textual content. The aim is to satisfy search engine procedures (algorithms) in order to appear in the maximum number of search results; the aim to generate advertising revenue.
By Google's meaning of the term content farms have been specifically defined as sites that generate low quality content for the sole purpose of generating search engine hits.
As such HubPages falls well outside the bounds of such targeted sites.
Google Begins Filtering Results
In January Google expressed its desire to hide from search results, those sites that generated "shallow" and "low-quality" sites for the sole purpose of driving ad revenue traffic.
Today Google claims to have done just that with a new algorithm that will affect nearly twelve (12%) of all search results in the U.S, but pushing to the bottom of search results, those sites specifically designed to gain high ranking scores with questionable content.
“Many of the changes we make are so subtle that very few people notice them,” writes Google’s webspam team on the Official Google Blog. “But in the last day or so we launched a pretty big algorithmic improvement to our ranking—a change that noticeably impacts 11.8% of our queries—and we wanted to let people know what’s going on.”
Google claims that the update does not simply push the low quality sites to the bottom of lists, it also promotes "high-quality" sites “original content and information such as research, in-depth reports, thoughtful analysis.”
Personal Ad Blocker
If you are a regular reader you know that I covered Google's latest add-on to Chrome, Personal Ad Blocker. Of course these block require the participation of sites that actually generate these ads, so the effectivness of this extenion has yet to be proven.
Validation of Effort
What Personal Ad Block also does is report back to Google about those sites that most users would rather not have show up in search results. Though this extension is somewhat new, Google claims that it has already helped the company zero in on eight-four (84%) percent of the top 12 sites blocked by users of the Chrome extension.
To clarify, Google did not use the Chrome extenion to help filter out Content Farm sites, but did use the information to verify that their new algorithm is doing what it was designed to do; block "garbage" content from search results.
Careful Wording
Google has been careful not to use the term "content farm" in describing what this new procedure does. Rather the search giant prefers to use phrases such as “low-quality” and "questionable content" to describe the targets of the new procedures.
Though Google is careful in it's use of terms, Matt Curtis, Google's head spam-fighter says:
“I think people will get the idea of the types of sites we’re talking about.”
Examples of such sites include Associated Content and Demand Media. One of Demand Media's best known sites, eHow.com, routinely copies others content and presents it as their own.
Considering the number of returns on most search results, a twelve percent change (downward of course) is a huge number. For example searching for the term "MacBook Pro" returns sixty-four million hits in Google. A twelve percent reduction could reduce that number to fifty-two million or a reduction of twelve million hits.
Coda
To my way of thinking this is long (long) overdue. The shear number of hits from any search term indicates that there needs to be a real and concerted "house-cleaning" of sites that represent themselves as one thing, yet present another.
Google, for its part, has not, nor will it likely, share what the actual code behind the content filter is. This makes perfect sense. If sites that provide "low-quality" content know how the new algorithm works, they can then work to circumvent it and that will put Google right back at "square one" as to filtering spurious content.
This is such a serious problem, in fact, that if another search engine came along that had a better handle on the problem, it could quickly bump Google from its top spot as defacto standard search engine.
Will Bing, Yahoo, and the other search engine providers follow? It is likely, but it will take time. Once again Google is one step ahead of the competition in this effort.
Disclaimer
The author was not compensated in any way, monetarily, with discounts, or freebies by any of the companies mentioned.
Though the author does make a small profit for the word count of this article none of that comes directly from the manufacturers mentioned. The author also stands to make a small profit from advertising attached to this article.
The author has no control over either the advertising or the contents of those ads.