ArtsAutosBooksBusinessEducationEntertainmentFamilyFashionFoodGamesGenderHealthHolidaysHomeHubPagesPersonal FinancePetsPoliticsReligionSportsTechnologyTravel
  • »
  • Technology»
  • Internet & the Web»
  • Search Engines

What is Webmaster Tools and the Manual Action for Webspam Update?

Updated on September 9, 2013
Google Webmaster Tools - Manual Actions for Webspam
Google Webmaster Tools - Manual Actions for Webspam

How to Get a Webmaster Tools Account

What's Needed
A Gmail account
That Gmail account has to be associated with a webmaster tools product (link is located in the link section of this hub)

What is Google Webmaster Tools?

SEO can be confusing for a webmaster who has just been introduced to it. An explanation of what webmaster tools is and how it can be found is necessary to better prepare for their first use.

Webmaster tools can be defined as a service used by a website administrator (webmaster) to better understand how data has been interpreted by a search engine crawler.

The reports consist of several main categories, some with sub-categories. All data is used to assist with the analysis of a website.

With this data, a professional is able to analyze many aspects of a website. Back-links, search queries, crawl data, index status and HTML issues along with many other reports can all be accessed from this service.

Google has always tried to be upfront with the way it analyzes and processes content found on the web. But, as the web is essentially its own evolving ecosystem, the way data is analyzed today, will most likely change tomorrow.

This is the benefit of a Google webmaster tools account.

Google is never going to give away its algorithms, so no-one will ever fully understand the way content is analyzed. But, Google is more than happy to show how a website is performing and the issues that may be holding it back.

The webmaster tools being discussed in this hub is the Google based webmaster tools.

To access and use Google webmaster tools, all that is needed is a Gmail account. Once a Gmail account is created, visit the webmaster tools link in the links section of this hub.

To keep it brief, so a new webmaster doesn't become overwhelmed, we will review the main features of webmaster tools.

Webmaster Tools Features - Overview

Site Dashboard - A broad view of core statistics are shown. Site messages, crawl errors, search queries and sitemap data is displayed.

Site Messages - Although a quick view of messages is displayed in the site dashboard, the quick view only informs of unread messages. With site messages, all messages ever received can be saved for future reference.

Structured Data - Rich snippets inserted that mark-up website HTML. This helps a crawler better understand a pages content. This is best suited to inform search engines of reviews, upcoming events and any other important information that can be displayed underneath a search result. The schema markup vocabulary and other markup information can be found in the link section of this hub

Data Highlighter - An easy to use point and insert method of structured data markup. An alternative for someone unfamiliar with the webmaster markup version.

HTML improvements - Google will notify a webmaster of HTML issues that, if attended to, may help improve a websites performance and user friendliness.

Site-links - Site-links are other pages on a website. These are displayed underneath the homepage in a search result.

Search Queries - Phrases a website is being searched for and the average search result position is displayed.

Links to Your Site - The total number of links a crawler has found to a website and where they're coming from on the web will be presented.

Internal Links - Links on a page that link to another page on that website.

Manual Actions - A manual action from Google in a response to webspam found. This will be covered in detail in the next section.

Index Status - The amount of pages found by a crawler and how many of those pages have been processed. Information regarding the amount of indexed pages, pages removed, pages blocked by robots.txt as well as the total URL's ever crawled is accessible by clicking the advanced tab.

Content Keywords - As Google crawls a websites content, words that are mentioned more frequently are found and tallied together. These are called 'keywords'.

Remove URL's - Old webpages that are planning to be removed, or that are no longer of use to a website can be removed from the index using this tool.

Crawl Errors - If Google encounters errors while trying to crawl a page, it will be displayed here.

Crawl Stats - Important aspects of a websites crawling statistics is reported using a line graph. Pages crawled, kilobytes downloaded and time spent downloading a page are all shown in there own graph.

Fetch as Google - This is a good feature that allows a user to see the way a search engine views their website. After a URL has been fetched it can then be submitted for indexing. This can only be done a max of 500 times.

Blocked URL's - A view of a websites current robots.txt, the amount of URL's being blocked, last time downloaded and whether it was successful or not is displayed.

URL Parameters - Use this tool to tell Google how to handle specific URL parameters. If a website uses multiple URL's that lead to the same page, then duplicate content may become an issue. e.g A websites page is

awebsite.com/dogs.htm. But, when a sessionid= tag is used it becomes a different address.

awebsite.com/shop/index.php?&highlight=big+dog&husky_id=1&sessionid=123&affid=431

This is where you can set the parameter of the URL to use. Only use this feature if you are familiar with how parameters work.

Malware - If malware is detected while crawling a websites content, a notification will be displayed in this console.

Webspam is now Feeling The Weight of Googles Algorithms

Doof, Doof, Doof! Normally the sound of a 16 year olds stereo. Now, it's the customary welcome from Googles webspam update.
Doof, Doof, Doof! Normally the sound of a 16 year olds stereo. Now, it's the customary welcome from Googles webspam update.

What Triggers a Manual Webspam Action from Google

The main causes of a manual webspam action from Google
Unnatural links to a website
A website thats been hacked
User generated comment spam
Having a domain hosted with a spammy host
Content that is minimal at best with no substance.
Cloaking a redirection.
Hidden text
Keyword stuffing
General spam

Webspam Update - Manual Actions

As stated earlier Google webmaster tools has been around for quite some time now.

Features have been added, and some have been removed. At times there's even beta testing of products intended for launch that aren't yet ready for primetime.

In the end, every feature and aspect of the service has been to benefit a webmasters understanding of how their website is doing in search and the issues that need resolving.

The newest addition to the webmaster platform is the manual actions section.

This is the section where webspam that has been associated with a webmasters content will be displayed.

It's split into 2 sections.

Site-wide matches - If a website has spam that is associated with every page on that website, then a site-wide match is what will be displayed.

The term site-wide references the point that every page on that website has issues with spam.

Partial match - When spam has been found and it's only pointing to a few pages URL's, not the entire website, a partial match will be reported. Pages that are affected will be displayed.

If a website ends up with a manual spam action, whether it's a site-wide or partial match, other information regarding the moderation are also shown.

Reason - The reasoning behind the need for Google to place the manual webspam action in the first place.

Affects - Depending on what action has been taken on a website, the affects will vary. Nonetheless, how it's affecting the domain will be listed.

Some affects may include;

The inability to rank for certain phrases that the website used to rank well for.

If a partial match is found, then that URL may be blacklisted.

If a site-wide match is found, the entire domain may suffer and be blacklisted.

If a website does get penalized, after recovery, the websites rank in search results may drop. This may or may not be because of the manual action that was imposed.

The web is always changing. New content is uploaded every day. Maybe the website that was penalized naturally lost it's search result due to competition. Either way, once recovered it is best to stick to white hat SEO from now on. As next time, may be that websites last.

© 2013 Martin Heeremans

Comments

    0 of 8192 characters used
    Post Comment

    No comments yet.

    Click to Rate This Article