ArtsAutosBooksBusinessEducationEntertainmentFamilyFashionFoodGamesGenderHealthHolidaysHomeHubPagesPersonal FinancePetsPoliticsReligionSportsTechnologyTravel

What is Webmaster Tools and the Manual Action for Webspam Update?

Updated on September 9, 2013
Google Webmaster Tools - Manual Actions for Webspam
Google Webmaster Tools - Manual Actions for Webspam

How to Get a Webmaster Tools Account

What's Needed
A Gmail account
That Gmail account has to be associated with a webmaster tools product (link is located in the link section of this hub)

What is Google Webmaster Tools?

SEO can be confusing for a webmaster who has just been introduced to it. An explanation of what webmaster tools is and how it can be found is necessary to better prepare for their first use.

Webmaster tools can be defined as a service used by a website administrator (webmaster) to better understand how data has been interpreted by a search engine crawler.

The reports consist of several main categories, some with sub-categories. All data is used to assist with the analysis of a website.

With this data, a professional is able to analyze many aspects of a website. Back-links, search queries, crawl data, index status and HTML issues along with many other reports can all be accessed from this service.

Google has always tried to be upfront with the way it analyzes and processes content found on the web. But, as the web is essentially its own evolving ecosystem, the way data is analyzed today, will most likely change tomorrow.

This is the benefit of a Google webmaster tools account.

Google is never going to give away its algorithms, so no-one will ever fully understand the way content is analyzed. But, Google is more than happy to show how a website is performing and the issues that may be holding it back.

The webmaster tools being discussed in this hub is the Google based webmaster tools.

To access and use Google webmaster tools, all that is needed is a Gmail account. Once a Gmail account is created, visit the webmaster tools link in the links section of this hub.

To keep it brief, so a new webmaster doesn't become overwhelmed, we will review the main features of webmaster tools.

Webmaster Tools Features - Overview

Site Dashboard - A broad view of core statistics are shown. Site messages, crawl errors, search queries and sitemap data is displayed.

Site Messages - Although a quick view of messages is displayed in the site dashboard, the quick view only informs of unread messages. With site messages, all messages ever received can be saved for future reference.

Structured Data - Rich snippets inserted that mark-up website HTML. This helps a crawler better understand a pages content. This is best suited to inform search engines of reviews, upcoming events and any other important information that can be displayed underneath a search result. The schema markup vocabulary and other markup information can be found in the link section of this hub

Data Highlighter - An easy to use point and insert method of structured data markup. An alternative for someone unfamiliar with the webmaster markup version.

HTML improvements - Google will notify a webmaster of HTML issues that, if attended to, may help improve a websites performance and user friendliness.

Site-links - Site-links are other pages on a website. These are displayed underneath the homepage in a search result.

Search Queries - Phrases a website is being searched for and the average search result position is displayed.

Links to Your Site - The total number of links a crawler has found to a website and where they're coming from on the web will be presented.

Internal Links - Links on a page that link to another page on that website.

Manual Actions - A manual action from Google in a response to webspam found. This will be covered in detail in the next section.

Index Status - The amount of pages found by a crawler and how many of those pages have been processed. Information regarding the amount of indexed pages, pages removed, pages blocked by robots.txt as well as the total URL's ever crawled is accessible by clicking the advanced tab.

Content Keywords - As Google crawls a websites content, words that are mentioned more frequently are found and tallied together. These are called 'keywords'.

Remove URL's - Old webpages that are planning to be removed, or that are no longer of use to a website can be removed from the index using this tool.

Crawl Errors - If Google encounters errors while trying to crawl a page, it will be displayed here.

Crawl Stats - Important aspects of a websites crawling statistics is reported using a line graph. Pages crawled, kilobytes downloaded and time spent downloading a page are all shown in there own graph.

Fetch as Google - This is a good feature that allows a user to see the way a search engine views their website. After a URL has been fetched it can then be submitted for indexing. This can only be done a max of 500 times.

Blocked URL's - A view of a websites current robots.txt, the amount of URL's being blocked, last time downloaded and whether it was successful or not is displayed.

URL Parameters - Use this tool to tell Google how to handle specific URL parameters. If a website uses multiple URL's that lead to the same page, then duplicate content may become an issue. e.g A websites page is

awebsite.com/dogs.htm. But, when a sessionid= tag is used it becomes a different address.

awebsite.com/shop/index.php?&highlight=big+dog&husky_id=1&sessionid=123&affid=431

This is where you can set the parameter of the URL to use. Only use this feature if you are familiar with how parameters work.

Malware - If malware is detected while crawling a websites content, a notification will be displayed in this console.

Webspam is now Feeling The Weight of Googles Algorithms

Doof, Doof, Doof! Normally the sound of a 16 year olds stereo. Now, it's the customary welcome from Googles webspam update.
Doof, Doof, Doof! Normally the sound of a 16 year olds stereo. Now, it's the customary welcome from Googles webspam update.

What Triggers a Manual Webspam Action from Google

The main causes of a manual webspam action from Google
Unnatural links to a website
A website thats been hacked
User generated comment spam
Having a domain hosted with a spammy host
Content that is minimal at best with no substance.
Cloaking a redirection.
Hidden text
Keyword stuffing
General spam

Webspam Update - Manual Actions

As stated earlier Google webmaster tools has been around for quite some time now.

Features have been added, and some have been removed. At times there's even beta testing of products intended for launch that aren't yet ready for primetime.

In the end, every feature and aspect of the service has been to benefit a webmasters understanding of how their website is doing in search and the issues that need resolving.

The newest addition to the webmaster platform is the manual actions section.

This is the section where webspam that has been associated with a webmasters content will be displayed.

It's split into 2 sections.

Site-wide matches - If a website has spam that is associated with every page on that website, then a site-wide match is what will be displayed.

The term site-wide references the point that every page on that website has issues with spam.

Partial match - When spam has been found and it's only pointing to a few pages URL's, not the entire website, a partial match will be reported. Pages that are affected will be displayed.

If a website ends up with a manual spam action, whether it's a site-wide or partial match, other information regarding the moderation are also shown.

Reason - The reasoning behind the need for Google to place the manual webspam action in the first place.

Affects - Depending on what action has been taken on a website, the affects will vary. Nonetheless, how it's affecting the domain will be listed.

Some affects may include;

The inability to rank for certain phrases that the website used to rank well for.

If a partial match is found, then that URL may be blacklisted.

If a site-wide match is found, the entire domain may suffer and be blacklisted.

If a website does get penalized, after recovery, the websites rank in search results may drop. This may or may not be because of the manual action that was imposed.

The web is always changing. New content is uploaded every day. Maybe the website that was penalized naturally lost it's search result due to competition. Either way, once recovered it is best to stick to white hat SEO from now on. As next time, may be that websites last.

© 2013 Martin Heeremans

Comments

    0 of 8192 characters used
    Post Comment

    No comments yet.

    working

    This website uses cookies

    As a user in the EEA, your approval is needed on a few things. To provide a better website experience, hubpages.com uses cookies (and other similar technologies) and may collect, process, and share personal data. Please choose which areas of our service you consent to our doing so.

    For more information on managing or withdrawing consents and how we handle data, visit our Privacy Policy at: https://hubpages.com/privacy-policy#gdpr

    Show Details
    Necessary
    HubPages Device IDThis is used to identify particular browsers or devices when the access the service, and is used for security reasons.
    LoginThis is necessary to sign in to the HubPages Service.
    Google RecaptchaThis is used to prevent bots and spam. (Privacy Policy)
    AkismetThis is used to detect comment spam. (Privacy Policy)
    HubPages Google AnalyticsThis is used to provide data on traffic to our website, all personally identifyable data is anonymized. (Privacy Policy)
    HubPages Traffic PixelThis is used to collect data on traffic to articles and other pages on our site. Unless you are signed in to a HubPages account, all personally identifiable information is anonymized.
    Amazon Web ServicesThis is a cloud services platform that we used to host our service. (Privacy Policy)
    CloudflareThis is a cloud CDN service that we use to efficiently deliver files required for our service to operate such as javascript, cascading style sheets, images, and videos. (Privacy Policy)
    Google Hosted LibrariesJavascript software libraries such as jQuery are loaded at endpoints on the googleapis.com or gstatic.com domains, for performance and efficiency reasons. (Privacy Policy)
    Features
    Google Custom SearchThis is feature allows you to search the site. (Privacy Policy)
    Google MapsSome articles have Google Maps embedded in them. (Privacy Policy)
    Google ChartsThis is used to display charts and graphs on articles and the author center. (Privacy Policy)
    Google AdSense Host APIThis service allows you to sign up for or associate a Google AdSense account with HubPages, so that you can earn money from ads on your articles. No data is shared unless you engage with this feature. (Privacy Policy)
    Google YouTubeSome articles have YouTube videos embedded in them. (Privacy Policy)
    VimeoSome articles have Vimeo videos embedded in them. (Privacy Policy)
    PaypalThis is used for a registered author who enrolls in the HubPages Earnings program and requests to be paid via PayPal. No data is shared with Paypal unless you engage with this feature. (Privacy Policy)
    Facebook LoginYou can use this to streamline signing up for, or signing in to your Hubpages account. No data is shared with Facebook unless you engage with this feature. (Privacy Policy)
    MavenThis supports the Maven widget and search functionality. (Privacy Policy)
    Marketing
    Google AdSenseThis is an ad network. (Privacy Policy)
    Google DoubleClickGoogle provides ad serving technology and runs an ad network. (Privacy Policy)
    Index ExchangeThis is an ad network. (Privacy Policy)
    SovrnThis is an ad network. (Privacy Policy)
    Facebook AdsThis is an ad network. (Privacy Policy)
    Amazon Unified Ad MarketplaceThis is an ad network. (Privacy Policy)
    AppNexusThis is an ad network. (Privacy Policy)
    OpenxThis is an ad network. (Privacy Policy)
    Rubicon ProjectThis is an ad network. (Privacy Policy)
    TripleLiftThis is an ad network. (Privacy Policy)
    Say MediaWe partner with Say Media to deliver ad campaigns on our sites. (Privacy Policy)
    Remarketing PixelsWe may use remarketing pixels from advertising networks such as Google AdWords, Bing Ads, and Facebook in order to advertise the HubPages Service to people that have visited our sites.
    Conversion Tracking PixelsWe may use conversion tracking pixels from advertising networks such as Google AdWords, Bing Ads, and Facebook in order to identify when an advertisement has successfully resulted in the desired action, such as signing up for the HubPages Service or publishing an article on the HubPages Service.
    Statistics
    Author Google AnalyticsThis is used to provide traffic data and reports to the authors of articles on the HubPages Service. (Privacy Policy)
    ComscoreComScore is a media measurement and analytics company providing marketing data and analytics to enterprises, media and advertising agencies, and publishers. Non-consent will result in ComScore only processing obfuscated personal data. (Privacy Policy)
    Amazon Tracking PixelSome articles display amazon products as part of the Amazon Affiliate program, this pixel provides traffic statistics for those products (Privacy Policy)