Google WMT has started reporting large numbers of crawl errors (>150), starting in May of the type
Is this related to 'Related Search"
Is this damaging my sub?
I have a numerous amount of these as well (xml/stats/relatedhubevents.php?aid=). Thanks for posting in the forums, I will follow along. I hope these aren't hurting us in any way.
By the way, I also tend to get lots of WMT errors from the Topics pages on HP. These are on active (featured) hubs. Anyone else get these?
I see a lot of errors like that on my webmaster tools account too. And going to the links just brings me to a page not found. Where did these errors and links come from?
I just click the "mark as fixed" but I don't think that really does anything because sometimes I see them again when I check my webmaster tools account.
I'm noticing crawl errors being reported in my WMT as well (when there used to never be any). Wonder if it is related to the sub domain issue?
Please see Paul Edmondson's response below.
I checked my sitemap and it includes an entry for every page in my sub
After every sitemap entry there is one of these
which is exactly what is appearing in the crawl errors.
The number of indexed pages missing from the Webmasters list for my sub 194 is very close to the number of crawl errors of this type 56 now (but was 150 a few days ago).
Hope this helps!
We added these and some other pages to sitemaps to get Google to crawl them and drop them per advice from Google. The particular page you mentioned is now 403 (forbidden), while other pages in the /xml file have been no indexed with an x-robots tag.
They will show up at crawl errors in webmaster tools, but shouldn't hurt your site. We are working on reducing the total number of pages in Google's index from HubPages. It takes many months for them to fall out, but in five or six months, the total number of pages indexed should be reduced by a significant percentage - including the pages that are indexed URL only. For those not familiar, URL only pages happen when google picks up a URL, but the robots.txt file prevents google from crawling the page, so the page is added to the index as URL only.
We are watching this closely, and isn't anything to worry about if you see 403, 404, or 410 crawl errors. Here is an example of URLs we would like Google to drop.
https://www.google.com/search?q=site%3A … mp;bih=906
I can see what you are talking about here. Thanks for the explanation, Paul.
Your reply is very helpful and clarifies things.
It is an old, old story, but the transition to subs will remain incomplete while the topic sitemaps and all the other site-wide listings remain (latest, hot, etc.).
Just curious to know why the June 2011 threshold for hub SERP listing with HP (topic) URL, rather than Sub URL applies, and whether this affects rankings and ratings. Several people (e.g Marisa W) have reported that older hubs were less affected by the recent Panda squat.
Cheers and thanks again.
PS does this "so the page is added to the index as URL only" mean that the page is not ranked or rated by Google as the bot does not crawl it? Consequences?
by Brian Dooling22 months ago
I know there's been an update on Google that has cut Hubpages traffic by 20% across the board but I noticed on all of my popular hubs that Google and Bing haven't crawled them in 10 days! Is this normal? Or has Google...
by Paul Maplesden4 years ago
Hi there,I've been hearing discussions that noindexing hubs (because they are idled) impacts their backlinks in some way, and I'd like to find out if this is true.I've been researching this a bit online and have found...
by CMHypno5 years ago
I'm having a wade though Google Webmaster Tools and under crawl errors there were about 20 urls not found and about 60 restricted by robot.txt.Is this very damaging and would I be able to make any necessary changes?
by Dr. John Anderson4 years ago
Elsewhere on the forums people are reporting very long delays in getting indexed by Google - the delays seem to be increasing. Obviously this has been caused by the NOINDEX tag added to new hubs when under review and to...
by rachel.A5 years ago
I submit new Hub on last Tuesday it`s a unique article. i did`t submit it any were else, but google did not Crawl my post yet, search engines do crawl HubPages regularly, but why it did not come to my post?
by Dr. John Anderson3 years ago
A simple question. About 100 pages or so are added to HP every day. Recently HP changed its mind, and allowed thousands of un-featured pages to be indexed. You can see the spikes in the stats in the image, but the stats...
Copyright © 2017 HubPages Inc. and respective owners.
Other product and company names shown may be trademarks of their respective owners.
HubPages® is a registered Service Mark of HubPages, Inc.
HubPages and Hubbers (authors) may earn revenue on this page based on affiliate relationships and advertisements with partners including Amazon, Google, and others.