There has been a steady increase in the number of 410 and 404 errors reported in Webmaster Tools since the end of October.
I have a ton of crawl errors, but when I open the url, it's not the url of a hub, but like a messed up version of the url.
Crawl error for:
This is unrelated to my deletion several of my questions. Those haven't shown up in webmaster tools as of yet.
I'm glad you're asking, Melanie, because I have noticed the same thing. Is it a site-wide problem then?
Same thing's happening to me - have no idea of it's important or not, but it is escalating in the last few weeks.
I'm doing some follow-up on our end to see if this is related to something we did or is an issue in Google's end.
Thanks for taking time to report it.
I found some more info related to the issue you described.
The URLs of this form were never "real" urls, but were used to track traffic by category for Google Analytics.
We stopped using URLs in this format and deliberately set them up to return 410 errors so that Google would drop them.
OK, but the number of errors is increasing greatly - I've had more than 200 this month! G is not dropping them. It can't be good for each author's reputation !!!
I'm guessing that maybe Google is taking its time checking those urls for content, so it's just lazily adding 404 and 410 pages as they gradually coming across them (which is why they appear to be increasing.) Just my stab in the dark.
In the past it's taken quite a while for Google to drop the 410 and 404 notifications from webmaster tools... unfortunately not one of those "in a day's work" type of things in my experience.
Melbel: I just looked your URL up in Goolge search. Look at what you show as crawl error. Think about this: When we do the hubs, and select the category, sub category and maybe even another one. It looks like the dang categories we have to select from are coming up as part of the URL. HP needs to address these issues to writers, and stop ignoring us. Stupid part is that writers continue to have issues, yet there are writers who just keep on publishing hubs. I think this is not as much Google as it is with HP.
My guess is that perhaps HubPages is either:
-blocking Google from indexing categories
-changed the site architecture in terms of categories
with Google formerly indexing hubs as a part of their categories:
when they actually belong on their very own url:
A 404 ain't gonna' hurt ya', Google eventually works them out. I just wanted to alert HP of the fact that we're getting a bunch of these status codes in case this is something unintended.
This happened once before and was fixed. Looks like they've changed something and it's happening again.
Melbel, yes I have the exact same thing going on. LindaSmith1, I too have numerous crawl errors with URLs that look like categories of my hub. How do we alert Hubpages to this issue, or Marisa, has it already been brought to their attention?
I started noticing this in September. Every day, different Hubs showed the errors. That's when I started moving my content to other sites. (And no, none of my Hubs were in zZz land.)
by Blake Flannery4 years ago
This is just one example of many errors showing up in Google webmaster tools that appear to have category information added into the url of my hub creating a link to a non-existent hub. In the link below,...
by Tony3 years ago
Having just started to look at webmaster tools this may be in the wrong section and it may be just my complete lack of understanding but;Google webmasters tools and the "index status" tab should return a graph...
by Marisa Wright4 years ago
I just checked my Webmaster Tools and can see a sudden, dramatic rise in strange 404 messages - and that rise coincides with my fall in traffic. Can others check to see if they have the same? ...
by Suzanne Day3 years ago
I can see some of my new hubs in Google when I search for "hub name".But in Webmaster Tools, I submitted a sitemap and it says 38 URLs submitted and 10 indexed.I used Fetch as Google for a few new hubs and...
by CMHypno5 years ago
I'm having a wade though Google Webmaster Tools and under crawl errors there were about 20 urls not found and about 60 restricted by robot.txt.Is this very damaging and would I be able to make any necessary changes?
by Dorian Bodnariuc12 months ago
"Paranoia" must think some of you, and honestly, I used to think the same. I didn't even believe that this was possible, even though Google mentioned that they have ways to detect link spamming. But the...
Copyright © 2017 HubPages Inc. and respective owners.
Other product and company names shown may be trademarks of their respective owners.
HubPages® is a registered Service Mark of HubPages, Inc.
HubPages and Hubbers (authors) may earn revenue on this page based on affiliate relationships and advertisements with partners including Amazon, Google, and others.