Here's a post from googlewebmastercentral posted yesterday:
http://googlewebmastercentral.blogspot. … ernal.html
Look at the table where they show the differences. Previously scholar.google.com/ and sketchup.google.com/ were external links, now they are internal links.
It's probably a response to the subdomain switch. It's not HP per se, it's the fact that everyone is stampeding to copy HP - articlesbase, suite101, and numerous others read Paul's post about what he was going to do and then copied him! Next time perhaps the HP team should just mention things in this forum - no need for press releases shouting out strategy to the world! Rule #1: don't make G look stupid!
This change in how they are dealing with sub-domains probably accounts for the wild swings in traffic hubbers are experiencing.
What does this mean - I personally think people have to start making an effort to get external links for their hubs. What does everyone else think?
Well it also is important how
"if you own a site that’s on a subdomain (such as googlewebmastercentral.blogspot.com) or in a subfolder (www.google.com/support/webmasters/) and don’t own the root domain, you’ll still only see links from URLs starting with that subdomain or subfolder in your internal links, and all others will be categorized as external links. We’ve made a few backend changes so that these numbers should be even more accurate for you."
works for our subdomains on Hubpages....
Hmmm. This would seem to indicate that using the link tool and interlinking to other hubbers hubs could really help now? And those links that HP provides to other hubs that are on all our hubs?
Most of my hubs show hundreds of links from within HP; of those many are from my subdomain, interlinking my own hubs, but many are not. These would all now be considered external links?
Out of the whole article, this is the only bit that applies to us, because we don't own the root domain.
So for us, nothing has changed (you'll notice they say "you will still only see links....")
I'm wondering how this will impact us in the end and if there is something else we should be doing.
There is no indication that Google is changing the value of the links. It is just re-categorizing them.
The categories have different values in the Google algorithm. So I suspect this is the end of the subdomain "solution" for content pages.
So your saying we are screwed? Noooooooooo!!
I don't care if I maintained my super mega high traffic last week, but I want to at least maintain my current high traffic levels. It would still be about a 400% increase for me.
Why? As Simey says above, links from other subdomains are now external when they used to be internal.
That could in fact be the reason for the huge traffic increase last week that many say - those links from other hubbers hubs are now external links. I would have gone from a handful to thousands overnight when the spider came through and found 'em.
Other way around. Subdomains were seen as separate, now seen as not.
From Simey's quote, if we own the subdomain (I assume we do) and don't own the domain (we certainly don't) then any links from the domain that are not from our own subdomain are external.
Or do we not own the subdomain? At least for google's purposes.
I don't think that is a correct interpretation. Hubpages owns the whole shebang as far as Google counts it. This move seem to me to be quite obviously designed to neutralize what Hubpages et al have done.
And you may well be entirely correct. It would explain why so many have had big traffic increases (more external links) only to see it go away as those links go away.
And I have seen the reverse, yesterday all my traffic came back.
OMG - I give up already! Or maybe you got a manual slap for all those links and they've decided that it's all right.
I REALLY need a good crystal ball!
I know that it is presented simply as a way they "display" internal and external links in webmaster tools - but the wild swings in traffic that started two weeks ago and which are outside the panda updates, seem to be about another algo change, which may be about how they deal with the relationship of subdomains to the main domain.
G is quite tolerant of tactics as long as white hatters do it and it doesn't impact the SERPs as a whole. Hubpages by itself is small in the grand scheme of things, and if this had been implemented quietly, with just announcements in the forum, I think it would have proceeded smoothly.
The problem is that after the interview in the WSJ, every single spammy article directory (and there are literally thousands and thousands out there) has decided to do the exact same thing and rushed to set up sub-domains as well. Those other article directories wouldn't have picked up on the strategy if it had only been announced in the forum, because they don't hang out here.
Kudos to hubpages for hitting on the subdomain solution - but telling everyone about it was akin to Coke publishing their secret formula so everyone could copy them.
You have got a point there. It is the stupid spammers who are at fault here ultimately! Why must they spam article directories! Why must they destroy the goals of writers who produce unique content! Why!?
I hope spamming plagiarizers get their just desserts! They can go to ____!!
I would have to disagree that HP should not have announced it. It HAD to be announced to the members here, and as soon as that happened everyone knew about it. Dollars to doughnuts it was on a hundred different sites within an hour of putting on the forums here.
In addition, as soon as the results hit quantcast (a day or two) the effects were know everywhere as well.
HP probably did the best thing it could; announce it to the world as soon as the site as a whole went to it.
Well, the original conversation with Google, where Paul discussed the idea of creating sub-domains, wasn't conducted in private. HubPages doesn't have that kind of access. Paul had to ask the question on the public Google forum. So webmasters would've been watching how things went from that point on. I doubt the announcement made that much difference.
yes, that is how I interpret it, along with Paul's post.
We own our subdomain. Not HP.
Have you claimed ownership of your subdomain?
Claimed ownership? Do you mean the rel-auth thingie or something else?
I mean verifying authorship with Google.
This is what I mean: http://hubpages.com/learningcenter/How- … ster-Tools
I totally missed that one. Thanks!
*edit* A quick and easy process. Thanks again - can't imagine how I missed seeing something about that.
rebekahELLE, this is an awesome link, thank you for sharing it. Very easy to manage, even for a non-technical person like me! Like wilderness, I am shocked that I missed this step!
I do not think that is the case with me though. I had new published hubs skyrocket in traffic and they were not linked to other relevant hubs.
However, I believe I have old hubs that were linked, but they did not rise as much as the ones that were not.
Man, I have a lot of broken link notifications on my hubs. I deleted a lot of hubs before and there are links to nowhere on some of my hubs.
Is this going to be a good thing or a bad thing? I used to interlink my hubs, but have not done so for a long time. If it helps, then I guess I will do it, but not until I know for sure.
It's the other way around. Links between HP subdomains and the main HP domain will all be treated as internal links
"We’ve also extended this idea to include other subdomains, since many people who own a domain also own its subdomains—so links from cats.example.com or pets.example.com will also be categorized as internal links for www.example.com."
The question will be for links between cats.example.com and pets.example.com. Of from your subdomain to mine when you graciously provide a link to my work.
I can see the reasonableness either way, but how will google see it is the question. I also wonder if they will differentiate a link from one author to another with the new author tag.
SimeyC pulled out the relevant part of the update for what you'll see in Google webmaster tools if you have claimed your subdomain.
As far as the impact on search results, that remains to be seen. We haven't seen any major macro changes in traffic this week, although, traffic usually dips a bit over Holiday weekends. Hopefully many of you will get a bit of time to relax over the long weekend. I'm heading to Tahoe with my family and a few friends.
I'm not sure I have linked my subdomain properly in webmaster tools. One of my accounts is listed there correctly. The other one (my primary one) is listed the old way (pre-subdomain). Do I need to delete the old URL and add the correct subdomain URL?
I'm bumping this for you since others may have the same question.
Thanks, rebekahELLE! I'm still hoping someone will see this and answer it. I would like to get some good use out of Webmater Tools!
Since I still was not seeing any information on this account in Webmaster Tools (listed under the old, pre-subdomain name), but was seeing info on my small, correctly named second account, I just went ahead and deleted the old account name and added it back as the subdomain name. - Just in case anyone else was wondering about this. - I have only done this today, so I don't know whether that will solve my problem, but I assume that it will.
So, basically up until a few days ago, all subdomains were seen as separate websites so any links coming from them to my subdomain were seen as a bunch of external inbound links?
Today all the subdomains are seen as being all part of the same site so all those external links became internal links over night?
Isn't that just the same as not having subdomains then? And doesn't that put us right back where we were when Panda hit with all the good stuff and the bad stuff all in the same bucket again?
I'm not seeing how this isn't going to be a problem which negates any 'fix' these subdomains were supposed to have achieved.
I dont know about that ^ The sub-domain solution was obvious. Dozens of online bloggers and members of this site suggested that tactic long before HP ever implemented it.
Its likely most of the spammier type sites would have done it anyway, the spam sites tend to be on the cutting edge of seo tactics, they really dont need to learn from Hubpages. I implemented the same design pre-panda at xobba and it was no stroke of genius then either.
HP can certainly get kudos for the implementation of a huge administrative/technical solution across their site architecture though!
Id bet HP HAD to try and get some positive PR and the illusion of forward momentum in order to keep investors and advertisers happy. I dont think they made "G look stupid" as they did attempt to up the "quality" of the site in addition to changing site architecture
It was being discussed very quietly in knowledgeable circles - but it was clear that the other large article directories were going in a different direction (changing layouts, increasing length of articles, increased moderation) - I'm pretty sure the sub-domain thing wouldn't have occurred to them at all, judging by the fact the most of them have only implemented it weeks after HP did.
There's a reason offline companies are strict about business secrets, and it is a habit online co's should learn too.
Regarding the need for publicity to draw advertisers in - HP could have just finessed this; pointed to the increased traffic and murmured sweet nothings about how the stricter implementation of standards was having an effect, etc.
I think, though, it's one thing for businesses to keep secrets before something is put out there. It's another to even try to keep it a secret once something has been released/implemented. The Panda troubles (here or elsewhere) weren't a big secret by any means, and all it would have taken was for subdomains to be out for a day-and-a-half (less) before someone from somewhere else would have started calling big attention to it.
There's a point where businesses/individual have to just put whatever it out there, let whoever's going to try to copy it do so; and deal with whatever results from there. Sometimes what works for one business/person doesn't work for whoever copied the idea for a different set of circumstances anyway. On here, the aim of the subdomain was supposed to be to stop good stuff from penalized. If someone else copies the subdomain thing with different aims Google's going to figure that out eventually and is likely to further adjust what it does, and maybe in a way that would separate the "good aims use of subdomains" from the "unwanted aims use" of them.\
I don't know.. Business or individual, sometimes it's just better to put it all out there, let whoever tries to copy or otherwise muck with things do whatever they're going to do; see exactly the kind of stuff those other business/people do, and then eliminate "holes" in the original thing (such as the subdomain thing on here). Forewarned is forearmed. To me, the put-it-out-there approach can sometimes lead to less vulnerability to weaknesses in the end.
Much of the time copying what anyone or anything else does that is based on individual sets of circumstances/conditions doesn't work out equally effectively for someone who copies essentially without regard for its/his own circumstances/challenges. It may work for awhile, but before long things start to unravel.
Either way, subdomains weren't going to remain a secret
There were many things we did to improve the site where some are public and some private. As a whole the internal quality ratings have increased significantly - I'm not sure people understand the scale and shift in quality that happened across over 1 million Hubs in six months. (I'm doing an interview next week where I'll give some pretty startling numbers).
As the plan went forward, subdomains were the last major step beyond the ongoing efforts, and we thought that when we scaled it out, the public swing in traffic data was too great for people to not latch onto the most visible aspect of the change, so we went public.
Subdomains should help Google evaluate content clearly and I think that what they're doing with rel=me and rel=author align with how they are thinking about author reputation as apposed to site profiling.
I can see how it would be beneficial because it makes your sub domain look even more like a website.
My suggestion is that it downgrades the links between subdomains from 'external' to 'internal'. If your articles only have internal links within HP and this change affects rankings you may see a decline in traffic. Why has Google announced the changes if it does not affect rankings? Perhaps links between subs were always regarded as internal links and this was designed to clarify it. But I suspect that this is designed to counter the subdomain tidal wave.
I don't think so. It downgrades the links if the same author owns both the root domain AND the sub-domain. However:
"if you own a site that’s on a subdomain...and don’t own the root domain, you’ll still only see links from URLs starting with that subdomain or subfolder in your internal links, and all others will be categorized as external links."
That's the way I read it, too, particularly as google accepts our individual ownership of the subdomains.
If (IF) the SE thinks the same way it means that we all have a large number of new, external, links that are now more valuable. It could be one of the reasons for the sudden huge increase many saw. It could also be a cause for the abrupt decline and sandboxing that some saw as the spiders found hundreds of new links all at the same time. At least one person has indicated that simply asking google to re-evaluate immediately ended the sandbox. A "mistake" that a human promptly corrected perhaps.
It just makes sense, at least to me. Google doesn't like content farms and prefers individual domains. In all but name our new subdomains are just that - individual domains all by the same author - just collected together under one main domain. Sounds like what they want to see.
Time will tell, perhaps, but as I said above it is possible that linking between our subdomains just became more valuable.
No, because it says "still", which means nothing has changed in that regard.
True, but until a few weeks ago we didn't have subdomains and links between them to be "still" about. Only after that change did we collect massive numbers of links between subdomains that, as they are found by google, are counted as external.
You may well have a hub that HP has put a link next to that points to my subdomain. Or even an intextual link or link capsule because you liked my hub and it was pertinent to your own. It is those links that are now considered as external whereas they were all internal before we got subdomains and there are hundreds of them.
Or am I still barking up the wrong tree, as I usually am?
That's right, and when we didn't have sub-domains, all the links between Hubs were internal.
But if you had a Blogger (.blogspot) sub-domain a few weeks ago, links from other blogspot blogs were treated as external links. And they still are.
So as soon as we switched over to sub-domains, links from other Hubbers' sub-domains became external. That was the whole point of moving to sub-domains - so we're no longer part of the same website as the spammers.
If you are correct it means that links from a second HP account will count as external links. What a lovely idea!
Well, it's a pretty big "if" - I just tossing out thoughts and ideas. But if it is so, then you are right; that could be very nice.
Do you or anyone else have links from fellow hubbers in your webmaster external links? All my links from HP are from my own subdomain, which occur in both the internal and external lists. That would prove the point.
I don't have any data there yet. But are you sure about your own links? Yahoo tells me that I will have around 130 links from the domain for any one hub but only 5 or 10 are from the subdomain. That would mean 120 from HP outside of my subdomain and those should be external.
I know from looking in the past that many of these links come from the "related hubs" section that HP puts on every hub. Those should all be external if we're reading it right.
All my known external links (built and organic) are listed in the 'external links' page as well as links from my own HP articles ( listed as both internal and external at the moment ). On the External Links Page -hubpages.com links are shown as 28,185 links (???) for 242 pages (the number in my subdomain) !!!). Webmasters Central can be very weird!
I don't have any links listed from any other subdomains on HP.
I would love to hear whether anyone has a link from a fellow subdomain listed as an external link.
Interesting. Yahoo says your hub about geothermal prospects has a link from Jake kelly's hub on best farmville crops to plant. Looking at that hub, there is a link to yours in the "related hubs" section that I would have thought would be from Jakes subdomain.
Yahoo lists 7 such links, along with about 25 from hubpages itself.
Is it possible that you are simply aren't seeing those relatively few in the 25,000 links from HubPages itself?
In any case, you are showing 28,000 external links that before the subdomain shift would have been counted as internal. THAT certainly should be worth something!
just highlighting as some are interpreting all HP subdomains as internal, which they're not.
Is it possible for HP to allow for some customization of the site? I mean, still have elements of HP, but make individual author's profiles/pages look more unique? That might be a better way. Google seems to love unique designs rather than a whole bunch of writers whose websites look exactly the same, regardless of domain name.
Internal Links do have a different value to external links. Numerous tests in varying forms have confirmed this.
This move currently looks like it 'may' jsut be reporting, however it holds significant in the fact that Google is no longer viewing a subdomain as 'completely' a seperate site.
I look forward to seeing test results on how/if this has any large impact, and if there really has been any change in the algorithm to join sub domains to a main domain, be it a dampaning factor or being considered internal links.
Sub-domains are not new. I imagine Google has always had some kind of weighting factor for links between sub-domains. I would be surprised if they changed this without being upfront about it.
This re-catagorising might even be a helpful response to the sudden upsurge of interest in subdomains as a way of separating content and allowing the Google bot to pick out the good stuff.
Matt Cutts previously indicated that sites where there is a mix of poor content and high quality content have long been a problem for Google.
Also, Paul Edmondson consulted Matt Cutts prior to going the sub-domains route and there is certainly some level of approval for the sub-domain option.
http://searchenginewatch.com/article/20 … covery-But
Sub-domains are not new, but the way google is thinking about them is. There has always been a natural connection between subdomains and main domains, but it is rather low and testing has shown it to be a minimal factor (A main domain strength passes on very little domain trust to a sub-domain).
And if you read everything about the contact between Matt Cutts and Paul, it seems pretty obvious that Hubpages received the same Google response as tehhe rest of us - simply put they will give us website guidelines, but they will not tell us how to game the system. This is no surprise, it makes sense, but Google is never going to tell any website exactly what needs fixing to get to the number one spot.
I think what would be interesting is for Hubpages to put their site up for review by an SEO panel, these often bring up interesting discussions about how a site could be improved in pure SEO terms.
I think the quality of pages is the ultimate issue.
Google uses human assessors to rate the performance of its search. If hubs that ride high in the SERPs are judged to be worthy of their positions overall, there is no reason for Google to tweak its algorithms, hand hand out penalties or mess about with the value of links in a way that harms us.
It is not just engaging content that matters now. I reckon the details will become more and more important.
Something I notice when I read hubs (even from writers that I respect)are the many small grammatical errors and typos that a human editor would mop up easily on the more professional sites. I ain't too hot in this area myself. This is the kind of thing that could count against Hubpages.
There were a panel of human reviewers in the Panda analysis which were used to create a new algorithm that provided a value to the main algorithm. There is not a human element that judges websites, but an analysis of human responses to websites was used to create a fresh take on 'website quality'.
I agree that large amounts of basic grammer and spelling mistakes may have an effect, some studies have show that it may well be a factor, and it is definitely something that Google has the capacity to judge.
Unfortuantely I do not think this is the reason Hubpages is suffering (There are too many other factors in Panda that Google has mentioned which talk about sites such as Hubpages.
The biggest problem is of course that Google has talked about content farms, and judging from the affect of Panda on Hubpages it is clear that the line has been drawn to include us. No matter how much I believe that Hubpages does contain plenty of high quality content, it does not change Google's perception of us.
They use real readers in a sandbox environment for their normal search algorithm testing.
http://www.stateofsearch.com/how-does-t … last-year/
The real critics of the kind of output you find on Hubapges over the last year have been journalists in the 'quality' press from the New York Times to the Wall Street Journal. They have been scathing about the poor quality of Google's search results especially in those areas where search has been dominated by content farms.
Even Google is obliged to take note of the level of criticism it was getting at the beginning of the year.
Essentially, we will never be safe unless we deliver the goods in terms of quality.
Google can always knock us back if they decide it will suit their users. They will just tweak and tweak until we are gone.
Here is something from Forbes: http://www.forbes.com/sites/jeffbercovi … etty-dumb/
Again though, it has beenanalysing peoples reaction to cahnges to the algorithm, not changing a sites individual rankings based on tehir assumption of that site.
People looked at results which included sites such as Hubpges, and htey were obviously not happy with the results which is why Google changed the algorithm to align with what their test subjects want to see.
I take your point that the assessors data is not used to rank a site. At the same time it influences search. If assessors favor searches that eliminate certain sites like Hubpages it won't help.
Also, that assessor data alerts Google to sites that are not delivering what people want.
I reckon this is why web strategy for a site like Hubpages has to be very different to the strategy you would use for the average site.
Small and medium sized websites are struggling for success. To get success they might only need satisfy the Googlebot (without completely alienating users).
Big web presences like Hubpages will get human eyes run over them- both from Google and the press, which might then pressure Google. Hubpages, still very muscular, SEO needs to be used wisely or it will backfire.
Hopefully, the sub-domain option represents a wise use.
If sub-domains are just seen as a dodge to get around Panda, our present progress will soon be reversed.
But Will, the situation we have now, with each of us in our own sub-domain, means that the quality of the site as a whole should be irrelevant, shouldn't it?
Or are you saying any sub-domains which are performing poorly are full of low quality Hubs?
I have 19,000 links from Hubpages. I reckon my fate is pretty much tied up with the rest of the site.
Of those 19,000, how many are actually from hubber.hubpages.com?
Yes, our fate is still tied to HP, but to 19,000 individual hubbers as well. The string to HP is there, but weakened and supplemented by those other subdomains.
If a poor quality site links to you does it hurt you? We used to get both good and bad "juice" from HP - now it will mostly come from other hubbers subdomains, and isn't that basically a good thing?
Sorry wilderness, I really don't want to count them!
The point is, all those links are supporting my sub domain and if they are hit I will be hit. I am also probably linking to thousands of hubs. So that also affects my sub-domain.
As I said, there is no reason for Google to change the rather fine status quo that is benefiting so many of us (with massive traffic), unless the feedback they are getting about their searches suggests numerous Hubs have risen higher than the quality of the content deserves.
This is why I think cranking up the SEO is not helpful unless you can crank up the quality.
Right now, I don't know how well Google can really rate the quality of a sub-domain. If it can rate very accurately there is no reason for the site as a whole to suffer for the bad stuff- Google can just send lousy sub-domains straight to hell.
I'm a bit dubious that the Google bot has such fine powers of discrimination.
You get to counting (cracks whip)!
Whether G can detect quality accurately is almost immaterial; they try their best and use the results to assign position. If they are wrong then good hubbers go down the drain (we see that) but right or wrong they are going to do it.
It would then seem than G will determine the quality of each individual hub and then, using those figures, give a rating to the entire subdomain. This lets happen just what you suggest; lously subdomains go bye bye.
Part of G's idea of quality seems to be external links, and those have recently increased a hundred fold for us. Thus we are writing much better quality than we were and will be ranked higher.
In that respect we are very heavily tied to HP, but then as we are all part of a content farm that will always be the case. I suppose that HP could design subdomains without that, but we would then lose a lot of good "juice" and might as well have our domain.
If people are concerned about being part of a content farm, why not set up a load of Web 2.0 sites which also offer Ad sharing, such as Tripod, Typepad, Jimdo and Posterous..these are sub domains as well and are High PR
Lets not forget that the whole idea of this site is that being open to everyone is how the site gets bigger and more successful.
I have been investigating other content sites that involve revenue sharing, writing my own e-books, and making affiliate sites. I think jumping from one site to another is a waste of time.They will all be slapped down by the mighty G.
I have a few affiliate websites that I make some money on and with a little more effort,I may be more successful.
It's a good start for us starting out,to write on these platforms but the hand holding has to stop sometime.
I want to write e-books... anyone have some experience and success tips they want to share?
I wrote this hub recently because I think many hubbers could do very well with ebooks <snipped - no self-promotional links in the forums, please> - the publishing industry has changed hugely in the last 18 months or so - and hardly anyone has noticed. Anyone who can write a decent hub or 10 - can write an ebook - and the potential is that you can make an awful lot more money off it than a few cents from Adsense clicks
by Raymond D Choiniere 8 years ago
Hey Veterans,I am curious. Recently Google made changes and many people are or have lost a lot of traffic, and I am wondering if the problem is too many internal links.I only ask because since Google's change, I have lost a significant amount of traffic, however many of my hubs are linked...
by Philip Cooper 11 years ago
Could someone help me understand what an internal link is. I understand that Hub pages doesn't allow more than 2 internal links. I have linked several of my hubs together because they are about the same topic. So one hub has 5 links on it. Are these internal links? I also have links coming in from...
by ProCW 12 years ago
by Paul Goodman 9 years ago
Article for discussion. I know that this recent development has already been mentioned by some hubbers in forums. But I am now wondering if this might be the main reason why we are seeing the current traffic plunges for many hubbers? See link...
by JohnKrantz 11 years ago
I see other hubs with 62 or 68 hub score.. mine is 72 and still, the outgoing links are nofollow. help, please:( I can't use hubpages as a backlink otherwise!
by Eric Dockett 3 years ago
This is getting silly. The last few Hubs I updated had all links to other Hubs snipped, even though these links were (a) on the same topic (b) helpful to the reader and (c) pointing to the same niche site. I really try to understand why Amazon links are removed, and I get why links to other sites...
Copyright © 2021 Maven Media Brands, LLC and respective content providers on this website. HubPages® is a registered trademark of Maven Coalition, Inc. Other product and company names shown may be trademarks of their respective owners. Maven Media Brands, LLC and respective content providers to this website may receive compensation for some links to products and services on this website.
|HubPages Device ID||This is used to identify particular browsers or devices when the access the service, and is used for security reasons.|
|Login||This is necessary to sign in to the HubPages Service.|
|HubPages Traffic Pixel||This is used to collect data on traffic to articles and other pages on our site. Unless you are signed in to a HubPages account, all personally identifiable information is anonymized.|
|Remarketing Pixels||We may use remarketing pixels from advertising networks such as Google AdWords, Bing Ads, and Facebook in order to advertise the HubPages Service to people that have visited our sites.|
|Conversion Tracking Pixels||We may use conversion tracking pixels from advertising networks such as Google AdWords, Bing Ads, and Facebook in order to identify when an advertisement has successfully resulted in the desired action, such as signing up for the HubPages Service or publishing an article on the HubPages Service.|