The recent Panda update produced a typical change in traffic for HP shown below.
I have often wondered why the change is so immediate and consistent. As shown in the image below it would suggest that the change is not the sum of all the changes in quality rating and/or penalties applied to all the pages, otherwise it would not be so consistent.
It is appears to be a universal rating change or reset applied to all hubs and to the site over all.
For example a change in HP site rating from 76 to 72 out of 100.
This would imply that Google's Panda is effectively a way of changing the quota of page views allocated to HP. This pattern is quite consistent for other changes, for example the increase in traffic from March 25 to May 6.
Anyone have any thoughts or explanation for this?
It sounds like you're talking about something interesting, but I'd need to see more months and to understand exactly what is being measured (one subdomain, or HP overall, or a single page?) before I could venture a guess. I'd also like to see the dates of Google changes marked.
I've been trying to figure some of this stuff out myself. The period you've got as the highest peaks saw my daily traffic from Google go from 120 a day to 280 or so consistently. This last update dropped it to 95! These numbers appear to be consistent with the ratios you've posted.
The data are from Quantcast. (all of HP views per day)
You can check it out yourself over different time scales
The sudden shifts and the amount of rise/drop at the shift match my own experience of the data.
The simplest test would be to track Google's ranking of the HP domain and see if we can correlate date, direction, and perhaps magnitude of shift in the Google rating of the HP domain with the date, direction, and magnitude of drops in Google traffic to HP and to our subdomains.
The application of business statistics to web issues is a specialty of mine. But the specific details of tracking Google rankings, using Quantcast, etc., is outside my territorty. Anyone want to put together a team to gather & analyze data?
The big problem is that Google doesn't tell people how it ranks different sites. Well also it might rank them differently for every keyword.
I know one company puts out "search visibility" data for different keywords that it tracks. But it doesn't track all keywords. Mostly we just see a drop in traffic from quantcast and conclude that Google is now ranking us worse.
Of course, individually, people can see how they rank for different keywords that they target. That has it's own problems though, with Google personalized results etc.
The traditional concept of Google ranking is based on ranking of pages, not sites. The overall ranking is derived from the sum of the parts. Panda appears to be a site-wide and subdomain rank (penalty or boost weighting index) that modifies the ranking of the pages across the board for all keywords.
In very simple terms it appears that the Panda re-rank weighting is applied in this way.
# Hubber subdomain rank say 93 after Panda => 87
# HP site rank 87 after Panda => 84
The weighting can of course go up.
So how does this affect the traffic
# Article 1 keyword rank say 57 => after applying the HP and Subdomain revised weighting => 53 - drop in traffic
# Article 2 75 => 73 - drop in traffic, etc.
The weightings after Panda are applied to all search result outcomes.
Something like this is the only way to explain the consistent 'Steps' in the traffic data for HP and subdomains.
Yep, I agree. That's what I was trying to say but it didn't come out too well.
The thing that nobody understands very well is how this works with subdomains. I think initially the hope was that each subdomain would get its own panda rating, so good writers would not be affected by spammers. But it seems to me that it is not working like that.
The subdomains were never given true independence. There are all sorts of site maps, topic pages, hot, best and latest listings that link the pages back the mother ship HP. This means that any Panda changes will affect all subdomains, no matter how well they rank themselves - some more than others. If HP delivered the topic pages, etc. as 'hot' deliver on demand pages, rather than hard-wired ones, the subdomains would be more independent. I think there is clear evidence that the subdomains worked brilliantly at first, but G wound down their success. (Paul E has stated that they won't do what is required to completely separate the subdomains in order to preserve the community - We're all in this together). Time for more independence I say!
Perhaps HP should have looked at which subdomains did well at that initial swap over and dumped all the rest.
If it is accurate then good. Unfortunately I suspect the QAP can still be scammed by false 'engagement' factors - ie. internal traffic. NOT what Google is interested in.
The 'engagement' factor is weird in my opinion. HP initially used low traffic as a crude tool to identify stuff that could be dumped. The idea was 'If the page gets little traffic from Google, then a large percentage of these are likely to be of poor quality - 'no Google love'. So lets dump the the lot and deindex them. Idling for low traffic was a crude way of getting rid of a lot of poor quality stuff quickly. HP still persists with this approach and still claims that good quality, low traffic pages hurt the site. There is no evidence of this and traffic is not part of Google ranking system. I think that HP still works on the principle "If the article gets low traffic there must be something wrong with it - Google hates it, so won't feature in the SERPS - so dump low traffic ones" There is no income from these pages anyway.
As I've written below, I am of two minds about the subdomain interlinking. If we didn't have it, then the good writers would probably not have been panda'd, but it's possible that being linked to the mothership helps with ranking.
If there was no interlinking, then it would be pretty much like writing on blogger. That is subdomains too, but independent, and I bet you they don't all have the same penalty. I've heard people with blogs are starting to see good traffic.
The only other advantage HP offers is very good monetization. I reckon the HPads CPM is better than what most people can get from Adsense. So we might still be better off giving HP 40% of impressions.
All very interesting and has given me a lot to think about. HP is easy and nice and fun. And the CPM is great. But it is judging what works for you I guess, what you want, how to achieve it... etc..
I have often thought, Mark, that you might be better off with blogspot. Probably not better off financially, though. But I bet you could develop a real following. Have you seen hyperboleandahalf?
I don't imagine stickmen are conducive to adsense. But once you became famous, I bet you your Zazzle and cafepress stores would make a lot of money.
Or not. But you would probably find it great to have total control over your stuff and to totally own it. Of course there is the community here, but I bet you a lot of people would subscribe to your blog.
Thanks aa - appreciated and I kind of know it, but haven't got there yet. I am too slapdash and lazy on my other places - no QAP to pass. I have seen hyperboleandahalf - and wow. It gave me a lot to think about - weaving a tale, lots of pics - comic for adults - excellent. I'm still learning I think and perhaps not working hard enough. Anyhow. Thanks!
My traffic has dropped in a similar way to HubPages. However, my best hubs are still on page 1 of Google. I have taken away my history, so the place in search is real. How can that be?
I would say that if they've been on page 1 for a considerable amount of time that they have gained backlinks. These kind of links add value to your page in the eyes of google. Quality backlinks tell search engines how other sites 'rank' our pages, kind of like a vote of trust.
You've got to factor in that during the summer months certain areas of the country spend their free time outdoors. In the winter, their computers are entertainment.
One of the things I noticed when Panda first struck back in 2011 was the loss of the long tail keyword rankings so you might want to take a look at that in GA and do a comparison. The other thing to be aware of is that you can lose a lot of traffic just by being pushed down one or two places in the SERP's for your main keyword/s.
The last Panda update was on March 15th. Now, Panda runs continuously. I wrote a Hub about Google Panda and what it does if you're interested.
You know, I think the fact that Panda runs continuously only matters in terms of data refreshes, not updates.
Something definitely happened on May 9th. It wasn't just us, lots of sites got hit on the same day.
Just because Panda runs continuously doesn't mean Google won't make changes to its algorithm. The day they release this algorithmic change or update, SERP rankings are going to change, and sites are going to lose traffic or gain it. Just as we saw on May 9th.
What is different with Panda running continuously is that it no longer makes data refreshes on specific days (like once a month as it used to do before). What this means is that when you make a change to your site, say remove excess advertising or thin content, Panda should notice this as soon as your site is crawled. You don't have to wait for the next data refresh for changes to be recognised by Google. Theoretically.
To answer the original question, I'm not sure why you think the quantcast traffic is inconsistent with what Google says Panda does. Panda and penguin give a score to the whole domain, not to individual pages. The only inconsistency is if you expect the scores to be given to subdomains, rather than the whole domain.
It seems to me that Google isn't paying that much attention to the fact that HP has subdomains, and treats the whole site as one entity. Probably because of the extensive linking between the subdomains. I suspect this is why the subdomain solution was so successful when it was first introduced, but then stopped working. It seems to me that Google has changed the way it treats subdomains here.
I'm not convinced. My traffic doesn't follow the pattern on that graph and never has. I seem to go in the opposite direction, more often than not - or don't move at all!
You have to remember that when there's a Panda update and Hubbers lose traffic, they'll flock to the forums to vent - but you won't hear from those whose traffic isn't affected, or has improved. So it's easy to get the impression that when HubPages is hit, we're all hit - but in fact, that's not the case at all.
Hmmm your case does seem to disprove my theory. However, if the subdomains were truly independent, you would expect some of them to go down, and some to go up with a Panda change, so overall traffic would stay more or less constant.
So either your subdomain is behaving strangely, panda is not the only game in town, or there are a few subdomains that provide most of HP's traffic, and when they go down, quantcast traffic goes down, and vice versa.
I have to say my traffic closely follows the overall HP curve.
Another thing to consider is that Panda doesn't just affect us, but also our competitors. Our panda rating might remain the same, but if a competitor improves our traffic might go down. Perhaps you have very specific competitors since you write in a very specific niche?
Actually not. aalite. Your theory could be sound. Marisa may just have some quality in all or some of her hubs, or a quality of her sub-domain, that is liked by the very change that Panda doesn't like about all of HubPages in general. I'm going to look at some of her hubs and see if I can detect any qualities that I wold like to add to some of my hubs. If we find that ingredient and incorporate it, we can balance the ups and downs of Panda-bouncing. (Picture a panda on a trampoline - only you're the trampoline!)
Actually if you look at my profile, I don't write in a specific niche on HubPages - I put most of my dance articles on my blog.
You're also assuming that mine is the only sub-domain that bucks the trend. It's not. There are many others - so I prefer the theory that some go up, and some go down. It certainly reflects my experience.
There's so much in this comment that just isn't factual. I'll just respond to one thing. Google can and does penalize just one page on a site or one subdomain:
http://www.seroundtable.com/bbc-granula … 16514.html
Lol, there is so much in your comment that is irrelevant
The link you posted talks about a manual penalty and a notification about unnatural links to the BBC. What does that have to do with Panda?
Anything else that isn't "factual" that you would like to discuss?
A person just starting out on Hubpages begins with zero traffic when they publish their first article. Say a year later they get a thousand views a day. If there is a G quota for traffic, that would imply others have lost traffic just because this new person gained traffic. It would also imply that there is some sort of cutoff point where you're no longer "new" and your traffic will drop simply to make room for new writers in the quota.
Granted, the HubPages community is very fluid. There are new writers who succeed, new writers who fail, old writers who give up and old writers who keep going strong. There's also the QAP factor now, where content is de-indexed and some authors are packing up and shipping out.
It's plausible that there is some kind of traffic balancing act going on, but it could very well be a product of all the changes going on here at HP and in the online content world in general rather than a conspiracy by G to punish the site. I think its counterproductive to speculate too much, since we'll never know. I'd rather think positive than assume there is some kind of conspiracy against my work.
As for the anomalies in the data, with my own traffic I almost always see a chip taken out of it every time some new algorithm rolls out or there is a major update. But it recovers within a week or so. My theory is that HP doesn't always fair so well when it comes to satisfying G's idealist notions, but in the reality of everyday user engagement and satisfaction a Hub is as good as any other page, and will rise accordingly. It takes a little time for things to reshuffle to where they ought to be after an update.
I have thought for some time that HP writers should not link to each others articles. For awhile, HP pushed this idea but I don't see the team doing that any more. The problem as I see it is if you link to an article that is not considered "authoritative", you have created a "bad" link, so to speak. Also, as people delete articles, leave HP or whatever, those links then become ruined, and this creates another set of problems. I really think we should all dump our links to each other's articles for these and other reasons, and stick to linking to well known, credible sources or link within our own articles where possible. This way, we are not damaging each other's subdomains but are strengthening our own.
That is quite true. But.....even if we don't link, there is a hell of a lot of interlinking done by the site itself, like the "related hubs" underneath each hub.
On the other hand, if there was no interlinking, then we wouldn't get the page rank flow to our articles, which might in fact make them rank worse. It would really be like writing on blogger, where each subdomain is pretty independent.
But, at least the links the site makes are dynamic, so if a hub is removed you don't end up with a bad link. Also I would hope that the QAP plays a role in choosing the hubs that are displayed underneath your hub.
I thought Google had largely discounted internal links. One reason why content sites plummeted.
I dislike the links to other authors content. It makes it less likely a reader will stay with me - resulting in a higher bounce rate and loss of potential income. Sure I get reads across from others but on balance I'd prefer to not have 101 reasons to leave me.
Hmmm, I've not really come across any writing on how Google treats internal links. So all thoughts about this are the product of my (not very bright) mind.
I would be surprised if there was ever a time the way Google treated external links the same way it did internal links. But......If you have an authoritative domain, that ranks well, and some pages have a lot more internal links, than other pages, then I would imagine Google thinks that these pages must be the important ones and ranks them better.
If you imagine an "organic" website, written to convey information by somebody who doesn't care about what Google thinks, it will have tiers of pages, depending on how detailed and important the information is. How that website is interlinked will tell Google which are the "main" pages, and it will try to send more traffic, for more general queries to it. So logical interlinking is still important to a site.
How much power those links have to help or hurt depend on how search engines view the subdomains, which really isn't clear.
If they can indeed have a major impact, it stands to reason that not only linking between Hubs could have a positive/negative influence, but also any time we leave a comment on a Hub, answer a question or make a forum post. Those all create links to our subdomains. Should we be more judicious about whose Hubs we comment on, or whose questions we answer?
Either way it's almost a moot point, because HP has so many links built into the system that we're all forever entwined anyway.
FWIW, it seems to me the links are a good thing, at least for now, and perhaps its a big factor in why HP Hubs rank so much better than articles from other content sites. (?)
I'm responding to the whole topic of linking to hubs written by other hubbers.
I am not writing in terms of SEO, but from a place of believing in offering my readers the best work we can offer and as a believer in win-win success.
I will continue to read articles by other hubbers, and link to excellent ones that are relevant to the topic of my own articles. That comes from who I am as a writer and what I believe about business success.
To the extent that Google succeeds in its expressed purpose for Panda, this will be beneficial.
I do like to keep people close to my own article, so I try to remember to edit the HTML and insert "target="_blank"" on those links (I think I got the syntax right), so readers see the new item, but don't lose my hub.
I hope other writers will do the same. I still believe that the long-term solution is to write articles people want to read, write lots of them, tie them together in sensible ways, and share the link juice!
If you are linking to external sites that have strong credibility, you are always much better off than linking internally to another author's hub, except in the instance where that author does indeed have strong credibility, such as in a doctor who writes about disease, etc. You should also remember that when you link to another author's hub, you are also, in effect, linking to that author and ALL of his hubs. So, if some are great and others not so good, the linking can damage you. The same is true if you link externally to a hub that does not have strong credibility.
I respond to this in two ways:
1) Well, then, you've shown us the solution. Let's all be strong, credible professionals with a lot to offer, and share links.
2) To be clear on a technical point (remember I'm out of date): Does Google check a page's *outbound* links and increase or reduce page rank based on quality of the destination domain? If so, is it measuring the destination sub-domain (a particular hubber) or the destination domain (all of HubPages) in doing this assessment?
A couple of things. First, we don't think a high quality page that doesn't get traffic is bad. However, to identify the high quality pages to the level of accuracy we need is cost prohibitive, although we are able to identify some and those are featured for the foreseeable future. As we get better at identifying good pages, more will be long term featured
One thing we all know is Google is changing and continues to change. Each of the steps in the graph are Panda influenced in our opinion. We think the May 9th update was a Panda change as well. Penguin updates are usually a small ripple across the site.
I've questioned (behind the scenes to google) about whether subdomains are the optimal way for HubPages to be organized. Especially since we see such a huge percentage of urls show in the serps as hubpages.com only (google is ignoring the 301 - at least for display in the serps). Google's response has been not to switch back. Leave it as is. They have also advised other sites to spit their sites up by subdomains or sub directories so that they can better distinguish the low quality sites from the high (instead of viewing them as one large site).
We are confident that not all subdomains are treated the same. It's pretty clear that some go up and others go down. We also think Google's algorithms work at the page level, subdomain level, and at the domain level to a degree.
Google's been very consistent with us. Focus on quality. When I've asked about interlinking (related Hubs), and topic pages, they've suggested that authors focus on content quality not links.
So, why do we see these macro steps up and down with Panda updates. One idea is there is a high degree of correlation with the type of content Hubbers create and we know that the vast majority of featured Hubs are very close in quality. Perhaps we are all dancing on the line. From our perspective we would love to see more delineation between what content Google likes.
Others think that it's all about content thresholds. For example, if we were able to make HubPages 30% excellent, 40% good, 25% ok, and 5% poor, we would see traffic gains. This is somewhat similar to the first idea. If we could move a large portion of the OK Hubs to Excellent, that may get some panda relief.
Love to hear others thoughts on this.
Since Google has said there will be a softening of Panda this summer, our hope is the continued quality work will pay dividends.
Our view is there will be short term fluctuations, but to set the course for the future, our strategy is focused on original, long format, media rich Hubs. The antithesis of twitter. More robust than YouTube. I personally feel that HubPages succeeds when authors get the rewards (traffic, comments, earnings etc) for their information. When we deliver this, the community grows for the benefit of all.
Paul - thanks for your thoughts and questions. It's nice to have simple guidance from an expert.
On the item you wrote above, I still see a vast range in quality within featured hubs. Using just my own hubs as an example - over 100 featured - at the low end, we're talking 5 text capsules and 2 pictures. At the high end, we're talking at least Stellar (1150+ words, often a few thousand, 2 special capsules, and lots of photos). That is quite a range in both quality and also time-cost of production.
Figuring that's the range from ok up to good, it would be very useful to know if I'm better off producing a lot of good hubs, or maybe 1/3 as many hubs, but making all my new hubs excellent. (Rough guess, it takes 3 times longer to produce an excellent hub vs. a good hub.
Lastly, when you say "we" and "us," who's the "we"? I know from your profile page that you're one of the founders of HubPages. But I don't know if you're still involved at a corporate level, or whether you were writing as a hubber or offering a statement from HubPages.
I just posted this http://hubpages.com/forum/topic/113293? … ost2412309
Update the best hubs and make new hubs to create new winners. Clearly, there is a significant variation in how much traffic all excellent Hubs get.
In the official HP forums I'm in this role http://hubpages.com/about/team
In the sports forum, I'm a Hubber:)
The debate about what Panda does amounts to how the ‘quality’ threshold and the ability of sites to meet it is adjusted
|------\/ individual site weighting lowered
Paul E has argued that the step changes in traffic are cause by Google’s Panda raising the bar. This means that more of the lower ranking pages don’t make it and traffic drops. This would be a universal bar raising applying to all sites.
I think that this probably occurs, BUT there is something else going on. There is evidence that Google applies changes in ranking for individual sites, perhaps based on an algo. What this means is that it reassesses HP in Google’s eyes and says “ Overall their quality is still not as good as their competitors” so we will lower their ranking a bit from 85 to 79. "Squidoo has improved and they don’t have so much junk so their weighting will go up from 65 to 71". These weightings are applied to all SUBS and all articles on HP, so that they all, irrespective of quality score, have less chance of appearing in the SERPS and their rank on the SERP pages is lowered.
If Paul E’s theory was correct then the loss of traffic would only be for the poorer quality pages. This is not my experience. When HP’s traffic drops 20%, my traffic also drops say 15%. But the loss applies to most of my Hubs, not just the poor quality ones or the ones that got little traffic before the change.
This is evidence that Google adjusts the weighting it gives to HP and to the subs based on its overall assessment of site ‘quality’ in its eyes. The weightings are applied on a site by site basis using some site algo. There is definite evidence of individual site targeting by Google, for penalties etc. So Google is not just changing the threshold it is changing the ability of all pages in a domain to compete. This change in overall weighting would better explain the consistent step changes in traffic.
The other outcome from this is that Google is effectively assigning a quota to various sites and that it periodically makes adjustments to deliver a consistent quota or share of the traffic. [but that is much more comroversial]
I think it's a lot to do with content quality thresholds, both across the site as a whole and for individual subdomains. I also wonder if you've seen any step changes in traffic for specific topic areas? You may not want to answer that in case people are discouraged from writing on particular topics here (!), but it seems to me that the breadth of topics/keyword phrases that have the opportunity to do well on a site like HP reduces with each Panda update. As other niche specific sites are given more content authority by Google and in turn get better SERP placements, it's only natural that other sites will go down accordingly.
Related to whether interlinking among other subdomains hurts us, I posted the same question recently & got the response in this thread from Admin:
However, I still wonder if the automatic links to other hubs can hurt us, especially if any accidental reciprocal links were to happen. I know for a fact that we see unrelated hubs linked - I guess something triggers a hub as a potential link & the filter isn't always on target.
@janerson99 I don't think you're interpreting what I'm trying to say quite right.
I don't think it's "raising the bar" as opposed to changing the algo in some way.
I put out a few possibilities as to why we see steps up and down.
- The content is so similar that Google assesses it fairly uniformly across the site. So, it looks like a domain wide issue, but correlation doesn't mean causation.
- The distribution of quality is too close to the line where Panda sits for the subs or domain. That's why we see step ups and down.
In the May 9th update these sites also saw losses. Their profile is pretty different than ours.
- https://www.quantcast.com/howstuffworks.com (wasn't hit with other panda updates but dropped early may)
No worries! A can't really understand what you mean.
I think you are saying that lifting the average quality of the pages would make a happy Panda
"If we could move a large portion of the OK Hubs to Excellent, that may get some panda relief."
The poblem is that HP does not have enough excellent ones, the quality is very even across the site and so small Panda changes have a big impact.
Also, that Panda imposes a site specific penalty or rating that is not a universal one, and that sites with very different profiles can be penalised for different reasons at different times.
Anyway, fun chat. Cheers!
See I interpreted this to say that the QAP rating of a vast number of hubs is for example 6 (just because that might/or might not be the threshold). Paul thinks that a lot of the site sits on the "panda threshold".
So Google might make a small change, and suddenly the penalty is lifted. On the other hand, they make another small change, and we are under the water and penalised again. At least this is how I understood it.
One possibility could be to raise the bar in the QAP even higher, say to 7, but I think that would cause too many hubs to become de-indexed and the site would lose too much content, and money. I remember, when there was this idea that the QAP threshold would be 8, Melissa saying that very of the hubs she rated made it to 8.
Another solution is for the good writers to write more "8" or higher hubs, thus bringing the overall quality higher.
This would be really helped if we could be given the QAP rating of our hubs (pretty please )
All this makes perfect sense........except that what Google says about wanting quality really doesn't correspond to what I see in the SERPs. My long, in depth, multi imaged hub about something I am "passionate" about is outranked by a 20 word wiki answers thing, and a youtube thing which is a slide of pixellated photos!
No matter what Youtube videos are going to be rated higher than ours, just because Youtube is owned by Google. I've been running into Pinterest boards showing up in the searches before websites. That is a little crazy. I don't want to click on photos on a Pinterest board. I want info.
'The problem is that HP does not have enough excellent ones, the quality is very even across the site and so small Panda changes have a big impact.'
I don't want to be mean, but you simply cannot say this kind of thing without any evidence.
Paul E said it - I was just summarizing. HP has the data!
"For example, if we were able to make HubPages 30% excellent, 40% good, 25% ok, and 5% poor, we would see traffic gains. This is somewhat similar to the first idea. If we could move a large portion of the OK Hubs to Excellent, that may get some panda relief. "
Scanning quickly... I read Excellent as Excrement.
Paul is reporting other people's speculations. I've no idea if they have any evidence. You clearly have none.
What I can offer as (admittedly feeble) evidence is that whenever people complain about sudden traffic loss I go and look at their subs and there is almost always some very obvious problem.
This kind of thing:
Terrible writing e.g. long winded, ill-informed, illogical, confused, confusing, dull
OK writing but poor spelling/punctuation/grammar (most common)
Too many pages in spammy areas (fairly common)
Too many pages on subjects already covered a trillion times like 'how to clean a keyboard' (very common)
Weird, SEO-freak titles (declining in popularity, mercifully).
Affiliate ads that don't help the reader (still plenty of these)
In terms of internal links, aren't we always led to believe that Wikipedia is the model?
When it comes to external links, however, the most reliable rarely rank (reliable as in studies and data). I've linked to sites that are completely ignored by Google, but IMHO and for what it's worth, are far more trustworthy than many of the nonsense sites that Google rank.
So which way should we go with this?
This will be very unpopular, and I'm sure it will not be implemented, but it seems to me that the only way to raise the average quality of the site, is to raise the bar in the QAP for new content.
Lets say the threshold for featuring right now is 6 (That's what Simone said it was). If I understand what Paul says correctly the average QAP grade of HP overall is not much higher than that, say 6.4 (I'm making these numbers up, I have no special insight).
Paul also seems to think that an overall quality of just above 6 is Panda's threshold. If we could raise the average quality to 7, we would be insulated from Panda changes and would (most probably) not be penalised.
The easiest way to raise average grade would be to raise the featuring threshold to 7, but I suspect that would mean de-indexing so much content that the pruning would kill the tree. But if you require new stuff to be higher than 7 to be featured, you would raise the average quality. If you leave the threshold at 6, then you will keep getting a hell of a lot of hubs that are 6, and the average will stay the same.
It's all good telling people that they should aim for an 8, but I bet you most hubbers don't read the blog or the forum, and are only vaguely aware of this stuff.
You could always give people some sort of incentive to write particularly high quality hubs. I guess this is what the 'stellar' contest was supposed to do.
Maybe a new page to display hubs that are 8 or above (or 9 or above depending on numbers). Sort of like hub of the day? If getting on that page ensured higher traffic from hubbers, and a nice link from a high PR page, people might try to get on it.
I agree. I checked the QAP levels. 6 means barely able to form a coherent thought - should not be allowed outside without a carer. We can do better than that.
I agree - if HP wants to truly be a site known for high quality content, the standards should be high, not mediocre. I have still seen very amateurish or flawed writing in published hubs in the past few weeks (including in new hubs that have passed the QAP). I am not sure what the "Lovely Indian Auntie" hubs do to us, in terms of Google. I know they get hits (money is money, I guess), but is that at the expense of higher rankings and more robust traffic for the site as a whole?
You also have to look at the time of year. Most of these Panda changes seem to happen in late spring when less people are online. When the weather warms up people get outside to garden, do yard work and outdoor activities. Over the summer traffic always declines no matter if Google has made changes or not.
When you write you have to realize it's feast or famine, some months are good and some bad. Just like squirrels, you have to store your nuts away when there is plenty and do other things or live sparingly when writing income is lean. Keep writing, editing and weeding out the bad articles/hubs and carry on.
Don't put all your eggs in one basket or in other words don't rely on one source of income.
@Marisa Wright - Yep, with each panda update some go up, some down, some stay the same. We have had very little conclusive data other than Hubs by authors in non english speaking countries were more likely to lose traffic.
@aa lite - I believe you asked about category traffic changes. They almost always follow the same pattern as the overall site.
It is interesting that most of Marisa's hubs don't appear in the SERPS via subdomain. Several others, with older hubs, have suggested that hubs tied to the HP URL may be more stable with traffic, and vaccinated against Panda. It there any evidence of this?
@jandrson99 I've asked Google about this many times. They say it's intentional. Lots of older hubs show in the serps as hubpages.com. However, they are not immune to panda.
Will Aspe's comments in several Forum Topics have gotten me to think about what is truly necessary to pull HubPages out of the doldrums. I think we have to come up with unique viewpoints and original slants to the content in our Hubs. Rehashing existing web content isn't enough any more. I've been trying to think of ways to do that with some of the ones that I've been working on for future publication.
I'm trying to act more like a journalist an less like a student writing a paper. I've figured out some interesting topics, that aren't overly saturated and I've been interviewing people to move the narrative along. That way I can add a professional edge to my work that might not be found elsewhere on the Internet on these topics. I'm sure I'll come up with other ways to provide truly original work.
The HubPages Ad program is the perfect way to monetize this because it doesn't require that I put products on the page to make money. Although I might add a few if they are directly related.
Those would be the first pages scraped and copied. Better to work on improving the site in general.
I used to have consistent traffic that was higher and now I have consistent traffic that is lower. I didn't sign on nearly three years ago to make a lot of money, although that would be a sweet perk. This is probably one of the most interesting forums that helps inform about the behaviors of the hubs and traffic, etc. And since I have a daughter getting married this summer, I haven't had time to focus on all the reasons why this, that or the other right now, so thanks for this forum.
Aw, congratulations! My son is getting married in Sept. I would imagine your experience is a lot more involved than mine.
So I'm going completely out the box here, and I hope Paul's still checking in, because I want to play devil's advocate and challenge a few basic assumptions (not for the first time, either!) I'm making these points as a person who holds a Bachelor of Science in Public Relations and Marketing and who has been on HP for a little over a year, with a total of around 130 hubs under my belt, four of which have been selected as HotD at one point or another. I also completed the second AP before the many changes that were made.
Challenge 1: Google's rankings are based on quality of material.
What the heck does this even mean? Proper English usage, grammar, and composition? Or does it mean popularity, regardless of writing quality? Maybe (and I suspect most likely) Google measures quality by its ability to gain advertisers, the PPC it can charge those advertisers, and indirectly, the number of click-throughs on their advertised links. Despite Google's outstanding public relations that has the public duped into thinking Google is all about providing the public with what the public wants, it did not become a multi-billion (with a B!) company by simply doing good works. It's sole purpose is to generate sales. Why would Google have a need to update algorithms if not to induce more revenue by increasing opportunities for advertisers and greater sales results than those advertisers can get elsewhere?
HP's spending on QAP programs and such is only as worthy as the sales it produces and the opportunities it presents to Google (anyway, this is true as long as it's so reliant on Google for income... if other streams of revenue are introduced the QAP could potentially affect those.) We all take pride in our writing, but did you know that the average American reads at a sixth grade level and has a 3-second attention span? How exactly do you measure quality now? Or do you opt to go for popularity? (Hint: collegiate-level sites aren't the ones raking in the dough!)
Challenge 2: Algorithm changes are done to protect users / produce more efficient search results.
Google doesn't want to spend on unproductive ads. Actual search result quality is more cluttered than ever these days by top rankings with maps, videos, etc. than actually finding the particular text a user's looking for - something that used to be performed with Boolean searches.
Continuing to think as if I were a Google exec (ha!), it would be really dumb of me to spend $100 to get $20 worth of revenue, especially if that same $100 could net me $1,000 elsewhere. I'm coming to believe this is the real reason behind algorithm shifts - to channel the ads where they produce the best outcomes. This month, maybe "swimsuits" are producing more ad revenue than "jackets," so Google adjusts their algorithms to capitalize on this. But that's not the only type of adjustment that might need to be made. Perhaps there is a new flood of advertisers coming into the marketplace. The real estate market's picked up, and suddenly there are thousands of new real estate agents across the country clamoring to make a name for themselves. That requires a slightly different adjustment. Costs of channels themselves - video vs. articles vs. direct website sales of products - would also be a factor.
So many factors make up a marketplace. There are seasonal adjustments, permanent adjustments as some things become obsolete, and preference adjustments as tastes change and fads rise and fall. Then there are economic factors that influence how much is spent by consumers and advertisers, and how they allocate their spending. (Why did Google acquire YouTube? Because video readily hits those consumers who don't have the attention span to read, and it represented an advertising channel Google could capitalize on.) Ok, I'll stop here, except to say that what Google's concerned about has nothing to do with subdomains and everything to do with revenue, which is why they're not giving straight answers and say only, "Don't bother changing."
Challenge 3: High quality, evergreen hubs are best.
I am sure we all agree to this, yet many people will attest that seasonal topics sell best for them. But the key here is less about evergreen than it is about micro-targeting the consumer. How many readers do you actually convert into paying customers? Related question: Can you keep 'em coming back and spending more?
There are two routes to making money on the web - self-promotion or the more passive way, relying on search engine traffic. Companies like Coca-Cola have huge budgets to ensure that they capture both the web traffic AND direct-to-consumer dollars. Most of us rely on search engine traffic, which produces a pretty low amount of sales as we compete against other, similar sites to a buyer who clicks the top few results and then goes on about their way. To succeed this way, it's not just necessary to rank high, it's also necessary to convert visitors into buyers. For those of us who write just because they enjoy it, great, but if we want to convert, we have to have an understanding of how to do it, and we have to have the tools to get it done.
There are some topics that lend themselves naturally toward conversion and others that require much more persuasion before someone will open their wallet and grab a credit card. Overall, the number of topics on HP that do NOT lend themselves naturally toward conversion far outrank the ones that do. A book review with a testimonial about how it changed my life will prove more effective than trying to sell jewelry because the Crown diamond sold for a record price. Higher priced items require more persuasion, which is a fine art in itself.
Because of all of this, I believe HP's rank would benefit immensely from a couple of changes:
1. Create more interaction from users. They should be able to ask questions and respond to info in hubs in an engaging, dynamic way. A comments box after the article is just plain boring and doesn't really add meaningful value, but a social tool that let user comments get posted in varied fonts in the margins next to text they select in the hub's body, so that it temporarily becomes part of the page, could not only keep content fresher, but also help users engage and connect. Right now, we talk AT users instead of WITH them. This is the kiss of death in the new social media web that has emerged over the last few years if you ask me!
2. Reward users who make purchases by creating a widget that asks them if they'd like to get recognized on the page for their purchase or asks them if they'd like to tell the world why they made the purchase they made. If they say yes, their testimonials could be added in a sidebar next to the Amazon or Google link modules. People LOVE seeing their names published!
3. Create more direct streams of revenue. Zazzle's now advertising on HP, but as writers, we could benefit from referring Zazzle products for sale or offering our own. This opens up a world of possibilities for hub topics that aren't easily featured now, such as one that features postage stamps of fighter planes or.... whatever can be imagined.
4. Adjust QAP to determine what is "promotional enough" without being spammy. Or educate writers on how to convert. Or both.
HP's traffic looks like it's dropping to its lowest point in several years, judging from the links provided earlier in the thread (adjusted for a longer time period.) The big picture is about a lot more than algorithm changes, and it's certainly about more than quality, but I'll be honest, I feel like HP just isn't "getting it" because I have only heard about changes on these same old ineffective themes for the last year.
On the whole I agree with this, but will offer just one thought. Google's search engine and adsense, although supposed to be separate, are inextricably intertwined. Advertisers use the SE to target their ads, and I have no doubt that google changes SE ranking based on the ads there.
Having said that, google then needs their SE to maintain the adsense end of the business; the end that earns them their money. If google loses traffic to their search engine (vs Bing, Yahoo or whatever else) they lose money.
And it is here where quality matters - that SE must return what the reader wants. In the long run, then, it is the reader that will determine the quality of an article, not google. Google will try to match the SE to what the reader views as quality, and we will try to match what google does, but it is ultimately the reader doing the deciding.
So quality DOES matter, at least as much as conversion, but it is quality as decided by the reader, interpreted by google, re-interpreted by HP, and finally again by the author. For instance, I try to write my hubs aimed at tradesmen in language they will understand and accept regardless of grammar and pretty charts. I don't particularly care how HP rates it (as long as it is featured), but I do hope that google will eventually figure out that those hubs are "better" because readers keep coming to them rather than to an academic article on the same subject.
If I'm successful, google will decide those hubs are of a higher "quality" than other articles, but that will be because the readers have voted with their mouse. And google, within parameters of maximizing their own income, will place the hub high on the SERP. As advertisers trust google's judgement, google's efforts to maximize their income becomes less of a factor than the readers opinion.
We're saying the same thing, except that I highlighted that we don't know what Google interprets as quality: accuracy, depth, popularity, conversion, or a mix of these?
For all we know, they roll dice. It certainly seems that way some times.
It is almost certainly a mix of those as well as a dozen other things. Time on page, for example should be a good indication of the value of a page to the reader and I would expect google to include that in their algorithm. They're still pushing for the "+" button, too, and that's another indication of how the reader felt about the page.
There is no one size fits all. It's going to completely depend on the type of search query and the demographic of the searcher.
by Ben Guinter 8 years ago
I'm curious to hear if there are any hubbers, who saw a big drop in traffic after the most recent Google Panda and Penguin updates, have fixed their issues and are getting good traffic again yet?I think it would be good for us all to know what changes brought about your renewal of traffic, since...
by Raymond D Choiniere 8 years ago
Hey Veterans,I am curious. Recently Google made changes and many people are or have lost a lot of traffic, and I am wondering if the problem is too many internal links.I only ask because since Google's change, I have lost a significant amount of traffic, however many of my hubs are linked...
by Silver Rose 9 years ago
Here's a post from googlewebmastercentral posted yesterday:http://googlewebmastercentral.blogspot. … ernal.htmlLook at the table where they show the differences. Previously scholar.google.com/ and sketchup.google.com/ were external links, now they are internal links.It's probably a response...
by Eric Dockett 3 years ago
This is getting silly. The last few Hubs I updated had all links to other Hubs snipped, even though these links were (a) on the same topic (b) helpful to the reader and (c) pointing to the same niche site. I really try to understand why Amazon links are removed, and I get why links to other sites...
by Dolores Monet 9 years ago
The new Google Panda update has hurt sites than contain internal links. I have a lot of these, to my own hubs in the same niche. Would it be a good idea to get rid of these?Personally, I like internal links when they lead a reader to a similar topic.
by WoodsmensPost 9 years ago
I searched the forums and was surprised not to find this article posted anywhere.Nothing really new that we haven't heard before but nice little tidbit of info for for those Demand Media Haters. HubPages CEO on Google's Panda algorithm: SEO doesn't work!!!Posted August 23, 2011 on ZDNet
Copyright © 2020 HubPages Inc. and respective owners. Other product and company names shown may be trademarks of their respective owners. HubPages® is a registered Service Mark of HubPages, Inc. HubPages and Hubbers (authors) may earn revenue on this page based on affiliate relationships and advertisements with partners including Amazon, Google, and others.
HubPages Inc, a part of Maven Inc.
|HubPages Device ID||This is used to identify particular browsers or devices when the access the service, and is used for security reasons.|
|Login||This is necessary to sign in to the HubPages Service.|
|HubPages Traffic Pixel||This is used to collect data on traffic to articles and other pages on our site. Unless you are signed in to a HubPages account, all personally identifiable information is anonymized.|
|Remarketing Pixels||We may use remarketing pixels from advertising networks such as Google AdWords, Bing Ads, and Facebook in order to advertise the HubPages Service to people that have visited our sites.|
|Conversion Tracking Pixels||We may use conversion tracking pixels from advertising networks such as Google AdWords, Bing Ads, and Facebook in order to identify when an advertisement has successfully resulted in the desired action, such as signing up for the HubPages Service or publishing an article on the HubPages Service.|