1. Sites such as "tools dot davidnaylor.co.uk/keyworddensity/" feature very prominently in the links list ("links:URL search") and appear to store huge list of links on servers. Could Google be mistaking them as "link farms".
2. Subdomain staleness. I have noticed that many of the older subs that have had few or no hubs added to them, have suffered traffic crashes ( see the 'successes'). Many of the hubs on these site appear to have been de-indexed. Has the subdomain structure meant that 'evergreen hubs' in stale subs are vulnerable because Google ignores them or simply de-indexes them to keep the SERPS fresh? Google has access to dated updated information in sitemaps and cache dates.
Let me answer a few questions, and this is not theory, it is fact. Google is measuring user interaction through analytic, Google + and the Chrome browser and has modeled what a quality site is. They also measure your hub traffic because there is a Google + button on every hub (have to be logged in to see it) and they are measuring user interaction on your hubs.
If your bounce rate is high (mostly single pageviews), they assume correctly that your hub page is lacking, and if too many of your hubs on your account(subdomain) have a high bounce rate, they consider it a crappy site and the panda slaps it into nowhere.
The Freshness algo puts new post higher in the search engines for a few days to measure user interaction, if your page meets the standards, it stays, if not, it drops. it usually takes 3 to 5 days on average for them to rate the page. It has noting to do with updating the page.
Comments does not have anything to do with freshness, but is a good signal to them that people are being "Engaged" with your content, so it is strong signal. Facebook likes + shares and comments help and shows Google that people are becoming engaged, Re Tweets carry weight, but multiple tweets with no re-tweets is a negative signal.
Quality content can suck as far as Google is concerned and the Panda is here to repo your ego if you do not make sure you have good and related internal links to your other hubs or other hubbers hubs. The days of the one page wonder is over. For all that writes on loads of different subjects that are unrelated where you can not build internal links, you had better share some of your traffic with other hubbers by linking to them as a reference or your accounts will soon go away.
Do you think Hubpages put that internal link tool in there for their health? This is a social site, so treat it as such and you can recover or never go into the panda box.
This makes a lot of sense. I've read a few of your hubs and SEO blog as well, and find the information to all be logical and informative. I have a question for you that I will ask of you on one of your hubs, but I am curious as to your thoughts regarding the lack of a Google +1 button on our hubs.
There used to be a Google +1 button, but it is no longer there. I believe HP staff said in another forum that it wasn't being used much so that is why it is gone, but it seems to me that Google +1 is in its infancy and as we learn more about its importance, its usage may swing upward - but that opportunity is no longer present on our hubs. Obviously that activity matters to Google. Not to mention, we can view +1 Reports in Webmaster Tools, additionally signifying its importance.
Thanks for reading my hubs and visiting the site and looking forward to the question on the hub.
They probably know that Google is measuring visitor interaction with it (Google +)and are trying to hide some of their info, but I think it is a mistake.
We know that Google + is part of their ranking signal and there are some great benefits to it, so i would suggest that they put it back. I had seen one just a couple of day ago on a hub, may have been mine, but can't remember.
Thanks again, hope this helps.
While what you say sounds good, it doesn't correlate to what I've seen here on HP or on other sites.
In and of itself, bounce rate is not an indicator of a site's quality. In fact, I would argue that it's the opposite in many cases. Google's definition of a bounce is someone who comes to a website and leaves after visiting only one page. That's not necessarily a bad thing.
I saw this analogy somewhere else: If someone Googles "what is 2+2" and my site ranks for "4" and adequately answers why that is so, I've answered the visitor's question and they leave. I can't even fathom why they would stick around to see why 3+3=6 and 4+4=8. Now, if look at bounce rate with time on page together, then that may be some type of quality indicator.
Your definition of the "freshness factor" is nothing new. Historically, new sites got a "boost" by Google before being left to fend for themselves.
As to comments, let's go back to the 2+2 example. What could a visitor possibly have to say? "Hey, bud, great job! I never knew that 2+2=4. Can you amaze us with more majik?" I have hubs/pages that have retained their #1 position despite the lack of changes/comments/social markers/tweets and all of that stuff.
About your "one-page wonders" hypothesis, again, I don't see it in practice. I do concede that may have more to do with age and other factors, but I really don't think you can make a blanket statement like that.
It has something to do with quality and uniqueness. If content has quality and unique (but relevant) keywords, then it obviously gets a lot of traffic, especially when evergreen. People always search for everyday things, like crafts and weight loss.
I agree that freshness is weighted heavily now. One thing I like to look at is what types of keywords are driving traffic after these updates. I don't feel bad when I lose terms that weren't that relevant. For example, my hub on gifts for 8 year old girls was getting traffic for nine year old girls for a while. Losing that traffic seems like the right thing. However, losing traffic for 8 year old girl seems wrong.
There seems to be something in play around tail terms and plurals.
Google seems to be more aggressive at deriving dates now as well. Most studies I've seen shows that fresher content enjoys higher CTRs in serps. That could contribute, but there are plenty of places where Google doesn't show dates. Have you ever seen a piece of content with the recipe microformat include a date in the serps? It seems like they have a set of criteria that determines if a date is appropriate...just not sure where they draw the line.
Just did this random search on google.com.
I typed in: recipe best chicken supreme.
These are the first set of pages to show. The 3rd, 4th and 5th position all have dates, including two very old dates!
Sorry, I have no idea how to make the image bigger, but the dates are:
3rd: 3 Mar 2011
4th: 19 Nov 2001
5th: 14 Dec 2004
I did a search on a recipe Hub I published last week and the date (and all the other good recipe info) shows in the Google snippet.
I had the same problem with the invisibility of the small image size when uploading from my computer. Then I tried uploading it to my website first, then used that image link back to here and the image was much bigger.
So, I have to assume that you haven't raised any girls, yes? The difference between eight-year-old girls and nine-year-old girls is...NIL. I can understand how Google may not be able to understand the difference in its algorithms, but I do think you should be a bit more clear about the technical aspect. Well, at least in my IMHO.
So...Google controls the CTR? If I'm reading what you wrote correctly, you are saying that the newer the content, the higher the click-thru-rate. I'm confused, because that's one thing that Google (supposedly) doesn't control. So why would freshness be a factor all of a sudden? Are you saying that Google serves up different ads to the "fresher" content?"
Confused in Connecticut
I think he is referring to click-through within search results, not ads. I have not personally seen the studies Paul has showing fresher content enjoying higher click-through-rates, but perhaps it is because fresher content is getting ranked higher, and therefore clicked on more often - just all supposition on my part though.
Plus, if you look at my comment a few above this one, searchers can also filter results based on time (i.e. any time, past 24 hours, past week, past month, etc.)
Yes, in my opinion. Comments not only show the spiders the articles are worthy due to interested readers, they also bring added SEO.
I'm certainly not the expert here, but I have seen comments come in to older content, and then the next day the traffic spikes up a bit.
So would updating hubs, the ones that are able to update, keep older hubs fresh and relevant? Or is the only way to stay fresh is to keep publishing hubs often and consistently ?
Updating hubs or getting comments is the only way to improve freshness of an individual hub. Publishing more hubs and back linking improves your freshness in this community and the relevance of your hubs.
Google "sees" content changes--new version is different than new version. Google doesn't understand "comments" directly since there are no HTML tags that set off comments--it will see the content and recognize changes.
Back linking from newer relevant hubs will provide your hub the additional boost of relevance Google is looking. Linking from outside hub pages is even better because that is an indicator of trust from an outside sources.
I was trying to tell if google changed the updated date based on comments. There are reports that it does, but I couldn't find a real example.
I think that they do look at more than the date on a Hub. I saw one Hub that was updated quite a bit and it got a more recent date in the serps. Another hub was lightly updated and it's date remained the original published date after it was crawled.
Thanks for the info. I have noticed that when receiving a number of comments the traffic on the hub does improve. I often receive hub length comments.
I wonder if there is a percent change that they look for. I once heard someone say don't change more tan 20% at a time, but they didn't back that up.
As far as keywords, I did not mind losing bug bites, because my hub is about bed bug bites, and should be for more specialized keywords. What I don't get is that I rank for a keyword for one hub, that I was trying to rank for a different hub. The other hub is more appropriate for that particular keyword so I don't know why the first one is getting ranked and not the one that was meant to be ranked for it.
@writeangled - thanks for posting that query. I'm not sure why I wasn't getting dates on recipe queries, but I can see them with yours.
I'm in the process of running my own freshness experiment with a Hub. Here's what I'm doing:
(1) I updated a time-sensitive Hub with relevant dates, links, and info. for some 2012 festivals and events.
(2) I submitted the URL through Google Webmaster Tools (fetch as google) so my updated Hub gets crawled to sooner.
(3) I will check to see when my updated Hub has been crawled by looking at the 'cached' information on the extended snippet in SERPs.
(4) Since dates don't always show in Google snippets, once the updated Hub has been crawled I will find the earliest 'time frame' from Google's 'show search tools' on the left of SERRPs. If it appears in a recent time frame like 'past 24 hours' or 'past week' then I know that Google considered my updates worthy enough for the Hub to be considered fresh or newer.
Hopefully I will have the results of my experiment in a few days.
But the fact that any searcher can easily filter results by time frame is yet another good reason to keep content fresh on Hubs where newness may be relevant.
I am still using Windows XP, but only when I need to as I run it using VM Ware Fusion on my Mac. To me it is still the most solid and reliable Windows OS ever made.
Whatever caused the drop this weekend seems to be unrelated to content freshness. I only started actively writing in January and one of my hubs that was averaging 17 hits a day went to NOTHING! My overall stats went from about 180 - 200 a day to the 60s.
I discovered that I had some scraped content showing on a site that Google had identified as malicious and I have no idea what to do about it, but I unpublished that hub temporarily. Since republishing a day later, it seems to be picking back up, but my Google stats are still lower than they were.
My traffic fell drastically and I just checked one of my hubs - Copied on 4 sites! I sent a mail to fanbox as they did remove previous copies. But, there's another site lucki13 that ranks high with my copied content. Go to that site and enter the whole URL of your hubs and I'm sure you'll find plenty. I found a few of mine
EDIT: It's another clone of HP search for your profile and all your hubs will be listed. Any idea what I need to do? DMCA?
Someone reported lucki13 in the forums a few weeks ago. It turns out it is a proxy server and not a copycat site. No need to worry or file DMCAs.
Thanks I remember reading about it. But, the previous proxies didn't pop up in search engines at least that's what I thought
Yeah, weird. I don't know enough about this stuff to comment intelligently, LOL. Speaking of weird, what's up with my previous comment being so long and narrow? Does it appear that way on your screen, too?
The only traffic I get is to older hubs, the reason I'm now writing very few new ones is that it seems impossible to get them ranked and traffic coming to them?
Many older pages were SEO-ed into prominence and that applies to Hubpages too, of course, which leverages its mega-site advantages.
Now that large-scale backlinking is a dead end, search results could get frozen without something to stir things up.
And I reckon that is one of the reasons for the freshness mania.
I don't think anyone is at a disadvantage writing on Hubpages these days but it is not the advantage that it used to be.
I was referring to CTR from the search results. Google derives a date that it displays next to the summary. It appears that the date impacts the CTR. Newer dates get higher CTRs.
Interesting discussion. I've been wondering in recent times if it might be better to have a relatively smaller account (50 - 150 hubs) which can be regularly updated, rather than a huge account of hundreds of hubs which are time consuming to keep up to date. Of course, Google probably draws different rules for different keywords and topics - info on Roman architecture won't change much, but info on digital technology can go out of date very quickly.
I actually wonder if HubPages should maybe consider being proactive with deleting low quality hubs that have languished for months/years without any attention given to them (editing, updating, comments etc)? Especially if nobody has signed into the account for ages?
We do have a process that closes old inactive accounts that mostly impacts the type of Hubs you describe.
I think one of the best ways to guage what is successful is to see what the succssful sites are doing. To this end, freshness is a factor, but HP has that covered simply because the overall site is continuously updated with fresh hubs. I don't think individual hubs would need to be individually refreshed as long as the subdomain (the author's entire accont) was receiving fresh material. I'm sort of a noob on website structure and engineering, so if anyone knows this to be wrong, please educate me!
It seems to me that interaction with users might be a more important factor than freshness of content, at least when reviewing the most popular websites that keep people coming back again and again.
To this end, I'd love to see HP expand on the interactive aspects. We have comment modules, RSS feeds, and quizzes, which help keep readers engaged while they're on our pages, but these aren't really enough to keep the reader coming back specifically to a particular writer to see what's new - which is why people use social media so extensively. They get to contribute AND experience.
Meanwhile, forums detract from the user experience, in my opinion (I could be completely wrong, but I don't find forums to be especially user friendly for sites like HP, Google, Amazon - information can be hard to find and require a lot of time.) I wonder if there could be a way to have a status bar widget/blidget that updates like an RSS where writers could post personal nuggets similar to FB status updates, let readers ask questions, let authors click on a link they've recently updated to have it included in the blidget feed, and answer those questions - which could appear in real time within the right-hand sidebars of our hubs.
This would invite readers to develop a more personal connection with the authors they like, boost the social aspects of the site (this appears to be the next wave of rankability, in my opinion), contribute to one or more hub topics at a time, and possibly provide fresh material for ALL of our hubs at once if it can be tied into the updated status somehow. Authors wouldn't have to go into individual hubs to do updates if it could, because they'd be able to post a fresh status and have it appear on all their pages.
(Are you still reading, Paul E.?)
Ok, this is a bit out of the box and I don't know much about this stuff, but a thought occurred to me:
Could it be that the "normal" fluctuations are because of the way Google rearranges their indexes? Sort of like the way a computer defrag temporarily pulls chunks of data and holds it while it rearranges and rewrites those chunks of info to a hard drive... With all the data Google is handling, I could envision that process holding quite a bit of data in the ether and unavailable (or with skewed rankings because of all the shuffling) until it's placed back where it belongs.
I'm not highly technical, but this idea actually makes a lot of sense to me.
Over the months since Panda 1.0, I have seen some interesting fluctuations in my traffic, without my having done any single thing to any of my Hubs.
As one example, in recent weeks (I can look up the date, but I don't have it right now), my traffic fell by at least 50% - maybe more. It sat that way for well over a week. In the past 2-3 days, it has risen considerably. Today it is 3x what it was just a few days ago - not the kind of 10x surge that some have reported, but closer to what it was (and a bit above) a few weeks ago.
Mine's been sitting at 50% down since the end of May. I am hoping it will bounce back at some point, but who knows? I've done some tinkering around, but in the main I've been doing more stuff elsewhere.
Glad to hear that yours has recovered. It gives me some hope!
It's too soon to call it "recovered" ! But I am certainly encouraged that it has moved back up for these few days. My main point was - and is - that I have not done anything at all to my Hubs, and yet I have seen these fluctuations.
If my traffic had not gone back up, I would be leaning towards the "freshness" issue as the cause. I have been working on some new Hubs, not yet published, and was waiting to get them online before freshening the older ones - they do need that, to be sure.
But before I had done anything, poof! There may be some other explanation for what I have seen, but I do think jellygator's suggestion is worth considering. I have frequently wondered whether Google goes through a process over time of offering first one account then another one (with similar content) to see whether one gets better response than the other. That could be one explanation for some of the slight ups and downs I have seen. But I'm intrigued by JG's idea, and I'm going to keep it in mind going forward.
Google is all over the place at the moment. It's more turbulent than ever. I have not done anything major. I did take out some low performing hubs and also some that I thought might do better somewhere else. I have generally been concentrating my efforts elsewhere. I like the forums here though and come here quite a bit - sometimes I learn something, but other times I just use it like Facebook and see it as fun.
I think that is exactly right. Faced with five hundred pages on the same subject - ignoring the other 10,000 blatantly spammy ones - Google probably shuffles through them in terms of presenting them as search results and checks viewer responses.
So everyone gets a little taste of traffic. The ones with the best analytics - however it is measured - get the best ranking.
None of this is rocket science. It is exactly what we would do if trying to check audience responses to something. Try this, try that.
edit: Oh yeah. Before I get some self called expert shooting me down... how can Google measure it?
Well, they have the obvious analytics. But they also have Chrome, and the ability for a searcher to block further search results.
I think they can measure this stuff pretty well.
Good information and great readers infomation as well! I feel like we are trying to break a code with some this stuff. At least we are all working togather!
For some reason, I keep checking my hubpages earnings program amount every fifteen minutes. As if by some miracle the amount will triple before my eyes! Ugh... I guess I can go and buy myself that pencil sharpener I've always wanted, with the $2.50 I've made this month!
I just spent 3 weeks updating all my hubs, and I've seen a slight rise in traffic. Freshness factor maybe?
Anyway, I am still down 80% from my previous traffic highs.
I wish you, Mr Phoenix Lives, had used your normal username to make this post. I suspect you are a name we would have listened to.
As it is, I am still listening, because what you've said makes sense. At least, some of it does.
This is my normal user name for hubpages. This account is shared with my wife, she just rarely post to the account, but she too is a professional and my partner.
I do not do commercial SEO, just our own stuff, so we stay out of the public's eye and enjoy our privacy. We are not big names because we choose not to, so you probably would not know us anyway.
Now, to make the rest of it make sense to you, let me ask you a few questions.
When you do a search and click on a page, if it has poorly written content, what do you do? You hit the back button and go to the next results, right?
What do you do if it is great content and there are no other internal related links where you can learn more from this profound writer? You hit the back button, right? You have no choice if you want more information because you see now way to find more on the topic.
Now, if you land a a site with good content and they have other related links in the content that will help you, what do you do? you click the link and read more, correct? That is a second pageview. As far as google is concerned, you satisfied "Their" customer. If the customer did not go back to google and do the same search, they are satisfied, makes sense, right?
There is software that can tweet, but none that can retweet, there is Facebook software that can get you likes, but not shares or comments.
Don't think that Google has not modeled real social profiles either, they have. Chrome allows them to see this. This stops spammers in their tracks and the days of faking it are long gone. Everyone here should be celebrating and having a party over this. The playing field is now level.
I am sure Google takes into account user metrics when ranking a page. It also makes sense that the freshness factor is a way of giving every page with a decent on-page profile a chance to get into the top end of the SERPs where user interaction can be measured.
It is a big jump to assume that the important metric is bounce rate, however. There are plenty of things that could be measured including view duration.
Personally, I rarely visit 2 pages on a site if I am hunting for info. One is usually enough. A lot of other people probably do the same.
Just to chime in - I think thephoenixlives is saying that Google won't ding the Hub that got your one page view because it did in fact satisfy your information needs. And I believe he is saying that Google assumes that the page view was worthy since you didn't hit the back button and repeat the same search or select another link in SERPs.
ktrapp is correct, There are good bounces and bad bounces. If you hit the back button and change your search phrase, it is not counted against you. I am giving basics because then you are getting into short clicks and long clicks, pogo sticking and a lot of other detailed information that I can in no way address in a forum. But bounce rate and user "Engagement" is critical, you may not lose rankings to the point of not getting any traffic, but you will never get to the top and stay there with bad site metrics any longer.
We know that Google measures CTR based on position and relation to keywords also in natural results, so when you get your metrics right, you can advance faster to the top.
Kinda like high school football. If they have an identical record, they start measuring other stats until they break the tie to advance to the district and then state championships.
This thread should be stickied so that everyone can read your words of wisdom again and again!
Internal linking seems to be of especial importance now.
After watching a webinar on it, I added internal intext links to every single hub in a small niche subdomain and saw the traffic double overnight.
In the couple of weeks since I added those links, traffic has trebled.
It is well worth doing, though HP's link tool seems to miss out on some of the best hubs to link to when you know you don't have a hub of your own on a topic.
I never just accepted what they offered. Instead I kept a Hubpages search tab open and chose the hubs to link to very carefully.
This is important if I don't want to have to be editing broken links all the time - the link tool seems to promote new hubbers, most of whom never seem to stay.
Bounce rate and time on page has improved. At least it can be measured now, because people do really follow the links which gives Google something to work on.
I linked to several hubs on this (dead) account and have seen those hubs get some views again, clear referrals from the niche account.
After Panda, I heard all this commotion about "authorship" regarding Google search... and it having something to do with duplicate content.
That said, post-panda & authorship introduction, I've been seeing a MUCH larger drop in traffic after a hub's been stolen than before panda and the authorship thing.
Maybe it's just me. I mean, the authorship thing has been cool in search engine CTR... but I thought it was supposed to help us out a bit with stolen content... it seems it's doing the opposite.
I think it's hard to say. I have had one hub stolen repeatedly and I would estimate that the traffic drops do to duplication were just as bad pre and post Panda and authorship.
I just discovered today that it's been copied in part (3 full paragraphs) by a student who posted it on a blog for English class.
Hubpages could stop that problem with the rel canonical tag in each hub since they do not allow duplicate content. Tie your Google plus account to your hubpages account and you should fix the issue. Just add a link to your profile page on your Google + account to your hub author account profile and your off to the races.
I did do that some time ago (+ link on my HP profile page and HP profile link on my Google + page) and assuming I did it correctly it has not fixed the issue.
I have done this too. I did it correctly, but it still seems that, once copied, my content tanks until sometime after I get it removed. Last fall, one of my hubs was copied on over 30 sites. (One webmaster told me he bought it in a PLR pack.) Even though it was time consuming filing DMCAs for all those sites, it did seem to make the difference as it has started to get a bit more traffic. Although, it's nothing where it was before it tanked.
KTrapp, you can check to see if you're doing the Google+ thing correctly by visiting Google's rich snippets tool:
Looks like you did it right.
Very cool tool melbel. Thanks for checking mine for me. I thought it was set up ok.
In the longer term, especially if authors keep sending out those DMCA's, the content thieves should realize that they are not getting anywhere stealing stuff. So theft should tail off.
In the short term though, it is obvious Google is failing to differentiate between the originators and the copiers.
In fact, Google seems to be determined to force us to police the net since it can't do it, itself.
I agree for now, but they have machine learning, so they will get better with time. Rome was not built in a day, and since Hubpages was hit last June or July with panda, they are still recovering their authority and trust. Hubpages was full of spammers and content was being ripped from other sites and placed here and or spun and placed here.
There are probably reasons for the lack of trust in hub authors.
I am also seeing the same pattern. Not on this account, because Google has rated every single copy out there higher than my hubs, despite have Google ownership.
On my niche accounts, I can usually see immediately when a hub has been stolen through view drops, and they don't seem to recover even after the offending site has been removed from the SERPS, which is odd to say the least.
I have not linked those account to Google+, because I have still to see that it is worth it.
thephoenix, you're telling everything I've learned in the last three years, only more coherently.
I am of two minds about the rise of social factors as a minor metric for ranking content, but: +1 to all your posts. Several times over.
Going back to the original post, Google has put more emphasis on freshness as a ranking factor in some recent algorithm tweaks, but it's doing it intelligently: freshness still matters more on some topics (Justin Bieber) than others (Caligula).
As a case in point, I've got a stale article that pulls in 1100-1200 people a week, every week, year in, year out. Apart from adding a few new links and removing a few broken ones, I haven't updated it in 3 years. Steady traffic and clickouts seem to convince Google to rank it in about the same place from year to year. Freshness is a factor, but like most factors, the key factor for Google is, "What does a person searching for query X want? Does this page give it to them? Does it do a better job than most other pages out there?"
Just a theory, but freshness for domains is probably more important than freshness for individual articles. I haven't seen a change in pattern since I started writing online, nor with any of the updates.
I agree, some topics (particularly news of any sort) require pieces (singular articles on HP) that have new content added to them regularly. However, and I'm hopeful here, that theoretical article about Eleanor Roosevelt's favorite formal wear, from 1999, will still rank above any new, simply because it was so exhaustive and informative.
I've been busy lately, but I'd be in error if I didn't thank Greekgeek for sharing her experiences with online writing, especially comparing the various venues. I do hope you will stay on here at HP, for my own selfish reasons, and because you are so helpful to newer writers.
I agree with you 100%, you are spot on.
There is an assumption that subjects like history are unchanging. Academics would disagree. They only make careers with new ideas and data. Freshness matters with Caligula.
I believe it would be based on the rate of new information appearing on the subject, number of searches and social signals and probably a few other signals that affect it. Think Google trends. Hope this helps.
Will, I know that, but still, Justin Bieber fans want to know what happened yesterday, whereas classics majors consider an article written 3 years ago "yesterday."
well... I don't mean to contradict anybody in particular, but I have seen exactly zero *evidence* of social signals or user interaction affecting rankings or traffic.
and this statement is based on commercial SEO for 5 years across a dataset of 220 sites, 2 of which earn approx 750k GBP per month between them.
what I have seen is a great deal of speculation about *facts* and what Google are doing with their information from monitoring us.
So you are saying it is all in the backlinks? Nothing has changed for the last ten years?
primarily links yes. but I wouldn't exactly say nothing has changed, the type and quality of links needed has changed enormously in the last year, never mind ten.
what I am saying is that social and user signals seem to play no role, sites will rank just fine without, and sites that have bundles of social *vote* and thousands of users wont rank for toffee without decent links.
and as I said, I have seen no convincing evidence, just a lot of speculation based on Google's spin.
I don't know much about SEO anymore, but one of my hubs went viral on Facebook in April. This hub was nothing in Google search and now it's up near the top of the first page.
While many social signals are nofollowed or withheld from Google's spiders, Amit Singhai and Matt Cutts have both mentioned that Google has been using social signals over the past year or so. There have also been some references to social signals in Google updates and announcements for the past year (including some that have since been phased out, like site blocking data, but even those demonstrate how Google is trying to leverage social signals.)
I have also seen the viral effect. When one of my articles when viral on Twitter, it raced to the front of Google's search results for the popular word "pinterest" and stayed there for about two weeks until the social buzz started to cool off.
So with all due respect, while I'm only an amateur in SEO (studying it since 2007, and before that watching how the web and methods for navigating the web have changed since I started building websites in 1993) -- I'm going to trust the Google engineers' opinions over yours about whether Google is using social signals in its algorithm.
I follow and pay attention to what Google says in its announcements and through its spokespundits. They are often misleading and play cards very close to the chest, but they do let some information leak out. (Not always on purpose!)
Greekgeek and Ktrapp, Do you recommend using Google + to promote hubs? If so do you just go to the site? I haven't really been using it because I tend to spend a lot of time there doing nothing.
Here's another of my crackpot theories. I'm not sharing my stuff. At all. If it isn't good enough for others to share, it ain't worth sharing. I found HP traffic / shares a little misleading in that respect - ooh look at this marvellous whatever, except out there in the more important world... it isn't marvellous.
So my own theory is.. I need to make it better and better. And get lucky of course.
edi: I DO use Pinterest. It's rubbish for traffic, particularly as I don't bake cakes. But it does give a small amount of feedback for images, which I need.
hey, of course everybody has to make their own minds up. I'm just sharing conclusions from the data we have, and it is also the common feeling amongst the majority of other pros I know.
this does not of course include (m)any "SEO"s who make their money by "social media marketing" because they, like Google, have a vested interest in muddying the waters.
and with regards to believing Google's public propaganda, feel free. many years ago I used to closely follow MC etc, until I realized the true nature of the information "shared"
- from their POV it's very much easier to herd the majority of the sheep in the direction you want them to go, than deal with everybody doing what is required to make their website rank properly.
this guy is in our private seo group (along with approx 200 other pros) and this is a pretty good summary IMO
http://searchenginewatch.com/article/21 … e-Rankings
Quoting from the page referenced:
"How much do social signals play into Google rankings?
Don't be daft, I haven't a clue."
So there you are.
At least he is an honest SEO.
The fact is no one knows much beyond the fact that Google wants to serve readers pages that they want to read. So produce a page that serves your readers well and Google will do its best to find it.
You can speculate about the signals that Google uses for ever.
I cut pages that readers react badly to as a matter of course, partly for the sake of avoiding Panda and partly because they will never make any money anyway.
I believe most are missing the point, as Greekgeek stated earlier. The social signals do help with SEO, they do increase ranking, and I believe it is to put a page or post that is going viral up top to get some exposure and see if top bloggers write about them. Giving natural links to the recipient of the content if it is indeed worthy of a mention.
I have seen social signals cause links to my content on many occasions. There is no doubt that inbound links from authorities are the mainstay, but social is the main means of getting them naturally.
Manual link building will soon be a thing of the past. As the machine learning algo get's smarter, manual building of links will soon enough go the way of the penguin.
I am an old school SEO, have been since 2001 and Google and Matt C always warns before the action, usually a year to two years ahead of time. They done it with recips, they done it with article directories and they will do it with unnatural links that are built by SEO companies(already started.
I read a report on SE Roundtable that stated 65% of SEO companies took a hit over the last year and they are working on the other 35%. They are tired of people manipulating search results and charging people to manipulate them.
Also, that group you speak of has no consensus either, not more than any other "private group". If there was a consensus, you guys would guarantee your SEO work, but you don't and even go as far to disclaim that no SEO can promise top positions.
Like I stated, inbounds will never be totally discounted because it is the basic infrastructure of the internet. But once the general internet population catches on to the holes, they will capitalize on it and Google will change the speed limits again so to speak.
I can promise you this as of right now, if we have like inbounds, like profiles and onpage and you do not have as good of social signals as i do, I will beat you 100% of the time.
If social only accounts for 1% of the ranking signals, I only need .10% to beat you if we are in a competitive market.
So, please keep thinking it does not count. You give me all the edge I need by ignoring fact and opting for fiction.
SEO is like a car, take the keys away and it is useless, give the keys back and remove the fuel and it is still worthless, give the fuel back and remove a tire and you may go for a while, but not long.
Isolating just one single factor will get you beat by guys like me who take heed of MC's "Propaganda".
You also need to watch Matt's body language, he may not disclose or tell the whole truth. Like at around the 2 min mark of this video http://www.youtube.com/watch?v=Rk4qgQdp2UA
He says you get the phrase once or twice and it is ok or even good, but if you keep going they make it hurt a "Bit"? Watch his hands! That tells the real story.
This is not rocket science, it is common sense and being able to discern between reality and B.S. to put it plainly.
Panda is about bounce rates and user metrics, period. this is how they determine how crappy a site is for the user, not necessarily the content, but the site a a whole. How user friendly is it? bounce rate, pageviews, time on page, time on site, long clicks, short clicks, pogo sticking and much more.
Get your bounce rates under control and you are good with panda.
Why did the panda slap article directories? Because they were not user friendly. Ezine articles is still suffering because there is no good internal linking or navigation. That means less pageviews and why they are still in the tank.
Hubpages came out because they isolated the problems with Subdomains and why individual accounts get hit now, saving their main domain and the ones who build user friendly hub accounts.
This thread is not, nor was it started about SEO in general, it was over SEO specifics for current problems that hub writers were having with their accounts. The information and thread is to help hubbers that are having specific problems.
Getting into a deep analytical discussion in this area is not productive to most here that simply want their traffic back, and they did not drop because their backlinks were discounted, they were hit by panda user metrics problems.
If they are hit by panda, no amount of inbounds, no matter how good they are will bring them out until they get their site metrics under control.
no denying social can get you real links, and enough people say they have seen rankings from "going viral" to at least have to consider the possibility, but those real links, you wont rank long without them.
and we may not guarantee #1 rankings, but we deliver enough of them to have some idea what I'm talking about
I have no doubt you and/or your company are good at what you do with the numbers you posted earlier. You don't get numbers like that being a rookie, I know, I do very similar numbers, I just do them for myself. I am not a commercial SEO, I don't do SEO for hire, only my own. I used to do it commercial, but was tired of others trying to dictate my job!
My main point was that if they ( authors on hubpages) are hit by panda, inbounds are not the solution. Links are, just not inbound.
They need to get out of panda before they worry about link building.
with regards to linkbuilding for hubpages, moving forwards it looks as though they have severely messed with the way anchor text works with penguin and the follow on tweaks.
general consensus seems to be that "brand" links can't hurt you, and will probably only help things, so pointing decent links to your profile with your variations of subdomain as anchors is probably the way forwards?
it should be a fairly easy test of how Google sees your site (subdomain) to check to see if you are page 1 for your subdomain url or the individual words if appropriate.
ie in our case it's "seo ibiza" and the hubpages profile is indeed on page 1 (as part of a reputation management serp cleanup project to remove a certain scottish seo hubber from the results, buts thats immaterial)
so that should really imply that if google sees your "homepage" as rankable, for it's prime kws, inner pages (hubs) should be similar unless they have page penalties from previous linking efforts?
just thinking out loud, but would seem to make sense?
and another example
https://www.google.com/search?q=sunforg … amp;num=10
New hubbers will see hit or miss traffic on their subdomain, depending on whether or not they wrote on a popular topic, or had prior SEO knowledge.
HP is still a powerful platform.
Unfortunately, once your subdomain is 'panda'd' or Penguin'd', there is no advice from HP on how to move on.
Yet everything HP has done to the site is proving to be to our advantage, albeit after a period of time.
The interlinking of hubs was treated with howls of derision, yet those who followed that advice suffered less when Panda/Penguin hit? Am I correct?
Just asking because I don't have the data to know, one way or the other. I did interlink some orphan hubs throughout the site, but very few.
Ended up having to remove most of those links, as hubbers left.
I really think thephoenixlives should be listened to, because his description of interlinking seems the most plausible of all the possible things Google might have marked us down for - those of us who have lost traffic, that is.
To me, it is like a breath of fresh air hearing an SEO expert show us another way that doesn't involve mindless backlinking, which most of us can't be bothered with anyway.
I have no doubt backlinking still works, but writers really can't be arsed with it! Too much work when all we want to do is write!
The take home message I got from all this is that Google judges 'quality' by user engagement - time the average user spends reading the page and other pages on the site - so adding anthing to keep them for longer will help, video, good interesting stuff etc.
This article is very interesting - though I believe the facebook associations are not causal.
http://blog.searchmetrics.com/us/2012/0 … tors-2012/
Not only time on the page, but multiple pageviews is also a factor. Google looks at your subdomain (http://janderson99.hubpages.com/) as a complete website.
If every person that visits your "site" only views one page, it will not get traffic long from Google because it assumes algorithmically that it is a poor site because nobody ever visits a second page.
They want you building good "sites", not just a good page. Understand? To Google, a "page" is not credible, but a website with a lot of pageviews and a low bounce rate. It is also good to add video and others to help keep them on a page longer because it is a sliding scale depending on the other sites that rank in the first page.
The ones that have the best engagement records get's better positions overall.
I don't believe that for one second, and none of my websites backs up that theory.
If someone looks up "Abraham Lincoln" on Wikipedia and learns all about the man, why oh why oh why would they then go searching for something else? They wouldn't. If I looked up "how to change a tire" or "how to bake chocolate chip cookies," I would get the information I need and leave. There would be no reason for me to stick around. But that doesn't mean there was a problem with the site--quite the contrary. The site fulfilled its function perfectly.
Now if I was a new mommy looking for parenting tips or I had diabetes and was searching Mayo Clinic or Joslin for info, that might be a different story. But I don't believe Google has a differentiated algorithm for those types of searches.
I don't care if you believe it or not. You will never achieve a 0% bounce rate, If you would have taken the time to read my other post on this thread I covered short clicks, long clicks and pogo sticking. This is a little more advanced and I am trying to keep it simple.
Not all pages totally fulfill a query, or may cause more questions for the searcher. So, go ahead, and keep your one page wonders and i will beat you out every time because I also answer the questions that may be asked afterwards, getting more pageviews, and therefore more authority and better positions.
Now, as far as Abraham, why do they have internal links and other resources on that wikipedia page? That page has hundreds of internal links and hundreds of outbound resource links then? Hmmmmm?????
I bet there are over 1,000, maybe 10 thousand different recipes for chocolate cake, what makes you think they will go with yours? Are you or your cake that special? Only in your own mind. What if it is someone who can not use the flour you suggest and they have to look for an alternative.
Changing a tire, can you tell me how to change a tire on a 69 triumph spitfire? Come on, right off of the top of your head, can you? Is it the same process as a 89 94 subaru legacy? Is the tire in the same location? Is the jack the same as every other jack?
The point of my response is you could have so much more, and if you get multiple pageviews, your site increases in both trust and authority, and that brings better rankings, if you had a "change a tire page", then covering more details on other pages.
Like brand specific pages and linked to them from that page, you will not have to worry about bounce rates, you get more pageviews, which means more of a chance for revenues. Every pageview builds trust and gives you authority in the visitors eyes.
This makes more money for you, gives your site more clout with Google and then they dump more traffic on you because you have proven you will take care of "Their" customer.
You can choose to believe anything you want, i covered earlier how most will not believe and remain at their same level.
No one here is forcing you to do anything or believe anything.
My, aren't we a little bit defensive.
You can ramble all you want, but I am talking from what I have seen across many sites, and I'm not referencing HP subdomains. I have seen absolutely zero difference in the results between pages with high bounce rates and those with low ones. Zero difference. I actually haven't even seen a difference between those with a longer time on page, although common sense says that might be a "quality" factor or whatever you want to call it. This isn't a matter of me believing "what I want," it's what I've seen.
A high bounce rate along with other unfavorable metrics MIGHT send out the wrong signal to the search engines, IDK. But I do know that by itself, it does not impact webpages. My top hub has spent the better part of the past few years in the #1 spot for its keyword. Occasionally it fluctuates to #2 or #3, but it always comes back fairly quickly. It's got a 94.35% bounce rate. Average time on page is over 5 minutes. I can't really do better than #1 no matter how low I get that darn bounce rate down.
Can I tell you how to change a tire on a 69 Triumph Spitfire? No. But if I found a webpage that told me how to do it, I doubt I'd keep browsing to find out how to do something else. I'd probably just get up and change the darn tire.
This matches my experience. If read time is good you can be pretty sure all the user metrics that matter are good.
Bounce rate is probably the least significant.
I am not defensive one bit, You can do as you please, but since you are such a hotshot, why not tell me the main keyword, let me put up a hub and we will see how long you stick at that position? You game?
The truth is, apart from this strange fixation on bounce rate what you have said is stuff no sensible person would disagree with.
Maybe the bounce rate bone is something to play with on your own.
Sure. The keyword is (drumroll please)....."SEO guru wannabes."
That's what I thought....your a troll!
Also, how many pages are on your site and how many have over a 50% bounce rate?
Which site are you talking about? I was talking about what I've seen over many sites -- and not all of them are my own. I just had a conversation about this maybe two weeks ago with someone else who shared his experience with the metrics for his sites, and he's got way more than I do.
Anyhoo, one of my sites has roughly 50 pages. Every single page has a bounce rate higher than 50%. The only "pages" (which aren't even real pages) less than 50% are the "search" results where someone searched a category or keyword. As I said...different type of user, different type of behavior. Seriously -- there's a huge difference between someone looking for "who was the 30th president of the United States" vs "history of the US presidency."
No, I was right, your a troll and a liar. It is people like you that cause people not to help others.
I had already stated in this thread SEVERAL times about short clicks, long clicks and pogo sticking. Go look that up and maybe you will figure it out, because i have figured out that you are indeed a troll.
I understand the difference, stated that this was for people not into seo for the most part and you want to mince words? Your a troll. Go away!
"This article is very interesting - though I believe the facebook associations are not causal."
http://blog.searchmetrics.com/us/2012/0 … tors-2012/
yes, all these studies show good correlation between FB visibility and rankings, however thats where it breaks down IMO, correlation is NOT causation.
if a site ranks well and is visible, then if its good it's likely to get more FB attention than one that's on page 12 that nobody ever sees.
certainly in many small business searches we work on the top sites have been there years, (many are shite) and have virtually no social presence, and many hundreds of sites below them have been spammed all over every SM outlet you can think of.
with regards to user metrics, my own best performing affiliate (test) site converts (affiliate link click) 52% of arrivals - is this a bad bounce rate?
it's been #1 or 2 for it's travel-related terms for 3+ years and never seems to suffers from algo abuse despite zero social presence
They do not have to listen, but what do they have to lose at this point? Try it and fail? They are already in the can, so what does it hurt to try my advice? They stand to lose nothing and have everything to gain.
The reality is I am willing to help the ones who want help. I have offered hundreds in the past, but so far, only about 8 have taken the advice.
All but one of the 8 have reached great traffic and make in excess of 6 figures a year now, most within a year, the last one (no 8) is on their way to six figures as we speak. ( yes, you can do that much on hubs)
Most will not listen, it is human nature. I don't have stats on hubpages, but practice what I preach on my own sites. The only reason i even started on hubpages was because my wife bought a huge bean bag and wanted to do a review on it and we did not have a site that was themed correctly to do it, so she wrote her review on her beloved beanbag and we pushed it up.
This one hub does not make a lot, but it makes about 1200 to 1500 a year. As I was building the dozen of so links to it, I because a little curious about them and started reading the forums and realized that many were having panda problems, So i figured I would chime in and tell people how to fix them.
Your subdomain is the same as a wordpress subdomain or a regular domain to Google, so the solution was obvious.
I have put several blogs into panda and brought them out multiple times testing this. I know it works and know the tolerances because of extensive testing. Bringing them in and out of panda just to see how many times i could do it and narrow down the major factors.
Quality content is important, but you do not have to write like a pro from the AP to make a lot of money, and as you said, Hubs are powerful and a great platform.
They have some great writers here, much better than myself and it would be a total shame for the world not to see their writing compared to a lot of the trash online.
I will continue to share my experiences and tips here and on my blog for anyone who is interested in learning.
I have been blessed with my skills and now it is time to share the truth to the ones who will listen and want a better life.
To all the great writers out there on hubpages, don't waste your skills by ignoring, you can help make so many others lives so much better with your skills and knowledge, just try it, you will be amazed at the possibilities.
Are you saying people should stop thinking of their Hubs as part of HubPages, regard their sub-domain as a standalone blog, and act accordingly?
Another question - when you refer to "internal linking", are you referring to internal linking within your own sub-domain or linking to other Hubbers' Hubs?
No, but you control your content if it is within your hub. you will get more followers, you can educate them on what you are an expert in,, but it is 100% ok to link to other hubs.
They put that internal linking tool in there for a reason, you just have to check them on a regular basis to make sure the links are not broken.
Google just changed the way they see internal links. Now, a link from your subdomain to another subdomain is considered an internal link. So linking to other hubbers helps both you and them. It is 100% safe and a good idea if you do not have related hubs to link to.
I believe hubpages knows this because they would not have put the internal link tool in otherwise.
So can one sub pull down another sub because of links or because they are sub-domains and so part of hubpage.com. HP has an automated system where 'releated' links are generated that are not controlled by the author. There is also concern that poor quality in some subs will taint the ranking of other quality subs. How much are we 'all in this together'.
I was not aware that they could auto link from within your content to another hub. All I have seen on mine is the ability to link to other hubs. But to answer your question, yes. Linking to a bad neighbor can hurt yours. This is why i suggested that you write your own content.
Google just announced that they are working on a "disallow" feature in webmaster tools so that you can refuse credit for links pointing to your site so that suggest that it is now possible for negative SEO as it is called, but it has always been that if you link to a bad site, you could be taken down also, but there is more.
Panda is not technically a punishment. It is not saying that your site is doing something wrong, but more on the lines of you are not doing enough yet. So it is not a penalty.
Hidden text, keyword stuffing, spun content syndicated to get backlinks and such are things that make a "Bad" site. I have never had trouble with a site linking to a site in the grips of panda, and I have tested this extensively.
As long as their content does not look spammy or spun, you should have no problems at all linking to other hubs.
There is little room for blackhat on hubs from what I have seen other than inbound links that would not be totally obvious.
I really do not see linking to 99% of hubs as a risk at all. But in theory, it is possible.
"I was not aware that they could auto link from within your content to another hub"
Just look at the 6 link boxes at the bottom of the page = all generated by HP - rarely the author's other pages.
Discover What Other People Are Reading
What about the tainting caused from sub to sub simply by being part of the mother URL hubpages.com
By the way thanks for your viewpoint -its great!
One other thing, if your site falls and you suspect that is a problem, just remove them and it should come back within a couple weeks.
Hey what happened to your fabulous SEO hubs on this topic - I was just going to read through them in detail and poof they're gone???
I've also heard that a high bounce rate can hurt you. Any ideas on lowering the number? I've been studying both my dwell time and bounce rates across hubs and it's slightly better for me with the new design, but am wondering if there is anything I can do in formatting my hub or in writing to lower my bounce rate.
I do shove the picture to the top to push that right hand ad down my page. I personally don't like that ad at the top... to me it makes the site look too full of ads. I ran a test to see how my audience responded... positively. I'm just looking for more ways other than a picture at the top, lol!
Another thing I've heard (not great at SEO or anything, just something I heard), is if people regularly click on links your put in your articles/hubs, regardless of where they land (within your subdomain, another subdomain, a completely different site), then Google regards your page as a good source of information. I think I heard this on Squidoo (I don't really remember, like I said, I haven't really done any serious SEO in years.) So even if people click off your site by clicking a link you have in the body of an article, this is better than another type of bounce.
Just make sure your first paragraph is very short, then it can't appear.
Use of outbound links to quality sites has always been good practice. Anything that is good for the reader is good practice.
Write a page with genuinely good links and it can get bookmarked and repeat visits.
How would Google measure a "retweet". Twitter gave Firehose access to Bing in early 2012 and didnt renew with G. I dont think G would be able to measure that signal.
I would love to know how?
keep in mind there are over 250,000,000 tweets per day ...
Same deal with FB - G and FB arent playing nice anymore - both TWitter and Facebook backed out of advanced G social analytics inclusion.
G certainly built social models to work off - but real -time access to those major players is limited. I don't think those social signals are directly causal - the increased visibility just leads to increases in the factors G can access and rank.
and similar to seo ibiza, I have older sites with little to no effective social integration ( target demo doesnt use social ) that smashes sites in the SERP's with great "social" (def manufactured, fiver etc) activity.
greetings Mr Sunforged ..have enjoyed your hubs for a long while now, nice to cross paths finally.
and new panda refresh yesterday http://searchengineland.com/official-go … 5th-125945
Hmmm...and it says in there;
"Social media signals show extremely high correlation: social signals from Facebook, Twitter and Google+ are frequently
associated with good rankings in Google’s index. This is interesting in particular for the UK, which
hasn’t had such a strong correlation with social signals up to this point."
which kinda backs up what phoenix said.
He is probably banned from the forums now because he made a personal attack, but he was right on what he said before that, no?
Sorry to see all this nastiness on what was an extremely interesting thread.
And sorry also he removed his hubs on the topic, because they made perfect sense to me and I would have referred back to them. Wish I had copied them to notebook now!
nastiness? did I miss it? he was a bit excited but I wouldnt call it nastiness
with regards to his hubs, there all still here
https://www.google.com/search?q=site%3A … amp;num=10
with regards to the content and opinions, and thats all they are, as I said before its entirely up to you what you believe.
Im only sharing this information to try and save people doing loads of work they think is going to help, and in my experience probably wont very much.
as a successful pro SEO company we achieve all the results we ever need without ever thinking about doing social. many clients decide they are going to anyway and give up when they realise how much time they have spent and compare to their analytics stats results.
I seem to recall he called someone a troll and a liar amongst other things. Unfortunately he picked someone who's very well liked on HP so I can imagine several people probably reported him. A first ban is usually only temporary (for a few days). In light of that, removing all his Hubs seems like a childish thing to do - unless he has been here before under a different name, and has a record of bans under his alias already.
So as long as one insults a hubber who's not well liked the "insulter" doesn't take much of a chance of getting banned? Ah, this explains a lot. So far, no one I've ever reported has been banned, at least to my knowledge.
You're reading too much into it, I think! I'm just making a simple observation.
If someone insults a good friend, what do you do? Protest.
If someone insults a stranger, are you as strongly motivated to protest? I don't think so.
No, I reported an "elite" for insulting me numerous times in one post. Nothing was done because they are protected, while I was "accidentally banned" for a month until they relented when I asked for an official reason for my being banned.
This is why I don't like the moderators to have anonymity. You can say what you want to about the process, but the bans are NOT handed out equally here. But hey, it's HP's game, right or wrong. They are not required to show equality to all members.
Large enough to learn what respect for other people means. But then, this could only be a few people. Numbers have nothing to do with truth, at least in my opinion.
I only ask because, extreme sensitivity to nuances of equal treatment seem to be part of the big family thing.
I nearly starved living with my first serious girlfriend. If there was more food on my plate than hers she went crazy. The fact that there was always more than she could eat on hers didn't change the issue. The fact that she was tiny and weighed only two thirds of what I weighed meant nothing.
I did a lot of filling up on bread and butter after meals just to keep the peace.
She was the eldest so no doubt was obliged to sacrifice and be caring when she didn't much feel like it.
Another friend went crazy if I made coffee and the mug I gave him seemed in any way smaller (even a mouthful smaller)than mine. He was the youngest and seemed to come from a family where jungle law ruled. The biggest ate first.
Anyway, I coped with these peculiarities and I suppose I can continue to do so.
Actually, I only had one brother and one sister, Will. But thanks for the psychological comparison between desiring equality in nutrition and wishing for some sort of parity for all members of the HP community.
Fortunately for me, I never had to worry about anything to eat, or wear, or anything else as a child or as an adult. Growing up on a farm has it's perks, with food being the very least of my worries.
But thanks for your crude attempts at analyzing the reasons I wish for honesty, justice, and fair treatment for everyone here in the community. The simple reasons are often the most overlooked by those who have experienced other than ideal situations as a child. I suppose this explains why you are like you are.
I've written on several sites before coming to HP and had quite a bit of success on all of them. Unfortunately, they fell into the same type of rut HP seems to be heading for with showing favoritism to their chosen few over the good of their writers.
Yes, I do speak out against things I see which appear to be unjust because I hate to see this great community fall into the same type of dead-end routine as those other writing sites. I'm not only speaking for myself, however. People contact me with questions about HP. People who are afraid their livelihood--what little they now have here now--is being destroyed by the late actions of the powers-that-be. Most are afraid to stir the pot because of the retribution they may experience if they do so. They know I will ask tough questions without fear of retaliation. And someone has to ask them, or we are doomed as a community.
But one more thing. None of those other sites tried to hide the identity of their moderators. Unlike HP, one knew exactly who they were dealing with in all instances of communication with the site. There was no sense of communicating with nameless, faceless, judges who had nothing to lose had they took a position which turned out to be wrong. They had to man-up to their mistakes when they made them, with no coat tails to hide behind. I at least respected them for that.
And no, I do not wish to be a martyr, but the alternative is even more unpleasant to consider. Perhaps YOU could fill in for me as the voice of reason when controversy arises when I'm gone, Will?
Seems true, I thought it wasn't this way though. It's the sad truth
I have no reason to be untruthful about anything I've stated, Lobo. But truth in the online business world doesn't mean the same thing as it does in a face-to-face transaction. You can't look people in the eyes when you're talking to them on a computer. But even worse, they don't have to look you in the eyes either. More's the pity.
Ya, I've begun writing on Wizzley and writing my first lens on squidoo to give it a try. Wizzley looks fine. Just getting a few articles up on both these sites and will return to HP. I love HP, but I guess I should diversify a bit - The staff say it too
Yes, by all means diversify, Lobo! The shelf life of many writing sites--at least the length of time before they fall afoul of the search engines--is not very long in my experience. Some get greedy, while others forget those who made them successful in the first place. Myself? I'm weary of the whole thing at this point. It's not worth trying anywhere else anymore. I'd rather just write for my own pleasure without jumping through the hoops anymore. I'm getting too old to jump anyway. But good luck on your endeavors. You seem like a nice person.
Thanks Randy You seem nice too and I like the new pic of the snake your trademark or whatever you call it I just began writing online, so might as well give it a go. Anyway nothing to lose
Thanks, Lobo. The snake is a version of the "ewbie" Mark was so kind to draw for me when HP prohibited using live link smilies. Another great decision from the top!
Yes, don't let content sites dictate what you wish to achieve with your writing. Start your own instead. I'm more into the creative aspect of writing at this point as I'm weary of the content site mentality of online marketing with all of it's pitfalls and frustrations.
Oh! It's not exactly a stick figure though I was here when you used to use a linked smiley too
Nope, not exactly a stick figure at all, Lobo. I suspect Mark is a bit leery of showing his true talent here. I do love his stick figures, though.
Mark is acquiring European habits- he knows all the slang already- and that means tortured artist, pale as a Shelley, living underground like a hobbit.
Or perhaps he just doesn't give a monkey's.
Same here, I read his hubs because of his stick figures more than anything
his hubs indicated a different view on things to his past writings anyway
http://www.google.com/search?q=%22jim%2 … n%22%20seo
Sunforged is the SEO I can believe in.
Since they have access to Google +, not Twitter and Facebook, they can peruse the data there! Not everybody use Google +.
The relationship of social interaction with relevance is murky.
The main hub I wanted to see again is 'not available on this server' or something similar, even though it is referenced in the cache.
The social factors mean nothing to me as I hate Facebook and Twitter and any kind of mass social interaction.
It was the other stuff he said about interlinking that was spot on, so I am not disagreeing with you, just agreeing with him.
You both have good points to make, and if anyone reading here carried out the advice given by both of you, their hubs/sites should fly high.
nobody is arguing about interlinking, it has always bee critical for the last 10 years, we have written reams on it in other places.
this isnt a him or me thing you know, its about calm assessment of the facts and evidence you have seen. to me however this involves only minor interest in Google's version of events, because frankly Google is a bit of a joke. I could show you multiple examples of sites ranking due to links hidden on other sites by the web designers, in competitive big money searches.
- massive spammy subdomain farms ranking in major one-word kw searches, due primarily to heavy sitewide linkspam from porn sites.
its not as good as you think, nowhere NEAR.
It is worth stopping by Google's webmaster forums from time to time to listen to the wailing and gnashing of teeth (of those damned by SEO).
http://productforums.google.com/forum/# … ty$20links
A few quotes:
"We got a link spam penalty at April 01, 2012, because my pre-colleague outsourced the link building work, gaining hidden and spam links from GOV, EDU Links"
'we suspect we may have some kind of manual, or algorithmic penalty on the site. ...so contacted agency and asked them to remove all the links which had been built.'
'Last July 2011 I got a -50 penalty for using an SEO company that did me no favours when it came to linking...'
yes it was brutal for many companies, lots of whose clients didnt even know they were using those systems until they were hit or got the notification email. we had only 4 of our own test sites hit by penguin, in varying degree, and they were all quite different in optimization levels, techniques, and the extent of the hit. no wmt emails for any of our own or clients.
its hard to draw any conclusions from it all really, several test sites we thought might / should have also been vulnerable were and are still just fine, several months later now.
the waters are very muddy.
I am an expert Seo adviser who has worked on multi million dollar companies and increased traffic by hundreds of percentile points. My clients are completely satisfied with everything I do and regularly attend my web seminars (where I also sell a little bit of MLM).
I know the entire secrets of Google having worked in the industry for twenty five years, and can tell you that the nonsense spouted above is typical of the amateurs who populate this field.
That is why I say to you, HubPage citizens, be not tempted by the cheap quick fix and promise of easy traffic – that’s crazy talk.
Fortunately I was just passing and am able to offer my services, without actually offering them, because that would be self promotion.
And I don’t want the mods to get evil on my tush.
Although I have made my own fortune, and indeed that of anyone who happened to sign up for my assistance, I find myself at a loose end. It’s boring just lazing around on a yacht all day.
Time for some words I think.
Linking, social, signals, algorithm and analytics.
There. If that doesn’t prove something to you then I don’t know what will.
If you are tempted to visit any of my output, which you mustn’t because that is self promotion, you may find that most of it consists of rather excellent drawings of stickmen. I do this to put the SEO competitors off the scent.
Relax. You can be sure with me that a fortune is just around the corner.
lol. im not sure if this is directed at me or not, but (the 99% of) hubbers cant afford us, that much has always been clear.
im not on a yacht, but can hardly make myself do any SEO anymore either tbh, Ive gotten right into trading on forex and funnily enough do find myself bored on the internet waiting for the markets to move.
sorry if it seemed self promotional, but I wasnt offering anything other than advice, and really genuinely was just trying to help.
Hey fella, I'm being in tongue in cheek, not singling anyone out.
There just seemed to be a lot of experts on here... thought I might as well grab a bit of the action myself.
..meta robots canonical issues as well, innit
We all know that you are expert, Mark. It is just difficult to decide exactly what you are an expert on or in.
In England, you might be called an 'expert without portfolio'.
to hell w/G00gle.. I am using other analytics and not succumbing to their demands. There's many other ways to get traffic like these start-up social media companies do.
Not everyone has a Toyota. I am no expert but I do see not so favorable changes for them.
Google Measures Quality via Analytics (fullstop)
Google demands good quality and points to spelling and grammar etc.
But I don't believe that the dumb bot has a spell-checker or grammar checker.
All that 'read out loud' stuff is basically bulls**t.
Basically what G is sayings is that good quality will lead to good analytics.
This is an association not a 'cause and effect' relationship.
Poor quality stuff can have good analytics (time on page, bounce rate, pages per visit etc), can rank well and appear high in the SERPS.
Good quality is determined by the multiple-user response as conveyed by the analytics data (time on page etc). It the article is good users will read more of it, link to it, want to read other related pages you have written and refer it to their friends. (all measured by analytics)
The analytics for a site or sub is also very important as G uses this in a similar way to score all the articles within a sub. If the SUB analytics are good each of the articles will get a good score.
The take home message from this is that hubbers should work on improving their analytics both for the individual articles and for their subs.
The point is that there are some things that you can do to improve the analytics without blindly trying to improve the 'quality of the contents' in terms of the text. Basically this is about engaging the user for longer, providing what they want and encouraging them to go to other pages within your sub (by having an action type link above the fold etc.)
It is more about smart page design than quality of the text.
Another important aspect is to make the title and description as they appear in the SERP say
Pick Me! Pick Me!
This thread has meandered all over the place, but it's a fascinating read with so many points of view about dropping traffic (if we're still on that Here's my thinking for what it's worth. I may have missed the odd post, so I hope I'm not repeating anything.
This is something I learned post Panda and it makes perfect sense to me, more so post Penguin. It's a trickle-down effect, especially for older articles that have aged nicely and have been linked to organically from hither and yon. If those sites linking to HP articles get de-ranked for whatever reason Google deemed necessary, something HP writers have no control over, it diminishes the value of those links to HP pages. It follows that HP articles' rankings drop along with the diminished link value. Organic backlinks are still important, but obviously from sites that are respected by the search engines.
Does that make sense? 'Tis late here.
It does, but more likely to be related to freshness (staleness!) and being swamped by competition rather than links being killed off or downgraded. The evergreens should be developing more organic links to replace the ones being killed off - Who Knows?
It's all knock on effects because less traffic to evergreen hubs will inevitably lead to fewer people linking to them, and a lot of people are running scared of linking post-Penguin
with regards to the much vaunted "freshness factor" I present this search
https://www.google.com/search?q=google+ … amp;num=10
please note the date of the #1 post, and the dates and quality of the sites beneath. it has been there since a few weeks after publication, and never once flinched.
it has had zero external linkbuilding of any sort applied to it.
that is all.
Looks like a classic case of keywords in title, url etc. which has suddenly popped up as being given extra weighting.
its a little more than that also do the google properties and mashable etc below us not have similar?
it is easily explained, by us, anyway, have another search..
https://www.google.com/search?q=advance … amp;num=10
No one searches for those terms - you're the king of a mole hill - just stirring!
The second one is much more significant => has a PR3 and lots of links.
Why doesn't prelovac outrank you???? Is it 'seo' in your URL
not many people search compared to the broad terms, however we are only targeting the "advanced" niche. the main point to note is that we are above all the best WP SEOs in the world
and it is those specific advanced WP SEO techniques that are primarily responsible for a 4 year ranking above Google (lots of them) for their own terms.
If I had just published that post and forgotten about it, it wouldnt be there.
thats all I was getting at.
According to Google insights that term gets around 60 searches a month. It doesn't register at all on the Adwords external tool.
So, not a great example.
Google recently updated their search Engine Algo yet again with panda Update 3.8 on June 25th. According to the releases I read (Barry Schwartz on Search Engline Land) it affects "only" 1% of queries in the world.
The last update before this was June 8th and before that it was April 26. Does that help anyone here? I don't know! Sheesh!
"...Google said there were no updates to the algorithm or changes in the signals. This was simply a basic data refresh where they ran the algorithm again..."
Break out the Brasso, maybe even the spit and polish and dust down those hubs! Clean 'em up and update them a wee bit!
I also recommend trying a few new things. If you have old links from blogs or sites to your hubs that are not ranking well, try newer linkages. They will bring new eyes to your content, and Hub Pages will benefit overall.
I have drawn an info-graphic explaining how my SEO works and the result on my traffic...
Now my traffic is dying again. Time to go focus on other stuff and hope it looks better in a month or two. It has dipped below being worthwhile.
There are a few elements of the recent Google updates that could impact people here. The first is an emphasis on going after "tag spam". Those of you who heavily tag your hubs are at an increased risk of triggering an automated demotion. In my case, this was observed through data collected from outside projects. It made me take a closer look at my hub tagging and there were several where I had tagged in excess. Within 48 hours, my traffic popped back and hubs started ranking on the first page that hadn't been there since the first rollout of Panda. Here is my advice for recovery:
1. Clean up your tags to include no more than 2-5 of the best descriptive matches for your content.
2. Avoid generic one word tags that are too broad to offer any real descriptive value to the user.
3. Make sure your hubs don't contain anything that could be interpreted as "keyword stuffing".
4. Instead of multiple repetitions of the same keyword phrase, use synonyms, plurals, and variations that would likely appear in natural conversation.
Depending on what landed you on the hot seat to begin with, you should see a recovery in traffic within anywhere from a few days to a few weeks (Penguin and Panda usually run about once a month).
I thought tags on HubPages were only used internally and Google did not see them.
I've been told that too but suspect it is npt true as it is not what the Hubpages faq say.
From what I can tell, that conclusion was made by a few hubbers -- I don't remember what they based this conclusion on -- and spread around here on the forms as gospel. I have never been able to find much official info about tags from HP. They do mention very briefly in a learning center article that tags are used by search engines. I have never seen HP give an official word on tags, such as how many is too many, how many is not enough, and what exactly they are used for.
But then again, weren't tags removed from the new layout? I was told no to worry about them for this reason.
Tags are not indexed but... they play a part in deciding what other pages HP shows against your own.
Therefore it is possible that if you go for spammy non related tags you will get spammy non related pages - which might be a small minus sign.
Caveat. I am not an SEO expert. I don't make any money. I don't know what I am talking about. Etc.
So where do you get the information on how tags are used, as mentioned above Hubpages material say otherwise.
I made it up.
No seriously, I don't know. I thought I read it somewhere, on here. It makes sense to me but hey...
Everyone has an opinion.
I am inclined to believe Hubpages materials over rumors. And they say tags are used by search engines.
Fair enough. At least we can agree that spammy tags are bad news.
If you check the Google cache of a Hub, the tags area is not visible.
Take any one of your old hubs and Google its exact url like this:
You will notice the blank area where the tag cloud should be. It ain't there. This is Google's cache of the content of your hub, and it's what it bases its search index upon. [UPDATE: the new Hubpages layout has removed the "Tags" box from the sidebar of our hubs. I can't see tags displayed anywhere on any hub; can you? But even when that tags box was there, it was not showing in Google's cache, perhaps because Hubpages was displaying the tags list with a script that only triggered upon page-load by a web browser.]
More importantly, if you check the source code of a hub, there is NO meta keywords tag. None. Nada. So Google can't see tags there either.
Regardless, Google has said it doesn't use the meta keywords tag in its algorithm, and Bing only uses it as a spammer signal -- it ignores it unless you abuse it, in which case it may downrank your page.
Based on what Google's showing us of crawled content, as far as I can tell, the ONLY impact Hubpages tags can have is indirect: they help Hubpages decide which "related hubs" your hub links to. Those "related hub" links are visible to search engines. Google likes to see links to relevant pages, so in that sense, tags can have a positive or negative impact depending on whether they help you hub link to related content.
This has always been my understanding, and the fact that tags aren't even visible on our Hubs now settles it. Google can't see the tags if they're not there.
This wouldn't be the first time that HubPages has updated the way they do something, but not updated the learning centre or FAQ to reflect it.
Ugh. Here I go again with related hubs (sorry everyone). I was hoping my hub on power tumbling/trampolining/gymnastics would be picking up views with the Olympics getting closer, but it's not at all. According to the numbers at the top of the hub, there are 69 hubs about gymnastics on HP. However, none of them are showing as related hubs, even though that's the category I chose for that hub. It seems that instead of using tags to find related hubs, the algo is taking the word "power" from the hub's title because the related hubs all have to do with electricity -- except for the two that are about bicycling. I don't know how it's picking those two on bicycling over the 69 gymnastics hubs. Just don't know what tag or keyword it's using to pull those. <<smh>>
Okay, I stand corrected. It's probably a long shot, but have you reported that as a bug?
I know for a while we were getting cars hubs no matter what our topic, but I'm getting more related hubs now.
(of course, category isn't the same thing as tag, so those 69 gymnastics hubs might not happen to use any of the tags you picked, but still, if you used tags like Olympics and gymnastics, you should be getting other hubs sharing those tags.)
Thanks for the idea. I have not reported it as a bug. Several weeks ago Marisa Wright suggested I go in and delete some tags that might have been confusing the algo. I did, and it did change some of the related hubs, but the new ones are not any more related than the old ones. I gues it is time to report it and see what happens. Thanks for your input.
The tags are used to determine related Hubs, but they're not the only criteria used. The title of the Hub is used, too.
I'm not sure whether headings and keyword density is part of it - but if it is, perhaps that's part of the problem, because you've used the word "power" 75 times in the Hub (according to the density checker I used), and in every single heading.
I know this sounds counter-intuitive, but if I were you, I'd remove the tag "power tumbling" from your tags (assuming it's there). The annoying thing about tags is that if you use a phrase as a tag, HubPages uses the individual words as well as the phrase.
75 times! LOL, that is an eye opener. That word is part of the name of the sport, but 75 times! No wonder all my related hubs are about electricity. I will keep twekaing and see what happens. Thanks again for your help.
My traffic is back as of Monday July 2. Hopefully it is more than temporary. Re tags, I usually put about 13, but am now reducing them and making sure none have one word.
What can you tell us about one-word tags vs. multiple-word tags? I have often wondered which, if either, was more effective. TIA
I recently read to avoid one word tags in some SEO literature. It also said to not use a whole lot of tags. only those of great importance to article. How is your traffic today?
Keep in mind, any "SEO literature" you may have read outside of HubPages is not referring to the same "tag" mechanism that HP uses.
When reading anything that uses the word "tag"
1. Check the date (its probably outdated or badly parroting an outdated article or concept)
2. Check if they are referring to meta-tags (probably outdated)
3.Check if they are referring to Wordpress tags (or other cms that creates a "tag" page and a related internal link
As an earlier poster stated, the tag mechanism at HP ONLY serves as an internal organizational tool.
since the tag cloud doesn't even appear on single hubs anymore, you cant even consider it a true navigational device. Its nearly irrelevant unless you are trying to carefully craft where you might appear as "suggested/related"
I wonder why they still ask for tags when starting a hub. I never noticed they had vanished. You are so knowledgeable. I wish I knew 1/20 of what you know. Continued success to you. Do you think traffic is on its way back to HP?
I don't know about HP's traffic. I don't have very much content here anymore to judge personally.
When I peek at Quantcast - the year graph looks like a saw blade. Which isn't necessarily terrible, normal fluctuations and all - but there has been a significant amount of new content added, multiple contests etc. Why isn't there an upturn relative to the new "increased quality" hubs.
Right now, the question is how significant of a change might be caused by the design overhaul and its still to early to tell
As Sunforged says, anything you read about tags outside HubPages is irrelevant - they're a different kind of tag.
I've actually done the reverse, and tended to avoid multiple-word tags, because of HubPages' annoying habit of using the individual words in multiple-word tags. So for instance, if I use "belly dance" as a tag, I'll get related Hubs about losing belly fat, because of that word belly.
I need you to tackle some of my issues. You are amazing.
I use 2-3 tags and few repeated terms. So I think this is far from the main issue.
The new Panda update 3.8, which dropped on June 26, could already be hurting some "over-optimized" hubs and articles.
Hopefully that is search engine traffic, but it could be traffic from hubbers who have commented in the past and have gotten a notification that there is a new comment. They may be checking in to see what the new comment says.
Wowee! I can't keep up. Maybe I should put in for a lobotomy to make room. I leave this thread to do something, and there's even more when I come back.
by Susana Smith 9 years ago
There have been several theories about what content Google is penalising and rewarding in the search results but at the moment it does seem a bit random (from my end). Let's compare notes and hopefully we can see some common threads in there. What content of yours is holding on to its ranking and...
by Simone Haruko Smith 7 years ago
Happy Friday, Hubbers!Next week we will be raising the quality threshold for newly-published Hubs (meaning newly-Featured Hubs will, on the whole, be of higher quality) and will also be giving Featured Hubs (for those with high Hubber Scores) longer grace periods- up to a year for some.More...
by Dr. John Anderson 7 years ago
The Googlebot visits new hubs within 30-60 minutes while they are in 'Pending' mode and the NOINDEX tag is in place for 24 Hours. This means that the bot does not index the page, and does not return for days or weeks. New pages do not get indexed for days or weeks. This causes lost traffic and lost...
by KnowledgeAnywhere 9 years ago
I have been on hubpages for two months. I have read multiple articles on SEO and backlinking. Ninety percent of my hubs do not have backlinking. But I choose for a while to say no backlinking. It was "different" I thought and "original". ...
by Lena Kovadlo 7 years ago
If a hub is not featured does that mean that it doesn't get any search engine traffic from Google or other search engines?
by Sandra M Urquhart 4 years ago
What is the best way to get traffic on hubpages? Is it by the quality of your hubs or the frequency of your communication in the community? I've seen different articles that encouraged people to get noticed by being more social, claiming that that would increase interest and traffic; and then on...
Copyright © 2020 HubPages Inc. and respective owners. Other product and company names shown may be trademarks of their respective owners. HubPages® is a registered Service Mark of HubPages, Inc. HubPages and Hubbers (authors) may earn revenue on this page based on affiliate relationships and advertisements with partners including Amazon, Google, and others.
HubPages Inc, a part of Maven Inc.
|HubPages Device ID||This is used to identify particular browsers or devices when the access the service, and is used for security reasons.|
|Login||This is necessary to sign in to the HubPages Service.|
|HubPages Traffic Pixel||This is used to collect data on traffic to articles and other pages on our site. Unless you are signed in to a HubPages account, all personally identifiable information is anonymized.|
|Remarketing Pixels||We may use remarketing pixels from advertising networks such as Google AdWords, Bing Ads, and Facebook in order to advertise the HubPages Service to people that have visited our sites.|
|Conversion Tracking Pixels||We may use conversion tracking pixels from advertising networks such as Google AdWords, Bing Ads, and Facebook in order to identify when an advertisement has successfully resulted in the desired action, such as signing up for the HubPages Service or publishing an article on the HubPages Service.|