First off ... these are observations on *my* hubs and your experience may directly vary from mine. Secondly, as these are only observations I'm not offering up that what I will say has any bearing on the way things actually are or work. I'm just sharing my recently acquired data to see if others have had the same experience or something that differed ...
First off ... I have a program that allows me to see various information on the rankings of the top 10 Google sites for any keyword phrase, including: page rank, backlinks, keyword in title, keyword in URL, etc.
To make my experiment resemble something scientific, I took several of my hubs here that ranked on Google's 1st page and tried to scientifically analyze what ranked them on the front page in the place where they fell onto that page.
All chosen hubs were written about the same time and had received about the same amount of page hits. What I Was interested in was what might have placed them on page one and what might have ranked them on the front page where they ranked.
Page rank wasn't a deciding factor as pages ranked from one to 10 regardless of page rank, with PR 0 pages sometimes ranking higher and PR5 pages sometimes ranking lower. The same held true with backlinks. In fact, none of the off page elements seemed to determine ranking for me in Google.
I then looked at on the page elements. Oddly enough, some pages ranked high without using the ranking phrase in the title, the description, the headings, or the URL.
Even when I tried to observe the rankings from a view of a combination of factors, instead of observing them one at a time, I still could make no sense of the rankings.
So ... how is Google ranking determined?
I have a thought on this, but I could be wrong. It's nothing more than an educated guess based upon observations.
Each page I observed had a good page ranking OR a good amount of backlinks, OR used the keyword phrase in the title. I could not find ANY exception to this, though I did not spend a month looking under every rock to find one.
I think it was one of these factors that gave these pages an initial push up the Google ladder and then the final ranking was determined by how many views the page received on a daily basis.
I have no way to prove this, but it would definitely make sense. Why not allow the page with the most hits to be on top? Wouldn't that be the best choice?
And it could go deeper than that. Why not measure how long a person spends on a page as well as how often it is read? Google analytics tracks this, so it makes sense Google would take this into account.
Finally, Google tracks the Click Through Rate (CTR) for Google ads on each page and profits from those with the highest CTR, so wouldn't it make sense that Google would reward those pages with the highest CTR a higher ranking?
I have found pages with NO backlinks on page one, and these weren't pages just created a few days ago ... some had been around for years and were outranking those with many backlinks!
For instance, tell me why:
http://answers.yahoo.com/question/index … 259AA3Nnom
for the exact keyword phrase "baytril for dogs"
The yahoo page has been online for three years, so it's not about letting a new page up the ladder. Also, the page doesn't offer a lot of information.
This is the oddity known as Google, where rankings are a total mystery.
Again, I'm not making any assumptions ... just presenting data. I'd be interested to know what others have experienced.
On another note ... every hub that I have written so far that received a lot of hits from people on hubpages have also went on to receive a lot of hits from Google (with the exception of those hubs written specifically about hubpage topics). Of course, one might say that what reads well to the hubbers will also inevitably read well to the Google community ... but I can't say that with any certainty.
Again, I'm not trying to form a global consensus ... I'm just sharing what I found and allowing you to come to your own conclusions.
I am not too sure the two are related. Page ranking is very different to ctr and profits. You will find that Google looks at the entire Adsense account not for ctr but for conversions or meeting a goal. Google does not stop at an advertisement click but follows through what actions the visitor took once they got to the advertised site, such as completing an offer, filling in a newsletter subscription etc..
It is the overall success of the account that determines the ads Google will serve to your Adsense account. It has often been said if you have a page under 5% ctr, then take adsense off it to increase the accounts overall percentage.
Google wants their adwords campaign to be successful for its advertisers and if their ads are converting both are happy. When both are happy the referring site is rewarded. Of course the opposite is true, poor performing ads meets less money (cpc).
I always thought page rank was all about authority, the authority each inbound link carries that reflects the page rank algorithm. Nice thread you have started here.
I don't know but it is interesting I am new at this too!
Interesting and I am certainly glad that you took time to share this information Yoshi it is very valuable.
This is the first time I have offered up my own confusion and been thanked for it ... haha!
I'll give a cookie to anyone who can make sense of it all.
Good insight. I'm still new at this but from a background in business I had noticed this as well. Great pulling it together.
the first ranks so high because it is related to yahoo
This is really interesting stuff yoshi And yes Google is a law unto itself - lol! It will be even more confusing to work out google laws with google caffeine changing things.
Here are a couple of things I've noticed: two of my hubs that get the most external traffic and are at number 1 on the SERPs get given free adsense cents. It's like google is giving me a little bonus! Maybe this is to do with CTR and what people do once they've left my page?
The other thing is that some of my hubs that have very little internal HP traffic do fabulously well on google. So while I do think that pages that do well on HP get a big boost (they get a lot of internal links from high PR pages) - it's not essential.
I have given up with creating backlinks BTW - like you say it often doesn't make any difference. I'd rather spend my time looking for good long tail keyphrases.
they have new algorithm s which( speculation here) might hedge off the the high and low of a variable to give a better middle ground result and reduce linkbombing. for example if google simply used how many hits a page received then people would set up farms to log onto pages continuosly to jump the amounts of hits up to get on first page. They might use ip addressing to downplay for example , extra hits from the same ip address( ex: out of 100 hits 98 are from the same ip three days in a row and two others are not , so they might only count two of the hits from the suspicious ip address.
Most people are not on the net searching from the point of viwe of being a publisher so using ctr/ for example would not benefit the user experience but the publisher or marketers experience. For example if im looking for free kung fu movie sites but google puts only the paid kung fu movie sites for the sake of publishers than user experience is diminished from not finding what they want.
It shouldnt be the customer/ user doing the extra work to find what they want , it should be the marketer/ publisher doing the work to see what best works as they are the one getting paid.Making it tough for people to find what they want for the sake of the marketers defeats trhe purpose of fast engine search . people want results now and fast within less than a second without having to search through all pages.
Its tough to say which alogrithm is best at doibng this and Goggles page rank is not all inclusive as many variables can be added.
I think having a rate up or down next to each listing on the page for the USERS would be more bebeficial to search with a built in algorithm to sense same ip address farming - using the link system it puts emphasis on the publisher view of what is going to be seen by the searcher. If someone with a mediocre site has lots of ambition and networks and gets hyperlinks to every place he can ( like how to make billions on the internet scams do) while another excellent site with a few less links sits on what they have then the hyperlinked mediocre site will show regardlessrather than by rank of user.
using time on site could be exploited by opening a site and never closing making time go up. they probably set an allgorithm to count only up to a certain time at each certain ip adress or hedge off ones that are just idling open and sensing the actual times that are within a consistent timeframe for that particular page. I think user up down rating systen which only allowed a single vote from a single ip address is fairer to the USER experience and would be ten times easier. Use 50 percent of that toward rank and 50 percent from publisher web 2.0 sites to drive the ambition to publish good pages.. WOW that was longer than i anticipated but then again you wanted honest opinions.. and again I don't think you farmed some of them i was just showing how a certain ip may have been using anothers remotely to send malicious hits like happens in email spamming through malicious code to send to all users it aquires through a comp
Wouldn't we all like to understand the Google algorithm - believed to take up to 200 factors into account. Here's a list of over 100 factors thought to be included. Courtesy of Search Engine Journal.
Domain: 13 factors
1. Domain age;
2. Length of domain registration;
3. Domain registration information hidden/anonymous;
4. Site top level domain (geographical focus, e.g. com versus co.uk);
5. Site top level domain (e.g. .com versus .info);
6. Sub domain or root domain?
7. Domain past records (how often it changed IP);
8. Domain past owners (how often the owner was changed)
9. Keywords in the domain;
10. Domain IP;
11. Domain IP neighbors;
12. Domain external mentions (non-linked)
13. Geo-targeting settings in Google Webmaster Tools
Server-side: 2 factors
1. Server geographical location;
2. Server reliability / uptime
Architecture: 8 factors
1. URL structure;
2. HTML structure;
3. Semantic structure;
4. Use of external CSS / JS files;
6. Use of canonical URLs;
7. “Correct” HTML code (?);
8. Cookies usage;
Content: 14 factors
1. Content language
2. Content uniqueness;
3. Amount of content (text versus HTML);
4. Unlinked content density (links versus text);
5. Pure text content ratio (without links, images, code, etc)
6. Content topicality / timeliness (for seasonal searches for example);
7. Semantic information (phrase-based indexing and co-occurring phrase indicators)
8. Content flag for general category (transactional, informational, navigational)
9. Content / market niche
10. Flagged keywords usage (gambling, dating vocabulary)
11. Text in images (?)
12. Malicious content (possibly added by hackers);
13. Rampant mis-spelling of words, bad grammar, and 10,000 word screeds without punctuation;
14. Use of absolutely unique /new phrases.
Internal Cross Linking: 5 factors
1. # of internal links to page;
2. # of internal links to page with identical / targeted anchor text;
3. # of internal links to page from content (instead of navigation bar, breadcrumbs, etc);
4. # of links using “nofollow” attribute; (?)
5. Internal link density,
Website factors: 7 factors
1. Website Robots.txt file content
2. Overall site update frequency;
3. Overall site size (number of pages);
4. Age of the site since it was first discovered by Google
5. XML Sitemap;
7. Website type (e.g. blog instead of informational sites in top 10)
Page-specific factors: 9 factors
1. Page meta Robots tags;
2. Page age;
3. Page freshness (Frequency of edits and
% of page effected (changed) by page edits);
4. Content duplication with other pages of the site (internal duplicate content);
5. Page content reading level; (?)
6. Page load time (many factors in here);
7. Page type (About-us page versus main content page);
8. Page internal popularity (how many internal links it has);
9. Page external popularity (how many external links it has relevant to other pages of this site);
Keywords usage and keyword prominence: 13 factors
1. Keywords in the title of a page;
2. Keywords in the beginning of page title;
3. Keywords in Alt tags;
4. Keywords in anchor text of internal links (internal anchor text);
5. Keywords in anchor text of outbound links (?);
6. Keywords in bold and italic text (?);
7. Keywords in the beginning of the body text;
8. Keywords in body text;
9. Keyword synonyms relating to theme of page/site;
10. Keywords in filenames;
11. Keywords in URL;
12. No “Randomness on purpose” (placing “keyword” in the domain, “keyword” in the filename, “keyword” starting the first word of the title, “keyword” in the first word of the first line of the description and keyword tag…)
13. The use (abuse) of keywords utilized in HTML comment tags
Outbound links: 8 factors
1. Number of outbound links (per domain);
2. Number of outbound links (per page);
3. Quality of pages the site links in;
4. Links to bad neighborhoods;
5. Relevancy of outbound links;
6. Links to 404 and other error pages.
7. Links to SEO agencies from clients site
8. Hot-linked images
Backlink profile: 21 factors
1. Relevancy of sites linking in;
2. Relevancy of pages linking in;
3. Quality of sites linking in;
4. Quality of web page linking in;
5. Backlinks within network of sites;
6. Co-citations (which sites have similar backlink sources);
7. Link profile diversity:
1. Anchor text diversity;
2. Different IP addresses of linking sites,
3. Geographical diversity,
4. Different TLDs,
5. Topical diversity,
6. Different types of linking sites (logs, directories, etc);
7. Diversity of link placements
8. Authority Link (CNN, BBC, etc) Per Inbound Link
9. Backlinks from bad neighborhoods (absence / presence of backlinks from flagged sites)
10. Reciprocal links ratio (relevant to the overall backlink profile);
11. Social media links ratio (links from social media sites versus overall backlink profile);
12. Backlinks trends and patterns (like sudden spikes or drops of backlink number)
13. Citations in Wikipedia and Dmoz;
14. Backlink profile historical records (ever caught for link buying/selling, etc);
15. Backlinks from social bookmarking sites.
Each Separate Backlink: 6 factors
1. Authority of TLD (.com versus .gov)
2. Authority of a domain linking in
3. Authority of a page linking in
4. Location of a link (footer, navigation, body text)
5. Anchor text of a link (and Alt tag of images linking)
6. Title attribute of a link (?)
Visitor Profile and Behavior: 6 factors
1. Number of visits;
2. Visitors’ demographics;
3. Bounce rate;
4. Visitors’ browsing habits (which other sites they tend to visit)
5. Visiting trends and patterns (like sudden spiked in incoming traffic)
6. How often the listing is clicked within the SERPs (relevant to other listings)
Penalties, Filters and Manipulation: 12 factors
1. Keyword over usage / Keyword stuffing;
2. Link buying flag
3. Link selling flag;
4. Spamming records (comment, forums, other link spam);
6. Hidden Text;
7. Duplicate Content (external duplication)
8. History of past penalties for this domain
9. History of past penalties for this owner
10. History of past penalties for other properties of this owner (?)
11. Past hackers’ attacks records
12. 301 flags: double re-directs/re-direct loops, or re-directs ending in 404 error
More Factors (6):
1. Domain registration with Google Webmaster Tools;
2. Domain presence in Google News;
3. Domain presence in Google Blog Search;
4. Use of the domain in Google AdWords;
5. Use of the domain in Google Analytics;
6. Business name / brand name external mentions.
I think google has the power to control manually certain articles.
I had the no 1 position, no 1 page and no 1 keyword - it disappeared from page ranking totally! I had some activity where I couldnt go to blogger for a while to list. I really think they didn't want me to be listed, so manually used an over ride. Someone, somewhere didnt want this newbie to have success!
Also, on another hub, the thread rated higher than the hub itself. Strange eh?
Wow, very interesting! Thanks for letting us know your observations Yoshi.
I think the bottom line is that Google is always refining their system to find the best content online for the search terms. They will always be changing the way they do it. Beyond basic SEO, I think people are best off just providing good content.
by Susannah Birch 10 years ago
I'm sure you've all clicked through to one of these pages before, thinking it was on the subject you searched. However instead of a page on the subject, it's a randomly created list of results either as another SERPs page OR a list of related items (like amazon items).My question is - since these...
by Paul Maplesden 9 days ago
Hi there,I've been hearing discussions that noindexing hubs (because they are idled) impacts their backlinks in some way, and I'd like to find out if this is true.I've been researching this a bit online and have found the following:- According to the Google Webmaster forum, Google *does* count...
by Bill Manning 10 years ago
I'm still trying to push up a page I made a few weeks in my own site in the rankings. It's on the third page of Google toward the top.It's a very popular key word, so I really want to push it up. I've done lots of external links to it, but what about internal?If I had 5 or so other pages in my...
by Eric Dockett 2 years ago
This is getting silly. The last few Hubs I updated had all links to other Hubs snipped, even though these links were (a) on the same topic (b) helpful to the reader and (c) pointing to the same niche site. I really try to understand why Amazon links are removed, and I get why links to other sites...
by Cagsil 8 years ago
Hey Veterans,I am curious. Recently Google made changes and many people are or have lost a lot of traffic, and I am wondering if the problem is too many internal links.I only ask because since Google's change, I have lost a significant amount of traffic, however many of my hubs are linked...
by Anamika S Jain 6 years ago
I think a Google Page rank update happened today. I spend 15-18 hours working on the internet. A few minutes ago I noticed that one of my domains have gone from PR0 to PR2. I checked all my major sites and found changes in most of them. Some moved up on PR and some lost.
Copyright © 2020 HubPages Inc. and respective owners. Other product and company names shown may be trademarks of their respective owners. HubPages® is a registered Service Mark of HubPages, Inc. HubPages and Hubbers (authors) may earn revenue on this page based on affiliate relationships and advertisements with partners including Amazon, Google, and others.
HubPages Inc, a part of Maven Inc.
|HubPages Device ID||This is used to identify particular browsers or devices when the access the service, and is used for security reasons.|
|Login||This is necessary to sign in to the HubPages Service.|
|HubPages Traffic Pixel||This is used to collect data on traffic to articles and other pages on our site. Unless you are signed in to a HubPages account, all personally identifiable information is anonymized.|
|Remarketing Pixels||We may use remarketing pixels from advertising networks such as Google AdWords, Bing Ads, and Facebook in order to advertise the HubPages Service to people that have visited our sites.|
|Conversion Tracking Pixels||We may use conversion tracking pixels from advertising networks such as Google AdWords, Bing Ads, and Facebook in order to identify when an advertisement has successfully resulted in the desired action, such as signing up for the HubPages Service or publishing an article on the HubPages Service.|