I was checking my adsense account and in the reports section a number of my URLs appeared as blocked URL and the reason stated was the Robots.txt file. What is a Robots.txt file?
Does a blocked URL mean that the clicks on ads on that page are not counted by google adsense?
and in the help section they mention thatthe Robots.txt file is in the domain, does that mean that the administrator at hubpages needs to help out here?
You're from India so someone from your country may have tried to open your article in sansckrit or whatever you guys write in over there. When the person found your article on google search engine, they probably hit the translate to sanskrit button and the page got served in sanskrit and indexed on a server over there with a slightly different translate url. that server then blocks with a robots.txt file bcuz not in english. your original indexed english copy is still on googles index server and the next time someone wants to search your article and read off the server then that .txt statement will be gone. your copy on hubpage internal site server read by other hubbers will not release that staement.... it must be grabbed by a person grabbing the article from the search engine and thus off the server it is being indexed at and served in english... that shouldnt block any adsense clicks though bcuz the next time served in english will take care of it.. you can give the name of the hub and I can search the name of hub plus username and pick off the search page and should clear up or just wait until a an actual new reader finds ... i wont go into explaining how the or why the .txt file is used but heres a link if interested in the technical aspect...http://www.google.com/support/webmaster … wer=156449 or ... http://www.robotstxt.org/
I beleive google adds that .txt file to any translated page b4 it sends to the nearest local server since it is not the original and don't want the articles url to be indexed twice once in original and once in translated text to prevent clogging the index servers with excess uneeded pages or it could ultimately be translated over and over through different languages obscuring the original written intent. sort of the same seen with many different religeous text throughout the ages.
Thank you so much for answering my query. I re-checked the site digonistics page after I read your response and it does seem like some people have translated my hubs, From what you say, I gather that it is not a matter of great concern, so I can relax. It seems like the block process may happen a number of times if people wish to read the content in different languages, not something I can directly control.
Thank you once again!!
no problema... in fact I learned about how .txt files work from another hubber so I'll let the thanbks go to whoever that other hubber was. Can't remember now.
by cashmere6 years ago
In my site diagnostics in Google adsense these hubs of mine are being shown as blocked. Reason blocked is Robots.txt File.Can you tell me how I can unblock my following hubs?My favourite Tarot layouts,How to teach...
by Quilligrapher6 years ago
Google AdSense Site Diagnostics reports that a Robots.txt file is blocking their crawler from accessing one of my hubs. Can anyone suggest how I can resolve this issue?
by pinkytoky4 years ago
When I do URL submissions, sometimes I get a message saying my robots.txt is blocking the robot crawlers. Also, it says my robots.txt file does not comply with the current “Robots Exclusion Standard”...
by Eileen Hughes6 years ago
I posted this in a follow up in Maddies article on this. But didnt know if anyone would go there.I went into Diagnostic for first time and some of mine are blocked by the robot.txt fileGooggle adsense ?? says...
by WriteAngled6 years ago
In my adsense report it says that one of my hubs is blocked because there is a Robots.txt file on it. The hub in question was written for a Hubmob topic. Its current score is 79. I obtained the information it contained...
by puter_dr6 years ago
When I logged into Adsense today, it gave me an Analytics diagnostic warning for hubpages, saying that they could not crawl because of a Robots.txt error.Anyone else getting this?
Copyright © 2016 HubPages Inc. and respective owners.
Other product and company names shown may be trademarks of their respective owners.
HubPages® is a registered Service Mark of HubPages, Inc.
HubPages and Hubbers (authors) may earn revenue on this page based on affiliate relationships and advertisements with partners including Amazon, Google, and others.