I was checking my adsense account and in the reports section a number of my URLs appeared as blocked URL and the reason stated was the Robots.txt file. What is a Robots.txt file?
Does a blocked URL mean that the clicks on ads on that page are not counted by google adsense?
and in the help section they mention thatthe Robots.txt file is in the domain, does that mean that the administrator at hubpages needs to help out here?
You're from India so someone from your country may have tried to open your article in sansckrit or whatever you guys write in over there. When the person found your article on google search engine, they probably hit the translate to sanskrit button and the page got served in sanskrit and indexed on a server over there with a slightly different translate url. that server then blocks with a robots.txt file bcuz not in english. your original indexed english copy is still on googles index server and the next time someone wants to search your article and read off the server then that .txt statement will be gone. your copy on hubpage internal site server read by other hubbers will not release that staement.... it must be grabbed by a person grabbing the article from the search engine and thus off the server it is being indexed at and served in english... that shouldnt block any adsense clicks though bcuz the next time served in english will take care of it.. you can give the name of the hub and I can search the name of hub plus username and pick off the search page and should clear up or just wait until a an actual new reader finds ... i wont go into explaining how the or why the .txt file is used but heres a link if interested in the technical aspect...http://www.google.com/support/webmaster … wer=156449 or ... http://www.robotstxt.org/
I beleive google adds that .txt file to any translated page b4 it sends to the nearest local server since it is not the original and don't want the articles url to be indexed twice once in original and once in translated text to prevent clogging the index servers with excess uneeded pages or it could ultimately be translated over and over through different languages obscuring the original written intent. sort of the same seen with many different religeous text throughout the ages.
Thank you so much for answering my query. I re-checked the site digonistics page after I read your response and it does seem like some people have translated my hubs, From what you say, I gather that it is not a matter of great concern, so I can relax. It seems like the block process may happen a number of times if people wish to read the content in different languages, not something I can directly control.
Thank you once again!!
no problema... in fact I learned about how .txt files work from another hubber so I'll let the thanbks go to whoever that other hubber was. Can't remember now.
by cashmere 8 years ago
In my site diagnostics in Google adsense these hubs of mine are being shown as blocked. Reason blocked is Robots.txt File.Can you tell me how I can unblock my following hubs?My favourite Tarot layouts,How to teach English and Multilevel marketing - the good the bad and ugly.This has happened since...
by Brinafr3sh 6 years ago
Have any of your Hubs been blocked due to robots.txt file?Did you ever go to one of your Hubs url's and it said "robots.txt file," instead of it showing your hubsite?
by Quilligrapher 8 years ago
Google AdSense Site Diagnostics reports that a Robots.txt file is blocking their crawler from accessing one of my hubs. Can anyone suggest how I can resolve this issue?
by Carolyn Blacknall 8 years ago
I was looking at Google analytics and it said that one of my hubs (Disney-Snow-White-Birthday-Party-Ideas)is blocked from Google crawler because of a Robots.txt file.Their help file says:How can I make sure my robots.txt file grants access to your crawler? Simply add the following two lines of text...
by pinkytoky 7 years ago
When I do URL submissions, sometimes I get a message saying my robots.txt is blocking the robot crawlers. Also, it says my robots.txt file does not comply with the current “Robots Exclusion Standard” and this may cause problems for some search engines.Is there anything I can do about...
by Eileen Hughes 9 years ago
I posted this in a follow up in Maddies article on this. But didnt know if anyone would go there.I went into Diagnostic for first time and some of mine are blocked by the robot.txt fileGooggle adsense ?? says this:Robots.txt file: If you have a robots.txt file, the page requesting Google ads may be...
Copyright © 2018 HubPages Inc. and respective owners. Other product and company names shown may be trademarks of their respective owners. HubPages® is a registered Service Mark of HubPages, Inc. HubPages and Hubbers (authors) may earn revenue on this page based on affiliate relationships and advertisements with partners including Amazon, Google, and others.
|HubPages Device ID||This is used to identify particular browsers or devices when the access the service, and is used for security reasons.|
|Login||This is necessary to sign in to the HubPages Service.|
|HubPages Traffic Pixel||This is used to collect data on traffic to articles and other pages on our site. Unless you are signed in to a HubPages account, all personally identifiable information is anonymized.|
|Remarketing Pixels||We may use remarketing pixels from advertising networks such as Google AdWords, Bing Ads, and Facebook in order to advertise the HubPages Service to people that have visited our sites.|
|Conversion Tracking Pixels||We may use conversion tracking pixels from advertising networks such as Google AdWords, Bing Ads, and Facebook in order to identify when an advertisement has successfully resulted in the desired action, such as signing up for the HubPages Service or publishing an article on the HubPages Service.|