jump to last post 1-3 of 3 discussions (4 posts)

Blocked URL and what is a Robots.txt file?

  1. Vibhavari profile image73
    Vibhavariposted 7 years ago

    I was checking my adsense account and in the reports section a number of my URLs appeared as blocked URL and the reason stated was the Robots.txt file. What is a Robots.txt file?

    Does a blocked URL mean that the clicks on ads on that page are not counted by google adsense?

    and in the help section they mention thatthe Robots.txt file is in the domain, does that mean that the administrator at hubpages needs to help out here?

  2. mel22 profile image60
    mel22posted 7 years ago

    You're from India so someone from your country may have tried to open your article in sansckrit or whatever you guys write in over there. When the person found your article on google search engine, they probably hit the translate to sanskrit button and the page got served in sanskrit and indexed on a server over there with a slightly different translate url. that server then blocks with a robots.txt file bcuz not in english. your original indexed english copy is still on googles index server and the next time someone wants to search your article and read off the server then that .txt statement will be gone. your copy on hubpage internal site server read by other hubbers will not release that staement.... it must be grabbed by a person grabbing the article from the search engine and thus off the server it is being indexed at and served in english... that shouldnt block any adsense clicks though bcuz the next time served in english will take care of it.. you can give the name of the hub and I can search the name of hub plus username and pick off the search page and should clear up or just wait until a an actual new reader finds ... i wont go into explaining how the or why the .txt file is used but heres a link if interested in the technical aspect...http://www.google.com/support/webmaster … wer=156449  or ...  http://www.robotstxt.org/

    I beleive google adds that .txt file to any translated page b4 it sends to the nearest local server since it is not the original and don't want the articles url to be indexed twice once in original and once in translated text to prevent clogging the index servers with excess uneeded pages or it could ultimately be translated over and over through different languages obscuring the original written intent. sort of the same seen with many different religeous text throughout the ages.

    1. Vibhavari profile image73
      Vibhavariposted 7 years agoin reply to this

      HI mel22,

      Thank you so much for answering my query. I re-checked the site digonistics page after I read your response and it does seem like some people have translated my hubs, From what you say, I gather that it is not a matter of great concern, so I can relax. It seems like the block process may happen a number of times if people wish to read the content in different languages, not something I can directly control.
      Thank you once again!!

  3. mel22 profile image60
    mel22posted 7 years ago

    no problema... in fact I learned about how .txt files work from another hubber so I'll let the thanbks go to whoever that other hubber was. Can't remember now.