Robots.txt
A robots file (robots.txt) tells search engine bots what they are and are not allowed to crawl. Good robots will find your robots file and follow the instructions for themselves. However there are bad robots which will ignore their instructions and crawl things that you tell them not to. For the most part, the majority of robots will follow the instructions in your robots file.
Creating a robots file is fairly simple once you know how to give the robots instructions so let’s run through the instructions. You’ll first need to specify which robot you’re giving instructions to. You’ll do this by entering “User-agent:” then adding the robots name. An asterisk ( * ) can be used to give a specific instruction to all robots. With your robot identified you then need to tell them what they’re supposed to be doing. This is done by hitting enter to go down one line then entering “Disallow:” followed by a file name. You can use a “/” to tell the robots that they are not allowed to crawl anything or leave the rest blank to tell the robots that they are allowed to crawl everything. You can specify multiple files you don’t want crawled by hitting enter again, entering “Disallow:”, then adding another file.
Here are examples of how it should look:
Allow all robots to crawl all files:
User-agent: *
Disallow:
Disallow all robots to crawl all file (not recommended at all):
User-agent: *
Disallow: /
Disallow all robots to crawl only a specific file:
User-agent: *
Disallow: /file-name/
Allow a specific robot to crawl all files:
User-agent: Robot Name
Disallow:
Disallow a specific robot from crawling all files:
User-agent: Robot Name
Disallow: /
Disallow a specific robot from crawling a specific file:
User-agent: Robot Name
Disallow: /file-name/
Disallow a specific robot from crawl multiple specific files:
User-agent: Robot name
Disallow: /file-name/
Disallow: /another-file-name/
Now let’s see a real example. We’re going to tell EmailCollector not to crawl a page that we'll call "Fire Safety" and the extension from our website will be /fire-safety/:
User-agent:EmailCollector
Disallow: /fire-safety/
If you wanted to tell EmailCollector not to crawl any pages, you would just replace /fire-safety/ with *.
Now on to creating the actual file. Open up a simple text editing program like Notepad or Wordpad (program has to be able to save files as .txt). You can also right click on your desktop and choose to create a new text file which will, by default, be set as a .txt file then edit that file. Go ahead and allow/disallow robots to your liking then save your file and name it “robots”, being sure it saves as a .txt. Bots will be looking for a file called "robots.txt" so it must be named exactly that. If you’re not aware of any robots then you can do a quick Google search for lists of current robots or take a look at my robots file for some examples.
With your robots file (robots.txt) completed, just go to your hosting cpanel and upload your robots file into your websites main directory. You can then use a robots file verifier to ensure everything is in order.
If you're a WordPress user, you have it much easier. Simply go to your plugins tabs (on your wp-admin panel), search for “pc robots”, and install the PC Robots plugin. You can check over the robots file that was just installed to be sure it’s to your liking, but you’ve now got a robots file installed and set.
Brought to you by Internet Marketing For Newbies. Free internet marketing info, tips, tricks, and more.