How to Create robots.txt file for SEO

It has been a half month since ThemeLib.com was created, not a long time, huh ?. Today, I decided to test how is my site in two popular Search Engines : Google and Yahoo.

Here is the result :

  • Google: get 48 results returned with keywords http://csshook.com
  • Yahoo: get 35 results returned with keywords http://csshook.com

All of the results in Google is fine. They indexed my posts, my tags, my keywords, … But there are some problems with my site in Yahoo. Yahoo indexed my wp-login page and download links !!! How did it happen, huh ? If you have knowledge on SEO (Search Engine Optimization), these results are not good, really. After examining ThemeLib.com a few minutes, I realized that I have not created a Robots.txt file yet !!! Oh man, how can I forget it? 

Introduction

The robots.txt file is used to instruct search engine robots about what pages on your website should be crawled and consequently indexed. Most websites have files and folders that are not relevant for search engines (like images, download links, admin files, …) therefore creating a robots.txt file can actually improve your website indexation.

How to Create a Robots.txt file

A robots.txt is just a simple text file that can be created with any text editor such as Notepad. If you are using WordPress, a sample robots.txt file would be :

User-agent: *
Disallow: /wp-
Disallow: /feed/
Disallow: /trackback/
Disallow: /download/

“User-agent: *” means that all the search engine (Google, Yahoo, MSN and so on) should use those instructions to crawl your website.

“Disallow: /wp-” will make sure that the search engines will not crawl the WordPress files. This line will exclude all files and foldes starting with “wp-” from the indexation, avoiding duplicated content and admin files. Similar toDisallow: /feed/, Disallow: /trackback/, …

After you created the robots.txt file just upload it to your root directory and DONE!

You may also like...