How can you create a robots.txt file?

A robots.txt is a text file that has to be placed on your server to ask different search engine spiders not to index or crawl some pages or sections of your site. We can use it in order to prevent indexing completely and to prevent some places of your site from being indexes or to issue individual indexing instructions to particular search engines.

A robots.txt itself is a file that can be created in Notepad. You need to save it to the root directory of your site. Your index page or home page is in this directory.

How to create a robots.txt file?

There is not difficult to create a basic robots.txt file. You can create it by using notepad or whatever is your favorite text editor. You just create a text document and save that new document as robots.txt. To create the file, do not use an html editor unless it has the ability to create a plain text document (ASCII). In most computers, you can create a text document using notepad.

•    Right click on your desktop
•    Choose new
•    Choose text document
•    Open the document you just created
•    Insert instructions to robots
•    Click on save as
•    Save document as robots.txt

To instruct search engine robots, the robots.txt file is used about which pages of your website should be crawled and consequently indexed. Many websites have folders and files that are not relevant for search engines (such as admin or images files).

So, I must say that creating a robots.txt file can improve your website indexation.

Share this Blog