Robots.txt Generator

Search Engine Optimization

Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

Free Robots.txt Generator

Suppose you don’t have a robots.txt file. The search engines will still crawl and index your website. Nevertheless, you will not tell search engines which pages or folders they should not crawl. That will not have much of an impact when you commence a blog and do not have a lot of content. However, as your site grows and you own a lot of content, you would likely want to control how your website is crawled and indexed.

Search bots have a crawl allowance for each website. It means that they crawl a specific number of pages during a crawl session. Suppose they don’t finish crawling all your site pages. They will return back and resume crawl in the next session. That can slow down your website indexing rate.

You can fix this by disallowing search bots from crawling irrelevant pages like your WordPress admin pages, plugin files, and themes folder.

By refusing unnecessary pages, you save your crawl quota. It supports search engines crawl even more pages on your site and index them as quickly as possible. Another great reason to use the robots.txt file is to stop search engines from indexing a post or page on your website. It is not the most reliable way to hide content from the general public, but it will help you stop them from appearing in search results.

robots.txt is an essential file that must be put in the root folder of your site to help search engines index your website more competently. Search engines like Google use a website crawler or robot that reviews all the articles, images or contents on your website. There might be parts of your site that you do not require to crawl to include in user search results. It may be the admin page. You may add these pages to the file to be explicitly ignored. Robots.txt files apply something termed the Robots Exclusion Protocol. Our free tool will quickly generate the file for you, including inputs of pages to be excluded.