Robots.txt Generator

Get your Website Report using our free tools

Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

When search engines crawl a site, they first look for a robots.txt file at the domain root. If found, they read the file’s list of directives to see which directories and files, if any, are blocked from crawling. This file can be created with a robots.txt file generator. When you use a robots.txt generator Google and other search engines can then figure out which pages on your site should be excluded. In other words, the file created by a robots.txt generator is like the opposite of a sitemap, which indicates which pages to include.

The robots.txt generator

You can easily create a new or edit an existing robots.txt file for your site with a robots.txt generator. To upload an existing file and pre-populate the robots.txt file generator tool, type or paste the root domain URL in the top text box and click Upload. Use the robots.txt generator tool to create directives with either Allow or Disallow directives (Allow is default, click to change) for User Agents (use * for all or click to select just one) for specified content on your site. Click Add directive to add the new directive to the list. To edit an existing directive, click Remove directive, and then create a new one.