Robots.txt Generator

Search Engine Optimization

Robots.txt Generator

Default robot access

Additional rules

Action Robot Files or directories

Sitemap (optional)

Your Robots.txt File


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.

Now, copy and paste this text into a blank text file called "robots.txt" (don't forget the "s" on the end of "robots") and put it in your root directory. Like all other files on your server, make sure its permissions are set so that visitors (such as search engines) can read it.


About Robots.txt Generator

When search engines crawl a site, they first look for a robots.txt file at the domain root. If found, they read the file’s list of directives to see which directories and files, if any, are blocked from crawling. This file can be created with a robots.txt file generator. When you use a robots.txt generator Google and other search engines can then figure out which pages on your site should be excluded. In other words, the file created by a robots.txt generator is like the opposite of a sitemap, which indicates which pages to include.

The robots.txt generator

Robots.txt Generator is an easy-to-use tool to create proper Robots.txt directives for your site: Easily copy and tweak Robots.txt files from other sites or create your own. When search engine spiders crawl a website, they typically start by identifying a robots.txt file at the root domain level. Upon identification, the crawler reads the file’s directives to identify directories and files that may be blocked. Blocked filed can be created with the robots.txt generator; these files are, in some ways, the opposite of those in a website’s sitemap, which typically includes pages to be included when a search engine crawls a website.

As you use the robots.txt file generator, to see a side-by-side comparison on how your site currently handles search bots versus how the proposed new robots.txt will work, type or paste your site domain URL or a page on your site in the text box, and then click Create Robot.txt button.

 

 


Robots.txt Generator Feed Back