Robots.txt Generator

Search Engine Optimization

Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

Geekiti » SEO Tools » Robots.txt Generator

Robots.txt Generator

Use our Robots.txt generator to create a Robots.txt file for your website.

URL: Agent: Action:    
URL: Agent: Action:    
URL: Agent: Action:    

 
 

Unsure of what you’re doing? Read our robots.txt guide first.

Link to this tool:

<a href="http://www.geekiti.com/seo-tools/robots-txt-generator/">Robots.txt Generator</a>

What is a Robots.txt File?

A robots.txt file is simply a set of instructions, or preferences, for search engines. You can use it to let Google and other search engines know that you’d like them to ignore certain files or directories, utlimately excluding them entirely from being indexed in search results.

Example of a robots.txt file:

User-Agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /out/
Sitemap: http://www.example.com/sitemap.xml

Supported User-Agents

  • Googlebot
  • Googlebot-Mobile
  • Googlebot-Image
  • Googlebot-Video
  • Mediapartners-Google
  • AdsBot-Google
  • Bingbot
  • Adidxbot
  • MSNBot
  • BingPreview

 

irst version of our Robots.txt file generator is launched. It allows unlimited items to be added with All and Googlebot User-Agents available to kick things off, but these will soon expand to include a Google Image Bot as well as other search engine bots. Upon creation, a ‘robots.txt’ file will instantly be downloaded to your PC or Laptop. Robots.txt file guidance will be added to this page later on.