## GENERAL SETTINGS ## Enable robots.txt rules for all crawlers User-agent: * ## Crawl-delay parameter: number of seconds to wait between successive requests to the same server. ## Set a custom crawl rate if you're experiencing traffic problems with your server. Crawl-delay: 30 ## Do not crawl development files and folders: CVS, svn directories and dump files Disallow: /CVS Disallow: /*.svn$ Disallow: /*.idea$ Disallow: /*.sql$ Disallow: /*.tgz$ ## Do not crawl Magento admin page Disallow: /controlsection/ ## Do not crawl common Magento technical folders User-agent: * Disallow: /app/ Disallow: /downloader/ Disallow: /errors/ Disallow: /includes/ Disallow: /lib/ Disallow: /pkginfo/ Disallow: /shell/ Disallow: /var/ Disallow: /cgi-bin/ Disallow: /js/ ## Do not crawl 2-nd home page copy (example.com/index.php/). Disallow: /index.php/ ## Do not crawl links with session IDs Disallow: /*?SID= ## Do not crawl checkout and user account pages User-agent: * Disallow: /onestepcheckout/index/ Disallow: /checkout/cart/ Disallow: /customer/account/ Disallow: /customer/account/login/ Disallow: /control/ Disallow: /customer/ Disallow: /customize/ Disallow: /review/ Disallow: /sendfriend/ Disallow: /wishlist/ ## Do not crawl seach pages and not-SEO optimized catalog links User-agent: * Disallow: /catalogsearch/ Disallow: /catalog/product_compare/ Disallow: /catalog/category/view/ Disallow: /catalog/product/view/