Robots.txt serves as one of the factors in search engine optimization, although in Magento this file is not included by default. So, let’s fix this and create the file ourselves:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 |
# indicates that rules are defined for all bots User-agent: * # should not wait too long for a page to load Crawl-delay: 10 # Now come the rules: restrict robots from indexing the following pages: Disallow: /admin/ Disallow: /app/ Disallow: /downloader/ Disallow: /errors/ Disallow: /includes/ Disallow: /lib/ Disallow: /pkginfo/ Disallow: /shell/ Disallow: /var/ Disallow: /api.php Disallow: /cron.php Disallow: /cron.sh Disallow: /get.php Disallow: /install.php Disallow: /LICENSE.html Disallow: /LICENSE.txt Disallow: /LICENSE_AFL.txt Disallow: /RELEASE_NOTES.txt # These rules are supported only by major bots Disallow: /*?dir* Disallow: /*?dir=desc Disallow: /*?dir=asc Disallow: /*?limit=all Disallow: /*?mode* Disallow: /*?SID= # Disallow: /checkout/ Disallow: /onestepcheckout/ Disallow: /customer/ Disallow: /catalogsearch/ Disallow: /catalog/product_compare/ Disallow: /catalog/category/view/ Disallow: /catalog/product/view/ |
These are the basic rules. Robots.txt file should be customized and changed per each project individually.
Useful links:
http://en.wikipedia.org/wiki/Robots.txt
http://www.mcanerin.com/EN/search-engine/robots-txt.asp
Partner With Us
Looking for a partner to grow your business? We are the right company to bring your webstore to success.
Talk to Igor