If i want to only allow crawlers to access index.php, will this work?
User-agent: *
Disallow: /
Allow: /index.php
User-agent: *
Allow: /$
Allow: /index.php
Allow: /sitemap.xml
Allow: /robots.txt
Disallow: /
Sitemap: http://www.your-site-name.com/sitemap.xml
Try swapping the order of Disallow / Allow:
User-agent: *
Allow: /index.php
Disallow: /
See this info from wikipedia:
"Yet, in order to be compatible to all robots, if you want to allow single files inside an otherwise disallowed directory, you need to place the Allow directive(s) first, followed by the Disallow, for example:"
http://en.wikipedia.org/wiki/Robots.txt
Still I wouldn't expect it to work too consistently
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With