Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Robots.txt: Disallow subdirectory but allow directory

Tags:

robots.txt

I want to allow crawling of files in:

/directory/

but not crawling of files in:

/directory/subdirectory/

Is the correct robots.txt instruction:

User-agent: *
Disallow: /subdirectory/

I'm afraid that if I disallowed /directory/subdirectory/ that I would be disallowing crawling of all files in /directory/ which I do not want to do, so am I correct in using:

User-agent: *
Disallow: /subdirectory/
like image 882
user523521 Avatar asked Mar 22 '11 01:03

user523521


2 Answers

You've overthinking it:

User-agent: *
Disallow: /directory/subdirectory/

is correct.

like image 148
Matthew Flaschen Avatar answered Oct 29 '22 10:10

Matthew Flaschen


User-agent: *
Disallow: /directory/subdirectory/

Spiders aren't stupid, they can parse a path :)

like image 28
alex Avatar answered Oct 29 '22 10:10

alex