Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Disallow certain page directories but NOT that page itself

Tags:

robots.txt

Let's say, I have a dynamic page that creates URL's from user inputs. For example: www.XXXXXXX.com/browse <-------- (Browse being the page)

Every time user enters some query, it generates more pages. For example: www.XXXXXXX.com/browse/abcd <-------- (abcd being the new page)

Now, I want Google to do crawl this "browse" page but not the sub pages generated by it.

I'm thinking of adding this to my robots.txt page; "Disallow: /browse/"

Would that be the right thing to do? or will it also prevent the Googlebot from crawling the "browse" page? What should I do to get the optimal result?

like image 340
Raj Sandhu Avatar asked Dec 25 '15 20:12

Raj Sandhu


1 Answers

URL doesn't end with slash:

www.XXXXXXX.com/browse

Therefore this code should work:

User-agent: *
Disallow: /browse/
like image 69
znurgl Avatar answered Oct 13 '22 13:10

znurgl