Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

django serving robots.txt efficiently

Here is my current method of serving robots.txt

url(r'^robots\.txt/$', TemplateView.as_view(template_name='robots.txt',
                                            content_type='text/plain')),

I don't think that this is the best way. I think it would be better if it were just a pure static resource and served statically. But the way my django app is structured is that the static root and all subsequent static files are located in

http://my.domain.com/static/stuff-here

Any thoughts? I'm amateur at django but

 TemplateView.as_view(template_name='robots.txt',
                                  content_type='text/plain')

looks a lot more resource consuming than just a static call to my static directory which is served on nginx.

like image 859
Lucas Ou-Yang Avatar asked Aug 24 '13 23:08

Lucas Ou-Yang


2 Answers

Yes, robots.txt should not be served by Django if the file is static. Try something like this in your Nginx config file:

location  /robots.txt {
    alias  /path/to/static/robots.txt;
}

See here for more info: https://nginx.org/en/docs/http/ngx_http_core_module.html#alias

Same thing applies to the favicon.ico file if you have one.

The equivalent code for Apache config is:

Alias /robots.txt /path/to/static/robots.txt
like image 112
HankMoody Avatar answered Oct 16 '22 09:10

HankMoody


I know this is a late reply, I was looking for similar solution when don't have access to the web server config. So for anyone else looking for a similar solution, I found this page: http://www.techstricks.com/adding-robots-txt-to-your-django-project/

which suggests adding this to your project url.py:

from django.conf.urls import url
from django.http import HttpResponse

urlpatterns = [
    #.... your project urls
    url(r'^robots.txt', lambda x: HttpResponse("User-Agent: *\nDisallow:", content_type="text/plain"), name="robots_file"),
]

which I think should be slightly more efficient that using a template file, although it could make your url rules untidy if need multiple 'Disallow:' options.

like image 13
Stephen Bridgett Avatar answered Oct 16 '22 08:10

Stephen Bridgett