Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Multiple robots.txt for subdomains in rails

I have a site with multiple subdomains and I want the named subdomains robots.txt to be different from the www one.

I tried to use .htaccess, but the FastCGI doesn't look at it.

So, I was trying to set up routes, but it doesn't seem that you can't do a direct rewrite since every routes needs a controller:

map.connect '/robots.txt', :controller => ?, :path => '/robots.www.txt', :conditions => { :subdomain => 'www' }
map.connect '/robots.txt', :controller => ?,  :path => '/robots.club.txt'

What would be the best way to approach this problem?

(I am using the request_routing plugin for subdomains)

like image 492
Christopher Avatar asked May 01 '10 04:05

Christopher


4 Answers

As of Rails 6.0 this has been greatly simplified.

By default, if you use the :plain option, the text is rendered without using the current layout. If you want Rails to put the text into the current layout, you need to add the layout: true option and use the .text.erb extension for the layout file. Source

class RobotsController < ApplicationController 
  def robots
    subdomain = request.subdomain # Whatever logic you need
    robots = File.read( "#{Rails.root}/config/robots.#{subdomain}.txt")
    render plain: robots
  end
end

In routes.rb

get '/robots.txt', to: 'robots#robots'
like image 191
Ryan Romanchuk Avatar answered Nov 20 '22 12:11

Ryan Romanchuk


Why not to use rails built in views?

In your controller add this method:

class StaticPagesController < ApplicationController
  def robots
    render :layout => false, :content_type => "text/plain", :formats => :txt
  end
end

In the view create a file: app/views/static_pages/robots.txt.erb with robots.txt content

In routes.rb place:

get '/robots.txt' => 'static_pages#robots'

Delete the file /public/robots.txt

You can add a specific business logic as needed, but this way we don't read any custom files.

like image 30
ramigg Avatar answered Nov 20 '22 13:11

ramigg


Actually, you probably want to set a mime type in mime_types.rb and do it in a respond_to block so it doesn't return it as 'text/html':

Mime::Type.register "text/plain", :txt

Then, your routes would look like this:

map.robots '/robots.txt', :controller => 'robots', :action => 'robots'

For rails3:

match '/robots.txt' => 'robots#robots'

and the controller something like this (put the file(s) where ever you like):

class RobotsController < ApplicationController
  def robots
    subdomain = # get subdomain, escape
    robots = File.read(RAILS_ROOT + "/config/robots.#{subdomain}.txt")
    respond_to do |format|
      format.txt { render :text => robots, :layout => false }
    end
  end
end

at the risk of overengineering it, I might even be tempted to cache the file read operation...

Oh, yeah, you'll almost certainly have to remove/move the existing 'public/robots.txt' file.

Astute readers will notice that you can easily substitute RAILS_ENV for subdomain...

like image 30
TA Tyree Avatar answered Nov 20 '22 13:11

TA Tyree


For Rails 3:

Create a controller RobotsController:

class RobotsController < ApplicationController
#This controller will render the correct 'robots' view depending on your subdomain.
  def robots
    subdomain = request.subdomain # you should also check for emptyness
    render "robots.#{request.subdomain}"
  end
end

Create robots views (1 per subdomain):

  • views/robots/robots.subdomain1.txt
  • views/robots/robots.subdomain2.txt
  • etc...

Add a new route in config/routes.rb: (note the :txt format option)

match '/robots.txt' => 'robots#robots', :format => :txt

And of course, you should declare the :txt format in config/initializers/Mime_types.rb:

Mime::Type.register "text/plain", :txt

Hope it helps.

like image 1
levandch Avatar answered Nov 20 '22 13:11

levandch