Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Hosting multiple rails services on a single server + architecture of an "api enabled" website

I've just finished reading Paul Dix's book Service-Oriented Design with RoR and I'd like to create a Rails 3 web app based on what I've just learned.

I think I got the basic architecture right, but a simple question is kind of blocking me: how am I supposed to host several REST services on a same server?

Here's how I see things for the moment:

  • Create *Service apps (UserService, XYZFeatureService, ...) based on Sinatra (I guess), which provide REST endpoints to access resources
  • Have a front-end Rails app with controllers/views/... that consumes data from the different services. End users would access it through http://www.myapp.com for instance.
  • And finally have a standalone "API" app to handle calls to https://api.myapp.com/* or https://www.myapp.com/api/* to publish an external API that would consume the same services with possible authentication, throttling and so on on top of it.

Does it sound like a good start for you?

As far as the implementation goes, from what I've read in the book I plan on creating gems to handle communication between the rails app and the services (I may throw in some RabbitMQ but that's another story).

However, as I only have one physical server, I'm wondering how I am going to make all those apps/services live together? My first guess is to launch each service app on localhost:xxxx where xxxx is a different unprivileged port for each service. I could configure each client gem in my rails app to use those ports.

Along with that, I'd probably run Apache 2 + Passenger to serve my rails front-end and the API service, using something like Rack::URLMap (or virtual hosts if using a subdomain) to direct requests to the right app. Should I then use Passenger to run my services too in a production environment?

Is it the right way to go?! It feels consistent with what I've read and learned, and easily split into several physical servers too if needed, but I'd like to be sure I'm not missing something. Would you build things differently?

Thanks a lot for your input!

Update

Main questions I'd like a to see answered are:

  • Is the described architecture appropriate to build a web app with external API endpoints?
  • Is it OK to have services running on a single server on different ports?

Thanks!

like image 338
Olivier Lance Avatar asked Dec 22 '11 15:12

Olivier Lance


2 Answers

So this question is a bit more than 3 years old, and I think it could benefit from a reasonably objective answer.

It's funny to read this question again and see it's been upvoted recently when the simple, "high level" answer is just: do what you want/need to do!

There's no magic rule to observe, although I guess that's what I was looking for at the time. There are, however, some key things to try to keep in mind:

  • Designing a Service-Oriented Architecture means we're preparing for scaling. Each service is meant to run on its own and not depend on being run along on the same server as other services of the stack. Don't couple your services, they need to remain independent

  • However do not "over-prepare" this: the temptation is high to spend a lot of time designing the perfect architecture, when what you actually need to do is ship your v1!

  • When you're building separate services, don't make it more complicated than needed: a simple web stack with REST(-like) endpoints will probably be sufficient to start with. RabbitMQ and other message queues are great too, but they solve problems you may not really have.

  • As far as servers are concerned, well... in an ideal world you'd want a server per service, all in a datacenter, with replication over a second (or more!) set of servers in another physically separated datacenter... this requires time and money to set up and to maintain. If you're in a big organization that might be fine, but if that's the case you probably didn't need to read this answer.
    So yeah, you can start small! One or two servers, or a "big one" with virtualized servers on it.... it all depends on your confidence in administrating the thing or your hiring of a sysadmin. A single machine can be enough, and do not hesitate to run several services on it, provided they all can share the same system and memory.
    Today, I'd probably use nginx to dispatch requests to the correct services, depending on hostnames or ports, and run private services on different ports with a firewall (Shorewall for instance) to block requests from the outside on those ports.

There it is... like I said, there's no magical answer, but solutions to devise for each problem there is to solve. What I've learned during the past 3 years, working mostly alone on medium/large projects, is that starting simple is key.

like image 20
Olivier Lance Avatar answered Oct 13 '22 21:10

Olivier Lance


I use the Apache-Passenger combo and a script (see below) but I read a lot about benchmarks pushing Node.JS behind an Nginx load-balancer - and at least to providing the webservices API, it might make sense.

my script is:

def build_a_new_oxenserver()
  site = siteurl.gsub(/\./,"_")
  system( "rake add_site['#{siteurl}','#{site}','#{id}']") if Rails.env.production?
  default_tmpl = open(File.expand_path(Rails.root + "public/default_template.html")).read
  tmpl = Template.create(:ox_id=>id, :name=>"first template", :content=>default_tmpl)
  pg=Page.create( :ox_id=>id, :language_id=>1, :template_id=>tmpl.id, :title=>"Home", :menu_label=>"Zu Hause", :ancestry=>nil, :root=>true)

  # add the Paragraph element to this ox's toolbox
  self.elements << Element.find(1)

  # add an Article, a Paragraph, and a Post
  pe = PageElement.create( :element_id => Element.find(1) )
  pe.elementable = Paragraph.create(:content=>"This is written *in bold* -")
  pe.save
  pg.page_elements << pe


end

The add_site rake does a remote job on the production server - creating the necessary folders, configuration files and linked scripts to get a new 'instance' running. This way I'm able to extend my services and with a little effort, I'd be able to also extend the load-balancing capabilities.

Please observe that this solution is a 'shared-source' version

The rake script looks like this:

#
#   rake add_site["www.domain.tld", "www_domain_tld", 131]
desc "Task for adding new oxenserver site"
task :add_site, :domain, :site, :ox_id do |t, args|
  service_path = /data/www/html/app_service
  site_string = %{
    <VirtualHost *:80>
        ServerName #{args[:domain]}
        DocumentRoot #{service_path}/sites/#{args[:site]}/public
        PassengerAppRoot #{service_path}/sites/#{args[:site]}
        SetEnv OX_ID #{args[:ox_id]}
        <Directory #{service_path}/sites/#{args[:site]}/public>
                AllowOverride all
                Options -MultiViews
        </Directory>
    </VirtualHost>
  }
  File.open("tmp/#{args[:site]}.conf", "w") do |f|
    f.write site_string
  end

  site_start = %{
    mv #{service_path}/current/tmp/#{args[:site]}.conf /data/apache/conf.d/#{args[:site]}.conf
    service httpd restart
  }

  File.open("tmp/#{args[:site]}.sh", "w") do |f|
    f.write site_start
  end

  #
  sites_dir = "#{service_path}/sites/#{args[:site]}"
  shared_sites_dir = "#{service_path}/shared/sites/#{args[:site]}"
  shared_oxen_dir = "#{service_path}/shared/sites/oxen"
  current = "#{service_path}/current"


  # prepare system files/directories
  system "mkdir #{sites_dir} #{shared_sites_dir} #{shared_sites_dir}/public"
  system "cd #{sites_dir} && cp -rus #{current}/* ."
  system "cd #{shared_sites_dir}/public && cp -r #{shared_oxen_dir}/public/* ."
  system "cd #{shared_sites_dir} && mkdir log tmp && cd #{sites_dir} && rm -rf public log tmp && ln -s #{shared_sites_dir}/public public && ln -s #{shared_sites_dir}/log log && ln -s #{shared_sites_dir}/tmp tmp"
  system "cd #{sites_dir} && touch tmp/restart.txt log/production.log"
  system "mv tmp/#{args[:site]}.sh public/system/job_queue/#{args[:site]}.sh"
end
like image 175
walt_die Avatar answered Oct 13 '22 20:10

walt_die