Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Deploying to multiple EC2 servers with Fabric

I'm wondering if anyone has experience deploying to multiple servers behind a load balancer on ec2 with fabric

I have used fabric for a while now, and have no issues with it, or deploying to multiple servers, but what I would like to do in this scenario is (lets say I have ten instances running) de-register half (5) of the boxes from my load balancer, deploy my code to them and run a smoke test, and if everything looks good, register them with the load balancer again and de-register the remaining 5 instances and deploy to them, and then register them back to the load balancer.

I have no problem accomplishing any of the individual tasks (de-registering, running tests, deploying etc), I just don't know how to organize my hosts in a simple fashion so that I can deploy the first half, then the second half. Fabric seems to be set up to run the same tasks on all hosts in order (task 1 on host 1, task 1 on host 2, task 2 on host 1, task 2 on host 2 etc etc)

My first thought was to create a task to handle the first part of de-registering, deploying and testing, and then set the env.hosts for the second half of the servers, but i felt this seemed a bit hokey.

Has anyone modeled something similar to this with Fabric before?

like image 944
MattoTodd Avatar asked May 01 '12 20:05

MattoTodd


2 Answers

You can simplify this by defining roles (used for aggregation of hosts) and executing your tasks on one role, then running tests and deploying on the second role.

Example of roledefs:

env.roledefs = {
    'first_half': ['host1', 'host2'],
    'second_half': ['host3', 'host4'],
}

def deploy_server():
    ...
    # deploy one host from current role here

def deploy():
    # first role:
    env.roles = ['first_half']
    execute('deploy_server')
    test()  # here test deployed servers
    # second role:
    env.roles = ['second_half']
    execute('deploy_server')

More links:

  • env.roledefs documentation,
  • env.roles documentation,
  • execution model,
  • execute() documentation,
like image 87
Tadeck Avatar answered Oct 17 '22 10:10

Tadeck


You want to use the execute() function. This will allow you to do something like this:

def update():
     deallocate()
     push_code()
     smoke_test() #could fail fast
     reallocate()

def deploy():
     execute(update, hosts=first_five)
     execute(update, hosts=last_five)

You could also make each of the deallocate, push_code, and smoke_test, tasks an execute() call in the deploy, and then you'd run all the deallocates then run all the code pushes, etc.

Then have a check of some sort and then proceed with the others running said tasks.

like image 44
Morgan Avatar answered Oct 17 '22 08:10

Morgan