Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Continuous deployment & AWS autoscaling using Ansible (+Docker ?)

My organization's website is a Django app running on front end webservers + a few background processing servers in AWS.

We're currently using Ansible for both :

  • system configuration (from a bare OS image)
  • frequent manually-triggered code deployments.

The same Ansible playbook is able to provision either a local Vagrant dev VM, or a production EC2 instance from scratch.

We now want to implement autoscaling in EC2, and that requires some changes towards a "treat servers as cattle, not pets" philosophy.

The first prerequisite was to move from a statically managed Ansible inventory to a dynamic, EC2 API-based one, done.

The next big question is how to deploy in this new world where throwaway instances come up & down in the middle of the night. The options I can think of are :

  1. Bake a new fully-deployed AMI for each deploy, create a new AS Launch config and update the AS group with that. Sounds very, very cumbersome, but also very reliable because of the clean slate approach, and will ensure that any system changes the code requires will be here. Also, no additional steps needed on instance bootup, so up & running more quickly.
  2. Use a base AMI that doesn't change very often, automatically get the latest app code from git upon bootup, start webserver. Once it's up just do manual deploys as needed, like before. But what if the new code depends on a change in the system config (new package, permissions, etc) ? Looks like you have to start taking care of dependencies between code versions and system/AMI versions, whereas the "just do a full ansible run" approach was more integrated and more reliable. Is it more than just a potential headache in practice ?
  3. Use Docker ? I have a strong hunch it can be useful, but I'm not sure yet how it would fit our picture. We're a relatively self-contained Django front-end app with just RabbitMQ + memcache as services, which we're never going to run on the same host anyway. So what benefits are there in building a Docker image using Ansible that contains system packages + latest code, rather than having Ansible just do it directly on an EC2 instance ?

How do you do it ? Any insights / best practices ? Thanks !

like image 774
renaudg Avatar asked Apr 14 '14 09:04

renaudg


People also ask

What is meant by continuous deployment?

Continuous deployment is a strategy for software releases wherein any code commit that passes the automated testing phase is automatically released into the production environment, making changes that are visible to the software's users.

What is continuous deployment in Devops?

Continuous Deployment (CD) is the final stage in the pipeline that refers to the automatic releasing of any developer changes from the repository to the production. Continuous Deployment ensures that any change that passes through the stages of production is released to the end-users.

What is difference between continuous delivery and continuous deployment?

Continuous Delivery vs Continuous Deployment Continuous delivery is a partly manual process where developers can deploy any changes to customers by simply clicking a button, while continuous deployment emphasizes automating the entire the process.

What is the goal of continuous deployment?

The goal of continuous deployment is to release applications (and business value) to end users faster and more cost-effectively by managing small, incremental software changes. Continuous deployment is usually managed by application release automation tools.


1 Answers

This question is very opinion based. But just to give you my take, I would just go with prebaking the AMIs with Ansible and then use CloudFormation to deploy your stacks with Autoscaling, Monitoring and your pre-baked AMIs. The advantage of this is that if you have most of the application stack pre-baked into the AMI autoscaling UP will happen faster.

Docker is another approach but in my opinion it adds an extra layer in your application that you may not need if you are already using EC2. Docker can be really useful if you say want to containerize in a single server. Maybe you have some extra capacity in a server and Docker will allow you to run that extra application on the same server without interfering with existing ones.

Having said that some people find Docker useful not in the sort of way to optimize the resources in a single server but rather in a sort of way that it allows you to pre-bake your applications in containers. So when you do deploy a new version or new code all you have to do is copy/replicate these docker containers across your servers, then stop the old container versions and start the new container versions.

My two cents.

like image 82
Rico Avatar answered Sep 24 '22 21:09

Rico