Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to automate multi server deployment using docker

Here's my situation:

  • I have a project written in Go stored on Github
  • I have 3 app servers behind a load balancer (app1, app2, app3)
  • I have a Dockerfile as part of the project in git, which when used to build an image, knows how to install all my app dependencies (including Go) and get a working environment for my app
  • I have containers running on all 3 app servers and everything is working marvellously

Now I want to change some code and redeploy my changes to those 3 servers. I can think of 3 possible ways to facilitate the automation of this:

  1. As part of my dockerfile I can add a step that pulls my code from Github and builds it. So to redeploy I need a script that logs into the 3 servers and rebuilds and runs the containers, thus pulling all new code in the process. At most all I ever need to push to a server is the Dockerfile.
  2. As part of my Dockerfile I can have an ADD command to bundle my code into the container. I would then need to deploy my entire project to each server using something like Capistrano or Fabric and then kill the old container, rebuild and run.
  3. I can use a nominated machine (or my dev environment) to build a new image based on the current source code. Then push this image to the registry. Then have a script which logs into my servers and pull the new image down, kill the old container and run the new one.

Number 1 seems like the easiest but most other discussion I've read on Dockers leans towards a situation like 3 which seems rather long-winded to me.

What is the best option here (or not here), I'm new to Docker so have I missed something? I asked someone who knows about Docker and their response was 'you're not thinking in the Docker way', so what's the Docker way?

like image 396
Matt Harrison Avatar asked Jun 04 '14 18:06

Matt Harrison


People also ask

Can Docker run multiple processes?

It's ok to have multiple processes, but to get the most benefit out of Docker, avoid one container being responsible for multiple aspects of your overall application. You can connect multiple containers using user-defined networks and shared volumes.

How Docker compose deploys multi containerized application?

The docker-compose. yml file allows you to configure and document all your application's service dependencies (other services, cache, databases, queues, etc.). Using the docker-compose CLI command, you can create and start one or more containers for each dependency with a single command (docker-compose up).

Can a Docker container connect to multiple networks?

You can create multiple networks with Docker and add containers to one or more networks. Containers can communicate within networks but not across networks. A container with attachments to multiple networks can connect with all of the containers on all of those networks.


1 Answers

I think the idea for option 3, is that you're building the image only once, which means that all servers would run the same image. The other two may produce different images.

E.g. in a slightly more involved scenario, the three builds can even pick different commits if you go with option 1.

like image 161
ivant Avatar answered Oct 07 '22 07:10

ivant