Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Where keep deployment files in Multi Container, Multi Repository Project?

I have a stack that consist of three git repositories and applications:

  • The Frontend Application
  • The Backend Application
  • The Worker Application

They all work together with a docker-compose.yml file that also includes other resources and images needed to deploy the app.

When I deploy the application, I only need to push the docker-compose.yml file, or in the case of AWS it would be an Dockerrun.aws.json file. So my question is... Where should I keep this file? It does not belong to any of the repositories above since it orchestrates all of them.

Should I set up a 4th repository called "The Deployment Repository" that keep all my deployment files and configurations? Or do I pick a repository from above that I decide is "The repo I deploy" where I keep all of it? Or do I perhaps store copies of the deployment in all repositories so that I can deploy the full application no matter which repository I'm working on?

like image 674
Marcus Lind Avatar asked Nov 27 '17 01:11

Marcus Lind


2 Answers

A few months after posting this question I've come up with the following conclusion:

1. Mono Repositories

In some cases the simplest solution might be to create a single repository that hold all the applications. This goes against the 12 Factor Methodology written by Heroku, however if all you're doing is managing a frontend and backend application, it might save you a lot of extra work and DevOps.

Your git repository would then be:

.git/
frontend_app/
  ...
  Dockerfile
backend_app/
  ...
  Dockerfile
deployment/
  nginx/
  docker-compose-dev.yml
  docker-compose-prod.yml

2. Multi Repositories

If you stick to the idea to use multiple repositories, perhaps because you're strict about following best practices and want to separate the code, or perhaps because you have a multi service infrastructure that might swell to dozens or hundreds of repositories, I recommend the following:

  1. Each application has its own repository, its own CI/CD pipeline, and its own build process. Whenever a repo is updated, it triggers the build and create a docker image. It does not "deploy" anything.

  2. You have a separate repository for the deployment that only contain the files that orchestrates the system and perhaps any web server configs or other deployment configurations. This is the repo that is deployed.

  3. Whenever your deployment detect changes, it pulls down the latest image that was build and updates your application. This can either be done by the CI/CD Pipeline in the first step triggering the update, or an agent sitting on the server listening for updates.

The third part is the tricky part and there are multiple way to solve it. There are multiple software programs that assists with managing this type of infrastructure such as AWS ECS, Docker Swarm or Kubernetes.

like image 118
Marcus Lind Avatar answered Nov 15 '22 18:11

Marcus Lind


Try an arbitrary approach, evaluate the pros and cons and reconsider it later. I don't think there is a standard approach.

You can imagine "packaging" or merely "exposing" the docker-compose definition outside of a source control repository (they're deployment artifacts at this point, not source), though of course the original source file should be under revision control.

We like git even for deployment as it allows us to also track configuration changes per-deployment (with deployment-specific notes in the commit log). The context might be a little particular; the applications in question will often need to be configured very differently across sites, so the actual docker-compose.yml files are in fact different per-deployment. As a result, we do use one git repository per deployment. YMMV.

like image 44
tne Avatar answered Nov 15 '22 20:11

tne