I have a stack that consist of three git repositories and applications:
They all work together with a docker-compose.yml
file that also includes other resources and images needed to deploy the app.
When I deploy the application, I only need to push the docker-compose.yml
file, or in the case of AWS it would be an Dockerrun.aws.json
file. So my question is... Where should I keep this file? It does not belong to any of the repositories above since it orchestrates all of them.
Should I set up a 4th repository called "The Deployment Repository" that keep all my deployment files and configurations? Or do I pick a repository from above that I decide is "The repo I deploy" where I keep all of it? Or do I perhaps store copies of the deployment in all repositories so that I can deploy the full application no matter which repository I'm working on?
A few months after posting this question I've come up with the following conclusion:
In some cases the simplest solution might be to create a single repository that hold all the applications. This goes against the 12 Factor Methodology written by Heroku, however if all you're doing is managing a frontend and backend application, it might save you a lot of extra work and DevOps.
Your git repository would then be:
.git/
frontend_app/
...
Dockerfile
backend_app/
...
Dockerfile
deployment/
nginx/
docker-compose-dev.yml
docker-compose-prod.yml
If you stick to the idea to use multiple repositories, perhaps because you're strict about following best practices and want to separate the code, or perhaps because you have a multi service infrastructure that might swell to dozens or hundreds of repositories, I recommend the following:
Each application has its own repository, its own CI/CD pipeline, and its own build process. Whenever a repo is updated, it triggers the build and create a docker image. It does not "deploy" anything.
You have a separate repository for the deployment that only contain the files that orchestrates the system and perhaps any web server configs or other deployment configurations. This is the repo that is deployed.
Whenever your deployment detect changes, it pulls down the latest image that was build and updates your application. This can either be done by the CI/CD Pipeline in the first step triggering the update, or an agent sitting on the server listening for updates.
The third part is the tricky part and there are multiple way to solve it. There are multiple software programs that assists with managing this type of infrastructure such as AWS ECS, Docker Swarm or Kubernetes.
Try an arbitrary approach, evaluate the pros and cons and reconsider it later. I don't think there is a standard approach.
You can imagine "packaging" or merely "exposing" the docker-compose definition outside of a source control repository (they're deployment artifacts at this point, not source), though of course the original source file should be under revision control.
We like git even for deployment as it allows us to also track configuration changes per-deployment (with deployment-specific notes in the commit log). The context might be a little particular; the applications in question will often need to be configured very differently across sites, so the actual docker-compose.yml files are in fact different per-deployment. As a result, we do use one git repository per deployment. YMMV.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With