Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Setting up a development environment with docker

Tags:

I'm facing some issues in setting up docker for a development environment for my team. So far:

  1. I used a base image to start a container

    docker run -t -i ubuntu:latest "/bin/bash"
    
  2. I installed all the compile and build tools in it

  3. I committed that image and pushed that to our local docker server

    docker commit e239ab... 192.168.10.100:5000/team-dev:beta
    

So far so good. Now, acting as a team member:

  1. I pull the dev environment image on my computer

    docker pull 192.168.10.100:5000/team-dev:beta
    
  2. I start a container:

    docker run -t -i 5cca4... "/bin/bash"
    

At this point, I'm thinking of my container as a sort of remote machine that I can SSH into and work in.

I try to do a git clone from inside the container, but this fails because of a publickey problem. I copy the id_rsa* files into the docker manually, and the clone works. Then I try to edit some source files, but my vim configurations, bash configurations, everything is thrown off because this is a fresh OS environment. What does work really well is my entire dependency versioned build environment.


These are the possible solutions I'm thinking of to help me work around this.

  1. After pulling the basic image, use a dockerfile to add all the environment variables from the host to the docker.

    Cons: Everytime my host environment changes bash/vim/git I need to update the dockerfile

  2. Use a Volume from the host to the container. Git clone and edit files in the host. Run the build scripts and the compilations from inside the docker.

    Cons: Content from a data volume cannot be used to update the image if required. I don't know if this is something I should care about anyway.

Or am I approaching this the wrong way?

like image 595
iamnat Avatar asked Aug 08 '14 19:08

iamnat


People also ask

Can you use Docker for development?

Here's some of the benefits of leveraging docker for development purposes. Consistent development environments for the entire team. All developers use the same OS, same system libraries, same language runtime, independent of the host OS. The development environment is exactly the same as the production environment.

How do Docker dev environments work?

Dev Environments uses tools built into code editors that allows Docker to access code mounted into a container rather than on your local host. This isolates the tools, files and running services on your machine allowing multiple versions of them to exist side by side.

Can I use Docker as a virtual environment?

Virtual machinesDocker isn't a VM technology. It doesn't simulate a machine's hardware and it doesn't include an operating system. A Docker container is not by default constrained to specific hardware limits. If Docker virtualizes anything, it virtualizes the environment in which services run, not the machine.

Is Docker good for local development?

Docker is a great way to provide consistent development environments. It will allow us to run each of our services and UI in a container. We'll also set up things so that we can develop locally and start our dependencies with one docker command. The first thing we want to do is dockerize each of our applications.


Video Answer


1 Answers

As a young docker user, I will try to explain my usage of it. I mostly use it for two things: isolate services and containerize complex environments.

1. Isolate services

Abstract

Like Separation Of Concern Principle

Why ? For reusability and scalability (debug and maintenance too by the way).
For example to have a dev environment for a PHP Laravel website, I would run a couple of containers:

  • Mysql
  • Apache-PHP
  • Redis
  • ..

Each of these containers (services) would be linked to each other so they can work together. For example:

Apache <== Mysql (3306 port).

The Apache container can now open a TCP connection to the Mysql container through the exposed 3306 port.

Similar projects will rely upon a single Docker image but will run in different containers. But every tool required by an app should be containerized for the sake of team work.

Source code management

I never put source code directly in a container. I prefer to mount it by volume (docker run -v option) into the container.

When I want to run a command that will alter my source code, like build, run tests or npm updates, I do it either from the host or the container, depending on how much configuration this tool needs for a run.
The more it is complicated or app-specific, the more I would consider to do it in the container.

Running containers

Following the example above, I'll run a myapp and an mysql containers.

$ cd myapp
$ docker run -d \    # run as daemon (detached)
 -p 82:80 \          # bind container's port 80 to host's 82
 -p 22001:22 \       # bind container's port 22 to host's 22001
 --expose 3306       # exposed to other container linked with it
 --name mysql
 author/mysqlimage  
$ docker run -d \    # run as daemon (detached)
 -p 81:80 \          # bind container's port 80 to host's 81
 -p 22001:22 \       # bind container's port 22 to host's 22001
 -v $(pwd):/var/www  # mount current host's directory to container's /var/www
 --link mysql:db     # give access to mysql container (ports) as db alias
 --name myapp
 author/apacheimage

myapp container would be able to talk with mysql container. To test it: $ telnet db 3306 would work.

Running containers using Fig

Like I said, Docker's cmd is a nightmare to me, so I found another great tool, Fig which made me end up with this clear yaml file, located at my project's root:

web:
    image: lighta971/laravel-apache
    links:
        - db
    ports: 
        - "81:80"
        - "22001:22"
    volumes:
        - .:/app
db:
    image: lighta971/phpmyadmin-apache
    ports:
        - "82:80"
        - "22002:22"
    expose:
        - "3306"

And then $ cd myapp && fig up gives the same result as the commands below :)

2. Containerize complex environments

I'm also using Docker for Android development. A basic Android/Cordova setup is big, like Gigs of downloads, and requires time to set up the environment.
This is why I put all the components into a single "swiss army knife" container:

  • Android SDK
  • Android NDK
  • Apache Ant
  • Android tools
  • Java
  • ...

It results in an image with everything I need to set up my Cordova environment:

$ docker run -i 
 --privileged                  # special permission for usb
 -v /dev/bus/usb:/dev/bus/usb  # mount usb volume
 -v $(pwd):/app                # mount current folder
 -p 8001:8000
 lighta971/cordova-env

Which I aliased in cvd:

$ alias cdv='docker run -i --privileged -v /dev/bus/usb:/dev/bus/usb -v $(pwd):/app -p 8001:8000 lighta971/cordova-env'

Now I can transparently use all the programs inside the container like if it was installed on my system. For instance:

$ cdv adb devices            # List usb devices
$ cdv cordova build android  # Build an app
$ cdv cordova run android    # Run an app on a device
$ cdv npm update             # Update an app dependencies

All these commands would work into the current directory because of the volume mount option $(pwd):/app.

Dockerfile

All that said, there are other things to know, like:

  • understanding the build process to make efficient images
  • deal with needs of persistent data
  • keep image packages updated
  • etc.

Hope that was clear to you :)

like image 99
Aurel Avatar answered Sep 23 '22 03:09

Aurel