We have several Python 2.6 applications running on Linux. Some of them are Pylons web applications, others are simply long-running processes that we run from the command line using nohup
. We're also using virtualenv
, both in development and in production. What is the best way to deploy these applications to a production server?
In development we simply get the source tree into any directory, set up a virtualenv and run - easy enough. We could do the same in production and perhaps that really is the most practical solution, but it just feels a bit wrong to run svn update
in production. We've also tried fab
, but it just never works first time. For every application something else goes wrong. It strikes me that the whole process is just too hard, given that what we're trying to achieve is fundamentally very simple. Here's what we want from a deployment process.
That's really it! How hard can it be?
Uploading the Script Open the file using a text editor and add any dependencies needed such as numpy in order to run your project as when you deploy to Heroku the “pip” install command will be running to make sure all dependencies are present in order to run the script. 3. git add .
Development and deployment of Python code is made much easier by setuptools in combination with virtualenv and pip.
The trickiest part, I've found, is running a development environment that mirrors the deployed setup as closely as possible, while at the same time respecting Pythonic tools and idioms. But it turns out that this is very easy to achieve with pip and setuptools, which together allow you to "install" a development tree into a Python environment without moving the files around. (Actually setuptools does this all by itself, but pip acts as a front end handle dependencies better.)
Another key issue is preparing a clean environment with a known package set across both environments. Python's virtualenv is a god-send in this respect, allowing you to configure a completely customised Python environment with your own choice of packages, without requiring root access, or OS packages (rpm or dpkg), and without being constrained by whatever packages and versions thereof that happen to be installed on your distro.
Finally, one annoying bug-bear is the difficulty of creating command-line scripts that play nice with PYTHON_PATH. This is also dealt with quite elegantly by setuptools.
(To keep things simple, this is fairly prescriptive. Feel free to diverge as appropriate.)
From the working directory, setup a new Python virtual environment:
$ python <untarred_directory>/virtualenv.py venv
You'll want to do most of your work from inside this virtual environment. Use this command to do so (.
is a shortcut for source
):
$ . venv/bin/activate
Install pip:
$ easy_install pip
Create directories for each installable package you want to create.
Once your tree structure is ready, you are almost ready to begin coding. But right now, packages that depend on each other can't see each other as they will in the deployed environment. This problem is resolved with a neat little trick that setuptools offers, and which pip makes use of. For each package you are developing, run the following command (make sure you are in the virtual environment for your project, as per step 3, above):
$ pip install -e pkg1
This command will install pkg1
into your virtual environment, and it does so without copying any of your files. It simply adds a link to the site-packages
directory pointing to the package's development root and creates an egg-info directory in that root. You can also do this without pip, as follows:
$ cd pkg1
$ python setup.py develop
And it will usually work, but if you have third-party dependencies (which should be listed in setup.py, as explained here in the setuptools documentation), pip is smarter about finding them.
One caveat to note is that neither setuptools nor pip have any smarts about finding dependencies amongst your own packages. If PkgB in directory B, depends on PkgA in directory A, then pip install -e B
will fail, because pip has no way of knowing that PkgA can be found in directory A; it will instead try, and fail, to download PkgA from its online repository sources. The workaround is simply to install each package after its dependencies.
At this point, you can start python, load up one of your modules and start toying with it. You can edit code, and it will be immediately available the next time you import it.
Finally, if you want to create command-line tools with your packages. Don't write them by hand. You'll end up with a horrible mess of PYTHON_PATH hacks that never quite works properly. Just read up on automatic script creation in the setuptools documentation. This will spare you a lot of grief.
When your packages are ready for action, you can use setup.py to create deployment packages. There are way too many options to go into here, but the following should get you started:
$ cd pkg1
$ python setup.py --help
$ python setup.py --help-commands
Due to the broad nature of the question, this answer is necessarily incomplete. I haven't dealt with long-running servers, web frameworks or the actual deployment process itself (in particular, use of pip install's --index-url
to manage a private repository of third-party and internal packages and -e vcs+...
, which will pull packages out of svn, git, hg or bzr). But I hope I've given you enough rope to tie it all together (just don't hang yourself with it :-).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With