I have a few script that had their own copy of a some functions, so I extracted these functions to a module and had the scripts import the function. These script are to be copied over to a bunch of linux servers and executed. When the scripts worked standalone, I would simply copy the files over to the servers and execute "python ".
I have a central management server that will copy and run the scripts on the different servers.
I've done some reading on python eggs, but could use some advice on method to go for. The way I do it today is to copy and simply run the python script. As this works fine, I was thinking maybe there is a way to bundle the scripts themselves with the (in-house) module they're dependent on, copy the bundle over to the servers and execute them. Don't see why I would need to install anything using "pip".
Now, what kind of setup would you guys recommend? Should I maybe build eggs on my local computer, and have the management server copy the egg file over to the servers? I would prefer to copy everything the server need over from the management server, instead of having the servers pull down dependencies themselves, so that I won't have to punch more holes all the firewalls. Since eggs typically need to pull down dependencies, maybe eggs are not the way to go?
Most of my servers are running python 2.6, but I do have some running python 2.4 and 3.2.
Greetings, Kenneth
You could use the zipapp module from the standard library to create executable Python zip archives. It is available from Python 3.5 onwards. One way to create a bundle is to add a top-level file named __main__.py , which will be the script that Python runs when the zip executable archive is executed.
You might want to consider looking at Twitter's PEX library which can create executable files from python packages: https://pex.readthedocs.org/en/latest/whatispex.html
.pex files are just carefully constructed zip files with a #!/usr/bin/env python and special
__main.py__
Update 2016: wagon helps building wheel packages with dependencies for offline installation.
For simple projects keeping all source together in one folder and copying it as a whole is good enough. You can use git to push your code to a central repository and pull it to your server, without building any packages. Fabric and Ansible are two tools that can help you automate the deployment process. (For example, remotely run git pull
and delete all your pyc
files).
If you have shared dependencies between projects, pip and wheels are the modern alternatives to eggs:
You can create a simple bundle that contains all of the dependencies you wish to install using.
$ tempdir=$(mktemp -d /tmp/wheelhouse-XXXXX) $ pip wheel -r requirements.txt --wheel-dir=$tempdir $ cwd=`pwd` $ (cd "$tempdir"; tar -cjvf "$cwd/bundled.tar.bz2" *)
Once you have a bundle, you can then uninstall it using:
$ tempdir=$(mktemp -d /tmp/wheelhouse-XXXXX) $ (cd $tempdir; tar -xvf /path/to/bundled.tar.bz2) $ pip install --force-reinstall --ignore-installed --upgrade --no-index --use-wheel --no-deps $tempdir/*
(From pip docs)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With