Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Include run-time dependencies in Python wheels

I'd like to distribute a whole virtualenv, or a bunch of Python wheels of exact versions with their runtime dependencies, for example:

  • pycurl
    • pycurl.so
      • libcurl.so
        • libz.so
        • libssl.so
        • libcrypto.so
        • libgssapi_krb5.so
          • libkrb5.so
            • libresolv.so

I suppose I could rely on the system to have libssl.so installed, but surely not libcurl.so of the correct version and probably not Kerberos.

What is the easiest way to package one library in a wheel with all the run-time dependency?

Or is that a fool's errand and I should package entire virtualenv? How to do that reliably?

P.S. compiling on the fly is not an option, some modules are patched.

like image 410
Dima Tisnek Avatar asked Sep 24 '14 14:09

Dima Tisnek


People also ask

Do Python wheels include dependencies?

Python package installed with pip , e.g. WheelPython dependencies can be specified as dependencies in your packaging, and automatically installed by pip . You can include third party C libraries in wheels, but for sufficiently complex dependencies that won't work.

Does wheel file include dependencies?

whl) files. Wheels are a binary installable format that includes the package and its dependencies. If both sdisk and wheel formats are available on PyPI, pip will prefer a compatible wheel. If pip does not find a wheel to install, it will automatically build a wheel and install it for you.

How do you add dependencies to a Python project?

Installing dependencies with PyCharmInstall the poetry plugin. Navigate to Project Interpreter , then click the gear icon and click Add . Click Poetry Environment , then click OK . PyCharm will automatically install the packages required into a virtual environment.


1 Answers

AFAIK, there is no good standard way to portably install dependencies with your package. Continuum has made conda for precisely this purpose. The numpy guys wrote their own distutils submodule in their package to install some complicated dependencies, and now at least some of them advocate conda as a solution. Unfortunately, you may have to make conda packages for some of these dependencies yourself.

If you're fine without portability, then targeting the package manager of the target machines will obviously work. Otherwise, for a portable package manager, conda is the only option I know of.

Alternatively, from your post ("compiling on the fly is not an option") it sounds like portability may not be an issue for you, in which case you could also install all the requirements to a prefix directory (most installers I've come across support a configure --prefix=/some/dir/ option). If you have a guaranteed single architecture, you could probably prefix-install all your dependencies to a single directory and pass that around like a file. The conda approach would probably be cleaner, but I've used prefix installs quite a bit and they tend to be one of the easiest solutions to get going.

Edit: As for conda, it is simultaneously a package-manager and a "virtualenv"-like environment/python install. While virtualenv is added on top of an existing python install, conda takes over the whole install, so you can be more sure that all the dependencies are accounted for. Compared to pip, it is designed for adding generalized non-Python dependencies, instead of just compiling C/Cpp exentions. For more info, I would see:

  • pip vs conda (also recommends buildout as a possibility)
  • conda as a python install

As for how to use conda for your purpose, the docs explain how to create a recipe:

Conda build framework

Building a package requires a recipe. A recipe is flat directory which contains the following files:

  • meta.yaml (metadata file)
  • build.sh (Unix build script which is executed using bash)
  • bld.bat (Windows build script which is executed using cmd)
  • run_test.py (optional Python test file)
  • patches to the source (optional, see below)
  • other resources, which are not included in the source and cannot be generated by the build scripts.

The same recipe should be used to build a package on all platforms.

When building a package, the following steps are invoked:

  1. read the metadata
  2. download the source (into a cache)
  3. extract the source in a source directory
  4. apply the patches
  5. create a build environment (build dependencies are installed here)
  6. run the actual build script. The current working directory is the source directory with environment variables set. The build script installs into the build environment
  7. do some necessary post processing steps: shebang, rpath, etc.
  8. add conda metadata to the build environment
  9. package up the new files in the build environment into a conda package
  10. test the new conda package:
    • create a test environment with the package (and its dependencies)
    • run the test scripts

There are example recipes for many conda packages in the conda-recipes <https://github.com/continuumio/conda-recipes>_ repo.

The :ref:conda skeleton <skeleton_ref> command can help to make skeleton recipes for common repositories, such as PyPI <https://pypi.python.org/pypi>_.

Then, as a client, you would install the package similar to how you would install from pip

Lastly, docker may also be interesting to you, though I haven't seen it much used for Python.

like image 183
metaperture Avatar answered Oct 01 '22 01:10

metaperture