Is it possible to have distutils just run the python module dependency analysis (and possibly install missing modules) without actually installing the python module in question? I imagine a command as follows:
./setup.py check-dependencies
that would report if any dependent modules are missing on the target system.
The distutils package provides support for building and installing additional modules into a Python installation. The new modules may be either 100%-pure Python, or may be extension modules written in C, or may be collections of Python packages which include modules coded in both Python and C.
Pip Check Command – Check Python Dependencies After Installation. Because pip doesn't currently address dependency issues on installation, the pip check command option can be used to verify that dependencies have been installed properly in your project. For example: $ pip check No broken requirements found.
Setuptools is a collection of enhancements to the Python distutils that allow developers to more easily build and distribute Python packages, especially ones that have dependencies on other packages. Packages built and distributed using setuptools look to the user like ordinary Python packages based on the distutils .
Dependencies in Python packaging are a confusing subject. For a long time, the only standard was PEP 314, which defines the requires
, provides
and obsoletes
parameter to the distutils.core.setup
function. The elements used for these arguments are Python module name, for example provides=['xml', 'xml.utils']
. The PEP was not very clear about standard library dependencies (do I have to depend on Python >= 2.5 or do I have to require 'xml'
?) and as it turned out, there was no tool that made use of these fields (not even distutils itself).
Then came setuptools. It introduced other kinds of dependencies which used project names instead of modules, so for example you can have setup(..., install_requires=['PyXML', 'Pylons'], tests_require=['nose'])
, which is immensely more useful: people release software on PyPI using unique project names, and you can use these same names in your setup script to depend on them, and with easy_install or pip you get these dependencies installed, modules, scripts and all.
When the reins of distutils were taken up again a few years ago, the community standardized some of setuptools’ idea of dependencies to produce PEP 345, which is now implemented in distutils2, intended to replace distutils and setuptools.
To sum it all up:
- you may have distutils-style module-level dependencies in your setup script, which are useless
- you may have setuptools-style project-level deps, which are used by setuptools-based tools
- you can have PEP 345-compliant projec-level deps in a setup.cfg
file, which are used by distutils2
So, to let us answer your question, you need to tell us which kind you have. For all practical matters, distutils-style module deps should not be used, so it leaves setuptools project deps or the new PEP 345-style ones, which are still new and not widespread yet. distutils2 has a compat layer for setuptools, so it can be possible to use it to get the info you want out of a setuptools-based setup.py
script.
Unrelated to packaging tools, there is also a tool that can scan your code to find the modules you’re using: it’s the modulefinder module, in the standard library, which is not very known or used, judging by the sad state of its code. This tool won’t tell you if a module used comes from the stdlib or a third-party project, and it can’t tell you the project name to use in your setup.py
or setup.cfg
file.
HTH
I think the closest you can get is:
setup.py install -v -n
which means to run a dry-run (-n
) in verbose (-v
) mode.
You could also use the distuitls.dep_util module, it wouldn't work as an option of setup.py
though.
HTH!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With