Package managers for JavaScript
like npm
and yarn
use a package.json
to specify 'top-level' dependencies, and create a lock-file to keep track of the specific versions of all packages (i.e. top-level and sub-level dependencies) that are installed as a result.
In addition, the package.json
allows us to make a distinction between types of top-level dependencies, such as production and development.
For Python
, on the other hand, we have pip
. I suppose the pip
equivalent of a lock
-file would be the result of pip freeze > requirements.txt
.
However, if you maintain only this single requirements.txt
file, it is difficult to distinguish between top-level and sub-level dependencies (you would need for e.g. pipdeptree -r
to figure those out). This can be a real pain if you want to remove or change top-level dependencies, as it is easy to be left with orphaned packages (as far as I know, pip
does not remove sub-dependencies when you pip uninstall
a package).
Now, I wonder: Is there some convention for dealing with different types of these requirements
files and distinguishing between top-level and sub-level dependencies with pip
?
For example, I can imagine having a requirements-prod.txt
which contains only the top-level requirements for the production environment, as the (simplified) equivalent of package.json
, and a requirements-prod.lock
, which contains the output of pip freeze
, and acts as my lock
-file. In addition I could have a requirements-dev.txt
for development dependencies, and so on and so forth.
I would like to know if this is the way to go, or if there is a better approach.
p.s. The same question could be asked for conda
's environment.yml
.
Note that the equivalent of npm package. json is the PipFile file!
☤ Pipfile.Pipfile. lock takes advantage of some great new security improvements in pip . By default, the Pipfile. lock will be generated with the sha256 hashes of each downloaded package.
package-lock. json is automatically generated for any operations where npm modifies either the node_modules tree, or package. json . It describes the exact tree that was generated, such that subsequent installs are able to generate identical trees, regardless of intermediate dependency updates.
There are at least three good options available today:
Poetry uses pyproject.toml
and poetry.lock
files, much in the same way that package.json
and lock files work in the JavaScript world.
This is now my preferred solution.
Pipenv uses Pipfile
and Pipfile.lock
, also much like you describe the JavaScript files.
Both Poetry and Pipenv do more than just dependency management. Out of the box, they also create and maintain virtual environments for your projects.
pip-tools
provides pip-compile
and pip-sync
commands. Here, requirements.in
lists your direct dependencies, often with loose version constraints and pip-compile
generates locked down requirements.txt
files from your .in
files.
This used to be my preferred solution. It's backwards-compatible (the generated requirements.txt
can be processed by pip
) and the pip-sync
tool ensures that the virtualenv exactly matches the locked versions, removing things that aren't in your "lock" file.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With