We have project with the following high-level directory structure*
./datascience/
├── core
│ └── setup.py
├── notebooks
│ └── Pipfile
└── web
└── Pipfile
*Excluded all the irrelevant files and directories for brevity.
The core
package is a library. It's a dependency of both the notebooks
and web
applications.
The core
package, being a library, has its dependencies specified in setup.py
import setuptools
setuptools.setup(
install_requires=[
'some-dependency',
'another-dependency'
]
)
The web
and notebooks
applications are using pipenv for dependency management. Their dependencies are specified in a Pipfile
.
For example, here's how the web
dependencies are specified in web/Pipfile
:
[packages]
datascience-core = {path = "./../core"}
flask = "~= 1.0"
Notice how the core
dependency is a local dependency, hence the relative path.
Doing a pipenv install
from inside the the web
or notebooks
directory, does not install the dependencies of the core
library as I expected!
I also tried using a Pipfile
for core
, hoping that pipenv would pick it up in its graph and download all the nested dependencies. But it doesn't.
How can dependencies of the core
app be installed automatically when pipenv is installing dependencies for the web
or notebooks
app?
Use a Pipenv project Pipenv will read the Pipfile and Pipfile. lock files for the project, create the virtual environment, and install all of the dependencies as needed. Finally, if you want to use Pipenv to manage a project that currently uses a requirements.
pip will grow a new command line option, -p / --pipfile to install the versions as specified in a Pipfile , similar to its existing -r / --requirement argument for installing requirements. txt files.
While pip can install Python packages, Pipenv is recommended as it's a higher-level tool that simplifies dependency management for common use cases. This does a user installation to prevent breaking any system-wide packages.
$ pipenv lock is used to create a Pipfile. lock , which declares all dependencies (and sub-dependencies) of your project, their latest available versions, and the current hashes for the downloaded files. This ensures repeatable, and most importantly deterministic, builds.
Found a solution thanks to this comment in a pipenv issue thread: https://github.com/pypa/pipenv/issues/209#issuecomment-337409290
I've continued listing the core
's dependencies in setup.py
.
I've changed the web
and notebook
apps to use an editable installation of the core
package.
This was done by running the following in both the web
and notebooks
directory:
pipenv install --editable ../core
It produced this diff
[packages]
- datascience-core = {path = "./../core"}
+ datascience-core = {editable = true,path = "./../core"}
Now running pipenv install
from the web
and notebooks
directory results in the installation of the core
package and its dependencies!
It also solved another very annoying problem, which was having to pipenv install
every time there was a change in core
. Now it picks up development changes without having to re-install the local package!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With