I have a Python project consisting of a Jupyter notebook, several scripts in a bin
directory and modules in a src
directory, with dependencies in a Pipfile
:
myproject
├── myproject.ipynb
├── Pipfile
├── Pipfile.lock
├── bin
│ ├── bar.py
│ └── foo.py
└── src
├── baz.py
└── qux.py
The scripts foo.py
and bar.py
use the standard shebang
#!/usr/bin/env python
and can be run with pipenv shell
:
mymachine:myproject myname$ pipenv shell
(myproject-U308romt) bash-3.2$ bin/foo.py
foo
However, I can't easily access the modules in src
from the scripts. If I add
import src.baz as baz
to foo.py
, I get:
ModuleNotFoundError: No module named 'src'
One solution I tried is to add a .env
file under myproject
:
PYTHONPATH=${PYTHONPATH}:${PWD}
This works thanks to pipenv
's automatic loading of .env
, but checking the .env
file into the git distribution of the project would collide with the traditional use of .env
to store secrets such as passwords -- in fact, my default .gitignore
for Python projects already excludes .env
for just this reason.
$ git add .env
The following paths are ignored by one of your .gitignore files:
.env
Use -f if you really want to add them.
Alternatively, I could move src
under bin
, but then the Jupyter notebook would have to reference the modules as bin.src.baz
etc., which is also a hassle.
My current workaround is just to add a symlink:
myproject
├── Pipfile
├── Pipfile.lock
├── bin
│ ├── bar.py
│ ├── foo.py
│ └── src -> ../src
└── src
├── baz.py
└── qux.py
This works, and I suppose has the benefit of being transparent, but it seems like there should be some way to leverage pipenv
to solve the same problem.
Is there a portable, distributable way to put these modules on the search path?
Introduction to Python module search pathThe current folder from which the program executes. A list of folders specified in the PYTHONPATH environment variable, if you set it before. An installation-dependent list of folders that you configured when you installed Python.
I'm not sure there's a perfect solution for this, but in the interest of being explicit rather than implicit (PEP 20), I've decided to check in a file that needs to be sourced before running any script. This is one extra manual step but you can put this in a Makefile for instance.
env.sh
export PYTHONPATH=${PYTHONPATH}:${PWD}
Makefile
bar:
source env.sh && pipenv run python scripts/bar.py
.PHONY: migrate
The solution is a bit similar to the approach Go takes with its GOPATH
.
I think the other solutions are not as good:
pipenv
aims to solve dependencies, I could be wrong but I did not find anything related to the problem of the PYTHONPATH
.(Came here for an answer, ended up giving one instead)
I have a similar project folder structure, so I had the same problem.
Thanks to your tip, my solution was to add an file .env
at the same level as the Pipfile
with the following content:
$ cat .env
PYTHONPATH=${PYTHONPATH}:src
Now, launching my app with something like
$ pipenv run python -m package.subpackage.app
seems to work ok from inside my project's folder and also from it's subfolders.
Side note(although is not a good/clean way to do things):
for your ModuleNotFoundError: No module named 'src'
problem ... the "problem" is that the src
(folder) is not a package, in order to fix that you could easily add an (empty) __init__.py
file inside the src
folder, thous making it a "package"; which in turn would make import src.baz
possible.
(Later edit)
Actually this adds a record <project_folder>/${PYTHONPATH}
in sys.path
, which is useless, so the correct content of the .env
file should be only PYTHONPATH=src
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With