I am trying to build good habits when developing my python applications and organize them as if they will always be used by others in the future. One crucial part of that is the packaging aspect.
I have read a certain amount of posts, discussions and the pep 517/518. However, I still don't fully understand how to properly organize my files for packaging.
I have decided to follow the setup.cfg
path instead of the pyproject.toml
path. That thing is clear. I shall identify the runtime necessary package in the setup.cfg
.
I am also using virtualenv
and I understood that development related package (black
, pytest,...
) should be defined in a separate file from setup.cfg
, usually one called requirements.txt
. One thing that is not clear: should setup.cfg
be a subset of requirements.txt
and therefore repeat some information? This seems bad practice and will at some point definitely become disconnected and hard to maintain.
I have tried to find an answer to that question. I found this article but I don't understand how adding this
--index-url https://pypi.python.org/simple/
-e .
in the requirements.txt helps avoid the issue or even handle the development dependencies.
I am even more lost when he presents the following possibility:
--index-url https://pypi.python.org/simple/
-e https://github.com/foo/bar.git#egg=bar
-e .
What does this achieve?
Finally, is the setup.cfg
way compatible with the building of wheels package?
There are more than 200,000 Python packages in the world (and that's just counting those hosted on PyPI, the official Python Package Index).
A Python package usually consists of several modules. Physically, a package is a folder containing modules and maybe other folders that themselves may contain more folders and modules. Conceptually, it's a namespace.
I usually follow the two approaches below to separate the list of dev dependencies from runtime dependencies in my packages:
In my setup.py
configuration file I use the extras_require
field to define development related dependencies like this:
setup.py
from setuptools import find_packages, setup
INSTALL_REQUIRES = ["python-dotenv"]
EXTRAS_REQUIRE = {
"dev": ["flake8", "black", "mypy"],
}
setup(
name="mypackage",
version="0.1.0",
description="My awesome package",
packages=find_packages(),
install_requires=INSTALL_REQUIRES,
extras_require=EXTRAS_REQUIRE,
python_requires=">=3.8",
)
As you can see above I have an EXTRAS_REQUIRE
dictionary with a dev
key where I list my dev dependencies.
Then I can use pip
to install the package. If I want to exclude the installation I just do pip install .
, this way only the required python-dotenv
and obviously the mypackage
packages will be installed in your given environment.
However if you want to include the development related dependencies you can specify the extra install like this pip install .['dev']
, then all your dev deps will get installed in the environment as well.
In case you insist to have your dev requirements in a separate requirements file, you can also do that, let's call it requirements-dev.txt
and populate it like this:
requirements-dev.txt
. # note the dot here, this will tell pip to include the install_requires deps from your setup.py
flake8
black
mypy
and modify the previous setup.py
by deleting the extras dictionary:
setup.py
from setuptools import find_packages, setup
INSTALL_REQUIRES = ["python-dotenv"]
setup(
name="mypackage",
version="0.1.0",
description="My awesome package",
packages=find_packages(),
install_requires=INSTALL_REQUIRES,
python_requires=">=3.8",
)
Then you can install your package like pip install -r requirements-dev.txt
and if you don't want to install dev dependencies you just do the regular pip install .
and that's it.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With