For my package, foo, I'm using the following setup.py:
from setuptools import setup
setup(name='foo',
version='0.0.1',
description='Lol',
url='https://github.com/foo/foo',
author='legend',
author_email='[email protected]',
license='GPLv3',
packages=['foo'],
install_requires=["bar"],
entry_points = {'console_scripts': ['foo = foo:main']},
keywords = ['foo'],
zip_safe=False)
When testing on my Arch system, it added the script to PATH automatically so I could just run foo
on my command line and it'd run the function main() automatically. Then, I booted up a VM and tested it on Windows 7. Pip installed the package just fine, but it wasn't in my path!
Help?
setuptools
, pip
and easy_install
don't modify the system PATH variable. The <python directory>\Scripts
directory, where all of them install the script by default, is normally added to PATH by the Python installer during installation.
If the scripts folder was not added to your PATH during installation, you can fix that by running <python directory>\Tools\scripts\win_add2path.py
. (See How can I find where Python is installed on Windows?)
The above sample setup.py file worked fine for me (with the Scripts directory in PATH), by the way. I tested it with
python setup.py bdist_wheel
pip install dist\foo-0.0.1-py3-none-any.whl
and
python setup.py sdist
pip install dist\foo-0.0.1.zip
Do not expect pip
or easy_install
to modify your PATH
, their task is to install a package into current environment.
On Linux, if you use global Python environment, you are likely to need root privileges, so you typically do:
$ sudo pip install <package>
However, this is not recommended method as it spoils system-wide Python environment (imagine having two applications having a bit different requirements to the same package version and you might have a problem).
Recommended method is to use some sort of virtualenv, which allows installing python package into separate python environment, which is also easy to remove and recreate.
It seems like you have custom python based script, which you want to use in your system.
For this scenario I use following method (assuming virtualenv
tool is installed into system-wide python):
$ mkdir ~/apps
$ mkdir ~/apps/myutil
$ cd ~/apps/myutil
$ virtualenv .env
$ source .env/bin/activate
(.env)$ pip install <package-or-more>
Now you shall have in ~/apps/myutil/.env/bin
directory installed all the script installed by pip
, let us call it myscript
(there can be more).
Remaining step is to make symlink from some directory which is already on PATH
, e.g. into /usr/local/bin
:
$ cd /usr/local/bin
$ sudo ln -s ~/apps/myutil/.env/bin/myscript
From now on, you shall be able calling command myscript
even without virtualenv being activated.
If you need to install later version of the script:
$ cd ~/apps/myutil
$ source .env/bin/activate
(.env)$ pip install --upgrade <package-or-more>
As you have the script linked, it will automatically be available in the latest version.
virtualenvwrapper
allows you to create multiple named virtualenvs and give you easy activation and
deactivation. In such case I do the following:
$ mkvirtualenv bin-myscript
(bin-myscript)$ pip install <package-or-more>
(bin-myscript)$ which `myscript`
~/.Evns/bin-myscript/bin/myscript
(bin-myscript)$ cd /usr/local/bin
(bin-myscript)$ sudo ln -s ~/.Evns/bin-myscript/bin/myscript
Upgrade is even simpler:
$ workon bin-myscript
(bin-myscript)$ pip install --upgrade <package-or-two>
and you are done
tox
is great tool for automated creation of virtualenvs and testing. I use it for creating
virtualenvs in directories I like. For more information see my other SO answer
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With