I'm trying to integrate TravisCI into my workflow, and realized I had some dependencies because of my old directory structure (not having self-contained, virtualenv
-able git repos).
When I try to run nosetests
locally, it runs the tests just fine; when TravisCI tries to run them, it fails, with an import
error. Specifically, I have, as one of the lines in my test script:
from myproject import something
My directory structure is inside my git repo myproject
is something like:
.travis.yml
requirements.txt
something.py
tests/
test_something.py
nose
to its requirements.txt
, and the tests always pass locally.I feel like I still haven't understood absolute-vs-relative imports, and I can't tell if that's coming in to play here, or if I'm just doing something obvious and dumb in my project.
Desired outcome: figure out why TravisCI is failing, and fix my repo accordingly, so that I can commit and have things build correctly, both locally and on TravisCI. If that requires more drastic changes like "you should have a setup.py
that does blah-blah to the environment" or similar - please let me know. I'm new to this aspect of Python, and find the current documentation overwhelmingly unclear.
As an FYI, I found this question and adding --exe
doesn't help, or seem to be the same issue.
3 Most Common Import Problems in Python 1 1. ImportError No Module Named#N#ImportError: No module named module_name#N#If you want to import module in the same... 2 2. SystemError Parent Module not Loaded More ...
Since there are no finders, Python can’t find or import new modules. However, Python can still import modules that are already in the module cache since it looks there before calling any finders. In the example above, importlib was already loaded under the hood before you cleared the list of finders.
However, the whole import machinery is available in the importlib package, and this allows you to do your imports more dynamically. The following script asks the user for the name of a module, imports that module, and prints its docstring: import_module () returns a module object that you can bind to any variable.
In Python, you use the import keyword to make code in one module available in another. Imports in Python are important for structuring your code effectively. Using imports properly will make you more productive, allowing you to reuse code while keeping your projects maintainable.
I see there are no answer and I encountered the same issue, so I am posting here in hope to help somebody:
The quick fix for me was to add this line export PYTHONPATH=$PYTHONPATH:$(pwd)
in the .travis.yml
:
before_install:
- "pip install -U pip"
- "export PYTHONPATH=$PYTHONPATH:$(pwd)"
Having a setup.py
which should be the default option as it is the most elegant, configured like:
from setuptools import setup, find_packages
setup(name='MyPythonProject',
version='0.0.1',
description='What it does',
author='',
author_email='',
url='',
packages=find_packages(),
)
And then add this line in .travis.yml
before_install:
- "pip install -U pip"
- "python setup.py install"
Changing the layout of the project to have the test folder under the application one (the one with your core python code) such as:
.travis.yml
requirements.txt
app
|_ tests
| |_ test_application.py
|_ application.py
And running the test in travis with coverage and nosetest like:
script:
- "nosetests --with-coverage --cover-package app"
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With