This is my current directory structure, I am trying to import a function from src/helpers/log.py to src/data/download_dataset.py. I have followed this answer but it still does not work.
|-- AUTHORS.rst
|-- CONTRIBUTING.rst
|-- HISTORY.rst
|-- LICENSE
|-- MANIFEST.in
|-- Makefile
|-- README.rst
|-- data
| |-- external
| |-- interim
| |-- processed
| `-- raw
| `-- wine-quality.csv
|-- docs
| |-- Makefile
| |-- authors.rst
| |-- conf.py
| |-- contributing.rst
| |-- history.rst
| |-- index.rst
| |-- installation.rst
| |-- make.bat
| |-- readme.rst
| `-- usage.rst
|-- dvc_mlflow
| |-- __init__.py
| `-- dvc_mlflow.py
|-- logs
|-- models
|-- requirements_dev.txt
|-- setup.cfg
|-- setup.py
|-- src
| |-- data
| | |-- __init__.py
| | `-- download_dataset.py
| |-- features
| | `-- __init__.py
| |-- helpers
| | |-- __init__.py
| | `-- log.py
| `-- models
| `-- __init__.py
|-- tests
| |-- __init__.py
| `-- test_dvc_mlflow.py
`-- tox.ini
I am importing the file log_error in src/data/download_dataset.py like so:
from helpers.log import log_error
But when I try to run the file using python3 src/data/download_dataset.py I get the error ModuleNotFoundError: No module named 'helpers'. I am a bit confused because I already added in the __init__.py files in each of the directories to make them modules but the issue still persists.
You can try the sys.path.append method. Whatever modules you want to import, find the path to those modules and pass it to the function.
Example:
If my current working directory is /home/user_name/Desktop/Scripts/Main.py and I want to import some file Factorial.py which is at /home/user_name/Documents/OtherScripts/, I can do the following
# Inside your Main.py file
import sys
sys.path.append("/home/user_name/Documents/OtherScripts/")
from Factorial import *
I found an even simpler way that did not resort to me having to append the path using sys.path.append(), just needed to set the python path in the command line using export PYTHONPATH="${PYTHONPATH}:/path/to/your/project/".
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With