I have this file structure:
/home/test
├── dirA
│ └── ClassA.py
└── dirB
└── Main.py
With the following content in the files:
ClassA.py:
class ClassA:
def __str__(self):
return 'Hi'
Main.py:
from dirA.ClassA import ClassA
class Main:
def main():
a = ClassA()
if __name__ == '__main__':
Main.main()
I change the current dir to:
$ cd /home/test/dirB
This works:
$ PYTHONPATH=/home/test python Main.py
This doesn't:
$ python Main.py
Traceback (most recent call last):
File "Main.py", line 1, in <module>
from dirA.ClassA import ClassA
ModuleNotFoundError: No module named 'dirA'
Adding this lines in Main.py has no effect:
import os, sys
# Get the top level dir.
path = os.path.dirname(os.path.dirname(__file__))
sys.path.append(path)
The module still can't be found! There are plenty of similar questions but I couldn't make this work programmatically (skipping the PYTHONPATH
env var.) I understand that dirs are not modules, files are but this works in PyCharm (is the IDE fixing PYTHONPATH
?)
You need to make sure that you've altered your sys.path
before you attempt to load any package that might depend on the altered path - otherwise your script will fail the moment it encounters and import
statement. In other words, make sure your Main.py
begins as:
import os
import sys
path = os.path.join(os.path.dirname(__file__), os.pardir)
sys.path.append(path)
from dirA.ClassA import ClassA
To ensure that the last import statement operates on the altered path.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With