I'm currently re-factoring a project (formerly big one file) into several seperate python files, each of which runs a specific part of my application.
Eg, GUIthread.py
runs the GUI, Computethread.py
does some maths, etc etc.
Each thread includes the use of functions from imported modules like math
, time
, numpy
, etc etc.
I already have a file globalClasses.py
containing class definitions for my datatypes etc, which each .py file imports at the start, as per recomendation here: http://effbot.org/pyfaq/how-do-i-share-global-variables-across-modules.htm . This is working well.
What I would like to do is have all my 3rdparty module imports in the globals
file as well, so that I can write, for example, import math
once but have all of my project files able to use math
functions.
Questions:
1. Is this possible?
2. Is it a good idea/is it good Python practice?
My current solution is just to put
import math
import time
import numpy
...
(plus imports for all the other modules I'm using as well)
at the top of every file in my project... But that doesn't seem very tidy, and it's easy to forget to move a dependency's import statement when moving code-chunks from file to file...
To import a module from another path, you first need to import the sys module as well as any other Python modules that you would like to use in your program. The sys module is provided by the Python Standard Library and it provides functions and parameters that are system-specific. The path.
The most Pythonic way to import a module from another folder is to place an empty file named __init__.py into that folder and use the relative path with the dot notation. For example, a module in the parent folder would be imported with from .. import module .
To use the module, you have to import it using the import keyword. The function or variables present inside the file can be used in another file by importing the module.
We can use sys. path to add the path of the new different folder (the folder from where we want to import the modules) to the system path so that Python can also look for the module in that directory if it doesn't find the module in its current directory. As sys.
Yeah I guess there is a more elegant way of doing this which will save redundant line of code. Suppose you want to import some modules math, time, numpy
(say), then you can create a file importing_modules
(say) and import the various modules as from module_name import *
, So the importing_modules.py
may look something like this:
importing_modules.py
from math import *
from numpy import *
from time import *
main.py
from importing_modules import *
#Now you can call the methods of that module directly
print sqrt(25) #Now we can call sqrt() directly in place of math.sqrt() or importing_modules.math.sqrt().
The other answer shows how what you want is (sort of) possible, but didn't address your second question about good practice.
Using import *
is almost invariably considered bad practice. See "Why is import * bad?" and "Importing * from a package" from the docs.
Remember from PEP 20 that explicit is better than implicit. With explicit, specific imports (e.g. from math import sqrt
) in every module, there is never confusion about from where a name came, your module's namespace includes only what it needs, and bugs are prevented.
The downside of having to write a couple import
statements per module does not outweigh the potential problems introduced by trying to get around writing them.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With