I'm working on a relatively large Python application, and there are several resources that I would like to keep as global variables accessible throughout several different modules. These values are things like the version number, version date, the global configuration, and some static paths to resources. I've also included a DEBUG
flag that gets set by a command line option so that I can run my application in a debug mode without needing the full environment.
The values I'm importing I've been careful to ensure are ones that do not change over the course of running the program, and I've documented them as global constant variables that should not be touched. My code looks essentially like
# Main.py import wx from gui import Gui DEBUG = False GLOBAL_CONFIG = None VERSION = '1.0' ICON_PATH = 'some/path/to/the/app.ico' def main(): global DEBUG, GLOBAL_CONFIG # Simplified import sys DEBUG = '--debug' in sys.argv GLOBAL_CONFIG = load_global_config() # Other set-up for the application, e.g. setting up logging, configs, etc app = wx.App() gui = Gui() app.MainLoop() if __name__ == '__main__': main()
# gui.py import wx from __main__ import DEBUG, GLOBAL_CONFIG, ICON_PATH import controller class Gui(wx.Frame): def __init__(self): wx.Frame.__init__(self, None) icon = wx.Icon(ICON_PATH, wx.BITMAP_TYPE_ICO) self.SetIcon(icon) # Always make a copy so we don't accidentally modify it conf = GLOBAL_CONFIG.copy() self.controller = controller.Controller(conf) # More setup, building the layout, etc
# controller.py from __main__ import DEBUG import logging log = logging.getLogger('controller') class Controller(object): def __init__(self, conf): if DEBUG: log.info("Initializing controller in DEBUG mode") self.conf = conf # Other setup ...
This is obviously far stripped down from what my application actually is, and neglects error handling, documentation, and basically all implementation details.
Now, I've seen it said that this is a bad idea, but without explanation for why. Since most results when googling for variants of "python import __main__" are questions about what if __name__ == '__main__'
is, it's hard to find some solid information on this topic. So far I've had no problems with it, and it's actually been really convenient.
So is this considered good Python practice, or is there a reason I should avoid this design?
One can use the Python's inbuilt __import__() function. It helps to import modules in runtime also. level : Specifies whether to use absolute or relative imports. Default is -1(absolute and relative).
Using import * in python programs is considered a bad habit because this way you are polluting your namespace, the import * statement imports all the functions and classes into your own namespace, which may clash with the functions you define or functions of other libraries that you import.
In Python, you use the import keyword to make code in one module available in another. Imports in Python are important for structuring your code effectively. Using imports properly will make you more productive, allowing you to reuse code while keeping your projects maintainable.
Startup and Module Importing Overhead. Starting a Python interpreter and importing Python modules is relatively slow if you care about milliseconds. If you need to start hundreds or thousands of Python processes as part of a workload, this overhead will amount to several seconds of overhead.
I think there are two main (ha ha) reasons one might prescribe an avoidance of this pattern.
If you have total control over the application and there will never be another entry point or another use for your features, and you're sure you don't mind the ambiguity, I don't think there's any objective reason why the from __main__ import foo
pattern is bad. I don't like it personally, but again, it's basically for the two reasons above.
I think a more robust/developer-friendly solution may be something like this, creating a special module specifically for holding these super-global variables. You can then import the module and refer to module.VAR
anytime you need the setting. Essentially, just creating a special module namespace in which to store super-global runtime configuration.
# conf.py (for example) # This module holds all the "super-global" stuff. def init(args): global DEBUG DEBUG = '--debug' in args # set up other global vars here.
You would then use it more like this:
# main.py import conf import app if __name__ == '__main__': import sys conf.init(sys.argv[1:]) app.run()
# app.py import conf def run(): if conf.DEBUG: print('debug is on')
Note the use of conf.DEBUG
rather than from conf import DEBUG
. This construction means that you can alter the variable during the life of the program, and have that change reflected elsewhere (assuming a single thread/process, obviously).
Another upside is that this is a fairly common pattern, so other developers will readily recognize it. It's easily comparable to the settings.py
file used by various popular apps (e.g. django
), though I avoided that particular name because settings.py
is conventionally a bunch of static objects, not a namespace for runtime parameters. Other good names for the configuration namespace module described above might be runtime
or params
, for example.
Doing so requires violating PEP8, which specifies
Imports are always put at the top of the file, just after any module comments and docstrings, and before module globals and constants.
In order for gui.py
to successfully import __main__.DEBUG
, you would have to set the value of DEBUG
before import gui
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With