A specific setup requires me to create local variables in __init__.py
that shall mask modules from the same package.
E.g. the variable y
(in the local context of __init__.py
) shall hide the module y.py
. The statement import x.y
shall yield the local variable instead of loading the module.
If you do not want to read about the specific setup, scroll down to the question; it is understandable without the details.
I have implemented a set of Python 2.7 packages, each of which may require individual configuration settings. For convenience, I was planning to provide configuration defaults per package that can be locally overwritten by whoever uses one of the packages.
(The rationale for this is to distribute default settings when deploying an app to a machine running a specific environment (a server, workstation, laptop, etc.), but at the same time to allow overriding configurations without messing up the local repository or resetting local adaptions on code updates.)
The directory structure example is:
~/pkg/
|
+- package_a/
| |
| +- __init__.py
| +- mod_x.py
| +- mod_y.py
|
+- package_b/
| |
| +- __init__.py
| +- mod_z.py
|
+- config/
| |
| +- __init__.py
| +- package_a.py # Should locally override <pkg>_sample.py
| +- package_a_sample.py
| +- package_b_sample.py
|
+- test_this.py
I'd like to access the settings stored under config/
like regular module imports, e.g.:
# ~/pkg/test_this.py
import config.package_a as cfg_a
... but have it implicitly switch to the overriding file, if it exists.
In order to somehow automate the process, I am dynamically creating local variables pointing to the correct configuration file imports. Using the imp
package, I can import a module and specifically naming it at the same time. (I.e. at runtime, you cannot distinguish whether <pkg>_sample.py
or <pkg>.py
was loaded to serve the configuration.)
I finally ended up with this:
# ~/pkg/config/__init__.py
import os
import imp
__all__ = ['datastore']
_cfgbase = os.path.dirname(os.path.realpath(__file__))
for cfgmodule in __all__:
if os.path.isfile(os.path.join(_cfgbase, cfgmodule + '.py')):
locals()[cfgmodule] = imp.load_source(
cfgmodule, os.path.join(_cfgbase, cfgmodule + '.py'))
else:
locals()[cfgmodule] = imp.load_source(
cfgmodule, os.path.join(_cfgbase, cfgmodule + '_sample.py'))
This actually creates a local reference to the required source files (omitting <pkg>_sample.py
when <pkg>.py
is existing in config/
.
I can use it from other modules/scripts if using from config import package_a as cfg_a
.
Essentially, this question may fall back to the well-known import x.y vs from x import y
-thing.
But there is a difference here.
I know that import x.y
requires y
to be a module. Is there any possibility to hide a module in its package's __init__.py
and to provide a local variable instad on import?
from x import y
yields the local variable y
from x
's __init__.py
import x.y
always imports the module, even if a local variable y
exists in __init__.py
.I cannot force everyone to always use the former import statement, people like to use the latter one in their code.
Any advise here?
Edited: Fixed title. Sorry.
Thanks @martijn-pieters for pointing out sys.modules
.
Actually, my approach would have worked perfectly without explicitly adding the new import to sys.modules
, as I just failed at properly naming the new imports:
locals()[cfgmodule] j= imp.load_source(
'config.' + cfgmodule, os.path.join(_cfgbase, cfgmodule + '.py'))
This solves the issue, as it does not register the new submodule with its canonical name (here: package_a
) but registers it as a submodule of my config
package.
Thanks a lot!
import x.y
does not really require y
to be a module. import x.y
looks up the 'x'
and 'x.y'
keys in the sys.modules
structure. If both are found, then x
is bound to sys.modules['x']
. Only if 'x.y'
does not exist, is Python going to look for a module to load.
The trick then, is to stuff your y
into sys.modules
:
sys.modules['x.y'] = y
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With