I was wondering if passing module objects to a class or object which requires it rather than using import
might not be a bad idea, as it avoids hard dependencies on those modules.
I was wondering if someone more aware of the Python Zen than I might be able to explain why this is a terrible/not terrible idea?
There is no such way to pass parameters to the module, however you can revamp your code a bit and import the parameters from a module as global parameters.
Python's language reference for assignment statements states that if the target is an object's attribute that supports assignment, then the object will be asked to perform the assignment on that attribute. If you pass the object as an argument to a function, then its attributes can be modified in place.
__import__() Parameters name - the name of the module you want to import. globals and locals - determines how to interpret name. fromlist - objects or submodules that should be imported by name. level - specifies whether to use absolute or relative imports.
What you're talking about is called dependency injection and is considered a good practice for making your code testable. I don't think there's anything about Python that would make it unPythonic or a bad practice.
There are other ways you could do it in Python, for example by importing different modules depending on some kind of flag you pass in:
class Foo(object): def __init__(self, testing=False): if testing: import module_test as module else: import module self.module = module
But passing a reference to the module you wish to use is more flexible, separates concerns better, and is no less Pythonic than passing a reference to a class or instance (or string or integer) you wish to use.
For the ordinary (non-test) use case, you can use a default argument value:
class Foo(object): def __init__(self, module=None): if not module: import module self.module = module
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With