Why don't I ever see the following in Python code?
class A:
def __init__(self, ...):
# something important
class B(A):
__init__ = A.__init__
It seems to work on my box with Python 2.5, 2.6, 2.7, and PyPy 1.8.
I see the following a lot, instead:
class B(A):
def __init__(self, *args, **kwargs):
A.__init__(self, *args, **kwargs)
or something using super
.
I like my first example better (explicit is better than implicit!) but I'm worried that it's not kosher for some reason. Is there something wrong or bad with it?
EDIT: Yep, I meant A.__init__
, not self.__init__
Sure, it'll work (assuming you get it right, which it is not in your example), but there's no reason to do it. If you want B
to call A
's __init__()
, just don't define __init__()
on B
and that'll happen automatically. That's the whole point of inheritance, after all.
The only reason you'd ever want to write an __init__()
for B
is if you wanted to do something when initializing a B
instance in addition to what is done to initialize an A
instance. In that case you'd write additional code besides just calling A.__init__()
, so a simple assignment wouldn't suit there, either.
I suppose you could use this if you wanted to protect B
's __init__()
from being changed by a monkey-patch to A
. But that seems like rather an edge case. Worse, it's unexpected behavior and a blatant denial of Python's dynamic nature.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With