In Python, you can use a dictionary as the first argument to dict.fromkeys(), e.g.:
In [1]: d = {'a': 1, 'b': 2}
In [2]: dict.fromkeys(d)
Out[2]: {'a': None, 'b': None}
I tried to do the same with a dict-like object, but that always raises a KeyError, e.g.:
In [1]: class SemiDict:
...: def __init__(self):
...: self.d = {}
...:
...: def __getitem__(self, key):
...: return self.d[key]
...:
...: def __setitem__(self, key, value):
...: self.d[key] = value
...:
...:
In [2]: sd = SemiDict()
In [3]: sd['a'] = 1
In [4]: dict.fromkeys(sd)
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
C:\bin\Console2\<ipython console> in <module>()
C:\bin\Console2\<ipython console> in __getitem__(self, key)
KeyError: 0
What exactly is happening here? And can it be resolved, other than using something like dict.fromkeys(sd.d)?
To create the dict, fromkeys iterates over its argument. So it must be an iterator. One way to make it work is to add an __iter__ method to your dict-like:
def __iter__(self):
return iter(self.d)
instance of SemiDict is not a sequence. I'd imagine the most obvious solution would be to inherit from dict, why don't you do it?
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With