I'm facing a really weird behavior with the hash function on Python. When I run the following command on Mac OS (10.10) I get different values from different calls.
$ python -c "print hash(None)"
-9223372036579216774
$ python -c "print hash(None)"
-9223372036582852230
In the other hand when I run the same thing on Ubuntu 14.04 I get:
$ python -c "print hash(None)"
596615
$ python -c "print hash(None)"
596615
For me it looks like, in OS X, python is using the memory address somehow and Ubuntu is not. From that I can see that the hash function is probably implementation dependent. But shouldn't it be based on the "value" of None only? What do those numbers represent? Why does it behave differently even on the same python version but on different OS?
None.__hash__
correlates to the _Py_HashPointer
hashing function. So basically the pointer of the object is used as hash. For None
beeing a singleton this is safe to use, but not deterministic. For a pointer cast to an adequate integer type p
the hash value is calculated like the following:
(p >> 4) | (p << (8 * SIZEOF_VOID_P - 4))
Referring to this a comment in the source code states:
bottom 3 or 4 bits are likely to be 0; rotate y by 4 to avoid excessive hash collisions for dicts and sets
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With