I defined a class:
class A: ''' hash test class >>> a = A(9, 1196833379, 1, 1773396906) >>> hash(a) -340004569 This is weird, 12544897317L expected. ''' def __init__(self, a, b, c, d): self.a = a self.b = b self.c = c self.d = d def __hash__(self): return self.a * self.b + self.c * self.d
Why, in the doctest, hash() function gives a negative integer?
It appears to be limited to 32-bits. By reading this question, it looks like your code might have produced the expected result on a 64-bit machine (with those particular values, since the result fits in 64 bits).
The results of the built-in hash
function are platform dependent and constrained to the native word size. If you need a deterministic, cross-platform hash, consider using the hashlib
module.
See object.__hash__
Notice that
Changed in version 2.5:
__hash__()
may now also return a long integer object; the 32-bit integer is then derived from the hash of that object.
In your case, expected 12544897317L is a long integer object,
Python derived the 32-bit integer -340004569 by (12544897317 & 0xFFFFFFFF) - (1<<32)
Python derived the 32-bit integer by hash(12544897317L), which results -340004569
The algorithm is something like this:
def s32(x):
x = x & ((1<<32)-1)
if x & (1<<31):
return x - (1<<32)
else:
return x
def hash(x):
h = 0
while x:
h += s32(x)
x >>= 32
return h
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With