I defined a class:
class A:
''' hash test class
>>> a = A(9, 1196833379, 1, 1773396906)
>>> hash(a)
-340004569
This is weird, 12544897317L expected.
'''
def __init__(self, a, b, c, d):
self.a = a
self.b = b
self.c = c
self.d = d
def __hash__(self):
return self.a * self.b + self.c * self.d
Why, in the doctest, hash() function gives a negative integer?
It appears to be limited to 32-bits. By reading this question, it looks like your code might have produced the expected result on a 64-bit machine (with those particular values, since the result fits in 64 bits).
The results of the built-in
hashfunction are platform dependent and constrained to the native word size. If you need a deterministic, cross-platform hash, consider using thehashlibmodule.