Non-integer seeds as the argument of
random.seed() are normally
replaced by the integer hash of the seed supplied. The value of
hash(x) seems to be different on on 32-bit architectures from its
value on 64-bit architectures, provided the seed
x is a string. The
values are the same if the seed is an integer.
This problem came up among several people trying to run the same code on
different platforms and different versions of Python 2.5 and 2.6.
Overall, there appear to be two different sets of results for
x a string rather than an integer — corresponding to 32- vs.
random.seed(x)function, Beazley (Python Essential Reference, 4th ed., p. 254) writes: "If
xis not an integer, it must be a hashable object and the value of
hash(x)is used as a seed."
The Python documentation website says, "All of Python’s immutable
built-in objects are hashable, while no mutable containers (such as
lists or dictionaries) are. Objects which are instances of user-defined
classes are hashable by default; they all compare unequal, and their
hash value is their
Strings are immutable in Python. Integers simply hash to their own values, so they're naturally the same on all installations.
Edit: I've added a little piece of code I used to explore the Python 2.6 hashing function to my public repository on [BitBucket: https://bitbucket.org/dpb/show_hash/overview].