J
Josh Taylor
I have a class that wraps a large file and tries to make it look like a
string w.r.t. slicing. Here, "large file" means on the order of
hundreds of GB. All the slicing/indexing stuff through __getitem__()
works fine, but len() is quite broken. It seems to be converting the
value returned by __len__() to a 32-bit integer. If the conversion
yields a negative number, it raises an exception.
I'm running Python 2.4.1 on an Opteron running RedHat FC3. It's a
64-bit processor, and Python ints appear to be 64-bit as well, so even
if len() only works with ints, it should still be able to handle 64-bit
values.
Here's a simple example that shows what I'm talking about:
'2.4.1 (#1, Jun 22 2005, 16:00:46) \n[GCC 3.4.2 20041017 (Red Hat
3.4.2-6.fc3)]'.... def __len__(self):
.... return 49000000000
....1755359744
Is this a bug, a design decision, or do I have something misconfigured
in my Python build?
-Josh
string w.r.t. slicing. Here, "large file" means on the order of
hundreds of GB. All the slicing/indexing stuff through __getitem__()
works fine, but len() is quite broken. It seems to be converting the
value returned by __len__() to a 32-bit integer. If the conversion
yields a negative number, it raises an exception.
I'm running Python 2.4.1 on an Opteron running RedHat FC3. It's a
64-bit processor, and Python ints appear to be 64-bit as well, so even
if len() only works with ints, it should still be able to handle 64-bit
values.
Here's a simple example that shows what I'm talking about:
'2.4.1 (#1, Jun 22 2005, 16:00:46) \n[GCC 3.4.2 20041017 (Red Hat
3.4.2-6.fc3)]'.... def __len__(self):
.... return 49000000000
....1755359744
Is this a bug, a design decision, or do I have something misconfigured
in my Python build?
-Josh