N
Nate Smith
Hello,
Is there any way to get around values being too big (being assigned as
bignums) to fit into a hash? (Not a Hash class, but a hash fixnum
assigned to each bucket). For example, in my LR parser, each
Production's hash is composed of the hashes of its elements:
def hash # Production#hash
@nonTerminal.hash + @action.hash + @expansion.hash
end
When @expansion.hash is very large, the hash values get large, and
values like: 67 4 -946450216 for @nonTerminal.hash, @action.hash, and
@expansion.hash, respectively, generate the following error:
in `hash': bignum too big to convert into `int' (RangeError)
My gut tells me there's no way to get around this, and the only way is
to restructure production's hash to be something smaller. Anyone have
any ideas? Thanks
Nate
Is there any way to get around values being too big (being assigned as
bignums) to fit into a hash? (Not a Hash class, but a hash fixnum
assigned to each bucket). For example, in my LR parser, each
Production's hash is composed of the hashes of its elements:
def hash # Production#hash
@nonTerminal.hash + @action.hash + @expansion.hash
end
When @expansion.hash is very large, the hash values get large, and
values like: 67 4 -946450216 for @nonTerminal.hash, @action.hash, and
@expansion.hash, respectively, generate the following error:
in `hash': bignum too big to convert into `int' (RangeError)
My gut tells me there's no way to get around this, and the only way is
to restructure production's hash to be something smaller. Anyone have
any ideas? Thanks
Nate