Eric said:
That's 100-133 significand bits. H-format numbers on
the VAX could approach the lower end (I don't recall just
how many of its 128 bits were exponent), but I'm not aware
of any C compiler that made use of H-format.
The big question, though, is why do you want so many
digits? What meaning could you assign to a plus-or-minus
one fluctuation in the fortieth decimal place? If you're
trying to calculate the radius of the Universe in Angstrom
units, forty places are far more than you need.
I'll be sorry I did this but..
angstrom
n : a metric unit of length equal to one ten billionth of a
meter (or 0.0001 micron); used to specify wavelengths of
electromagnetic radiation [syn: angstrom unit, A]
..and so 1 meter is 10,000,000,000 angstroms.
The speed of Light is 300 million meters per second. In angstroms we have
1e10 * 3e8. That would bring us to 3e18 or
3,000,000,000,000,000,000
angstroms per second Light speed. That's only 19 digits but it's only one
second. I'm not sure the 'correct' seconds per year but 60 * 60 * 24 *
365.25 yields 31,557,600 seconds. One Light Year would be about 9.46728e25
angstroms. Let's say 1e26 angstroms.
Ah now, what is the radius of the Universe? Even if we thought it could be
10 billion Light Years that would be 1e10 and would be only 1e36
angstroms. Eric is right. 40 digits will be enough.
How many digits can a 128-bit int represent? 40?
I'm right after all. I'm sorry I did this. ;-)