A
Andre
Hi,
I'm trying to plot a histogram on some data that I receive at runtime.
I'm trying to measure distances between two points, and these distances
can be from the range 1 to 5000000. I want to use a logarithmic scale
with a range -2^29 to +2^29. How can I determine at any point in time
which strata (i.e which cohort or portion of scale) the value belongs to
(without using the square or square-root functions)?
I could have mask-bits of all possible scales, such as 2^2, 2^3, 2^4
upto 2^29 and AND each of these with the value to figure this out but is
there a cleaner way of doing this?
Thanks a lot.
--Andre
I'm trying to plot a histogram on some data that I receive at runtime.
I'm trying to measure distances between two points, and these distances
can be from the range 1 to 5000000. I want to use a logarithmic scale
with a range -2^29 to +2^29. How can I determine at any point in time
which strata (i.e which cohort or portion of scale) the value belongs to
(without using the square or square-root functions)?
I could have mask-bits of all possible scales, such as 2^2, 2^3, 2^4
upto 2^29 and AND each of these with the value to figure this out but is
there a cleaner way of doing this?
Thanks a lot.
--Andre