However, I've come across the problem in JavaScript even when
multiplying and dividing integers. To me, that seems REALLY weird.
First of all, you've come across the problem in dividing integers, not
in multiplying them. Multiplication doesn't change the dyadicity of
decadic integers.
Secondly, it is not particular Javascript problem: it is true for any
machine using internal dyadic numbers while getting/outputting decadic
numbers, that means for any up to moment existing or ever existed
discrete computer. This is why it is always amusing to read some
"sweet memory" references of the kind "and model X did not have
it"
It is just always the programming choice what to implement: to make
the output "as expected by an average user" suffering for some
precision lost, or to output "as it really is but may be unexpected
for an average user". Javascript uses the second approach, some
environment might/may offer the first one.
Basically decadic rationals is a rather small subset of real numbers,
and decadic rationals being representable as dyadic rationals as well
are forming even much more smaller subset of the first set (dyadic
solenoid). Vulgarly speaking, only some decadic numbers can be exactly
represented as a finite sequence in decimal system, and out from them
only some can be represented as a finite sequence in binary system.
The overall math defining all relations may be rather involving but I
can post it here. From the practical point of view
http://www.jibbering.com/faq/index.html#FAQ4_6
and
http://www.jibbering.com/faq/index.html#FAQ4_7
do provide enough of info though I would argue with some rounding
ideas in FAQ4_6
For