U
user923005
Input is one of the most basic things that a program has to do.
It is also inherently difficult if the programmer does not control what is
entered.
For instance the input 100000000000000000000000 will overflow an integer and
be approximated in a floating point type. However it is a number, it may
even be a valid number for the problem domain. So you have to be very
careful in what you tell the user.
100000000000000000000000 will be represented exactly in any reasonable
floating point type.
There is only one significant digit.
Now, 100000000000000000000001 is another story.