Patricia said:
To me, this is a prime example of inappropriate extension.
Before extending a method to a wider domain, it is essential to consider
any underlying assumptions, and decide whether they remain valid over
the new domain. There is often far more to widening the domain than just
picking bigger data types.
In this case, the underlying assumption is the Gregorian calendar, and
its approximation to the relationship between day and year length. Leap
seconds are coming in about one a year, and most of them are in the same
direction. At one second per year, it would only take 86,400 years to
accumulate a day of leap seconds. Obviously, something is going to have
to change in time management well within the next hundred thousand years.
Sometimes generality is free, sometimes it costs, in programming and
testing time, and in runtime efficiency. In this case, the generality
gain from using int rather than short is effectively free, and there is
some risk short would not be wide enough.
On the other hand, int is simple, efficient, and handles very
effectively the entire domain over which the algorithm makes any sense.
It is a fine data type for this job. BigInteger, or even long, would be
overkill.
Patricia
By the very same reasoning, int, and in fact, Java, is a poor choice.
The data type int falls somewhere before the (assumed) requirement of
the user. To paraphrase the requirement (since requirements cannot be
accurately expressed in English (or a programming language for the Mr.
TDD!)), a data type that extended between negative and positive infinity
is required to express the arrow of time. Of course, this cannot be
achieved - quantum mechanics not withstanding - since it comes down to
things like physical hardware limitations, etc.
The question now is, why is the user of the programming language
restricted to express his/her implementation of requirements using a
tool - the programming language - that falls short of the target? Why is
the restriction not put in place by other physical limitations? And more
importantly, why does the programming language even care about such a
restriction? More often than not, I see an exceeding of requirements,
and here we are observing a not-so-common case, so it makes things a
little more difficult to argue. That is, it is much more easy to argue
on the premise of excess requirements since it is much more common, and
subsequently digress to this case, so I'm going out on a limb.
Assuming Java provided a data type that permitted an extension to
infinity (this excludes BigInteger for other reasons that I have omitted
and hope are not picked up on - for brevity), then it would certainly
make int look like hitherto "overkill". Since int could be viewed as
"being exactly the same as the infinite data type with the addition of a
restriction of finite bounds". One could then go on to say that this
restriction is indeed "addition" and since this addition "exceeds
requirement", it is therefore, "overkill".
The moral of the story is, formally express your requirements, even if
it is in your own head (admittedly, this is extremely difficult to do in
the face of the prolific "propaganda"). Unfortunately, Java contradicts
this ability in more ways than one, so we end up with some contrived and
blurred understanding of "what the hell it is we are doing" - to state
it loosely but not to undermine its importance. Nevertheless, I concede
to the immense power that marketers/evangelists/proclaimed
"experts"/etc. have...
In the meantime, I continue to document (though not publicly for now -
my recent resignation from IBM (i.e. The Filth) now permits me to) what
I, and a select few others, believe is a sound refutation and logical
proof of the invalidity of many of the tools (including Java itself)
that so many assume to exist legitimately within our common axioms.