Wanja said:
In my opinion it's been an unfortunate decision to have a "final"
keyword instead of a "var" keyword with a "final" default.
I respect that opinion but I hold it not, mostly because I came to Java after it
was invented and didn't have any say in the matter.
Now, I'm a fan of using the "final" keyword everywhere possible apart
from method and class declarations and I'll give you my reasons behind
it.
However, I do hold that this style is, formally, desirable.
Except you should use it on method and class declarations, too. As Bloch says,
you should design for heritability or prevent it.
a) It expresses more exactly what the code really does. I think that a
certain precision of expression is of beauty. This is a rather stylistic
view.
Indeed, it is a style view, and everywhere I've worked has also had a style view.
b) As soon as you're used to having every possible assignment declared
"final", each missing "final" will stand out like an alarm sign - you
immedeately spot those assignments that change their value. It greatly
helps me understand the code. So it is of practical value, once you're
used to it.
Defensive programming is a defensible practice.
Detractors should understand that there is nothing of "helping the compiler"
about this. These are all justifications of semantics, and therefore relevant.
c) If I mistakingly (mistyping, code completion errs) assign a value to
a "final" reference, the compiler will warn me immediately. Since I
always declare "final" what is never intended to change, this stops me
from making stupid mistakes. I find this is of a hugepractical value.
It's part of the larger strategy of getting the compiler to do as much work for you
as possible.
My twisted mind says that Java's lack of runtime generics is an asset in this regard.
It forces the programmer to work much harder to get the type semantics correct at
compile time, and yes, oh, dear, to use an explicit run-time type token (RTTT) when
runtime genericity is de rigueur. But then at least you have whatever type assertions
possible handled by the compiler.
Okay, yes, I would appreciate runtime generics, but it's still a good idea to push things
to compile time.
d) I hardly have any problem with using a value in an anonymous class
and if I have, It tells me that I'm probably doing something stupid.
I think I get what you're saying here, but I'm not sure.
e) Adding parameters to methods is a code smell, each "final" keyword
augments that by growing the signature even faster. I find it funny,
that one of the biggest design mistakes in the language (no final
default) is actually of some benefit here.
It's a choice. I disagree that it's a mistake. At least you can put 'final'
everywhere, even though it's annoying that you must.
Once you always use "final" where appropriate, and became used to it,
the visual clutter starts to disappear.
I don't think the anti-'final'ists buy this one.
I don't agree that adding parameters to methods per se is code smell, but
it is not to be abused either. If you factor your model correctly you usually
don't create a whole lot of overloads, and those not too abundant. Anyway.
But you do add parameters when you need to, though pundits now suggest
using builders instead. That makes the use of a mutable type (sorry, purists)
explicit and temporary, which is usually right.
And again there are exceptions to the rule. It's like a rubber band - the farther
you get from rules of thumb like "keep parameters few and overloads fewer",
the greater the tension. It doesn't mean never do it, it does mean understand
that the tension is there for a reason, and you should minimize it.
Truly everywhere I work people would stick my head in the teapot like the
dormouse if I were to use 'final' like that. Like essay and paper publication,
there is a house style in every house. And it isn't really all that necessary
anyway.
Another rule of thumb is to keep methods short. In a short space you hardly
need 'final' to tell you not to change a local variable. You hardly have time
to use it once, let alone fear reassignment. So I cheerfully follow the seat-of-
the-pants crowd on this one.