The problem with the "Java way" is that it it tries too hard to
be some sort of "pure" GC'd language trying to protect
programmers from themselves when the real problems or real
programmers don't need sure "pureness" or guaranteed safety
100% of the time.[1]
[1] Java doesn't guarantee your safety in many other areas. For
example, try writing an equals() without a hashCode() (or vice
versa) and see what kind of trouble you get into. So why should
Java be so fascist about memory safety? It's seems heavily
lop-sided as if memory problems were the obly problems with
programs.
Although programmer-safety is nice, it isn't the main goal behind Java's
memory model. The more important concern is security-safety. To
understand this, you have to see that Java tries something that is
somewhat unique for a mainstream programming languages (but common for
scripting languages). It offers a security system above and beyond that
provided by the operating system. That security system guarantees that
it is impossible to perform certain actions such as reading and writing
local files, even when the OS would allow it.
To provide this guarantee, Java needs to GUARANTEE that there is NEVER
an instance of completely undefined behavior in any application. There
is partially undefined behavior, such as what System.getProperty returns
when passed "os.name"... but there remains a requirement that the return
value will either be null or point to a valid String. If it were ever
possible to get a pointer to some random non-allocated point in space,
then then security guarantee would be thwarted.
In other words, the concern isn't over whether a benevolent programmer
will suffer from accidental memory corruption... it's whether a
malicious programmer can intentionally modify memory to which they
haven't been intentionally given a pointer. If the latter were
possible, then it becomes a feasible jump from there to figuring out a
way to modify the state of an instance of SecurityManager and force it
to let you trash someone's hard drive.
That's the important difference between memory issues and failing to
override hashCode. The latter generates incorrect BUT DEFINED behavior.
It does not pose a security risk.
If Java had some sort of reference-counting feature combined
with GC, the documetation could clearly state something like:
"Don't create cycles. Just don't. Really. If you do, you'll
just have to wait for the regular mark-and-sweep GC to clean up
your mess," and have this in *addition* to the current GC,
perhaps by the introduction of a CountedReference class.
Here's the problem, though. First, what classes have destructors?
Let's say there are a hundred of them. Next, what kinds of references
can refer to them? Any reference to a superclass or superinterface
MIGHT refer to a class with a destructor. Because class loading in Java
is dynamic, really any reference to any non-final type MIGHT refer to a
class with a destructor. Remember that reference-counting has to happen
ALL the time or NONE of the time for any given object. Half-way
reference counting is otherwise known as memory corruption.
So where does reference-counting have to occur? Ultimately, in AT LEAST
80% of all objects in the VM... and probably a lot closer to 95 to 100%.
Furthermore, some of the most common reference types (such as Object)
are in the list that must be reference-counted. Now, garbage collection
can be made to perform acceptably, but it's a tough job. Now you are
introducing a SECOND redundant form of garbage collection into the
system, and it is one that exhibits horrible performance characteristics
compared to more advanced algorithms like copying collection. This is
looking like a pretty dismal future for Java performance.
There's a good reason why deferred GC is commonly used in the first
place, and it's not cyclical references (which, incidentally, is
basically a solved problem, albeit with some overhead for the solution).
--
www.designacourse.com
The Easiest Way To Train Anyone... Anywhere.
Chris Smith - Lead Software Developer/Technical Trainer
MindIQ Corporation