Dealing with inheritance anomalies in Java

C

Colin Paul Gloster

Dear all,

In Java there are a few ways to cope with what is called the inheritance
anomaly problem (in the context of Matsuoka, S. and Yonezawa, A.,
"Analysis of inheritance anomaly in object-oriented concurrent programming
languages" in "Research Directions in Concurrent OO Programming" MIT
Press, 1993), such as:

* use an object-based design instead of an object-orientated design (in
which case, why not use Ada 83 instead of Java?);

* use an object-orientated design without multithreading (in which case,
why not use C++ instead of Java?);

* use Java variants (e.g. Jeeg; JR; RAVENSCAR (Reliable Ada Verifiable
Executive Needed for Scheduling Critical Applications in
Real-time) Java);

* aspect orientated programming is the main defense mentioned in Giuseppe
Milicia and Vladimiro Sassone, "The inheritance anomaly: ten years
after", "Proceedings of the 2004 ACM symposium on Applied computing";

* be sensible in your class design / be prepared to refactor (so much for
reuse without copying and pasting!);

* don't give it much thought and hope for the best.

Are there any other approaches which are applicable to Java?

Which approach(es) should be used?, and why? Which approaches are actually
idiomatic?

Regards,
Colin Paul Gloster

--
 
P

P.Hill

Colin said:
Dear all,

In Java there are a few ways to cope with what is called the inheritance
anomaly problem (in the context of Matsuoka, S. and Yonezawa, A.,
"Analysis of inheritance anomaly in object-oriented concurrent programming
languages" in "Research Directions in Concurrent OO Programming" MIT
Press, 1993), such as:

* use an object-based design instead of an object-orientated design (in
which case, why not use Ada 83 instead of Java?);

The choice of language is often based on economic issues and available
resources, thus in an academic setting you might choose to actually use
Ada while the rest of the world couldn't find enough Ada programmers on
their staff or available within the projects constraints of time or money to
actually consider using it.
* use an object-orientated design without multithreading (in which case,
why not use C++ instead of Java?);

Readability, protability, existing code base, ... There are all kinds of issues
for why to use Java over C++. What fantasy world do you live in?
* use Java variants (e.g. Jeeg; JR; RAVENSCAR (Reliable Ada Verifiable
Executive Needed for Scheduling Critical Applications in
Real-time) Java);

See comments regarding Ada above, but at least if these are runtime compatable
you reduce the problem to fewer resources needed in the lesser known
language.
* aspect orientated programming is the main defense mentioned in Giuseppe
Milicia and Vladimiro Sassone, "The inheritance anomaly: ten years
after", "Proceedings of the 2004 ACM symposium on Applied computing";

Until something else comes along, this seems like a good choice.
* be sensible in your class design / be prepared to refactor (so much for
reuse without copying and pasting!);

What does copying and pasting re-use have to do with refactoring? They are
opposites.
Which approach(es) should be used?, and why? Which approaches are actually
idiomatic?

As in all COOL (Concurrent OO) Languages, the anomaly exists, so deal with
it.


In general, the paranthetical comments on bullet list above seem nieve and
not often worth considering in real world projects.

-Paul
 
S

Stefan Schulz

On 23 Sep 2004 10:34:27 -0700, Colin Paul Gloster

XPost forcably cut down to size.
In Java there are a few ways to cope with what is called the inheritance
anomaly problem (in the context of Matsuoka, S. and Yonezawa, A.,
"Analysis of inheritance anomaly in object-oriented concurrent
programming languages" in "Research Directions in Concurrent OO
Programming" MIT
Press, 1993), such as:

Uhm... forgive my ignorance, but what exactly is the "Inheritance
Anomaly"? I do not
have the possibility to check a publication 11 years old at the moment, so
a
three-sentence explaination would have worked wonders on my understanding.
It might
even have given me the ability to answer your question.

See you
Stefan
 
P

Paul Colin Gloster

There is disagreement as to what its true cause is (and hence what would be
an appropiate name), but a three-sentence mainstream description (perhaps
not an explanation) follows:

"The combination of the object-oriented paradigm with mechanisms for
concurrent programming may give rise to the so-called inheritance anomaly
(Matsuoka and Yonezawa, 1993). An inheritance anomaly exists if the
synchronization between operations of a class is not local but may depend
on the whole set of operations present for the class. When a subclass adds
new operations, it may become necessary to change the synchronization
defined in the parent class to account for these new operations (Matsuoka
and Yonezawa, 1993)."

Those three sentences are from page 268 of Alan Burns and Andy Wellings,
"Real-time systems and programming languages : Ada 95, real-time Java, and
real-time POSIX", 3rd edition, 2001, Pearson Education.
 
P

Paul Colin Gloster

"Colin Paul Gloster wrote:
[..] There are all kinds of issues
for why to use Java over C++."

Indeed there are.

"What fantasy world do you live in?"

Real Time Systems in Space. Unfortunately C++ has creeped in too far.

"[..]

Until something else comes along, this [aspect orientated programming ]seems
like a good choice."

Okay.

"> * be sensible in your class design / be prepared to refactor (so much for
reuse without copying and pasting!);

What does copying and pasting re-use have to do with refactoring? They are
opposites."

Okay, I had not said that properly. If you need to change a superclass to
accommodate a new subclass, you obviously will not be able to keep the
superclass as is so you would change (i.e. not reuse) the superclass. Before
someone who has never heard of inheritance anomalies reaches this point of
the class hierarchary evolution, a naive specialization in the subclass
might be attempted which might soon be appreciated to be buggy and various
copy and paste hacks may or may not be tried later before finding out that
this is an already discovered and documented feature of mainstream COOLs.

"> Which approach(es) should be used?, and why? Which approaches are actually
idiomatic?

As in all COOL (Concurrent OO) Languages, the anomaly exists, so deal with
it."

Some OO-Ph.D. candidates may disagree that all COOLs must intrisically have
inheritance anomaly.


"In general, the paranthetical comments on bullet list above seem nieve and
not often worth considering in real world projects."

I would rather discount something as naive than not think of the
possibilities at all, though I would not want to go through every
(non-)option every time.
 
P

P.Hill

Paul said:
As in all COOL (Concurrent OO) Languages, the anomaly exists, so deal with
it."

Some OO-Ph.D. candidates may disagree that all COOLs must intrisically have
inheritance anomaly.


If you are going to get all pie in the sky and ivory tower on us and worry about
academic curiousities, I suggest you get the tense of the sentences right.
I said _exists_ -- present tense -- OO-Ph.D candidates having an opinion about
whether "COOLs must intrinsically have inheritance anomaly." a statement about
the future, is in not in anyway in agreement or disagreement with with you or I
_currently_ having the limitation.

In case you also can't read usenet news group headers and don't know how to
organize your discussion, this is a news group about Java, an existing language,
not a free-for-all place to comtemplate your navel or the future of programming
languages.

cheers,
-Paul
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,763
Messages
2,569,562
Members
45,038
Latest member
OrderProperKetocapsules

Latest Threads

Top