Why Generics?

  • Thread starter David Blickstein
  • Start date
A

Andrew Thompson

And IMHO it is a "waste of time" if we're guarding shampoo plants INSTEAD of
guarding borders.

Since you are going off on a wider tangent, and mentioned it.

Who will guard the guards?

Instead, why don't you pay the peasants that make the
shampoo a little more, and spend a little more on
disposing of the toxic waste products of the shampoo
manufacturing process, ..so they don't want to kill you
in the *first* *place*?
 
T

Thomas G. Marshall

Andrew Thompson coughed up:
Since you are going off on a wider tangent, and mentioned it.

Who will guard the guards?

Instead, why don't you pay the peasants that make the
shampoo a little more, and spend a little more on
disposing of the toxic waste products of the shampoo
manufacturing process, ..so they don't want to kill you
in the *first* *place*?


Please let's not go down there. I'm fairly sure that DB wasn't making a
political statement, so don't start us off in that direction. This issue is
far more complex than can be sensibly be discussed here.


--
I've seen this a few times--Don't make this mistake:

Dwight: "This thing is wildly available."
Smedly: "Did you mean wildly, or /widely/ ?"
Dwight: "Both!", said while nodding emphatically.

Dwight was exposed to have made a grammatical
error and tries to cover it up by thinking
fast. This is so painfully obvious that he
only succeeds in looking worse.
 
D

Dale King

Thomas said:
David Blickstein coughed up:

Well, I agree that he has lost a lot of respect in my eyes.

I should have mentioned that he is not just some crackpot. He is the
author of "Thinking in Java" which is probably the most widely read
introduction to Java since he makes it available for free.

But that doesn't necessarily make him right. And in this case I think he
went off the deep end.

I'm simplifying his position a bit. What he really wants Java to have is
something he calls latent typing. Latent typing is where you can invoke
method foo() on an instance of the templates type parameter and it works
if the type has a method foo, but does not require the type parameter to
be the subclass of any specific class.

It's nice to know that C# gets the same complaints:
http://dotnetguy.techieswithcats.com/archives/004273.shtml
 
D

Dimitri Maziuk

Dale King sez:
[Eckels on generics ]
I'm simplifying his position a bit. What he really wants Java to have is
something he calls latent typing.

Apples and oranges. Sun's article on generics (in "new features"
in 1.5 SdK docs) and JSR 14 states pretty clearly that main goal
of generics was type-safe collections and eliminating run-time
casts.

Templates (latent typing) is about code reuse: you write one e.g.
sort() method and it'll work on any argument type as long as that
type comes with "[" operator. That's what makes it "generic".

Compile-time type safety in collections does not make code "generic".
Sun really should've come up with a better name for it -- if anything,
List< MyType > is "less generic" than List< (*any*) Object >, so
"specifics" would be a better name.

Dima
 
D

David Blickstein

Please let's not go down there. I'm fairly sure that DB wasn't making a
political statement, so don't start us off in that direction. This issue is
far more complex than can be sensibly be discussed here.

You can be totally sure I wasn't making a political statement.

I was likening the attempt to justify generics based on the remote
possibility that they might someday find a bug, to a common debating ploy,
often used by politics to shift focus from useless accomplishment and away
from non-accomplishment and lost resources.

The safety provided by generics seems just this side of useless from a
pragmatic standpoint (IMHO). I'm rapidly coming to believe that the joke
someone made about it being their because "C++ weenies wanted it" might be
more true than not.

And yes, with regard to (stated) purpose, Java generics are apples compared
with C++ templates oranges. As far as I can tell C++ templates are there
for two reasons:

1) Bjarne Stroustrup was throwing things in without much consideration.
(People say
PL/I is the "kitchen sink language", but C++ is far more deserving of
that title.

2) Templates are there to solve a mistake Bjarne made that Java doesn't
have: a
type that can be used for container classes (Object in Java).
Bjarne's mistake
was not forcing a "root" class in the class hierarchy.
 
J

John English

David said:
The safety provided by generics seems just this side of useless from a
pragmatic standpoint (IMHO). I'm rapidly coming to believe that the joke
someone made about it being their because "C++ weenies wanted it" might be
more true than not.

Hmm. Generics help you catch some problems at compile time rather than
at run time:

ArrayList foo = new ArrayList();
foo.add("a string");
Date d = (Date)foo.get(0); // ClassCastException at runtime

as opposed to:

ArrayList<Date> foo = new ArrayList<Date>();
foo.add("a string"); // compile time error

Anything which saves me debugging time trying to find the source
of runtime errors is well worth it, IMHO, and since generics are
elided from the bytecode (the example above generates the same
bytecodes but you *know* the data is being handled correctly)
yhere is no runtime penalty.

Just my $0.02 :)

-----------------------------------------------------------------
John English | mailto:[email protected]
Senior Lecturer | http://www.it.bton.ac.uk/staff/je
Dept. of Computing | ** NON-PROFIT CD FOR CS STUDENTS **
University of Brighton | -- see http://burks.bton.ac.uk
-----------------------------------------------------------------
 
T

Thomas G. Marshall

John English coughed up:
Hmm. Generics help you catch some problems at compile time rather than
at run time:

ArrayList foo = new ArrayList();
foo.add("a string");
Date d = (Date)foo.get(0); // ClassCastException at runtime

as opposed to:

ArrayList<Date> foo = new ArrayList<Date>();
foo.add("a string"); // compile time error

Anything which saves me debugging time trying to find the source
of runtime errors is well worth it,


1. You *don't* know that.
2. You *cannot* know that.
3. It depends upon what the impact upon the bottom line is.

Spelled out: It's entirely possible that the extra time you spend
implementing generics is GREATER than the amount of time you would spent at
runtime detecting the errors if you didn't use them.

We all have got to *stop* oversimplifying the variables involved in this.
 
D

David Blickstein

Hmm. Generics help you catch some problems at compile time rather than
at run time:

John, you've probably missed some of the discussion.

We're aware of that stated purpose but a lot of us here have never ever
encountered that bug and/or feel its an exceedingly unlikely/rare bug.
Thus, we don't feel that "justifies" the inclusion of generics, hence the
alternative theory ("C++ people wanted it").
 
D

Dimitri Maziuk

David Blickstein sez:
John, you've probably missed some of the discussion.

We're aware of that stated purpose but a lot of us here have never ever
encountered that bug and/or feel its an exceedingly unlikely/rare bug.
Thus, we don't feel that "justifies" the inclusion of generics, hence the
alternative theory ("C++ people wanted it").

A lot of us have never ever seen

an ancient Greek,
an atom,
a black hole,
a dinosaur,
Earth from orbit,
God (any of the three of Him),
<add more here/>

Therefore, we feel that these things are exceedingly unlikely to exist,
or exceeedingly rare, at best. Thus, we must inevitably conclude that
they are included in our culture only because C++ people wanted it.

Dima (good sigmonster)
 
D

David Blickstein

A lot of us have never ever seen
an ancient Greek,
an atom,
a black hole,
a dinosaur,
Earth from orbit,
God (any of the three of Him),
<add more here/>

Therefore, we feel that these things are exceedingly unlikely to exist,
or exceeedingly rare, at best. Thus, we must inevitably conclude that
they are included in our culture only because C++ people wanted it.

Ah, one of those "trust me this follows from your logic" arguments. :)

While I might agree that "I haven't seen it" happen isn't sufficient to
invalidate generics, I would hope you agree that "it might happen" isn't
sufficient to validate it.

I've not yet heard ANYONE say they've encountered this bug and I'd need to
hear way more than one person say it in order to think it happens with
enough regularity to validate its addition to the language. Especially
given its complexity.
 
T

Tor Iver Wilhelmsen

David Blickstein said:
While I might agree that "I haven't seen it" happen isn't sufficient to
invalidate generics, I would hope you agree that "it might happen" isn't
sufficient to validate it.

Yes, it is. It's the same as with testing for null values - you can
say a variable is "unlikely" to hold a null, so you don't test for it.
You can divide ints, not bothering to check the "unlikely" case you
will divide by 0. Et cetera.

The issue is not how likely a bug is, it's how devastating that bug
will be at runtime, when your precious little mission critical system
crashes.

Generics improve documentability, usability and type safety. What is
wrong with new features? Generics are optional - if you don't want
them, don't use them, just like some people stuck to AWT after Swing
was introduced.
 
J

John English

David said:
We're aware of that stated purpose but a lot of us here have never ever
encountered that bug and/or feel its an exceedingly unlikely/rare bug.
Thus, we don't feel that "justifies" the inclusion of generics, hence the
alternative theory ("C++ people wanted it").

I've spent too much time in the past dealing with safety-critical situations
to have much faith in the unlikelihood or rarity of errors. If they *can*
occur, it's bad news.

Point is, you may have an error (a coding bug) which under some (but not all)
circumstances produces a fault (the wrong type of data is stored in an
ArrayList) which much later (and in a completely different part of your
code, or on some remote system that you've sent the ArrayList to via RMI)
produces a failure (a ClassCastException is raised). You then have to trace
back from the failure to find the point at which the original fault occurred
(a long time ago in a galaxy far, far away) and reproduce the circumstances
which caused it (maybe a multithreading race condition or something equally
time-dependent) in order to identify the original error and fix it. This
might be a couple of years after your code has been deployed, of course,
because this is an exceedingly unlikely and rare bug.

Me, I much prefer the compiler to save me the trouble. At *whatever* cost
in extra coding time. Having spent several weeks some years ago trying to
track down transient bugs in a networking system due to buggy C code with
subtle type errors due to data which was *sometimes* corrupted by a bad
pointer in an entirely different part of the system, I now have quite
strong (or strongly-typed?) views on how I like to spend my time.

-----------------------------------------------------------------
John English | mailto:[email protected]
Senior Lecturer | http://www.it.bton.ac.uk/staff/je
Dept. of Computing | ** NON-PROFIT CD FOR CS STUDENTS **
University of Brighton | -- see http://burks.bton.ac.uk
-----------------------------------------------------------------
 
D

Dimitri Maziuk

John English sez:
....
I've spent too much time in the past dealing with safety-critical situations
to have much faith in the unlikelihood or rarity of errors. If they *can*
occur, it's bad news.

<AOL/> I once didn't test for an obscure failure mode condition that
I didn't even realise existed. That was a credit card payment processing
system, so when our beancounters tried to balance the books at the end
of the month, they came up about $25k short.

I was smart enough to log all transaction details so with a few lines
of perl and much swearing I was able to back-charge the customers.
From the logs it looked like, in addition to genuine undecharging,
there were one or two customers who actually noticed the bug and
came back to exploit it. Repeatedly.

Moral: if a failure mode seems unlikely to you, that does not make
it any less devastating IRL.

Dima
 
D

Dale King

David said:
John, you've probably missed some of the discussion.

We're aware of that stated purpose but a lot of us here have never ever
encountered that bug and/or feel its an exceedingly unlikely/rare bug.
Thus, we don't feel that "justifies" the inclusion of generics, hence the
alternative theory ("C++ people wanted it").

Or the slight variant viewpoint that I have been trying to share:

That bug is rare and not likely to ever make it into production, *but*
generics has other benefits like better documenting intent and making
the code easier to use through eliminating casting that justify adding
it to the language. My only complaint is that it wasn't really added to
the language and instead we have erasure.
 
D

Dale King

David said:
John, you've probably missed some of the discussion.

We're aware of that stated purpose but a lot of us here have never ever
encountered that bug and/or feel its an exceedingly unlikely/rare bug.
Thus, we don't feel that "justifies" the inclusion of generics, hence the
alternative theory ("C++ people wanted it").

Or the slight variant viewpoint that I have been trying to share:

That bug is rare and not likely to ever make it into production, *but*
generics has other benefits like better documenting intent and making
the code easier to use through eliminating casting that justify adding
it to the language. My only complaint is that it wasn't really added to
the language and instead we have erasure.
 
R

Robert Maas, see http://tinyurl.com/uh3t

From: Andrew McDonagh said:
You created 15 classes all copied and pasted, then edited each one to
make their variable names and error messages reflect their real intent?

There were some other things different: Each class was used to retrieve
data from a different table in a relational database, each related to
other tables in different ways. Some tables were leaves, with only a
primary key and table-specific data. Other tables were parents with
foreign keys linking to records of other tables. In addition to the
main body of the code in each class that simply named the table
differently and named the variables and texted the error messages
differently, there were variations as to the additional fields beyond
the primary key, the name of the primary key, and the name and linkage
of foreign key, including instance fields that link lower-level records
into the parent record. So while I could possibly have factored out
some of the basic code in cases where the fields in the record matched
up exactly and only their names varied, that would have been a lot of
work and still I'd have to deal with all the pieces that didn't line up
such extra fields for some records and foreign keys with corresponding
instance-variable links.

So maybe I told only a half-truth when I described only the parts I
copied and renamed and failed to mention the parts that had to be
written anew for many of the classes.

Also, this was just a homework assignment to see if we could design a
structure of database tables appropriate for this data model and write
a Java JDBC program with a different class for each table to
appropriately reference the database, and get the whole thing working,
our very first non-trivial JDBC program ever. If this had been a
serious program to be maintained for years with real users, I would
still have used copy*paste to get the first draft working, to see if
the users are satisfied with the behaviour, then if there was extra
funding I would have refactored it to make the design cleaner. But this
was just one-time throwaway code for a class assignment, never to be
used again, so if I spent time refactoring before turning it in I would
have been late and gotten only half credit, and if I had turned in that
first draft but then spent effort refactoring it on my own, all that
effort would have been wasted because (1) nobody would ever see the
refactored version, and I wouldn't get extra credit for that additional
work, (2) I would have spent time on that instead of on my next
homework assignment thereby detracting from it, and (3) I already know
how to refactor and have done it many times whereas I never did a
serious JDBC program before and so the original copy&paste program
taught me useful stuff about various SQL commands I'd never used before
whereas refactoring would have taught me nothing new I didn't already
know.

So will you forgive me for writing throw-away copy&paste code in a time
crunch when nothing more was required or wanted?
Why not create one class which at runtime you give it the error
message etc?

Because I'd have to figure out a way to parameterized all the other
aspects of the various database tables that aren't uniform, i.e. invent
a whole new language for representing the relationships between
database tables, such that I could have a single db-relationship
compiler or emulator that somehow handled both cases of uniformness
between tables and cases of serious differences between tables, or else
have code that is half refactored and half not which would have been
uglier than having it *all* uniformly copy&paste.

By the way, in Lisp it would have been easier to refactor and write an
emulator because the syntax for data structures as literals exists
within the language. In Java by comparison I would have needed to write
the syntax for table relationships in XML and then learn how to parse
XML using DOM (because SAX would have been too painful for such an
application), and we weren't supposed to do any XML parsing yet at that
point in the course. As it turns out we *never* were required to do any
XML parsing in that class. But a week and a half ago during an idle day
I got the urge to teach myself XML parsing, and in about six hours I
taught myself enough to write a JSAX parser for a simple XML file
format I invented for specifying updates to a database, where my
program maintained a stack of XML syntax it was inside, used that to
fully validate the XML structure, and built a hash table of each update
request, and upon end-of-element event it converted the hash table into
a SQL command to perform the update. But I never got credit for that
because it wasn't an assignment for the course. The next day after that
I spent a similar amount of time to teach me JavaScript. I was going to
teach myself DOM next, but haven't gotten around to it yet, but I
browsed that part of the XML-parsing tutorial and it looks like the
actual parsing will be very easy, and I don't plan on building a JTree
so I can skip the rest of the tutorial. Instead I would use BeanShell
to interactively explore the DOM tree that has been built. But like I
said there hasn't been another day where I felt like doing such a
thing, so-far anyway.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Similar Threads

Generics ? 14
can this be done with generics? 32
More Generics warnings. 5
stupid generics 38
Generic generics help 7
Java generics limitations? 9
Generics and for each 12
Help on java generics 18

Members online

Forum statistics

Threads
473,768
Messages
2,569,574
Members
45,049
Latest member
Allen00Reed

Latest Threads

Top