lies about OOP

B

beliavsky

A paper finding that OOP can lead to more buggy software is at
http://www.leshatton.org/IEEE_Soft_98a.html

Les Hatton "Does OO sync with the way we think?", IEEE Software, 15(3),
p.46-54
"This paper argues from real data that OO based systems written in C++
appear to increase the cost of fixing defects significantly when
compared with systems written in either C or Pascal. It goes on to
suggest that at least some aspects of OO, for example inheritance, do
not fit well with the way we make mistakes."

His comments under "invited feedback" are amusing and confirm my
impression that OOP is partly (but perhaps not entirely) hype:

"I should not that this paper because it criticised OO had an unusually
turbulent review period. 2 reviewers said they would cut their throats
if it was published and 3 said the opposite. The paper was only
published if the OO community could publish a rebuttal. I found this
very amusing as my paper contains significant data. The rebuttal had
none. This sort of thing is normal in software engineering which mostly
operates in a measurement-free zone."

What papers have scientific evidence for OOP?

Paul Graham's skeptical comments on OOP are at
http://www.paulgraham.com/noop.html .

If OOP is so beneficial for large projects, why are the Linux kernel,
the interpreters for Perl and Python, and most compilers I know written
in C rather than C++?
 
P

Paul McGuire

Steve Holden said:
Well, that's not true either, and the fact that Bill Gates was
denigrating it implies that he at least knew about it, even if he chose
not to adopt it (then: of course nowadays Microsoft call almost all
their technologies "object oriented"; sometimes this description is as
accurate as when Gates speaks about "our open Windows environment").

We could all make our own choices, but anyone who's been programming
*seriously* since the 60s will likely remember Simula as the birth of
many oft he ideas later picked up by Alan Kay and promoted by the Xerox
PARC SmallTalk group.

I visited that group in 1981 (after Kay left, unfortunately, and then
being headed by Adele Goldberg, who is now coincidentally promoting the
delights of Python at conferences like OSCON), and object-oriented
programming was certainly something that was being taken pretty
seriously in the academic world as a potential solution to some serious
PLIT engineering problems.

The fact that it took the technology a relatively long time to appear
"in the wild", so to speak, is simply the natural maturation of any new
technology. Given that UNIX was developed in the early 1970s I'd say it
took UNIX 20 years to start becoming mainstream. But a lot of people
knew about it before it *became* mainstream, especially those who had to
place their technology bets early. The same is true of object-oriented
concepts.

I guess this is just to say that I'd dispute your contention that
SmallTalk was a curiosity - unless you define anything of interest
mostly to the academic world as a curiosity, in which case there's no
way to overcome your objection. It was the first major implementation of
an entire system based exclusively on OO programming concepts and, while
far from ideal, was a seminal precursor to today's object-oriented systems.

regards
Steve

Good points all. And yes, I recall the BYTE article on Smalltalk. I guess
I was just reacting mostly to the OP's statement that "by '86 the Joy of OOP
was widely known". He didn't say "OOP all began when..." or "OOP was widely
known," which I think still would have been a stretch - he implied that by
'86 OOP was widely recognized as Goodness, to which I disagree. This was
the year of the first OOPSLA conference, but as PyCon people know, just
having a conference doesn't guarantee that a technology is widely and
joyfully accepted. Just as my commercial-centric view may understate
academic interest in some topics, an academic-centric view may overestimate
the impact of topics that are ripe for research, or technically "cool," but
little understood or adopted outside of a university setting.

I would characterize the 80's as the transitional decade from structured
programming (which really started to hit its stride when Djikstra published
"Use of GOTO Considered Harmful") to OOP, and that OOP wasn't really
"joyful" until the early-to-mid 90's.

(And I apologize for characterizing Smalltalk as a "curiosity." I admit my
bias is for software that is widely commercially deployed, and even the most
ardent Smalltalkers will have difficulty citing more than a handful of
applications, compared to C,C++,VB,COBOL,Delphi, etc. I personally have
seen Smalltalk-based factory control and automation systems, but they are
rapidly self-marginalizing, and new customers are extremely reluctant to
enfold Smalltalk into an already patchwork mix of technologies, as is
typically found in factory settings.)

-- Paul
 
S

Steve Holden

Paul said:
[some stuff]


Good points all. And yes, I recall the BYTE article on Smalltalk. I guess
I was just reacting mostly to the OP's statement that "by '86 the Joy of OOP
was widely known". He didn't say "OOP all began when..." or "OOP was widely
known," which I think still would have been a stretch - he implied that by
'86 OOP was widely recognized as Goodness, to which I disagree. This was
the year of the first OOPSLA conference, but as PyCon people know, just
having a conference doesn't guarantee that a technology is widely and
joyfully accepted. Just as my commercial-centric view may understate
academic interest in some topics, an academic-centric view may overestimate
the impact of topics that are ripe for research, or technically "cool," but
little understood or adopted outside of a university setting.

I would characterize the 80's as the transitional decade from structured
programming (which really started to hit its stride when Djikstra published
"Use of GOTO Considered Harmful") to OOP, and that OOP wasn't really
"joyful" until the early-to-mid 90's.

(And I apologize for characterizing Smalltalk as a "curiosity." I admit my
bias is for software that is widely commercially deployed, and even the most
ardent Smalltalkers will have difficulty citing more than a handful of
applications, compared to C,C++,VB,COBOL,Delphi, etc. I personally have
seen Smalltalk-based factory control and automation systems, but they are
rapidly self-marginalizing, and new customers are extremely reluctant to
enfold Smalltalk into an already patchwork mix of technologies, as is
typically found in factory settings.)

Nothing to disagree with here.

regards
Steve
 
S

Steve Holden

Paul said:
[some stuff]


Good points all. And yes, I recall the BYTE article on Smalltalk. I guess
I was just reacting mostly to the OP's statement that "by '86 the Joy of OOP
was widely known". He didn't say "OOP all began when..." or "OOP was widely
known," which I think still would have been a stretch - he implied that by
'86 OOP was widely recognized as Goodness, to which I disagree. This was
the year of the first OOPSLA conference, but as PyCon people know, just
having a conference doesn't guarantee that a technology is widely and
joyfully accepted. Just as my commercial-centric view may understate
academic interest in some topics, an academic-centric view may overestimate
the impact of topics that are ripe for research, or technically "cool," but
little understood or adopted outside of a university setting.

I would characterize the 80's as the transitional decade from structured
programming (which really started to hit its stride when Djikstra published
"Use of GOTO Considered Harmful") to OOP, and that OOP wasn't really
"joyful" until the early-to-mid 90's.

(And I apologize for characterizing Smalltalk as a "curiosity." I admit my
bias is for software that is widely commercially deployed, and even the most
ardent Smalltalkers will have difficulty citing more than a handful of
applications, compared to C,C++,VB,COBOL,Delphi, etc. I personally have
seen Smalltalk-based factory control and automation systems, but they are
rapidly self-marginalizing, and new customers are extremely reluctant to
enfold Smalltalk into an already patchwork mix of technologies, as is
typically found in factory settings.)

Nothing to disagree with here.

regards
Steve
 
J

Jarek Zgoda

projecktzero said:
A co-worker considers himself "old school" in that he hasn't seen the
light of OOP.(It might be because he's in love with Perl...but that's
another story.) He thinks that OOP has more overhead and is slower than
programs written the procedural way. I poked around google, but I don't
know the magic words to put in to prove or disprove his assertion. Can
anyone point me toward some resources?

There's no magic in OOP. It's just more natural to human nature, so you
should point your friend to works of Aristotle or Sancti Thomae
Aquinatis, specially their writings on "natural sciences" and theory of
species.
 
T

Tomas Christiansen

projecktzero said:
A co-worker considers himself "old school" in that he hasn't seen the
light of OOP ... He thinks that OOP has more overhead and is slower
than programs written the procedural way.

He may be right, but consider the alternatives.

Think of an integer. An integer is an object!

You can assign a new integer-value to the object.
You can read the integer-value of the object.
(The integer can be part of complex expressions.)

Usually you are unaware (or don't care) _how_ the object is implemented.
Wether the bits are red, green, turns upside-down or are inverted - you
doesn't really care, as long as it can hold the values that you want it to
hold and be used in the relevant contexts (addition, multiplication, ...).

Some lanugages gives you the choise of many integer-implementations, some
languages gives you only a few choises and some languages gives you only one
choise.

Surely we can agree that the presence of an integer-object is extremely
useful! If you had to do all the integer-stuff in machine code _every_ time,
you would soon be very, very tired of working with integers. There is no
doubt that objects are (or can be) extremely useful, time-saving and very
efficient. Chances are that your own machine-code integer-inplementation is
not nearly as good as the one made by a team of top-tuned programmers (no
offense) programming the integer-implementation "object".

Wether the language should give you the choise of one, some, many or
extremely many integer-implementations, depends entirely on your needs
("what a pervert - he needs an integer!"). Lowering the number of choises of
implementations, rises the chances of having to chose a "not very good"
implementation. Letting the language automaticly chose the right one, frees
your mind to other processes, but at the risk of some kind of run-time
overhead.
 
T

Terry Reedy

I did not really 'get' OOP until after learning Python. The relatively
simple but powerful user class model made more sense to me than C++. So
introducing someone to Python, where OOP is a choice, not a mandate, is how
*I* would introduce a procedural programmer to the subject. YMMV.

Terry J. Reedy
 
R

Roy Smith

Terry Reedy said:
I did not really 'get' OOP until after learning Python. The
relatively simple but powerful user class model made more sense to
me than C++. So introducing someone to Python, where OOP is a
choice, not a mandate, is how *I* would introduce a procedural
programmer to the subject. YMMV.

OOP is a choice in C++ too. You can write procedural C++ code; no
need to use classes at all if you don't want to. Something like Java
is a different story. Java *forces* you to use classes. Nothing
exists in Java that's not part of some class.

I think the real reason Python is a better teaching language for
teaching OO concepts is because it just gives you the real core of OO:
inheritence, encapsulation, and association of functions with the data
they act on.

C++ has so much stuff layed on top of that (type bondage, access
control, function polymorphism, templates) that it's hard to see the
forest for the trees. You get C++ jocks who are convinced that that
stuff is part and parcel of OO, and if it doesn't have (for example),
private data, it can't be OO.
 
M

Martijn Faassen

Paul McGuire wrote:
[snip]
I would characterize the 80's as the transitional decade from structured
programming (which really started to hit its stride when Djikstra published
"Use of GOTO Considered Harmful") to OOP, and that OOP wasn't really
"joyful" until the early-to-mid 90's.

IMMEDIATE NOTICE TO ALL PYTHON SECRET UNDERGROUND MEMBERS.

Classified. Any disclosure to non-PSU members prohibited. Offenders will
be apprehended and removed from the time stream, permanently.

Words in human languages typically consist of a combination of vowels
and consonants, at least up until the start of the posthumanist
revolution in 3714, when the Morning Light Confederation's ships reached
the ablethik-seganichek world of Kaupang again (on Hellenberg consensus
time streams with catalog marker AB-7). Alphabetic scripts are a typical
way to represent them. Even in the posthuman phase on Kaupang they were
widely appreciated as a quaint representation.

The language English, an indo-european tongue of the west-germanic
persuasion (expressiveness rating 7, comprehensiveness rating 12, fits
in the moderate Y group of the Lespan pan-species language
classification system), is widely in use throughout a surprisingly long
period on many time streams. This language does not have overly long
consonant combinations.

The language Dutch, though closely related to the language English has a
slightly different sound to glyph mapping system. Dutch is, of course,
the true language of the Python Secret Underground and the official
native language of Python users. In the language Dutch, a certain vowel
sound is expressed as a combination of the glyphs 'i' and 'j'. The glyph
'j' however is exclusively used for consonants in the English language,
unlike in Dutch, where 'j' serves a dual role.

Human brains used to the English language cannot cope with glyph
representations that express consonants in too long a sequence, without
any space left for vowels. A combination like 'jkstr' in the English
language is inevitably considered to be a spelling error, and corrective
procedures automatically attempt to correct the spelling of such a word
to a more acceptable combination.

This happens frequently to the name 'Dijkstra', a name that originated
in the Dutch natural language. The English eye cannot accept such a
ridiculous combination of consonants (j k s t *and* r?), and desperately
tries to resolve the situation. As a result, the glyphs 'i' and 'j'
are frequently reversed.

This is extremely unfortunate, as Djikstra is well known to be a primary
moniker for the leader of the Insulationist faction within the Gheban
coalition. The Insulationist faction is, of course, a prominent member
the alliance that produced the Alien Whitespace Eating Nanovirus.
Djikstra is therefore an enemy of the Python programming language. All
that we stand for. All our hopes. All our dreams will come to naught if
Djikstra gets his way.

The moniker Djikstra is to be avoided in public utterances. PSU members
can give themselves away and draw unwanted attention from the
Insulationist overlord at this critical junction. What's worse,
innocents might be caught up in this cosmic web of intrigue. While most
innocents can of course be safely ignored, any innocent of temporal
tension rating 17 and above (revised scale) should not be exposed to
undue danger, as they may be essential for our time stream manipulations.

It is therefore important to avoid the utterance of Djikstra's name at
all costs!

ADDENDUM FOR PSU MEMBERS OF CLASSES NE-100 AND HIGHER

The relation between Djikstra and Dijkstra's name is of course not a
coincidence. As was already evidenced in the famous "Considered Harmful"
article, the great philosopher Dijkstra was on to a monumental cosmic
secret: that reality is bound by a term rewriti
 
M

Martijn Faassen

A paper finding that OOP can lead to more buggy software is at
http://www.leshatton.org/IEEE_Soft_98a.html
[snip description of paper that compares C++ versus Pascal or C]
What papers have scientific evidence for OOP?

That's of course a good question. I'm sure also that comp.object has
argued about this a thousand times. I'll just note that one paper is
just a single data point with specific circumstances. The OO languages
under investigation might have caused increased or lower failure rates
for other reasons than their (lack of) object-orientedness, for
instance. It is of course possible to come up with a lot of other
explanations for a single data point besides a conclusion that OOP can
lead to more buggy software. It for instance certainly not surprising to
me that C++ can lead to more buggy software than some other languages. :)

[snip]
If OOP is so beneficial for large projects, why are the Linux kernel,
the interpreters for Perl and Python, and most compilers I know written
in C rather than C++?

Because C++ is not an ideal object oriented language? Because a Linux
kernel has very stringent predictability requirements for what kind of
machine code is generated that C meets and is much harder to do with
C++? There are other reasons to choose C, such as portability, obiquity
and performance.

Some of the same reasons probably apply to Perl and Python, though at a
lesser degrees. I do not know a lot about Perl's implementation. I do
know that Guido van Rossum has in fact considered rewriting Python in
C++ in the past. And right now, there are various projects that are
using object oriented languages to reimplement Python, including Python
itself.

Finally, it is certainly possible to program in object oriented style in
C. It is more cumbersome than in a language that supports it natively,
but it is certainly possible. Such OO in C patterns occur throughout the
Linux kernel, which needs a pluggability architecture for its various
types of drivers. It can also be seen in many aspects of Python's
implementation. Another example of a C-based system that uses object
oriented technologies is the GTK+ widget set.

Anyway, this question is using a few data points to make an overly
generic argument, and the data points themselves do not really support
the argument so very well either.

Regards,

Martijn
 
P

Paul McGuire

Roy Smith said:
I think the real reason Python is a better teaching language for
teaching OO concepts is because it just gives you the real core of OO:
inheritence, encapsulation, and association of functions with the data
they act on.

C++ has so much stuff layed on top of that (type bondage, access
control, function polymorphism, templates) that it's hard to see the
forest for the trees. You get C++ jocks who are convinced that that
stuff is part and parcel of OO, and if it doesn't have (for example),
private data, it can't be OO.

+1, QOTW!! :) (esp. "type bondage"!)
 
P

Paul McGuire

Martijn Faassen said:
Paul McGuire wrote:
[snip]
I would characterize the 80's as the transitional decade from structured
programming (which really started to hit its stride when Djikstra published
"Use of GOTO Considered Harmful") to OOP, and that OOP wasn't really
"joyful" until the early-to-mid 90's.

IMMEDIATE NOTICE TO ALL PYTHON SECRET UNDERGROUND MEMBERS.

Classified. Any disclosure to non-PSU members prohibited. Offenders will
be apprehended and removed from the time stream, permanently.

<snip - it's "Dijkstra" not "Djikstra", you dolt! :)>

Yikes! (or better, "Jikes!" or even "Yijkes!"?) - my bad.
And he was on faculty at UT right here in Austin, too.

Red-faced-ly yours -
-- Paul
 
M

Mike Meyer

If OOP is so beneficial for large projects, why are the Linux kernel,
the interpreters for Perl and Python, and most compilers I know written
in C rather than C++?

Because C++ combines the worst features of C and OO programming. It
also makes some defaults go the wrong way, and forces decisions onto
the programmer that are best left up to the compiler, as the
programmer is liable to get them wrong.

C, on the other hand, is a very nice portable assembler language. I've
seen cases where a good optimizing compiler wrote faster code than a
bright human writing assembler (though it was less readable). C is
enough liek assembler that some HLLs generate C instead of assembler,
thus making them portable. I've seen those generate C code as clean as
a human being might generate, given the write options.

<mike
 
J

Jive

Paul McGuire said:
I was just reacting mostly to the OP's statement that "by '86 the Joy of OOP
was widely known".

I (Jive Dadson) said that. I guess I figured that if I knew about it, it
was widely known. But in retrospect, I had an information edge. I was in
Silicon Valley, working on the Next Big Thing, and I was wired into USENET.
My earliest dejagoogle hit is from '86. (It's not under "Jive Dadson", a
more recent nom du net.)

He didn't say "OOP all began when..." or "OOP was widely
known," which I think still would have been a stretch - he implied that by
'86 OOP was widely recognized as Goodness, to which I disagree.

Well, it was widely known by everyone who read the motos I stuck up on my
cubicle walls. :)

Jive
 
D

Daniel T.

A paper finding that OOP can lead to more buggy software is at
http://www.leshatton.org/IEEE_Soft_98a.html

Sure, OOP *can* lead to more buggy software, that doesn't mean it always
does.

Les Hatton "Does OO sync with the way we think?", IEEE Software, 15(3),
p.46-54
"This paper argues from real data that OO based systems written in C++
appear to increase the cost of fixing defects significantly when
compared with systems written in either C or Pascal. It goes on to
suggest that at least some aspects of OO, for example inheritance, do
not fit well with the way we make mistakes."

So, he has data that shows that C++ *appears* to increase the cost of
fixing defects, then *suggests* that its because C++ is an OO language?
Sounds like he is ignoring his own data to me...

Mr. Hatton suffers from the same problem that many OO critics suffer. He
thinks that the language choice decides whether the program written is
an OO program. I've seen plenty of very non-OO systems written in OO
languages, I've seen expert OO systems written in non-OO languages. OOP
isn't a language choice, it is a style of problem solving.

I'm happy to accept that it could take longer to fix bugs in programs
written in C++ when compared to either C or Pascal, the language itself
is quite a bit more complicated than either of the latter.

You know, it tends to take longer to repair a 2004 Mustang than it does
a 1964 Mustang, does that mean the newer car is not as good?

If OOP is so beneficial for large projects, why are the Linux kernel,
the interpreters for Perl and Python, and most compilers I know written
in C rather than C++?

All three of the systems in question were begun before C++ was
standardized. Python was also implemented in Java, does that mean OO
other than C++ is good? Of course not, the fact that the three projects
in question were implemented in C is not an indictment against OO in any
way.
 
M

Mike Meyer

Instead of copy and paste, I use functions for code reuse. I didn't see
the light of OOP, yet. I use Python but never did anything with OOP. I
just can't see what can be done with OOP taht can't be done with
standart procedural programing.

There are cases where functions just don't do the job.

I at one time needed to use the little-used (and probably little-know)
"account" featre of an ftp server to automate a regular file
transfer. Both Perl and Python come with ftp modules. Perl's was (is?)
procedural, Python's is OO. Neither supported the account feature.

Now, if I used perl to do this, I'd have to either modify the module
in place, meaning I'd have to remember to put the mods back every time
we updated perl if I couldn't get them to buy the patch, or I could
make a local copy of the module, meaning it wouldn't get any bug fixes
that might come with new versions of perl.

With the Python version, I created a subclass of the FTP connection
module, rewrote just the login method, and installed that locally. Now
I don't have to worry about installing new versions of Python, as my
code is outside the distribution. But I still get the benefit of any
bug fixes that show up outside the login method. I also submitted the
new login method, and it's now part of the standard module.

This kind of code reuse just isn't possible with procedural code.

<mike
 
F

Fredrik Lundh

Mike said:
Because C++ combines the worst features of C and OO programming. It
also makes some defaults go the wrong way, and forces decisions onto
the programmer that are best left up to the compiler, as the
programmer is liable to get them wrong.

that's a nice rant about C++, but it's not the right answer to the question. the
Python core developers are perfectly capable of writing working C++ code,
and both the core and most extensions would benefit from C++ features (just
look at Boost and other C++ layers).

but C++ wasn't a serious contender back when CPython development was
started, and nobody's going to convert the existing codebase...

</F>
 
G

Gregor Horvath

Daniel said:
Sure, OOP *can* lead to more buggy software, that doesn't mean it always
does.

I think that costs(=time) to develop and maintain software depends not
on wheter it is based on OOP or not but on two factors:

* Number of NEW Code lines to solve the given problem
* Complexity of this new code

The anwser to the question if OOP is better is: it depends

If the given problem is solved with less code and complexity in OOP then
it is the better approach if not the reverse is true.

Thats why I like python because it does not force to use OOP or
procedural programming.

But isnt the main argument to code in Python (or other high level
languages) easy of use and compact code?

Therefore should Python code be less buggy and cheaper to develop and
maintain. Are there any papers on that?
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,013
Latest member
KatriceSwa

Latest Threads

Top