lies about OOP

P

projecktzero

I know this might not be the correct group to post this, but I thought
I'd start here.

A co-worker considers himself "old school" in that he hasn't seen the
light of OOP.(It might be because he's in love with Perl...but that's
another story.) He thinks that OOP has more overhead and is slower than
programs written the procedural way. I poked around google, but I don't
know the magic words to put in to prove or disprove his assertion. Can
anyone point me toward some resources?

We do web programming. I suspect that OO apps would behave as good as
procedural apps, and you'd get the benefit of code reuse if you do it
properly. Code reuse now consists of cutting and pasting followed by
enough modification that I wonder if it was worth it to cut and paste
in the first place.

Thanks.
 
J

James Stroud

It goes something like this (re-hashed a little):

"Every program of any complexity written in a procedural language will have a
[half-assed] implementation of object oriented design."

I know this might not be the correct group to post this, but I thought
I'd start here.

A co-worker considers himself "old school" in that he hasn't seen the
light of OOP.(It might be because he's in love with Perl...but that's
another story.) He thinks that OOP has more overhead and is slower than
programs written the procedural way. I poked around google, but I don't
know the magic words to put in to prove or disprove his assertion. Can
anyone point me toward some resources?

We do web programming. I suspect that OO apps would behave as good as
procedural apps, and you'd get the benefit of code reuse if you do it
properly. Code reuse now consists of cutting and pasting followed by
enough modification that I wonder if it was worth it to cut and paste
in the first place.

Thanks.

--
James Stroud, Ph.D.
UCLA-DOE Institute for Genomics and Proteomics
611 Charles E. Young Dr. S.
MBI 205, UCLA 951570
Los Angeles CA 90095-1570
http://www.jamesstroud.com/
 
T

Tim Daneliuk

projecktzero said:
I know this might not be the correct group to post this, but I thought
I'd start here.

A co-worker considers himself "old school" in that he hasn't seen the
light of OOP.(It might be because he's in love with Perl...but that's
another story.) He thinks that OOP has more overhead and is slower than
programs written the procedural way. I poked around google, but I don't
know the magic words to put in to prove or disprove his assertion. Can
anyone point me toward some resources?

We do web programming. I suspect that OO apps would behave as good as
procedural apps, and you'd get the benefit of code reuse if you do it
properly. Code reuse now consists of cutting and pasting followed by
enough modification that I wonder if it was worth it to cut and paste
in the first place.

Thanks.

https://www.tundraware.com/Technology/Bullet/
 
A

Alan Morgan

I know this might not be the correct group to post this, but I thought
I'd start here.

A co-worker considers himself "old school" in that he hasn't seen the
light of OOP.(It might be because he's in love with Perl...but that's
another story.) He thinks that OOP has more overhead and is slower than
programs written the procedural way.

In the world of computers, the statement "X is slower than Y" is true
for almost every value of X and Y under some circumstances.

IMHO, "loves perl" doesn't mesh with either "old school" or "cares
about overhead", but that's just me.

Alan
 
P

Peter Hansen

projecktzero said:
A co-worker considers himself "old school" in that he hasn't seen the
light of OOP.(It might be because he's in love with Perl...but that's
another story.) He thinks that OOP has more overhead and is slower than
programs written the procedural way. I poked around google, but I don't
know the magic words to put in to prove or disprove his assertion. Can
anyone point me toward some resources?

We do web programming. I suspect that OO apps would behave as good as
procedural apps, and you'd get the benefit of code reuse if you do it
properly.

The question in your first paragraph is largely answered (albeit
indirectly) by your second. You are doing web programming. It's
highly unlikely that you currently are near your limits in terms
of either "overhead" (I'll take that to mean memory usage) or
performance, and you are almost certainly limited by bandwidth.

In other words, you're I/O bound and not CPU or memory bound, so
any fuzzy concerns about the supposed sluggishness of OOP code
are seriously misplaced.

If I'm wrong, and your company has only just been surviving in
the market, solely by virtue of the incredibly quick and
lightweight code crafted by your wizardly but dated co-worker,
then I'll happily go to work disproving his ludicrous claim.

Until then, it's hardly worth the discussion... a clear case
of premature optimization, and in this case costing your
company huge benefits in lowered maintenance costs, higher
code quality, greater reuse, access to more up-to-date programmers
than your co-worker ;-) and so on.

-Peter
 
J

Jeremy Bowers

We do web programming. I suspect that OO apps would behave as good as
procedural apps, and you'd get the benefit of code reuse if you do it
properly. Code reuse now consists of cutting and pasting followed by
enough modification that I wonder if it was worth it to cut and paste
in the first place.

OO is a huge, ginormous, amazingly large, unspeakably awesome,
can't-believe-anyone-ever-lived-without-it win... but not necessarily OO
as it is presented in Software Engineering class due to the unusual nature
of web programming.

Tell him to check out HTML::Mason, and be sure to work with it long enough
to actually use some of its features. Once he's hooked (and it really is
an awesome framework; Amazon is supposed to use it and while I'm sure it
is highly customized I can definitely see it), explain to him that the
various components are really objects, complete with quite a lot of the
object features like inheritance, even if it doesn't look it.

If he resists this and declares Mason to be "crap", then with all due
respect you've got a blowhard who refuses to learn on your hands, and in a
perfect world he'd be stripped of code responsibility and moved somewhere
where he can't hurt anything. (He may merely not like it; I reserve the
strong statements in the previous sentence for the case where he actually
dismisses it with prejudice.) In the meantime, I've had great luck in Perl
environments programming in OO anyhow, as long as you have reasonably
independent responsibilities, and eventually the advantages do not go
unnoticed. Perl gets bashed on around here (in a good natured way) but
there are certainly worse languages; generally when I want to do something
the Right Way it provides a way to avoid code duplication, though it is
usually more circuitous and complex than in Python.

Ultimately, of course, the true problem isn't that you aren't coding OO,
it is the use of Copy and Paste Programming. OO is one path out, but only
one. (Perl is strong enough that one can make a case for going functional,
though I strongly prefer a functional/OO hybrid that builds on OO but
freely borrows functional paradigms at will.)

http://www.c2.com/cgi/wiki?CopyAndPasteProgramming

The problem with web programming is that you can *get away with*
"procedural" programming because the partitioning of the problem into web
pages provides a primitive, but marginally effective partitioning of the
problem into discrete components. Thus, unless you are running
*everything* through the exact same "web page" (CGI script probably in
this case), you aren't doing true procedural; the CGI scripts function as
primitive objects themselves, enough to let you get farther than you could
in a monolithic program and fool yourself into thinking you're safe, but
not strong enough to build a large-scale system with high-quality code
(i.e., low duplication).

But you still suffer.

ObPython (serious though): Which Python framework is the most Mason like?
(I'm more interested in the component infrastructure than the exact syntax
of the files; I'm not so worried about embedding Python into the HTML. I
think it might be Zope but I haven't tried enough of them to know.)
 
J

Jive

projecktzero said:
I know this might not be the correct group to post this, but I thought
I'd start here.

A co-worker considers himself "old school" in that he hasn't seen the
light of OOP.

Just how old *is* his school? I saw the light in the 70's. For those of
you too young to remember, those were menacing and sinister days, when pant
legs were too wide at the bottom, and the grotesque evil of "top down
programming" was on the land. But by '86, the Joy of OOP was widely known.
Flowers bloomed and birds chirped. Pant legs narrowed. I believe that was
the year I attended the first C++ conference in Santa Fe.
 
A

Adam DePrince

I know this might not be the correct group to post this, but I thought
I'd start here.

A co-worker considers himself "old school" in that he hasn't seen the
light of OOP.(It might be because he's in love with Perl...but that's
another story.) He thinks that OOP has more overhead and is slower than
programs written the procedural way. I poked around google, but I don't
know the magic words to put in to prove or disprove his assertion. Can
anyone point me toward some resources?

We do web programming. I suspect that OO apps would behave as good as
procedural apps, and you'd get the benefit of code reuse if you do it
properly. Code reuse now consists of cutting and pasting followed by
enough modification that I wonder if it was worth it to cut and paste
in the first place.

Thanks.

Code reuse is not copying and pasting. This truly misses what code can
be. Code isn't, well shouldn't be, a static entity that written once
and forgotten. It is gradually enhanced, generalized, factored,
improved, optimized, rethought, etc etc.

A Properly Written (tm) application will have each abstract concept
implemented just once; in a properly written application a single change
is propagated throughout the system. In what you describe, a change
entails hunting the code you have pasted and changing it in a number of
locations. Depending on the size of your program and how badly your
application begs for code reuse, you can find yourself changing your
code in hundreds of places just to change a single data structure.

Seriously, ever put off changing an array to a linked list, a list to a
map, or some other similar change simply because you don't want to the
coding and testing? In a proper OOP application, different parts of
your program will *ask* for some abstract task to be performed, but only
one small part will actually deal with the details of doing it. Change
that and nothing else knows any better.

The "overhead" of OOPLs is bogus. C++ was explicitly designed so that
each and every OO operation was as fast as or faster than faking it in
C. Do you use structures in C with special functions to act on them?
Then you are already using objectish methods ... only proper C++ object
methods will be no slower, but a good deal cleaner.

Even in instances where this is the case, for instance, comparing early
smalltalk interpreters to your friendly C compiler, it is almost always
the case that the expressive power and abstraction of an OOPL allows for
the use of greater algorithmic sophistication. So, sure, your C linked
list searches might beat my Smalltalk linked list search, but in the
same amount of programmer time I'd be able to implement something
better.

I really don't care to prove my point, only to point out that if your
assertion that this individual does not understand OOP is true, then he
his point likely isn't coming from knowledge and experience, but fear of
the unknown.

Now, if you said that your co-worker was old school and into functional
programming, I'd agree to disagree and point out functional programmings
weaknesses with respect to complexity and the ability to partition
knowledge.

Forget goggle. Go to Amazon and get some texts on OOPL. Learn C++,
Java, Python for that matter. Practice casting problems as classes in
Python and submit them here for praise and criticism.

Lastly, Perl is an OOPl in its own right ... like Python and quite
unlike Java, it doesn't jam its OOP-ness down your throat.


Adam DePrince
 
P

Paul Robson

A co-worker considers himself "old school" in that he hasn't seen the
light of OOP.(It might be because he's in love with Perl...but that's
another story.) He thinks that OOP has more overhead and is slower than
programs written the procedural way. I poked around google, but I don't
know the magic words to put in to prove or disprove his assertion. Can
anyone point me toward some resources?

Oh, he's probably telling the truth, in that unless you have the type of
an object defined at run time then a straight procedural call is going to
be quicker, because classic "procedural" code has a very tight mapping to
the underlying hardware.

Of course, the issue is not about raw speed - which in many cases does not
matter (and the few where it does you can work around) ; it's about
maintainability, modularity and so on.

I once worked at a place (this would be mid 1980s) where the other coders
would not accept that it was "better" to use names for subroutines such as
CalculateBillingTotal or variables such as StaffName. The argument was
"well, gosub 13000 and S$ are the same thing" .... which misses the point.

If he's that obsessed speed what is he doing coding with Perl (hell I like
Perl) which is compiled to a bytecode which is then interpreted.... why
not code in 'C' or even Assembler, then it'll be really quick ? Answer ;
you like the facilities of the language. So it is a trade off.
 
P

Paul McGuire

But by '86, the Joy of OOP was widely known.

"Widely known"? Errr? In 1986, "object-oriented" programming was barely
marketing-speak. Computing hardware in the mid-80's just wasn't up to the
task of dealing with OO memory and "messaging" overhead. Apple Macs were
still coding in C and Forth. Borland didn't ship Turbo-Pascal with
Object-Oriented programming until 1989, and Turbo-C++ shipped in 1991.
Smalltalk had been around for 10 years by 1986, but it was still a
curiosity, hardly "widely known." It wasn't until the publication of David
Taylor's "Object Technology: A Manager's Guide" in 1990 that OOP began to be
legitimized to many management decision makers, that it was more than just
"fairy dust" (as Bill Gates had characterized it in an attempt to discredit
Borland's forays into the field).

I would pick the publication of "Design Patterns" in 1995 by the Gang of
Four (Gamma, Helm, Johnson, and Vlissides), to be the herald of when "the
Joy of OOP" would be "widely known." DP formalized a taxonomy for many of
the heuristics that had evolved only intuitively up until then. Its
emergence reflects a general maturation of concept and practice, sufficient
to say that the Joy of OOP could be said to be "widely known."

-- Paul
 
T

Timo Virkkala

projecktzero said:
A co-worker considers himself "old school" in that he hasn't seen the
light of OOP.(It might be because he's in love with Perl...but that's
another story.) He thinks that OOP has more overhead and is slower than
programs written the procedural way. I poked around google, but I don't
know the magic words to put in to prove or disprove his assertion. Can
anyone point me toward some resources?

Sounds like your co-worker has a major case of premature optimization. I don't
know about speed issues with OO, but for large projects, using OOP makes data
encapsulation so much easier. Writing correct code with minimum effort should be
the first goal, speed issues (at that level) should be brought into the game
later on.

You should ask your co-worker if he also puts all his data in global variables :)

*wink*
 
M

Mike Thompson

Paul said:
"Widely known"? Errr? In 1986, "object-oriented" programming was barely
marketing-speak. Computing hardware in the mid-80's just wasn't up to the
task of dealing with OO memory and "messaging" overhead. Apple Macs were
still coding in C and Forth. Borland didn't ship Turbo-Pascal with
Object-Oriented programming until 1989, and Turbo-C++ shipped in 1991.
Smalltalk had been around for 10 years by 1986, but it was still a
curiosity, hardly "widely known." It wasn't until the publication of David
Taylor's "Object Technology: A Manager's Guide" in 1990 that OOP began to be
legitimized to many management decision makers, that it was more than just
"fairy dust" (as Bill Gates had characterized it in an attempt to discredit
Borland's forays into the field).

In my view THAT byte article on Smalltalk in the early '80 was the
beginning.

Then came Brad Cox's book.

Then there was Glockenspiel's C++ for PC in about '87 or '88. And, of
course, cfont on unix from about, what, '85?

Across the late '80s there was, of course, Eiffel which seemed a
remarkable piece of work for the time. And was backed by a terrific book
by Myer.

Then it all seemed to take off once C++ version 2.0 was minted.
I would pick the publication of "Design Patterns" in 1995 by the Gang of
Four (Gamma, Helm, Johnson, and Vlissides), to be the herald of when "the
Joy of OOP" would be "widely known." DP formalized a taxonomy for many of
the heuristics that had evolved only intuitively up until then. Its
emergence reflects a general maturation of concept and practice, sufficient
to say that the Joy of OOP could be said to be "widely known."

In actual fact, virtually all the design patterns came from the
Interviews C++ GUI toolkit written in the early '90s. What an utterly
brilliant piece of work that was.
 
J

Jive

Then came Brad Cox's book.

I read it.
Then there was Glockenspiel's C++ for PC in about '87 or '88.

I didn't PC in those days. I Unixed.
And, of course, cfont on unix from about, what, '85?

That's about when I got it. I used to chat with B.S. on the phone,
discussing and proposing features. Now he's rich and famous. Me? Would
you believe rich? How about not destitute?
Across the late '80s there was, of course, Eiffel which seemed a
remarkable piece of work for the time. And was backed by a terrific book
by Myer.

I puzzled long over whether to adopt C++ or Eiffel at the company I was with
at the time. I went with C++, dispite the fact that cfront was slow as
death and buggy. C++ made it bigtime and the company went public. Lucky
guesses? Hah!

Ah, nostalgia isn't what it used to be.

Jive
 
C

Craig Ringer

In actual fact, virtually all the design patterns came from the
Interviews C++ GUI toolkit written in the early '90s. What an utterly
brilliant piece of work that was.

As somebody who has just been bowled over by how well Qt works, and how
it seems to make OOP in C++ work "right" (introspection, properties,
etc), I'd be interested in knowing what the similarities or lack thereof
between Qt and Interviews are.

I've been pleasantly astonished again and again by how I can write
something in C++ with Qt like I would write it in Python, and have it
just work. Alas, this doesn't extend as far as:

instance = Constructor(*args)

though if anybody knows how to do this in C++ I would be overjoyed to
hear from them. Qt _does_ provide a pleasant (if somewhat limited) of
the Python getattr() and setattr() calls.
 
M

Miki Tebeka

Hello projecktzero,
A co-worker considers himself "old school" in that he hasn't seen the
light of OOP.(It might be because he's in love with Perl...but that's
another story.) He thinks that OOP has more overhead and is slower than
programs written the procedural way. I poked around google, but I don't
know the magic words to put in to prove or disprove his assertion. Can
anyone point me toward some resources?
Try http://www.dreamsongs.com/Essays.html (search for "Objects Have Failed")
for an interesting discussion.

Bye.
 
S

sbassi

Hello,

Instead of copy and paste, I use functions for code reuse. I didn't see
the light of OOP, yet. I use Python but never did anything with OOP. I
just can't see what can be done with OOP taht can't be done with
standart procedural programing.
 
P

Peter Hansen

I just can't see what can be done with OOP taht can't be done with
standart procedural programing.

Well, there's absolutely nothing you can do with OOP that
can't be done with "standard procedural programming" (SPP).

But that's hardly the point. After all, anything you can
do with OOP or SPP can be done with assembly language as
well.

OOP is way of approaching the design and construction of
the software. As a starting point, consider the advantages
of procedural programming over using raw assembly language.

Now consider that there might be similar advantages in
using OOP instead of procedural programming.

And, lastly, to bring this on topic for this forum, consider
that there might be advantages in using *Python*, specifically,
for doing this OOP programming, compared to many other
languages. Not that you can do things in Python you can't
do in other languages (such as, say, assembly). Just that
you can do them much more easily, and the resulting code
will be much more readable to you and others.

(To be fair, for certain tasks using OOP provides basically
no advantages, and in fact might represent a more awkward
model for the code than a simple procedural program would.
If that's the sort of program you are faced with writing,
by all means stick with SPP and leave OOP to those who
write complex applications that really benefit from it.)

-Peter
 
S

Stefan Seefeld

Craig said:
As somebody who has just been bowled over by how well Qt works, and how
it seems to make OOP in C++ work "right" (introspection, properties,
etc), I'd be interested in knowing what the similarities or lack thereof
between Qt and Interviews are.

Qt provides widgets that a client app. can compose into a GUI.
InterViews provides 'glyphs' [*] that form a scene graph in a display
server. Although InterViews usually was compiled into a client-side
library, it provided all the functionality required by a display server
such as redisplay and pick traversals. Indeed, the X Consortium
supported InterViews (and its successor Fresco) for a while as the next
generation for its 'X Windowing System', until it was dropped (for
mostly political reasons, as usual) about '95.
(Fresco had been nominated, together with OpenDoc, as candidates for an
'Compound Document Architecture' RFP on the Object Management Group.
OpenDoc won.)

[*] The term 'glyph' reflexts the fact that the scene graph nodes in
InterViews are extremely fine-grained, i.e. glyphs can represent
individual characters or elements of vector graphics such as paths.
That's unlike any conventional 'toolkit' such as Qt, where a 'widget'
is quite coarse-grained, and the display of such 'widgets' is typically
not that of a structured graphic, but procedural.

Regards,
Stefan
 
S

Steve Holden

Paul said:
"Widely known"? Errr? In 1986, "object-oriented" programming was barely
marketing-speak. Computing hardware in the mid-80's just wasn't up to the
task of dealing with OO memory and "messaging" overhead. Apple Macs were
still coding in C and Forth. Borland didn't ship Turbo-Pascal with
Object-Oriented programming until 1989, and Turbo-C++ shipped in 1991.
Smalltalk had been around for 10 years by 1986, but it was still a
curiosity, hardly "widely known." It wasn't until the publication of David
Taylor's "Object Technology: A Manager's Guide" in 1990 that OOP began to be
legitimized to many management decision makers, that it was more than just
"fairy dust" (as Bill Gates had characterized it in an attempt to discredit
Borland's forays into the field).
Well, that's not true either, and the fact that Bill Gates was
denigrating it implies that he at least knew about it, even if he chose
not to adopt it (then: of course nowadays Microsoft call almost all
their technologies "object oriented"; sometimes this description is as
accurate as when Gates speaks about "our open Windows environment").
I would pick the publication of "Design Patterns" in 1995 by the Gang of
Four (Gamma, Helm, Johnson, and Vlissides), to be the herald of when "the
Joy of OOP" would be "widely known." DP formalized a taxonomy for many of
the heuristics that had evolved only intuitively up until then. Its
emergence reflects a general maturation of concept and practice, sufficient
to say that the Joy of OOP could be said to be "widely known."
We could all make our own choices, but anyone who's been programming
*seriously* since the 60s will likely remember Simula as the birth of
many oft he ideas later picked up by Alan Kay and promoted by the Xerox
PARC SmallTalk group.

I visited that group in 1981 (after Kay left, unfortunately, and then
being headed by Adele Goldberg, who is now coincidentally promoting the
delights of Python at conferences like OSCON), and object-oriented
programming was certainly something that was being taken pretty
seriously in the academic world as a potential solution to some serious
PLIT engineering problems.

The fact that it took the technology a relatively long time to appear
"in the wild", so to speak, is simply the natural maturation of any new
technology. Given that UNIX was developed in the early 1970s I'd say it
took UNIX 20 years to start becoming mainstream. But a lot of people
knew about it before it *became* mainstream, especially those who had to
place their technology bets early. The same is true of object-oriented
concepts.

I guess this is just to say that I'd dispute your contention that
SmallTalk was a curiosity - unless you define anything of interest
mostly to the academic world as a curiosity, in which case there's no
way to overcome your objection. It was the first major implementation of
an entire system based exclusively on OO programming concepts and, while
far from ideal, was a seminal precursor to today's object-oriented systems.

regards
Steve
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,769
Messages
2,569,580
Members
45,053
Latest member
BrodieSola

Latest Threads

Top