OO programming - illumination?

J

Juha Laiho

Followups set to comp.object; apologies for disturbing comp.lang.forth
readers, but this seems to have a sideline into that direction as well.


I've been trying my hand in programming Java for a while now, but have
failed to see the "big" difference of what is procedural and what is OO
programming until very recently (or this is how I currently feel). I'm
inviting comments on whether or not I finally have the correct idea. My
background is mostly from procedural languages with or without objects
(mostly C, perl), but I've occasionally tried to understand Smalltalk
as well as Forth. In Java I've mostly been limited to server-side Java
(servlets, with an occasional touch of EJBs).

Now, to the thought that occurred to me: in Java, much of the code is
actually not real OO code, but procerdural code that utilises objects.
Still, it would in many places be possible to write Java in more rigid
OO fashion. This'd explain my perplexion in not seeing that much
difference with what I consider "good" perl code and with Java.

Contrasting Java with Smalltalk: in Java, it seems common to write
methods to very procedural form (no return values on methods, lots
of branching constructs). With Smalltalk, it seems that pretty much
all methods have made some selection on which object type to return.
Additionally, methods tend to be very short (esp. comparing to some Java
servlet code I've seen).

The Smalltalk style appears to me to be build on very long (Contrasting
to Java) message chains, which begin by building some new object, then
doing numerous pieces of small-scale processing on that object (and in
the end perhaps/often returning a completely different object than was
originally created). I think I've seen a reference to "message-passing
OO" somewhere.

What I often see with Java is that a single statement contains just
a single method call - and for the cases where the method returns
something, the return value is stored to a variable, to be used
perhaps once or twice later on. So, in many cases, code very similar
to the Smalltalk messaging would be possible, but is not done. This
is the issue that seems odd to me; is this just a cultural issue
(Java programmers largely having a strong background in procedural
programming, and the paradigm shift is not happening), or are there
technical issues somewhere in the history of Java?

Further, in the Smalltalk code it seems that actual branches are not
very common, instead behaviour differences come from very heavy use of
polymorphism (which seems to be much less often used in Java). Again,
is this just a culture issue? Or is the whole issue that the Java code
I've seen is just procedurally written crap - that in "real life" (for
some value thereof..) Java code resembles much more my representation of
Smalltalk?

So, summarizing; what I feel is non-OO in the way I see Java code written:
- many methods written to return no value (void)
- if/else constructs used instead of object polymorphism to achieve
differences in program behaviour
- a lot of avoidable local variables used (references to which could then
be accidentally leaked somewhere, and thus be ineligible for garbage
collection)

This still leaves me to think about the need of exceptions in OO code:
I think exceptions have their place, but are often misused in Java.
My experience with Smalltalk doesn't yet cover exception use, so cannot
make any comparisions on this (does Smalltalk even have exceptions?).

.... and a further question: is "message-passing OO" as used in Smalltalk
the only form of object-oriented programming, or are there other variants
of "pure OO" (as in "not tainted by procedural programming")?


Finally to the sideline to Forth. I seem to see strong resemblance of
Forth idea of threading small pieces of code to modify a set of data
to the Smalltalk object-message style of programming -- just that the
"raw" data of the Forth program is presented as objects in Smalltalk.
Am I just seeing ghosts here, or do these two share some conceptual
similarity?


And thanks goes to anyone for shedding any light on this all-too-long
rambling - but I seriously need handholding to get my thoughts on this
to any sensible shape.
 
T

Tom Dyess

Juha Laiho said:
Followups set to comp.object; apologies for disturbing comp.lang.forth
readers, but this seems to have a sideline into that direction as well.


I've been trying my hand in programming Java for a while now, but have
failed to see the "big" difference of what is procedural and what is OO
programming until very recently (or this is how I currently feel). I'm
inviting comments on whether or not I finally have the correct idea. My
background is mostly from procedural languages with or without objects
(mostly C, perl), but I've occasionally tried to understand Smalltalk
as well as Forth. In Java I've mostly been limited to server-side Java
(servlets, with an occasional touch of EJBs).

Now, to the thought that occurred to me: in Java, much of the code is
actually not real OO code, but procerdural code that utilises objects.
Still, it would in many places be possible to write Java in more rigid
OO fashion. This'd explain my perplexion in not seeing that much
difference with what I consider "good" perl code and with Java.

Contrasting Java with Smalltalk: in Java, it seems common to write
methods to very procedural form (no return values on methods, lots
of branching constructs). With Smalltalk, it seems that pretty much
all methods have made some selection on which object type to return.
Additionally, methods tend to be very short (esp. comparing to some Java
servlet code I've seen).

The Smalltalk style appears to me to be build on very long (Contrasting
to Java) message chains, which begin by building some new object, then
doing numerous pieces of small-scale processing on that object (and in
the end perhaps/often returning a completely different object than was
originally created). I think I've seen a reference to "message-passing
OO" somewhere.

What I often see with Java is that a single statement contains just
a single method call - and for the cases where the method returns
something, the return value is stored to a variable, to be used
perhaps once or twice later on. So, in many cases, code very similar
to the Smalltalk messaging would be possible, but is not done. This
is the issue that seems odd to me; is this just a cultural issue
(Java programmers largely having a strong background in procedural
programming, and the paradigm shift is not happening), or are there
technical issues somewhere in the history of Java?

Further, in the Smalltalk code it seems that actual branches are not
very common, instead behaviour differences come from very heavy use of
polymorphism (which seems to be much less often used in Java). Again,
is this just a culture issue? Or is the whole issue that the Java code
I've seen is just procedurally written crap - that in "real life" (for
some value thereof..) Java code resembles much more my representation of
Smalltalk?

So, summarizing; what I feel is non-OO in the way I see Java code written:
- many methods written to return no value (void)
- if/else constructs used instead of object polymorphism to achieve
differences in program behaviour
- a lot of avoidable local variables used (references to which could then
be accidentally leaked somewhere, and thus be ineligible for garbage
collection)

This still leaves me to think about the need of exceptions in OO code:
I think exceptions have their place, but are often misused in Java.
My experience with Smalltalk doesn't yet cover exception use, so cannot
make any comparisions on this (does Smalltalk even have exceptions?).

... and a further question: is "message-passing OO" as used in Smalltalk
the only form of object-oriented programming, or are there other variants
of "pure OO" (as in "not tainted by procedural programming")?


Finally to the sideline to Forth. I seem to see strong resemblance of
Forth idea of threading small pieces of code to modify a set of data
to the Smalltalk object-message style of programming -- just that the
"raw" data of the Forth program is presented as objects in Smalltalk.
Am I just seeing ghosts here, or do these two share some conceptual
similarity?


And thanks goes to anyone for shedding any light on this all-too-long
rambling - but I seriously need handholding to get my thoughts on this
to any sensible shape.
--
Wolf a.k.a. Juha Laiho Espoo, Finland
(GC 3.0) GIT d- s+: a C++ ULSH++++$ P++@ L+++ E- W+$@ N++ !K w !O !M V
PS(+) PE Y+ PGP(+) t- 5 !X R !tv b+ !DI D G e+ h---- r+++ y++++
"...cancel my subscription to the resurrection!" (Jim Morrison)
 
T

Tom Dyess

I'm a little hungover, so bare with me. Lol.
Now, to the thought that occurred to me: in Java, much of the code is
actually not real OO code, but procerdural code that utilises objects.

OOP is basically objects of which have procedures, so for code to be OO
doesn't mean it has to be devoid of procedural programming. Each object's
method is a procedure, but the difference is that procedure is part of an
object as opposed to just being a procedure sitting on it's own. The way OOP
utilizes this is that each object is responsible for procedures that it can
handle. For example, lets say I have a Patient object in a Servlet and I
want to process some HTML describing that patient. With the patient object,
I can have a simple DrawHTML procedure so that each patient can draw itself.
To take this to another level, say I have a PatientList object which is a
container of Patient objects. If I want to draw the entire list, I can
fashion a drawHTML at the PatientList object (note: this is not
polymorphism, it's aggregation) which then calls each Patient's DrawHTML
method. So, to draw patients, I call PatientList's drawHTML method which
then calls each Patient's drawHTML method. This way, I don't have to have a
single procedure that is responsible for determining the list of patients to
draw AND drawing each patient. This is beneficial if I want to change the
list in any way. Say I want to draw a list of patients that start with the
letter A: I just have to modify the PatientList drawHTML method which will
not affect the patient's drawHTML method. It isolates the patient's drawHTML
code so I don't have to worry about it. Furthermore, say I want multiple
procedures in PatientList to draw patients: 1 for patients beginning with
the letter A, one with a complete list of patients. I never have to change
the patient's drawHTML code. Ofcourse, you can do this procedurally by
having a drawPatientHTML(String name, String id, etc) but with OOP, I can
just pass a Patient object. It's cleaner to have the patient draw itself in
OOP.
Still, it would in many places be possible to write Java in more rigid
OO fashion. This'd explain my perplexion in not seeing that much
difference with what I consider "good" perl code and with Java.

Just because a language is OO, doesn't mean that it forces developers to
code in an OO fashion. Java is VERY OO in the fact that everything is an
object except simple data types, and they have associated objects as well
(ie int and Integer obejct).
Contrasting Java with Smalltalk: in Java, it seems common to write
methods to very procedural form (no return values on methods, lots
of branching constructs). With Smalltalk, it seems that pretty much
all methods have made some selection on which object type to return.
Additionally, methods tend to be very short (esp. comparing to some Java
servlet code I've seen).

It's good procedural form to keep your methods short (typically I keep them
under a "screen page") just so you don't get lost in the code. I was coding
after someone who didn't quite understand this and had formed a very complex
procedure to be around 16 "printer pages" and was very difficult to work
with. Before refactoring a huge chunk of the app, I broke it down to more
managable procedures that the main procedure called.
The Smalltalk style appears to me to be build on very long (Contrasting
to Java) message chains, which begin by building some new object, then
doing numerous pieces of small-scale processing on that object (and in
the end perhaps/often returning a completely different object than was
originally created). I think I've seen a reference to "message-passing
OO" somewhere.

I know absolutely nothing about SmallTalk, but this seems very strange to
me. Are you talking about inheritance?
Further, in the Smalltalk code it seems that actual branches are not
very common, instead behaviour differences come from very heavy use of
polymorphism (which seems to be much less often used in Java). Again,
is this just a culture issue? Or is the whole issue that the Java code
I've seen is just procedurally written crap - that in "real life" (for
some value thereof..) Java code resembles much more my representation of
Smalltalk?

I would say "procedurally written crap." All of the Java and Delphi code I
write is heavily OO. I can't stand using procedural languages, but
unfortunately in designing web apps, I have to use JS which is very
procedural and not strongly type casted (which is another issue).
So, summarizing; what I feel is non-OO in the way I see Java code written:
- many methods written to return no value (void)

This doesn't have anything to do with OO/Procedural coding that I can think
of. Lets say I want to add a Patient object to the PatientList. If I were
storing the Patients in a HashMap and created an add(Patient pat) on the
PatientList object, I would not need to return a variable in this method. If
it was in an Array or ArrayList, I could return the index, but that is
dependent on the container I'm using, not whether I'm using OO or procedural
programming.
- if/else constructs used instead of object polymorphism to achieve
differences in program behaviour

I'd have to see an example to determine if this is correct or not, but just
like databases and normalization, sometimes it's better to denormalize just
a bit to gain speed. In java, polymorphism might not be warranted if it's a
simple if/else statement. But like I said, I'd need to see an example.
- a lot of avoidable local variables used (references to which could then
be accidentally leaked somewhere, and thus be ineligible for garbage
collection)

You should always define variables in the lowest scope possible. There is
nothing wrong with local variabes, on the contrary, it's bad form to have a
bunch of global variables, but you will probably always have them, just try
to keep them to a minimum. For example, in a servlet, I keep my DB
connection/pool in a global scope in the servletContext. It doesn't make
sense to create them locally.
This still leaves me to think about the need of exceptions in OO code:
I think exceptions have their place, but are often misused in Java.
My experience with Smalltalk doesn't yet cover exception use, so cannot
make any comparisions on this (does Smalltalk even have exceptions?).

Exceptions are very useful in any type of programming, it keeps your program
from "farting" at the user. You trap the exception, handle it gracefully if
possible, and the user doesn't know any different. For example, lets say you
have an HTML form that requires some sort of number. Well, HTML isn't
strongly typed, so it comes in as a string. What if they put "A" instead of
a number. You could trap the exception and report to the user via HTML that
"A number is required," or you could just let your servlet crash. I'd say
most would prefer the former. Lol.
... and a further question: is "message-passing OO" as used in Smalltalk
the only form of object-oriented programming, or are there other variants
of "pure OO" (as in "not tainted by procedural programming")?

I'm not sure what message-passing OO is, but a good book on OOP is C++: The
Core Language. It goes through OOP very well.
And thanks goes to anyone for shedding any light on this all-too-long
rambling - but I seriously need handholding to get my thoughts on this
to any sensible shape.

You're welcome. Sorry I couldn't help you with your Smalltalk and Forth
comparisons though. Wish me luck on my hangover. Lol.
 
X

xarax

/snip/
So, summarizing; what I feel is non-OO in the way I see Java code written:
- many methods written to return no value (void)

This is quite normal for event driven programming,
because the driver doesn't care what the listener
will do with the event.
- if/else constructs used instead of object polymorphism to achieve
differences in program behaviour

Unfortunately, most of the beginner tutorials have
awkward procedural examples of using listeners and
such, which devolve into if-else testing using
"instanceof". However, re-writing such tutorials
to use polymorphism would confuse beginners too much.

I am a firm supporter of polymorphism and multiple
inheritance. I also use "instanceof" only in limited
down-casting circumstances where I do not author the
class/interface definitions. In those cases where I author
the definitions, I can always avoid "instanceof" and use
implicit down-casting provided by the JVM as appropriate;
I never need to use explicit down-casting or generics in
JDK 1.5, although generics are convenient.
- a lot of avoidable local variables used (references to which could then
be accidentally leaked somewhere, and thus be ineligible for garbage
collection)

Local variables disappear when the method returns. If the
method is long running, then I explicitly nullify the locals.
I also use block-scoped variables to reduce the visibility,
and I still nullify them. (When a block-scoped variable goes
out of scope, its value is not necessarily nullified.)

I use local variables to help with debugging, so that I can
single-step through the method and see the exact values
returned/calculated. I rarely use a method call within another
method's parameter list, nor will I use chaining (like the
chaining provided by StringBuffer). I doubt there are any
significant performance issues with my style, and it is
much more readable and maintainable.

This still leaves me to think about the need of exceptions in OO code:
I think exceptions have their place, but are often misused in Java.

Exceptions should be truly exceptional. However, the "finally{}" block
is quite useful by itself without catching an exception.
My experience with Smalltalk doesn't yet cover exception use, so cannot
make any comparisions on this (does Smalltalk even have exceptions?).

... and a further question: is "message-passing OO" as used in Smalltalk
the only form of object-oriented programming, or are there other variants
of "pure OO" (as in "not tainted by procedural programming")?

All OO programming eventually devolves into procedural programming.
At some point, an algorithm or calculation must be performed, which
is procedural, by definition. OOP merely aids the programmer in
expressing the semantics needed to arrive at the proper procedural
processing.

/snip/
 
H

Hans Bezemer

Tom Dyess said:
OOP is basically objects of which have procedures, so for code to be OO
doesn't mean it has to be devoid of procedural programming. Each object's
method is a procedure, but the difference is that procedure is part of an
object as opposed to just being a procedure sitting on it's own. The way OOP
utilizes this is that each object is responsible for procedures that it can
handle.

Basically, I never liked object oriented languages at all. Somehow it
seems wrong to assume that everything is part of another, greater
thing that is totally encapsulated and has to take care of itself. If
the world was organized that way, a screw would either screw itself
into the wall or come with its own screwdriver.

As a matter of fact, when you look under the hood of an object, all
you see is a structure with a few pointers to functions. You don't
need an object oriented programming language to produce such code, C
will do just fine, thank you very much.

BTW, structures are not very efficient. A structure gives usually rise
to a frenzy of alignment problems that can only be solved by fillers.
Actually, you really don't need structures, since it can also be
represented by a host of loosely related arrays. It doesn't make much
difference whether you write 'row.column' or 'column [row]', except
that arrays of the same type are much easier to handle for both the
compiler and the memory allocators.

Basically, there are only one-dimensional arrays or do you really
believe that by magic memory is wrapped into matrices and cubes? The
compiler conveniently translates your 'array [x] [y]' notations into
real offsets like 'array + ((x * size) + y)'.

There are a lot of fancy datatypes, but basically they are just arrays
of arrays. As a matter of fact, there are only two real datatypes, a
word and a byte. A pointer is usually a word. The different
pointertypes are just created to let the compiler do some work for
you, so you don't have to remember what the real size of e.g. a
character or an integer is. Thus, 'char* p + 1' can be translated into
'p + sizeof (char)'.

I wonder why people need a 'typeof()' operator: after all these
declarations in order to let the compiler do most of the work they
seem to have forgotten halfway what type 'p' actually was. After all
the help their object-oriented compiler gave them they have completely
lost track of what they were doing. So, if it doesn't help, what good
is it?

As a matter of fact (hush!) there is no such thing as a pointer. It's
just an address, in other words: it is a variable that holds a
location in memory. A NULL pointer is just a variable that points to
the very first byte in memory (address 0) and by (compiler) convention
that is not a valid address. A NULL string is a pointer to a byte in
memory that just holds a terminator, which is 0 by convention. Note,
these are just conventions. You could define another convention which
may work just as well or even better.

Usually, I don't need much more than a stack (where I push my
parameters (all words)) and a way to allocate an array of words and
bytes. When I push a byte on the stack, it is expanded to a word and
when I store a value from the stack into a byte its 'most significant
bits' are lost. When I'm done, I terminate the allocation. I don't
need a paranoid garbage collector to clean up my mess, thank you.
Even less, I don't need a compiler to do the 'instantiation' or
'destruction' of 'objects', I'm quite capable to allocate or
deallocate a bunch of bytes or words myself.

BTW, the latter seems a lot easier. Not only that, but my programs
seem to run a lot faster and are a lot smaller than most others. I can
forgave C that it was typed so I was forced to do some casts to get
rid of those ugly warnings. It still seems odd I have to prove a dumb
machine that I know what I am doing. The problem is that with C++ I
have to figure out what he is doing by using ugly things like class
browsers. Normally, I don't even need a debugger, so why the heck
should I use something as hideous as a class browser. When I'm really
in trouble the assembly switch of a C compiler does wonders, even
though most have forgotten it's in there.

The streams library of C++ is notoriously difficult to use. With a
file, you open it, dump a bunch of bytes there and close it. End of
story. A file handle is just another word. Most of the forth programs
I write professionally are between 25 and 150 lines long and written
in minutes, not hours. They do their job and are developed quicker
than even a C programmer can do. I recently wrote a program that
interfaces with Graphviz. It scans the Forth source file, dumps a .DOT
source file, which draws a graphic scheme of its call sequences. 150
lines, less than 1 hour hack. Most definitions are a line long (as
good Forth programs should). Try to beat me on that. And don't say
maintenance is a horror. It is not. Since most definitions are a line
long, it is easy to pinpoint what should be changed.

A compiler should be created humbly, not arrogantly. Why not: it's
there to help me; I forked out the bucks, so don't bug me. If it never
heard of a 'byte class' or a 'word class', it should follow some
classes, not me. And don't start to ask me for a 'constructor method':
if I didn't allocate it, I didn't need it and you don't have to
'construct' it for me.

Sometimes the world is not an object.. Better, if you look into
nature, there are plenty of networks, but few pure, natural
hierarchies. A hierarchy is a sub-class of a network anyway. Somehow,
it seems strange to me that people want to tackle 'real life problems'
with such limited tools.

Hans Bezemer
 
K

kjc

Hans said:
Tom Dyess said:
OOP is basically objects of which have procedures, so for code to be OO
doesn't mean it has to be devoid of procedural programming. Each object's
method is a procedure, but the difference is that procedure is part of an
object as opposed to just being a procedure sitting on it's own. The way OOP
utilizes this is that each object is responsible for procedures that it can
handle.


Basically, I never liked object oriented languages at all. Somehow it
seems wrong to assume that everything is part of another, greater
thing that is totally encapsulated and has to take care of itself. If
the world was organized that way, a screw would either screw itself
into the wall or come with its own screwdriver.

As a matter of fact, when you look under the hood of an object, all
you see is a structure with a few pointers to functions. You don't
need an object oriented programming language to produce such code, C
will do just fine, thank you very much.

BTW, structures are not very efficient. A structure gives usually rise
to a frenzy of alignment problems that can only be solved by fillers.
Actually, you really don't need structures, since it can also be
represented by a host of loosely related arrays. It doesn't make much
difference whether you write 'row.column' or 'column [row]', except
that arrays of the same type are much easier to handle for both the
compiler and the memory allocators.

Basically, there are only one-dimensional arrays or do you really
believe that by magic memory is wrapped into matrices and cubes? The
compiler conveniently translates your 'array [x] [y]' notations into
real offsets like 'array + ((x * size) + y)'.

There are a lot of fancy datatypes, but basically they are just arrays
of arrays. As a matter of fact, there are only two real datatypes, a
word and a byte. A pointer is usually a word. The different
pointertypes are just created to let the compiler do some work for
you, so you don't have to remember what the real size of e.g. a
character or an integer is. Thus, 'char* p + 1' can be translated into
'p + sizeof (char)'.

I wonder why people need a 'typeof()' operator: after all these
declarations in order to let the compiler do most of the work they
seem to have forgotten halfway what type 'p' actually was. After all
the help their object-oriented compiler gave them they have completely
lost track of what they were doing. So, if it doesn't help, what good
is it?

As a matter of fact (hush!) there is no such thing as a pointer. It's
just an address, in other words: it is a variable that holds a
location in memory. A NULL pointer is just a variable that points to
the very first byte in memory (address 0) and by (compiler) convention
that is not a valid address. A NULL string is a pointer to a byte in
memory that just holds a terminator, which is 0 by convention. Note,
these are just conventions. You could define another convention which
may work just as well or even better.

Usually, I don't need much more than a stack (where I push my
parameters (all words)) and a way to allocate an array of words and
bytes. When I push a byte on the stack, it is expanded to a word and
when I store a value from the stack into a byte its 'most significant
bits' are lost. When I'm done, I terminate the allocation. I don't
need a paranoid garbage collector to clean up my mess, thank you.
Even less, I don't need a compiler to do the 'instantiation' or
'destruction' of 'objects', I'm quite capable to allocate or
deallocate a bunch of bytes or words myself.

BTW, the latter seems a lot easier. Not only that, but my programs
seem to run a lot faster and are a lot smaller than most others. I can
forgave C that it was typed so I was forced to do some casts to get
rid of those ugly warnings. It still seems odd I have to prove a dumb
machine that I know what I am doing. The problem is that with C++ I
have to figure out what he is doing by using ugly things like class
browsers. Normally, I don't even need a debugger, so why the heck
should I use something as hideous as a class browser. When I'm really
in trouble the assembly switch of a C compiler does wonders, even
though most have forgotten it's in there.

The streams library of C++ is notoriously difficult to use. With a
file, you open it, dump a bunch of bytes there and close it. End of
story. A file handle is just another word. Most of the forth programs
I write professionally are between 25 and 150 lines long and written
in minutes, not hours. They do their job and are developed quicker
than even a C programmer can do. I recently wrote a program that
interfaces with Graphviz. It scans the Forth source file, dumps a .DOT
source file, which draws a graphic scheme of its call sequences. 150
lines, less than 1 hour hack. Most definitions are a line long (as
good Forth programs should). Try to beat me on that. And don't say
maintenance is a horror. It is not. Since most definitions are a line
long, it is easy to pinpoint what should be changed.

A compiler should be created humbly, not arrogantly. Why not: it's
there to help me; I forked out the bucks, so don't bug me. If it never
heard of a 'byte class' or a 'word class', it should follow some
classes, not me. And don't start to ask me for a 'constructor method':
if I didn't allocate it, I didn't need it and you don't have to
'construct' it for me.

Sometimes the world is not an object.. Better, if you look into
nature, there are plenty of networks, but few pure, natural
hierarchies. A hierarchy is a sub-class of a network anyway. Somehow,
it seems strange to me that people want to tackle 'real life problems'
with such limited tools.

Hans Bezemer
Without debunking most of everything you stated.
It's clear to me that you know nothing about OO concepts.
Especially the modelling part. Objects are all around you.
You yourself are an object composed on many organs, those
organs can be further modeled to include cells and so on and so forth.
Each organ has it's own respsonsiblity. An organ, when needed
collaborates with other organs.

Are you getting any of this.

Oh, BTW.
The screw and it's ability to screw itself in.
Is just plane empty headed.
 
O

Oscar kind

Hans Bezemer said:
Basically, I never liked object oriented languages at all. Somehow it
seems wrong to assume that everything is part of another, greater
thing that is totally encapsulated and has to take care of itself. If
the world was organized that way, a screw would either screw itself
into the wall or come with its own screwdriver.

As a matter of fact, when you look under the hood of an object, all
you see is a structure with a few pointers to functions. You don't
need an object oriented programming language to produce such code, C
will do just fine, thank you very much.

That's not the issue here: OO is just another way of looking at the same
types of problems you can solve with "traditional" programming.

BTW, structures are not very efficient. [...]

Very true. But with the availability of more and more menory, many feel
that memory efficiency is not that important anymore. Especially if you
note that while it is more efficient to model numbers as bytes, words,
etc. in terms of space, it is not in terms of processor efficiency: it
uses 32 bits exclusively, which is twice the size of a word.

Why is OO used then? Because it makes modelling and design easier.
Especially for complex problems. The result has not the smallest
footprint, not the fastest execution, but is _good_ _enough_.

However, large systems can be created faster, and that _saves_ _money_.
And in the end, it's the euros/dollars/<insert your currency here>
allocated by beancounters that determine what is being used.
 
T

Thomas Gagne

Hans said:
Basically, I never liked object oriented languages at all. Somehow it
seems wrong to assume that everything is part of another, greater
thing that is totally encapsulated and has to take care of itself. If
the world was organized that way, a screw would either screw itself
into the wall or come with its own screwdriver.

<snip>

I think everyone's taking Hans' post too seriously. I don't think he meant it
that way. In the end he basically ended up suggesting FORTH and Assembly are
all that's required. I'm surprised he stopped when he did and didn't suggest
that a little microcode is all anyone needs since that's where all the action
is. Anything above microcode is unnecessary fluff. If the hardware weren't
so restricting we would be able to assemble bits ourselves into groups however
large we needed them without requiring those arbitrary 8, 16, 32, and 64-bit
restrictions.

A little early for April Fools' Day, but this newsgroup is always in need of a
little levity, however dry or masked.
 
L

Lee Fesperman

Hans said:
Basically, I never liked object oriented languages at all. Somehow it
seems wrong to assume that everything is part of another, greater
thing that is totally encapsulated and has to take care of itself. If
the world was organized that way, a screw would either screw itself
into the wall or come with its own screwdriver.

I snipped most of your comments but without prejudice. I do appreciate your thoughts.

I, too, am bothered by the hype surrounding OO. It is an amalgam of techniques that have
been used successfully for a long time. In fact, Forth was one the first languages to
effectively combine data and behavior with its BUILDS/DOES (and earlier) constructs. OO
is no silver bullet, but it is an excellent technique ... I would call it the best
modern programming facility by far.

I truly love Forth and have done some major development in the language including doing
my own port to an oddball architecture. I learned a lot from Forth but am dismayed by
its lack of traction in the mainstream. IOW, it's not often a practical solution in the
real world.
BTW, structures are not very efficient. A structure gives usually rise
to a frenzy of alignment problems that can only be solved by fillers.

I'm afraid you're behind the times. Java is very efficient because of the amazing
strides made in runtime compiling/optimization. Alignment is not a consideration; Java
doesn't even let you know if two members are stored adjacently. The compiler is free to
use the most efficient organization.

I also really love Assembler and C. I've probably written far more assembly language
systems than you will ever comtemplate. I would have no problem going back to those
languages to develop systems (actually, I have, since I'm currently working on an ODBC
driver). However, I also know I can produce more understandable/maintainable/bug-free
systems in Java with greater ease.

I understand your stance that you want control and can handle it. I just don't see it as
valid these days, in the general case. It's a waste of development resources. My first
programming was all in machine code, because I didn't want to cede any control to the
assembler. I'm much smarter now ;^)
 
R

Reply7471859353

Please take some time to look at machine forth(s). ( F21, B16, C18 ....
)

The formula SMP MPP FORTH VLIW didn't have a name like F21, B16, C18
....
But was/is 16-bit instruction ( with an optional provision
for a packed 5-bit set , ARM similar but more like model from Inmos
Transputer F code instrustion with 31 (0,reserved) dynamic local
methods. )

PLEASE WRITE 100,000 PAGES OF IBM DEFENSE DOCUMENTATION THAT
EXHAUST ALL SUB-ITERATIONS OF THE COUPLE HUNDERED I SENT THEM.
 
A

Andreas Klimas

kjc said:
Hans Bezemer wrote:


Without debunking most of everything you stated.
It's clear to me that you know nothing about OO concepts.
Especially the modelling part. Objects are all around you.
You yourself are an object composed on many organs, those
organs can be further modeled to include cells and so on and so forth.
Each organ has it's own respsonsiblity. An organ, when needed
collaborates with other organs.

Are you getting any of this.

Oh, BTW.
The screw and it's ability to screw itself in.
Is just plane empty headed.

Sorry, but Hans view is just right. an object oriented view of the
model is not the only one. I prefere the simplest possible one - and the
object oriented view by far isn't the simplest one. OO modeling
ends in myriads of classes and methods, only manageable with highly
sophisticated browsers. the last 10 years I have developed large
Smalltalk and Java applications. Smalltalk is fun, Java isn't.
since performance was sometime an issue I had rewritten some
applications in C. those rewrites were an order of magnitude
smaller and three orders of magnitude faster. not that JIT
compilers in Smalltalk (or even in Java) will produce slow code
- it's just the oposite.

As Hans mentioned, applications written in Forth become even simpler.
Complexity from high to low: Smalltalk -> C -> Forth.
at least IMO
 
A

Andreas Klimas

Oscar said:
Hans Bezemer said:
Basically, I never liked object oriented languages at all. Somehow it
seems wrong to assume that everything is part of another, greater
thing that is totally encapsulated and has to take care of itself. If
the world was organized that way, a screw would either screw itself
into the wall or come with its own screwdriver.

As a matter of fact, when you look under the hood of an object, all
you see is a structure with a few pointers to functions. You don't
need an object oriented programming language to produce such code, C
will do just fine, thank you very much.


That's not the issue here: OO is just another way of looking at the same
types of problems you can solve with "traditional" programming.


BTW, structures are not very efficient. [...]


Very true. But with the availability of more and more menory, many feel
that memory efficiency is not that important anymore. Especially if you
note that while it is more efficient to model numbers as bytes, words,
etc. in terms of space, it is not in terms of processor efficiency: it
uses 32 bits exclusively, which is twice the size of a word.

Why is OO used then? Because it makes modelling and design easier.
Especially for complex problems. The result has not the smallest
footprint, not the fastest execution, but is _good_ _enough_.

However, large systems can be created faster, and that _saves_ _money_.

this is simply _not_ true. I have seen too many evidences against this
hypothesis.
And in the end, it's the euros/dollars/<insert your currency here>
allocated by beancounters that determine what is being used.

wrong again. if developers want to be competitive they have
to learn keeping things simple. but in the UML/CASE and OO
world this just isn't possible. too many self produced problems.
IT becomes the costumer.
 
B

Bernd Paysan

Andreas said:
wrong again. if developers want to be competitive they have
to learn keeping things simple. but in the UML/CASE and OO
world this just isn't possible. too many self produced problems.
IT becomes the costumer.

There's no real surprise here. All early industries are known to have huge
internal costs, and little external ones. The cotton industry in Manchester
had 95% internal costs (steam engines, cotton mills, spinneries, automatic
looms, etc.), most of that machine costs, not direct manpower costs. The
machine costs certainly were manpower costs, too, but externalized manpower
(someone had to build the machines, dig out the coal and so on). Over the
time, the internal friction is removed, and the internal costs drop. ATM, I
think the cotton industry is dominated by external costs at the sales side.

Now, the cotton industry had 200 years to mature, the IT industry didn't.
Perhaps it's also paying not enough to the sales channel. "Keep it simple"
often has problems to sell, and "Keep it compatible" is easier to sell.
With "Keep it compatible", you accumulate a huge legacy, and you have to
pay the price for that.

But most of the time, the IT industry implements complexity for it's own
sake. Complexity seems to be a selling point - "we have XXX, YYY, and ZZZ",
without questioning if all that is necessary. "OOP" is just another of
those selling points, due to the hype around it. Sometimes it's necessary,
sometimes it isn't. It's no silver bullet, but it has something to offer.
 
O

Oscar kind

Andreas Klimas said:
this is simply _not_ true. I have seen too many evidences against this
hypothesis.

Please enlighten me with examples then. I'd like to correct myself if I'm
wrong, but I need something to replace my current knowledge with.

Also, were the correct techniques being used? Using OO to implement a
process centric solution, for example, is a mistake. As is using a procedural
approach to implement a data centric solution.

wrong again. if developers want to be competitive they have
to learn keeping things simple. but in the UML/CASE and OO
world this just isn't possible. too many self produced problems.
IT becomes the costumer.

Keeping things simple is the most difficult thing by far (IMHO). This is
not related to the specific technique used, although some techniques make
is easier to keep things simple. For administrative systems (my area of
expertise), I find OO does this.

This does not negate the fact however, that developers only work on stuff
as a hobby, or what they get paid for. And for this last part, a developer
has no final authority. He only has as much influence as his manager gives
him, which hopefully increases by his/her skills in making things simple.

However, if a manager has the idea that OO makes life easiest (even if it's
misplaced), then that's what is being used.
 
H

Hans Bezemer

"I think everyone's taking Hans' post too seriously. I don't think he
meant it
that way."

I've taken a few quotes from previous posts. First, I want to debunk
the myth that this was just flame-bait. Ok, it was a little over the
top, true, but certainly no flame-bait.. And I wasn't kidding either..

"Sorry, but Hans view is just right. an object oriented view of the
model is not the only one. I prefere the simplest possible one - and
the
object oriented view by far isn't the simplest one. OO modeling
ends in myriads of classes and methods, only manageable with highly
sophisticated browsers."

Right, that is my experience too. And I'm sick and tired of trying to
cram a simple problem into a flawed paradigm. What most people seem to
forget is that OO is a _programming_technique_, not a "one fits all"
model. I use OO techniques in my programs too if I want (or need) to
do abstraction. Yes, in C. You can easily do that. X was written that
way and Axel-Tobias Schreiner wrote a whole
book on that subject in 1993.

Sometimes things are not simpler than they are. This little piece of
code parses a textfile. Tell me where OO helps. Tell the file it has
to parse and rewrite itself? What does that help? This thing is only
70 lines (including comment).

: Field> bl parse-word ; ( -- a n)
: Fields>> 0 do Field> 2drop loop ; ( n --)
: Lines>> 0 do refill drop loop ; ( n --)

: GetCard ( --)
2 Fields>> Field> Card place \ skip two fields, get card
number
2 Fields>> Field> Phone place \ skip two fields, get phone
number
3 Lines>> \ skip three lines
;

"I, too, am bothered by the hype surrounding OO. It is an amalgam of
techniques that have been used successfully for a long time. In fact,
Forth was one the first languages to effectively combine data and
behavior with its BUILDS/DOES (and earlier) constructs. OO is no
silver bullet, but it is an excellent technique ... I would call it
the best modern programming facility by far."

But even that can be taken too far. When very complex datatypes are
developed you sometimes find yourself wondering what this thingy does
again.. Sometimes I rather prefer to write a word that does an
explicit conversion, just to remind myself I took the address of a
CREATEd item and converted it to something much more complex.

OO can be useful in some instances, but most of the time it is too
limited and horrifying designs emerge to to make it fit. Lego stones
(Forth) are much more convenient. If it fits, you can slam them
together and let the data pass by itself. Even distinct datatypes
don't help much, except add more conversion. If you want easily
maintainable code, factor and make small routines and small programs.
If you need monstrous amounts of code to write a single function,
routine or program you're doing something wrong. Let systems
communicate, don't try to cram all and everything in a single program.

Wording is very important. I've shown more than once that you can tell
the computer a story while actually writing Forth code.

"You yourself are an object composed on many organs, those
organs can be further modeled to include cells and so on and so forth.
Each organ has it's own respsonsiblity. An organ, when needed
collaborates with other organs."

Someone doesn't know to much about biology, I'm afraid. For the
stratification you mention is fairly artificial and arbitrary. The
pancreas, for example, has two distinct functions and two very
different kind of cells. There are a lot of tissues that do not
distinctly belong to any organ. Mitochondria are very independent cell
organs with their own DNA. Cut up a human and you'll see that
everything is not so distinct as your biology books try to make you
believe. FYI, I have a BA in biology, so I know what I'm talking
about.

Hans
 
C

cpu16x1832

Dagfinn said:
I knew someone would get me for this. ;-) I was referring to the
specific example of the Command pattern without considering all other
possible ways you could replace a design pattern with a language construct.

Iterators are much more pervasive than the Command pattern--in Java,
anyway. So the potential difference in productivity is greater. In other
words, you have a point.

I was thinking more in terms of what I would consider a normal use of
design patterns. The way iterators are used in Java seems aberrant to me.

pattern question? ( FORTH -> C/JAVA => SCHEME/LISP ) VON NEWMAN

BOOK MARK THIS LINK

http://groups-beta.google.com/group...A0qQiEMNXvzco-t51HUQzYrjcvJMYp3afiZB9NxWdHgA&

AND/OR THIS LINK

http://mywebpage.netscape.com/mawcowboy/homepage.html
 
D

Doug Hoffman

Hans said:
What most people seem to forget is that OO is a
_programming_technique_, not a "one fits all"
model.

You might not want to lump most everyone who uses some object
techniques into that category. Especially in a Forth forum
where OOP is used as an *extension* to Forth, not as a *replacement*
for Forth.


I use OO techniques in my programs too if I want (or need) to
do abstraction. Yes, in C. You can easily do that. X was written that
way and Axel-Tobias Schreiner wrote a whole
book on that subject in 1993.

Sometimes things are not simpler than they are. This little piece of
code parses a textfile. Tell me where OO helps.

Make up your mind. First you say you use OOP if you want and
then you slam it. ?? Give it a rest guy. I don't think there
is quite the amount of "hype" over OOP as you suggest. At
least not in Forth circles. Why does this topic bother you?

Choose your tool and program in peace.

Regards,

-Doug
 
C

cpu16x1832

Doug said:
You might not want to lump most everyone who uses some object
techniques into that category. Especially in a Forth forum
where OOP is used as an *extension* to Forth, not as a *replacement*
for Forth.




Make up your mind. First you say you use OOP if you want and
then you slam it. ?? Give it a rest guy. I don't think there
is quite the amount of "hype" over OOP as you suggest. At
least not in Forth circles. Why does this topic bother you?

Choose your tool and program in peace.

Regards,

-Doug

WARNING: Failure to perform the following instructions
may lead to misunderstanding,

Maybe start simple,


OO ( program development) example,

Tire object,
Road object,
Adhesion property negotiated between Tire and Road object.


Regards,

Mark A. Washburn

---

In comp.lang.forth the OT subject line should read as

OO programming - illumination!


Keep learning C/Java/Scheme, also.


Then, continue your search using the following view

http://groups.google.com/groups?q=g...1&as_maxd=31&as_maxm=12&as_maxy=2005&filter=0

Regards,
 
A

Andreas Klimas

Oscar said:
Please enlighten me with examples then. I'd like to correct myself if I'm
wrong, but I need something to replace my current knowledge with.

there is no enlightenment at all. just experience:

1. project starts out with a framework (say JSP/J2EE)
2. JSP/J2EE seems too complicated so as a next step a
new framework would be developed which of course didn't
replace the other one.
3. A company wide architecture will be released which
ends in a new couple of libraries - on top of the
second framework.
4. now real requirement come in this play - which of
course doesn't fit (well) in those frameworks and architectures.
so we have to work around and hope that we get
those requirements running. but at this time we
have to deal with so many problems - released only
by 'framework developers' and 'senior architects'

well, with all that stuff we need about three weeks to
develop an new simple HTML Form.
to start the application it needs some hours and 16GB
of RAM. Response time approx. 3 min.

I have seen the exact same approach on a couple
of very big customers in Swiss and Germany.

My Solution:
Simple Web Server (essentialy HTTP protocol handler)
written in C.200 lines of code written in two hours.
Application written in C, very sharp, only real needed
requirement included. no dynamic behavior. written in
some weeks. 10000 lines of C code. (compared by 150000
lines of Java code). 512 MB Ram (compared by 16GB).


believe it or not.
most time spent is not development, it's in understanding
of the problem domain. once one has grasped the requirement,
implementation is easy. Now some hints
-> minimize runtime exceptions (out of memory, of
disk space, TCP errors etc.)
-> minimize the use of frameworks, the minimum is zero
-> avoid implementing hooks for future extensions
-> use a straight implementation for the real problem (business
problem).
-> don't avoid thinking.

.... and so on
Keeping things simple is the most difficult thing by far (IMHO). This is
not related to the specific technique used, although some techniques make
is easier to keep things simple. For administrative systems (my area of
expertise), I find OO does this.

the problem I see (nearly every day) is that peoples are more
interested in writing techincally correct and pretty OO code
than in solving the real problem. as I mentioned above, this
will end in myriads of classes and methods, by far too oversized
for given problems. well I see, is much more fun to play
the technical game.

one question:
how many classes / methods are needed to convert a currency
into another ?
This does not negate the fact however, that developers only work on stuff
as a hobby, or what they get paid for. And for this last part, a developer
has no final authority. He only has as much influence as his manager gives
him, which hopefully increases by his/her skills in making things simple.

However, if a manager has the idea that OO makes life easiest (even if it's
misplaced), then that's what is being used.

sorry again. if software development becomes too expensive, managers
decide to outsource them, for example to india - no joke !
and anyway, those peoples do exact the same job, only three times
cheaper.

best wishes
Andreas Klimas
 
H

Hans Bezemer

Doug Hoffman said:
You might not want to lump most everyone who uses some object
techniques into that category. Especially in a Forth forum
where OOP is used as an *extension* to Forth, not as a *replacement*
for Forth.
Fortunately. FYI, I didn't start this topic. I'm just giving my
reaction under the first amendement. ;-)
Make up your mind. First you say you use OOP if you want and
then you slam it.??
No, I use OOP as a technique. Like lookup tables, calculation, and any
other programming technique or algorithm you can think of. That
doesn't mean I find every problem suitable to be solved by using OOP.
In other words, I got nothing against OOP, I just don't find it the
right paradigm for EVERY problem:

this->add (this->value);

Is simply less clear and evident than:

a += a;

Or even:

a dup @ swap +!
Why does this topic bother you?
Choose your tool and program in peace.
When making money, it is not always up to me to choose my tool. And
working around the limitations or making procedural interfaces around
OOP just plain gets me down. I wanna do something useful with my time.
Even when working.

Hans
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,769
Messages
2,569,582
Members
45,066
Latest member
VytoKetoReviews

Latest Threads

Top