Can a low-level programmer learn OOP?

C

Chris Carlen

Hi:

From what I've read of OOP, I don't get it. I have also found some
articles profoundly critical of OOP. I tend to relate to these articles.

However, those articles were no more objective than the descriptions of
OOP I've read in making a case. Ie., what objective
data/studies/research indicates that a particular problem can be solved
more quickly by the programmer, or that the solution is more efficient
in execution time/memory usage when implemented via OOP vs. procedural
programming?

The problem for me is that I've programmed extensively in C and .asm on
PC DOS way back in 1988. Then didn't program for nearly 10 years during
which time OOP was popularized. Starting in 1999 I got back into
programming, but the high-level-ness of PC programming and the
completely foreign language of OOP repelled me. My work was in analog
and digital electronics hardware design, so naturally I started working
with microcontrollers in .asm and C. Most of my work involves low-level
signal conditioning and real-time control algorithms, so C is about as
high-level as one can go without seriously loosing efficiency. The
close-to-the-machine-ness of C is ideal here. This is a realm that I
truly enjoy and am comfortable with.

Hence, being a hardware designer rather than a computer scientist, I am
conditioned to think like a machine. I think this is the main reason
why OOP has always repelled me.

Perhaps the only thing that may have clicked regarding OOP is that in
certain cases I might prefer a higher-level approach to tasks which
involve dynamic memory allocation. If I don't need the execution
efficiency of C, then OOP might produce working results faster by not
having to worry about the details of memory management, pointers, etc.

But I wonder if the OOP programmers spend as much time creating classes
and trying to organize everything into the OOP paradigm as the C
programmer spends just writing the code?

Ultimately I don't care what the *name* is for how I program. I just
need to produce results. So that leads back to objectivity. I have a
problem to solve, and I want to find a solution that is as quick as
possible to learn and implement.

Problem:

1. How to most easily learn to write simple PC GUI programs that will
send data to remote embedded devices via serial comms, and perhaps
incorporate some basic (x,y) type graphics display and manipulation
(simple drawing program). Data may result from user GUI input, or from
parsing a text config file. Solution need not be efficient in machine
resource utilization. Emphasis is on quickness with which programmer
can learn and implement solution.

2. Must be cross-platform: Linux + Windows. This factor can have a big
impact on whether it is necessary to learn a new language, or stick with
C. If my platform was only Linux I could just learn GTK and be done
with it. I wouldn't be here in that case.

Possible solutions:

Form 1: Use C and choose a library that will enable cross-platform GUI
development.

Pro: Don't have to learn new language.
Con: Probably will have difficulty with cross-platform implementation
of serial comms. This will probably need to be done twice. This will
waste time.

Form 2: Use Python and PySerial and TkInter or wxWidgets.

Pro: Cross-platform goal will likely be achieved fully. Have a
programmer nearby with extensive experience who can help.
Con: Must learn new language and library. Must possibly learn a
completely new way of thinking (OOP) not just a new language syntax.
This might be difficult.

Form 3: Use LabVIEW

Pro: I think that the cross-platform goal can be met.
Con: Expensive. I would prefer to use an Open Source solution. But
that isn't as important as the $$$. I have also generally found the 2D
diagrammatical programming language of "G" as repelling as OOP. I
suspect that it may take as much time to learn LabVIEW as Python. In
that case the time spent on Python might be better spent since I would
be learning something foundational as opposed to basically just learning
how to negotiate someone's proprietary environment and drivers.


Comments appreciated.


--
Good day!

________________________________________
Christopher R. Carlen
Principal Laser&Electronics Technologist
Sandia National Laboratories CA USA
(e-mail address removed)
NOTE, delete texts: "RemoveThis" and
"BOGUS" from email address to reply.
 
M

Marc 'BlackJack' Rintsch

Perhaps the only thing that may have clicked regarding OOP is that in
certain cases I might prefer a higher-level approach to tasks which
involve dynamic memory allocation. If I don't need the execution
efficiency of C, then OOP might produce working results faster by not
having to worry about the details of memory management, pointers, etc.

That's not something tied to OOP. Automatic memory management is also
possible with procedural languages.
But I wonder if the OOP programmers spend as much time creating classes
and trying to organize everything into the OOP paradigm as the C
programmer spends just writing the code?

Creating classes and organizing the program in an OOP language isn't
different from creating structs and organizing the program in C.

On one side Python is a very OOP language as everything is an object. On
the other side it is possible to write parts of the program in procedural
or even functional style. Python is not Java, you don't have to force
everything into classes.

From my experience Python makes it easy to "just write the code". Easier
than C because I don't have to deal with so much machine details, don't
have to manage memory, don't need extra indexes for looping over lists and
so on. And the "crashes" are much gentler, telling me what the error is
and where instead of a simple "segfault" or totally messed up results.

Ciao,
Marc 'BlackJack' Rintsch
 
J

John Nagle

Chris said:
Hi:

From what I've read of OOP, I don't get it. I have also found some
articles profoundly critical of OOP. I tend to relate to these articles.

However, those articles were no more objective than the descriptions of
OOP I've read in making a case. Ie., what objective
data/studies/research indicates that a particular problem can be solved
more quickly by the programmer, or that the solution is more efficient
in execution time/memory usage when implemented via OOP vs. procedural
programming?

The problem for me is that I've programmed extensively in C and .asm on
PC DOS way back in 1988. Then didn't program for nearly 10 years during
which time OOP was popularized. Starting in 1999 I got back into
programming, but the high-level-ness of PC programming and the
completely foreign language of OOP repelled me. My work was in analog
and digital electronics hardware design, so naturally I started working
with microcontrollers in .asm and C. Most of my work involves low-level
signal conditioning and real-time control algorithms, so C is about as
high-level as one can go without seriously loosing efficiency. The
close-to-the-machine-ness of C is ideal here. This is a realm that I
truly enjoy and am comfortable with.

Hence, being a hardware designer rather than a computer scientist, I am
conditioned to think like a machine. I think this is the main reason
why OOP has always repelled me.

Why?

I've written extensively in C++, including hard real-time programming
in C++ under QNX for a DARPA Grand Challenge vehicle. I have an Atmel
AVR with a cable plugged into the JTAG port sitting on my desk right now.
Even that little thing can be programmed in C++.

You can sometimes get better performance in C++ than in C, because C++
has "inline". Inline expansion happens before optimization, so you
can have abstractions that cost nothing.

If it has state and functions, it probably should be an object.
The instances of the object can be static in C++; dynamic memory
allocation isn't required in C++, as it is in Python.

Python is a relatively easy language, easier than C++, Java,
or even Perl. It's quite forgiving. The main implementation,
CPython, is about 60x slower than C, though, so if you're trying
to implement, say, a rapidly changing digital oscilloscope display,
the result may be sluggish.

John Nagle
 
D

Dennis Lee Bieber

However, those articles were no more objective than the descriptions of
OOP I've read in making a case. Ie., what objective
data/studies/research indicates that a particular problem can be solved
more quickly by the programmer, or that the solution is more efficient
in execution time/memory usage when implemented via OOP vs. procedural
programming?
OOP is NOT meant to improve execution time or memory usage... It is
supposed to improve code maintainability and design/implementation
effort. Theoretically by making it easier to modularize stuff and
reuse...
completely foreign language of OOP repelled me. My work was in analog

OOP is not a "language" per se, but a concept...
But I wonder if the OOP programmers spend as much time creating classes
and trying to organize everything into the OOP paradigm as the C
programmer spends just writing the code?
Depending upon the tool set available, OOP doesn't really have
"programmers" writing code... It has analysts designing objects in the
abstract, feeding them to the toolset, and the toolset creates the code
templates to be filled in with actual implementation details.
Form 2: Use Python and PySerial and TkInter or wxWidgets.

Pro: Cross-platform goal will likely be achieved fully. Have a
programmer nearby with extensive experience who can help.
Con: Must learn new language and library. Must possibly learn a
completely new way of thinking (OOP) not just a new language syntax.
This might be difficult.
The serial port, as exposed by PySerial, "is" an "object"... You can
probably use it without ever designing a class of your own (unlike Java,
Python can be used as a plain procedural language with libraries that
minimize one's exposure to objects)

As for OOAD/OOP... A somewhat poorly developed example would be a
radio:

A radio (object/class) is composed of:
A tuner (object/class) composed of:
AGC
mixer
VFO
Display
An audio output
audio amplifier
volume/tone controls
speaker(s)
A power supply
An antenna

Say you have classes for Analog Tuner and Digital Tuner; as long as
both tuners support the same interface: up/down frequency control,
antenna input, power input, AF output, you can model analog and digital
radios by just changing the tuner component.

Now... the above can be extended further (though not easily with the
analog tuner <G>)... Say you have real radio with a computer control
interface. In the software, you would create a virtual radio class,
having an interface of the "manual" controls of the real radio. An
instance of this class would respond to things like:
radio.tune_up(Hz)... Inside the class, the tune_up() method would
translate the Hz value to whatever the computer interface needs, and
send the appropriate command to the radio.

As far as a user of this "radio" is concerned, the interface to the
physical radio is transparent -- the radio could be built into the
computer itself, could be a "software defined radio", could be across
the country and accessed over the internet...
--
Wulfraed Dennis Lee Bieber KD6MOG
(e-mail address removed) (e-mail address removed)
HTTP://wlfraed.home.netcom.com/
(Bestiaria Support Staff: (e-mail address removed))
HTTP://www.bestiaria.com/
 
S

Simon Hibbs

Chris,

I can fully relate to your post. I trained as a programmer in the 80s
when OOP was an accademic novelty, and didn't learn OOP untill around
2002. However now I find myself naturaly thinking in OOP terms,
although I'm by no means an expert - I'm a sysadmin that writes the
occasional utility. I found learning OOP with Python very easy because
it has such a stripped-down and convenient syntax.

The advantages of OOP aren't in performance or memory, they're in the
fact that OOP simplifies the ways in which we can think about and
solve a problem. OOP packages up the functionality of a program into
logical units (objects) which can be written, debugged and maintained
independently of the rest of the programme almost as if they were
completely seperate programmes of their own, with their own data and
'user inteface' in the form of callable functions (actualy methods).

Here's a realy excellent tutorial on Python that's fun to follow.
Downloading and installing python, and following this tutorial will
probably take about as long as it took to write your post in the first
place. At the end of it you'll have a good idea how OOP works, and how
Python works. Learning OOp this way is easy and painless, and what you
learn about the theory and principles of OOP in Python will be
transferable to C++ if you end up going in that direction.

I hope this was helpful.

Simon Hibbs
 
C

Chris Carlen

John said:
Chris Carlen wrote:[edit]
Hence, being a hardware designer rather than a computer scientist, I
am conditioned to think like a machine. I think this is the main
reason why OOP has always repelled me.

Why?

When pointers were first explined to me, I went "Ok." And rather
quickly ideas lit up in my head about what I could do with them.

When I read what OOP is, that doesn't happen. All I think is "what's
the point of this?" "What can this do for me that I can do already with
the procedural way of thinking?" And if it can't do anything new, then
why rearrange my thinking to a new terminology? It's results that
matter, not the paradigm.
I've written extensively in C++, including hard real-time programming
in C++ under QNX for a DARPA Grand Challenge vehicle.

Did the vehicle win?
I have an Atmel
AVR with a cable plugged into the JTAG port sitting on my desk right now.
Even that little thing can be programmed in C++.
Yes.

You can sometimes get better performance in C++ than in C, because C++
has "inline". Inline expansion happens before optimization, so you
can have abstractions that cost nothing.

That's interesting. But why is this any different than using
preprocessor macros in C?
If it has state and functions, it probably should be an object.
The instances of the object can be static in C++; dynamic memory
allocation isn't required in C++, as it is in Python.

Why? Why is OOP any better at explaining a state machine to a computer?
I can write state machines all over the place in C, which tend to be
the core of most of my embedded programs. I can write them with
hardcoded logic if that seems like the easy thing to do any the
probability of extensive changes is extremely low. They are extremely
easy to read and to code. I have written a table-driven state machine
with arbitrary-length input condition lists. The work was all in
designing the data structures. The code to update the state machine was
about 4 lines.

Why would OOP be better? Different is not better. Popular is not
better. What the academics say is not better. Less lines of code might
be better, if the priority is ease of programming. Or, less machine
execution time or memory usage might be better, if that is the priority.

Until I can clearly understand why one or the other of those goals might
better be realized for a given problem with OOP vs. procedures, I just
don't get it.

I will keep an open mind however, that until I work with it for some
time there is still the possibility that I will have some light go on
about OOP. So don't worry, I'm not rejecting your input.
Python is a relatively easy language, easier than C++, Java,
or even Perl. It's quite forgiving. The main implementation,
CPython, is about 60x slower than C, though, so if you're trying
to implement, say, a rapidly changing digital oscilloscope display,
the result may be sluggish.

Yes, I certainly wouldn't consider Python for that.

Thanks for your comments.


--
Good day!

________________________________________
Christopher R. Carlen
Principal Laser&Electronics Technologist
Sandia National Laboratories CA USA
(e-mail address removed)
NOTE, delete texts: "RemoveThis" and
"BOGUS" from email address to reply.
 
E

Evan Klitzke

You can sometimes get better performance in C++ than in C, because C++
has "inline". Inline expansion happens before optimization, so you
can have abstractions that cost nothing.

This is a bit off topic, but inline is a keyword in C since C99.
 
C

Cousin Stanley

....
2. Must be cross-platform: Linux + Windows.

This factor can have a big impact on whether it is necessary
to learn a new language, or stick with C.

If my platform was only Linux I could just learn GTK
and be done with it.
....

Chris ....

The Python bindings for GTK in the form of a mechanism
deemed PyGTK are also available for Windows and provide
a diverse set of widgets for building GUI applications ....

http://www.pygtk.org

The Applications page lists a rather large and wide variety
of the types of programs that have been built using PyGTK ....

http://www.pygtk.org/applications.html

There is plenty of decent documentation available
and a dedicated newsgroup for assistance if needed ....
 
N

Neil Cerutti

That's interesting. But why is this any different than using
preprocessor macros in C?

This is OT, however: inline functions have a few benefits over
preprocessor macros.

1. They are type-safe.
2. They never evaluate their arguments more than once.
3. They don't require protective parentheses to avoid precedence errors.
4. In C++, they have the additional benefit of being defined in a
namespace, rather than applying globally to a file.

As an experienced C programmer you're probably used to coping
with the problems of preprocessor macros, and may even take
advantage of their untyped nature occasionally. Even C++
programmers still use the advisedly.
I will keep an open mind however, that until I work with it for
some time there is still the possibility that I will have some
light go on about OOP. So don't worry, I'm not rejecting your
input.

In my opinion OOP is usefully thought of as a type of design
rather than a means of implementation. You can implement an OO
design in a procedural langauge just fine, but presumably an OO
programming language facilitates the implementation of an OO
design better than does a procedural language.

Going back to the stack machine question, and using it as an
example: Assume you design your program as a state machine.
Wouldn't it be easier to implement in a (hypothetical)
state-machine-based programming language than in a procedural
one? I think John was insinuating that a state-machine is more
like an object than it is like a procedure.
 
S

samwyse

John said:
Chris Carlen wrote:[edit]
Hence, being a hardware designer rather than a computer scientist, I
am conditioned to think like a machine. I think this is the main
reason why OOP has always repelled me.

When pointers were first explined to me, I went "Ok." And rather
quickly ideas lit up in my head about what I could do with them.

When I read what OOP is, that doesn't happen. All I think is "what's
the point of this?" "What can this do for me that I can do already with
the procedural way of thinking?" And if it can't do anything new, then
why rearrange my thinking to a new terminology? It's results that
matter, not the paradigm.

What can this do for me that I can do already with the procedural way
of thinking? Absolutely nothing; it's all Turing machines under the
hood.

Why rearrange my thinking to a new terminology? Because new
terminologies matter a lot. There's nothing that you can do with
pointers that can't be done with arrays; I know because I wrote a lot
of FORTRAN 77 code back in the day, and withouy pointers I had to
write my own memory allocation routines that worked off of a really
big array.

Likewise, there's nothing that you can do in C that can't be done with
C++ (especially since C++ was originally a preprocessor for C);
however C++ will keep track of a lot of low-level detail for you so
you don't have to think about it. Let's say that you have an embedded
single-board computer with a serial and a parallel port. You probably
have two different routines that you use to talk to them, and you have
to always keep track which you are using at any given time.

It's a lot easier to have a single CommPort virtual class that you use
in all of your code, and then have two sub-classes, one for serial
ports and one for parallel. You'll be especially happy for this when
someone decides that as well as logging trace information to a
printer, it would be nice to also log it to a technician's handhelp
diagnostic device.
 
C

Chris Carlen

Neil said:
Going back to the stack machine question, and using it as an
example: Assume you design your program as a state machine.
Wouldn't it be easier to implement in a (hypothetical)
state-machine-based programming language than in a procedural
one? I think John was insinuating that a state-machine is more
like an object than it is like a procedure.

I think at this point, I should stop questioning and just learn for a while.

But regarding state machines, I had probably written a few in C the past
before really understanding that it was a state machine. Much later I
grasped state machines from digital logic. Then it became much clearer
how to use them as a tool and to code them intentionally.

Once I have written a state table, I can implement using flip-flops and
gates or in C as either a state variable and a switch statement or
something table driven. The switch code can be written as fast as I can
read through the state table. That's the easiest implementation, but
the least easy to change later unless it's fairly small.

I will be eager to see how to do this in Python.

I have found the comments in response to my doubts about OOP very
encouraging. I will do some learning, and come back when I have more
Python specific problems...

Thanks for the input!


--
Good day!

________________________________________
Christopher R. Carlen
Principal Laser&Electronics Technologist
Sandia National Laboratories CA USA
(e-mail address removed)
NOTE, delete texts: "RemoveThis" and
"BOGUS" from email address to reply.
 
C

Chris Carlen

Simon said:
Sorry, here's the tutorial link:

http://hetland.org/writing/instant-python.html


Simon Hibbs


Thanks Simon. Actually, that's the tutorial that I've started with.

Your comments are encouraging. I'll keep learning.


--
Good day!

________________________________________
Christopher R. Carlen
Principal Laser&Electronics Technologist
Sandia National Laboratories CA USA
(e-mail address removed)
NOTE, delete texts: "RemoveThis" and
"BOGUS" from email address to reply.
 
C

Chris Carlen

Bruno said:
Chris Carlen a écrit :
>[edit]
Must possibly learn a completely new way of thinking (OOP)

Not necessarly. While Python is OO all the way down - meaning that
everything you'll work with will be an object (functions included) -, it
doesn't *force* you into OO (IOW : you don't have to define classes to
write a Python program). You can as well use a procedural - or even
somewhat functional - approach, and most Python programs I've seen so
far are usually a mix of the three.
not just a new language syntax.

You forgot one of the most important part of a language : idioms. And
it's definitively *not* idiomatic in Python to use classes when a
simpler solution (using plain functions and modules) is enough.

I see. That's very promising. I guess some articles I read painted a
picture of religiousity among OOP programmers. But that is not the
impression I am getting at all on the street.
IMHO, the biggest gain (in learning Python vs LabVIEW) is that you'll
add a very valuable tool to your toolbox - the missing link between C
and shell scripts.


Thanks for the comments!



--
Good day!

________________________________________
Christopher R. Carlen
Principal Laser&Electronics Technologist
Sandia National Laboratories CA USA
(e-mail address removed)
NOTE, delete texts: "RemoveThis" and
"BOGUS" from email address to reply.
 
B

Bruno Desthuilliers

Chris Carlen a écrit :
Bruno said:
Chris Carlen a écrit :
Must possibly learn a completely new way of thinking (OOP)


Not necessarly. While Python is OO all the way down - meaning that
everything you'll work with will be an object (functions included) -,
it doesn't *force* you into OO (IOW : you don't have to define classes
to write a Python program). You can as well use a procedural - or even
somewhat functional - approach, and most Python programs I've seen so
far are usually a mix of the three.
not just a new language syntax.


You forgot one of the most important part of a language : idioms. And
it's definitively *not* idiomatic in Python to use classes when a
simpler solution (using plain functions and modules) is enough.


I see. That's very promising. I guess some articles I read painted a
picture of religiousity among OOP programmers.

That's alas a common disease - I'd say the best way to be definitively
disgusted from OO is to read comp.lang.object :(
But that is not the
impression I am getting at all on the street.

Heck. As you said, the important is to get things done. And I guess
that's why we all (here) love Python. Last time I had to work on a
Pascal program (actually Delphi's ObjectPascal, but the whole thing was
almost caricaturally procedural), I found myself having to write tens of
lines of code for thing that would have been no-brainer one-liners in
Python, and define new types (records - Pascal's structs) where Python's
builtin dict type would have do the trick. It's not a matter of
procedural vs OO vs functional, it's a matter of using the appropriate
tool for the job.
 
D

Dave Baum

Chris Carlen said:
Why would OOP be better? Different is not better. Popular is not
better. What the academics say is not better. Less lines of code might
be better, if the priority is ease of programming. Or, less machine
execution time or memory usage might be better, if that is the priority.

Short answer: Increasing programmer productivity is better, and OO
frequently accomplishes this.

Longer answer:

Consider OOP as one tool in the toolbox. It isn't "best" for every
conceivable problem, but neither is procedural programming, functional
programming, table driven state machines, or any other style of design
and/or programming. Each works well in some situations and poorly in
others. Having a large number of tools at your disposal, and knowing
which ones to use, is a big plus.

Let's talk about design versus implementation for a moment, since OO
really applies to both, but in different ways. You mentioned state
machines, and that is a good example of a design technique. You can
look at a problem and convert it to a state machine (design), then
implement the state machine in whatever language your computer
understands. Over the years, finite state machines have proven to be
very effective models because:

1) there are plenty of real world problems that map cleanly to a state
machine

2) state machines are easy to implement reliably in most computer
languages

3) if you have already implemented a state machine for problem A, then
implementing it for problem B is pretty easy - the real work is
translating problem B into a state machine

OOD is similar. There are a large number of problems for which an
object oriented design is a good fit. Once you have an OO design, you
then need to implement it, and languages with good OO support make this
a lot easier.

From what I have seen, the advantages of OO tend to increase with the
size of the project. For example, polymorphism generally lets M clients
work with N different kinds of objects by writing M+N chunks of code
rather than M*N. When M or N is small, this difference in minor, but as
M and N increase, it becomes significant.

By combining state and function, objects provide a good means of
encapsulating operations and keeping client code independent of lower
level code. This is a very big win since it allows for the evolution of
the lower level code without breaking all of the client code. As with
polymorphism, the benefits of encapsulation tend to increase with the
size of the project.

Even before OO languages were popular, it was quite common to use some
basic OO design in order to increase encapsulation. If you look at
almost any GUI framework from the 80's or early 90's you'll find lots of
C code with structs that the user is not supposed to mess with, and then
functions that take pointers/handles/indexes to those "magic" structs as
the first argument. This is really OO design implemented in a
procedural language. In fact, GUI frameworks are an excellent example
of a domain for which OO has established itself a very good way to model
the problem.

I could probably spend a lot more time on the merits of OO, but I think
if you really want to understand its benefits you need to work with it
in a domain for which OO is useful. It is possible that the specific
projects you work on really wouldn't benefit much from OO, and that is
why you haven't had the "a-ha!" moment. Forcing an OO model onto a
problem just for the sake of OO will only result in frustration.

Dave
 
W

Wayne Brehaut

Chris Carlen a écrit :

=== 8< ===
OTOH, OO is about machines - at least as conceveid by Alan Key, who
invented the term and most of the concept. According to him, each object
is a (simulation of) a small machine.

Oh you young'uns, not versed in The Ancient Lore, but filled with
self-serving propaganda from Xerox PARC, Alan Kay, and Smalltalk
adherents everywhere!

As a few more enlightened have noted in more than one thread here, the
Mother of All OOP was Simula (then known as SIMULA 67). All Alan Kay
did was define "OOPL", but then didn't notice (apparently--though this
may have been a "convenient oversight") that Simula satisfied all the
criteria so was actually the first OOPL--and at least 10 years earlier
than Smalltalk!

So Kay actually invented NONE of the concepts that make a PL an OOPL.
He only stated the concepts concisely and named the result OOP, and
invented yet another implementation of the concepts-- based on a
LISP-like functional syntax instead of an Algol-60 procedural syntax,
and using message-passing for communication amongst objects (and
assumed a GUI-based IDE) (and introduced some new terminology,
especially use of the term "method" to distinguish class and instance
procedures and functions, which Simula hadn't done) .

As Randy Gest notes on http://www.smalltalk.org/alankay.html, "The
major ideas in Smalltalk are generally credited to Alan Kay with many
roots in Simula, LISP and SketchPad." Too many seem to assume that
some of these other "features" of Smalltalk are part of the definition
of an OOP, and so are misled into believing the claim that it was the
first OOPL. Or they claim that certain deficiencies in Simula's object
model--as compared to Smalltalk's--somehow disqualifies it as a "true
OOPL", even though it satisfies all the criteria as stated by Kay in
his definition. See http://en.wikipedia.org/wiki/Simula and related
pages, and "The History of Programming Languages I (HOPL I)", for
more details.

Under a claim of Academic Impunity (or was that "Immunity"), here's
another historical tid-bit. In a previous empolyment we once had a
faculty applicant from CalTech who knew we were using Simula as our
introductory and core language in our CS program, so he visited Xerox
PARC before coming for his inteview. His estimate of Alan Kay and
Smalltalk at that time (early 80s) was that "They wanted to implement
Simula but didn't understand it--so they invented Smalltalk and now
don't understand _it_!"

wwwayne

=== 8< ===
 
A

Aahz

From what I've read of OOP, I don't get it.

For that matter, even using OOP a bit with C++ and Perl, I didn't get it
until I learned Python.
The problem for me is that I've programmed extensively in C and .asm on
PC DOS way back in 1988.

Newbie. ;-)

(I started with BASIC in 1976.)
Form 2: Use Python and PySerial and TkInter or wxWidgets.

Pro: Cross-platform goal will likely be achieved fully. Have a
programmer nearby with extensive experience who can help.
Con: Must learn new language and library. Must possibly learn a
completely new way of thinking (OOP) not just a new language syntax.
This might be difficult.

My experience is that learning GUI programming is difficult. Moreover,
GUI programming in C involves a lot of boilerplate that can be automated
more easily with Python. So I think this will be a better solution.

Note very very carefully that Python does not require an OOP style of
programming, but it will almost certainly be the case that you just
naturally start using OOP techniques as you learn Python.
 
T

Tony23

Chris said:
John said:
Chris Carlen wrote:[edit]
Hence, being a hardware designer rather than a computer scientist, I
am conditioned to think like a machine. I think this is the main
reason why OOP has always repelled me.

Why?

When pointers were first explined to me, I went "Ok." And rather
quickly ideas lit up in my head about what I could do with them.

When I read what OOP is, that doesn't happen. All I think is "what's
the point of this?" "What can this do for me that I can do already with
the procedural way of thinking?" And if it can't do anything new, then
why rearrange my thinking to a new terminology? It's results that
matter, not the paradigm.

I have been programming since 1978. I started off with BASIC, learned
Assembly and Pascal, and much later eventually moved on to Javascript,
Perl, and PHP. All of my work was done procedurally.

Recently, I have been working on a very large project involving a lot of
OO-Javascript. For what we are doing on the project, OO makes sense. I
really didn't get OOP until working on this project - probably because I
never did anything that really needed it.

I have found myself leaning more toward the OO paradigm since doing
this, after 25+ years of procedural programming, and now I find myself
doing more work with OO concepts, and getting things done even faster,
and with less work, than I used to.

But I still have a problem with STRICT OOP - which is why I like Python.
Use OO where it's useful, use procedural when that works best.

I suspect that the reason it isn't clicking for you is twofold: 1) You
don't do anything currently that has an obvious need for OOP, and 2) You
haven't done anything with OOP.

A couple ideas:

1) Maybe you can try building a relatively trivial program that would
more naturally use an OO methodology - perhaps a simple videogame like
Pac-man? The 'monsters' would be objects, with properties such as color,
X-position, Y-position, etc. - make yourself work in OO terms

2) This may seem silly, but download & play with "Scratch"
(http://scratch.mit.edu) - it's basically an introduction to programming
for kids, but it's completely OO, and super easy to use. It might be
useful to help you to see the 'grand view' better.

3) Give in to the dark side :)

Good luck - after so much time invested in one way of thinking, it's not
easy to change.
 
C

Chris Carlen

Aahz said:
For that matter, even using OOP a bit with C++ and Perl, I didn't get it
until I learned Python.


Newbie. ;-)

(I started with BASIC in 1976.)

Heh heh, I actually first programmed when the RadioShack TRS-80 came
out. I think I saw it first in 1978 when I was 11. I would hang out in
the store for hours writing crude video games.
My experience is that learning GUI programming is difficult. Moreover,
GUI programming in C involves a lot of boilerplate that can be automated
more easily with Python. So I think this will be a better solution.

Note very very carefully that Python does not require an OOP style of
programming, but it will almost certainly be the case that you just
naturally start using OOP techniques as you learn Python.


Thanks for the input!


--
Good day!

________________________________________
Christopher R. Carlen
Principal Laser&Electronics Technologist
Sandia National Laboratories CA USA
(e-mail address removed)
NOTE, delete texts: "RemoveThis" and
"BOGUS" from email address to reply.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,754
Messages
2,569,520
Members
44,996
Latest member
rainocode

Latest Threads

Top