Python was designed (was Re: Multi-threading in Python vs Java)

S

Steven D'Aprano

--> int="five"
--> [int(i) for i in ["1","2","3"]]

TypeError: str is not callable

Now how are you going to get the original int type back?


Trivially easy:

py> int
<type 'int'>
py> int = "five" # Oops!
py> int(42.5)
Traceback (most recent call last):
File "<stdin>", line 1, in ?
TypeError: 'str' object is not callable
py> del int
py> int(42.5) # Phew!
42


Thank you for bringing this back up. Was it you who suggested that
built-in are re-assignable?


It's not just a suggestion, it is a fact. The built-ins are just a
namespace, like any other module, class, or other namespace.

(Of course, if you break something in the built-ins, the consequences are
likely to be significantly more wide-ranging, but that's another issue.)

However, in the code shown above, you don't actually reassign a built-in.
You merely shadow it within the current module. Do you understand the
difference? In the above, the *builtin* int still exists, but your code
can no longer get direct access to it because a *global* int gets in the
way. Using Python 2.7:

py> import __builtin__ as builtins
py> builtins.int
<type 'int'>
py> int = "five"
py> int
'five'
py> builtins.int
<type 'int'>

so deleting the global "int" simply reveals the otherwise hidden
builtin.int instead.

However, if you rebind the builtin, Python doesn't remember what the old
value was (although in principle it could):

py> del int # get rid of the global
py> int is builtins.int
True
py> builtins.int = "six" # oh oh, this could be bad
py> int
'six'
py> del int
Traceback (most recent call last):
File "<stdin>", line 1, in ?
NameError: name 'int' is not defined



In this case, deleting the builtin doesn't magically recover it, it just
deletes it:

py> del builtins.int
py> int
Traceback (most recent call last):
File "<stdin>", line 1, in ?
NameError: name 'int' is not defined


At this point, in general, you've buggered up the current Python
environment and would normally need to restart the interpreter. But in
this specific case, all is not quite so lost: we can recover from this if
only we can find another reference to the int built-in type, and restore
it to the builtins:


py> builtins.int = type(42)
py> int("23")
23


I see no reason why Python couldn't create a read-only "backup builtins"
namespace, but on the other hand, why bother?

Because this is a bad idea for the reasons I just showed.

"Breaking things" is always a bad idea. But re-binding is not necessarily
a bad thing. Let's say I want to find out how often the built-in "len"
function is called by some module:


py> count = 0
py> def mylen(x):
.... global count
.... count += 1
.... return _len(x)
....
py> _len = len # save the real len
py> builtins.len = mylen # monkey-patch the builtins
py> import mymodule
py> count
3

Now obviously this is a trivial example. But there are more interesting,
and useful, reasons for monkey-patching builtins, usually for testing and
debugging purposes. Such a technique should be used with caution, but it
can be used.
 
S

Steven D'Aprano

Even Python, which isn't strongly typed

I see that in a later message you have stepped away from that
misconception, but I think it is well worth reading this essay:

https://cdsmith.wordpress.com/2011/01/09/an-old-article-i-wrote/

previously known as "What To Know Before Debating Type Systems".


I think the author goes a little too far to claim that "strong" and
"weak" are meaningless terms when it comes to type systems. I think it is
quite reasonable to accept that there is no hard and fast line dividing
"strongly" and "weakly" typed languages, without concluding that the
terms are meaningless. I think it is reasonable to say that Haskell has a
very strong type system, since it will (almost?) never allow any
operation on an unexpected type, or automatically convert one type to
another. Pascal is a little weaker, since it will automatically convert
numeric types but nothing else. Perl and PHP are a lot weaker, since they
will convert strings to numbers and vice versa. If you want to draw the
line between "strong" and "weak" so that Pascal is on one side and Perl
on the other, that seems reasonable to me.

One thing he missed is that there are untyped languages where everything
is the same type. If everything is the same type, that's equivalent to
there being no types at all. Examples include TCL and Hypertalk, where
everything are strings, and Forth, where everything are two-byte words.

But I digress. Apart from those couple of little criticisms, I think it
is a very useful article to read.
 
R

rusi

I learned a new word yesterday: ultracrepidarian. :)

Hehe!
And if you had uttered 'ultracrepidarian' before yesterday you would have been
ultracrepidarian. After that not.
[With frank and free respect to the power of google and cut-n-paste]
 
C

Chris Angelico

One thing he missed is that there are untyped languages where everything
is the same type. If everything is the same type, that's equivalent to
there being no types at all. Examples include TCL and Hypertalk, where
everything are strings, and Forth, where everything are two-byte words.

But I digress. Apart from those couple of little criticisms, I think it
is a very useful article to read.

Continuing the digression slightly: If everything's a string, how do
you handle aggregate types like arrays? Are they outside the type
system altogether (like in C, where an array-of-int isn't something
you can pass around, though pointer-to-int is)? The only language I've
worked with that has "everything is strings" is REXX, and it does some
fancy footwork with variable names to do mappings, with a general
convention around the use of stem.0 to create ersatz arrays (probably
how JavaScript got the idea).

ChrisA
 
S

Steven D'Aprano

Continuing the digression slightly: If everything's a string, how do you
handle aggregate types like arrays? Are they outside the type system
altogether (like in C, where an array-of-int isn't something you can
pass around, though pointer-to-int is)?

I don't know about TCL, but in Hypertalk, when I said everything is a
string, I meant it. If you want a list of strings, you create one big
string using some delimiter (usually spaces, commas or newlines). So I
might say something like:

# it's been a few years, I may have some of the syntax wrong
put field "list of stuff" into text
for i = 1 to the number of lines of text:
get line i of text
if word 3 of item 6 of it is "stop" then do_stop()
else do_start(word 1 of item 2 of it)

Hypertalk uses (almost) natural language chunking: lines are chunks of
text separated by newlines, items are separated by commas, and words are
separated by spaces. So you can easily implement up to three dimensional
arrays:

a b,c d,e f
g h,i j,k l
m n,o p,q r

is a list of three lines, each line having three items, each item having
two words. (Oh, and there's one more layer of chunking: the character or
char. Guess what that does?)


Actually, perhaps it's not quite true that everything is a string.
Hypertalk also has fields, which are text fields in the GUI environment.
Fields have properties such as the textsize and the font, as well as
contents, which are strings. There are also buttons, which don't have
contents, although some of them can have state like On or Off. There are
cards, which contain fields and buttons, and backgrounds, which contain
cards, and stacks, which contain backgrounds. So it actually was rather
object-oriented in a way, but the objects were completely tied to the GUI
environment. You couldn't directly create an abstract field object,
instead you treated it like a macro playback and did things like this:

choose menu item "New Field" from "Tools" menu
set the name of field "New Field" to "foo"
set the rect of field "foo" to 25,25,100,200

or if you were really keen, or perhaps foolish:

select field tool
drag from 25,25 to 100,200
set the name of field (the number of fields) to "foo"


Despite its flaws, it was great fun to program in, and the best
integrated GUI programming environment I've every seen by far.
 
C

Chris Angelico

I don't know about TCL, but in Hypertalk, when I said everything is a
string, I meant it. If you want a list of strings, you create one big
string using some delimiter (usually spaces, commas or newlines).

Fair enough. As a system, that works reasonably cleanly... if a little
inefficiently, since you need to delimit everything. But hey, your
arrays are first-class objects by definition, and that's a good thing!

ChrisA
 
P

Peter Cacioppi

I think the author goes a little too far to claim that "strong"
"weak" are meaningless terms when it comes to type systems

I can live with that, actually.

The important language classifications are more along the lines of static vs. dynamic typing, procedural vs. functional, no objects vs. object based vs. true OO.

That probably starts another flame war, but this thread is already running around with its hair on fire.

I still say that object-based is a distinct and meaningful subset of object-oriented programming. The former can be implemented elegantly in a wide range of languages without much in the way of specific language support, the latter needs to designed into the language to allow a modicum of polymorhpic readability.

It's an important distinction, because a project that is constrained to C should (IMHO) target an object-based design pattern but not an object-oriented one. That said, I'm open to disputation-by-example on this point, provided the example is reasonably small and pretty. (If the only polymorphic C code is ugly and non-small, it sort of proves my point).
 
M

Mark Lawrence

I can live with that, actually.

The important language classifications are more along the lines of static vs. dynamic typing, procedural vs. functional, no objects vs. object based vs. true OO.

That probably starts another flame war, but this thread is already running around with its hair on fire.

I still say that object-based is a distinct and meaningful subset of object-oriented programming. The former can be implemented elegantly in a wide range of languages without much in the way of specific language support, the latter needs to designed into the language to allow a modicum of polymorhpic readability.

It's an important distinction, because a project that is constrained to C should (IMHO) target an object-based design pattern but not an object-oriented one. That said, I'm open to disputation-by-example on this point, provided the example is reasonably small and pretty. (If the only polymorphic C code is ugly and non-small, it sort of proves my point).

As far as I'm concerned all of the above belongs on
comp.theoretical.claptrap, give me practicality beats purity any day of
the week :)

--
Roses are red,
Violets are blue,
Most poems rhyme,
But this one doesn't.

Mark Lawrence
 
P

Peter Cacioppi

give me practicality beats purity any day of the week :)

Without some notion of theory you will end up with php instead of python (see how I looped the thread back around on track ... you're welcome).

If you think php is no worse than python for building reliable, readable code bases than god love you. Readability is huge for allowing efficient team development of larger projects, and readability flows from these sort of discussions.
 
R

rusi

I still say that object-based is a distinct and meaningful subset of
object-oriented programming.

Yes that is what is asserted by
http://www-public.int-evry.fr/~gibson/Teaching/CSC7322/ReadingMaterial/Wegner87.pdf
-- a classic though old reference
The former can be implemented elegantly in a wide range of languages without much in the way of specific language support, the latter needs to designed into the language to allow a modicum of polymorhpic readability.

3 examples were given (1) python's C implementation (2) OS/2 (3) Linux kernel
About 2 I dont know anything though I believe gdk and gobject are more contemporary examples.

About 1 I have reservations -- see below

IMO the linux kernel is the closest approx to what you are asking:
The details are here http://lwn.net/Articles/444910/
The top level summary is in the opening paras of
http://lwn.net/Articles/446317/
It's an important distinction, because a project that is constrained to C
should (IMHO) target an object-based design pattern but not an
object-oriented one. That said, I'm open to disputation-by-example on this
point, provided the example is reasonably small and pretty. (If the only
polymorphic C code is ugly and non-small, it sort of proves my point).

Yes this is an important point though hard to argue in a definitive way -- I call such arguments philosophical rather than scientific; ie it is important but it cant really be settled once and for all.

To see this one can start with two extremes:
Extreme 1: Computability (aka Turing) theory. From this pov every language/system/computer is equivalent to every other and people designing 'newer' and 'better' ones are wasting their's and other's time just like fashion designers who keep alternating pant-hems from Elvis Presley to narrow.

Extreme 2: Semicolon as separator differs from semicolon as terminator;
P4 processor is different from P2 etc etc -- essentially treating any difference as a substantive difference.

Clearly both extremes are unsatisfactory: the first lands us into the Turing-tarpit. The second makes a discussion of close-but-different impossible.

Just as the study of algorithms arose out of a desire to study program efficiency but with the nitty-gritty details of machines abstracted away, in the same way programming language semantics arose in order to study broad classes of languages with details hidden away.

Unfortunately, even after 50 years of trying, semantics has been a dismal failure in defining the what and where and whither of OOP.
In a sane world this would have signified that perhaps OOP as a concept(s) needs to be questioned.
Considering that the opposite has happened -- programming language semantics as an area has become distinctly 'old-fashioned' and not-so-respectable--I can only conclude that the world is not sane.

Well the tide is slowly turning -- here's a growing bunch of people questioning the validity of OOP:
http://en.wikipedia.org/wiki/Object-oriented_programming#Criticism

Of these I find two noteworthy:
1. Stepanov who is next to Stroustrup in C++ circles, calls OOP a hoax.
2. Carnegie Mellon university has eliminated OOP as "unsuitable for a modern CS curriculum"

And which is why I sympathize with Mark Janssen's passion to clean up the OOP act.
 
C

Chris Angelico

3 examples were given (1) python's C implementation (2) OS/2 (3) Linux kernel
About 2 I dont know anything though I believe gdk and gobject are more contemporary examples.

Good point, I believe you're right there. I haven't worked with
GTK/GDK in C, but when poking around the docs to work out how to use
them in Python or Pike, I've seen something of how it's done. And yes,
they're polymorphic and properly object oriented.

ChrisA
 
S

Steven D'Aprano

I don't know if I want to step into the flames here,

Go on, be bold! You learn a lot by making bold claims and having them
shot down. Or at least, I did. Now I know everything, so I can afford to
be humble.

*wink*

but my
understanding has always been that in the absence of polymorphism the
best you can do is "object based" programming instead of "object
oriented" programming.

Well, that surely depends on the semantics of what you mean by "object
based" versus "object oriented", and I don't think there is any one hard,
universally agreed upon definition of those.

Object based programming is a powerful step forward. The insight that by
associating data structures and methods together you can significantly
improve readability and robustness.

This implies that "object-based" simply means that you have *syntax* for
associating methods with data, i.e. objects. I don't think I would agree
with that definition. For instance, I often describe Python as "object-
based" in the sense that *all* values in Python are objects, even things
which would be low-level primitives in some other languages, although you
can still write procedural, imperative or functional-style code.

Object oriented programming takes things further, most significantly by
introducing the idea that the object reference you are referencing might
be a run time dependent sub-class.

And I *strongly* disagree with this. I wonder whether you have read this?

http://en.wikipedia.org/wiki/Object-oriented_programming#Fundamental_features_and_concepts

Quote:

Benjamin C. Pierce and some other researchers view any attempt
to distill OOP to a minimal set of features as futile. He
nonetheless identifies fundamental features that support the
OOP programming style in most object-oriented languages:
[list of five feature]

Similarly, in his 2003 book, Concepts in programming languages,
John C. Mitchell identifies four main features: [...] Michael
Lee Scott in Programming Language Pragmatics considers only
[three features]


It is notable that polymorphism is *not* one of the three features listed
by Scott (although it is included by the other two). So I don't agree
that subtype polymorphism is necessary for OOP.

I can easily conceive of object-oriented languages with inheritance but
no subtype polymorphism. For instance, prototype-based OOP languages have
inheritance, but since they don't really have types in the class-based
OOP sense, they don't have subtypes, hence no subtype polymorphism.


Even Python, which isn't strongly typed,

That's not the case, although that's been discussed in other posts.
manages polymorphism by allowing the self argument to a sub-class
of the method class.

I must admit I don't really understand what this sentence is supposed to
mean.

There are many wonderful examples of object based programming in C. I
believe VB (not VB.net, the older VBA language) is object based but not
object oriented.

True object oriented programming
http://en.wikipedia.org/wiki/True_Scotsman_fallacy


seems to require proper support from
the language itself, because the run-time resolution of the "this/self"
reference needs specific constructs in the language.

Again, I don't understand what you are trying to say here. Provided that
the "this/self" reference has a type, what more does the language need to
provide? The reference itself is enough to identify the instance (since
it is the instance!) and the instance's type is enough to identify the
type (since it is the type!).

Bear in mind that my usual disclaimer when wading into the flames like
this is to quote Randy Newman ... "I may be wrong .... but I don't think
so!!"

:)
 
C

Chris Angelico

Go on, be bold! You learn a lot by making bold claims and having them
shot down.

Yes, it's a very effective technique. I just learned another meaning
of the word "trepan" via Savoynet that way. (It's a synonym for its
anagram "entrap", as well as being a surgical operation on the skull.
So now you know, too!)
I must admit I don't really understand what this sentence is supposed to
mean.

As I understand it, there's a little word missing: "... allowing the
self argument to BE a subclass...". That is, in this example:

class A:
def foo(self):
return "spam"
class B(A):
pass

x=B()
print(x.foo())

the method named foo and defined in class A might not get, as its
'self' argument, an instance of class A, but might instead get a
subclass thereof. Thus, polymorphism. Similarly, this C example cheats
a bit, but does work:

struct A
{
/* ... declare members here */
}
struct B
{
struct A base;
/* ... more members */
}

int foo(struct A *self)
{
/* ... */
}

int main()
{
struct B obj;
foo((struct A *)&obj);
}

It depends on the compiler not tinkering with the layout of the
structure at all, which I don't believe is guaranteed but is fairly
safe to assume. (The equivalent C++ code could use real inheritance,
and then it is guaranteed, plus the pointer can be cast implicitly.
But we already know C++ does object oriented code more cleanly.) As
far as foo() is concerned, it's been given a 'struct A', albeit one
with a few extra members after it.
Again, I don't understand what you are trying to say here. Provided that
the "this/self" reference has a type, what more does the language need to
provide? The reference itself is enough to identify the instance (since
it is the instance!) and the instance's type is enough to identify the
type (since it is the type!).

See above C example - except that true support would include implicit
upcasting, and would thus disallow cross-casting (which the C example
above would have problems with - you could cast any pointer to any
type with the exact same syntax and no compiler warning or error).

ChrisA
 
S

Steven D'Aprano

Yes that is what is asserted by
http://www-public.int-evry.fr/~gibson/Teaching/CSC7322/ReadingMaterial/ Wegner87.pdf
-- a classic though old reference

The truth of a definition is not really something that is amenable to
proof, only to agreement. In my experience, there is no widespread
agreement on what the terms "object oriented" or "object based"
programming mean. I expect that most people would consider them synonyms,
or might consider the first to have some vigorous meaning while the
second is just an informal term for a language that in some sense is
based on objects in some way.

Even if we agreed that there was a distinction between the two -- and I
don't think we do -- there is certainly no agreement as to what that
distinction actually is. There are far too many mediocre programmers with
limited experience outside of their narrow field who assume that whatever
sliver of C++/Java/Python/whatever that they learned is the One True
Definition of object-oriented programming. And too many academics looking
for hard boundaries between concepts which, fundamentally, exist in a
continuum.

It's all just part of the human tendency to pigeon-hole. According to
some, Java, which has many low-level machine primitive types, is an
object-oriented language, while Python, which has no machine primitives
and where every value is an object, is not. Explain that one, if you can.



[...]
Define ugly :)

One of the reasons multiple languages exist is because people find that
useful programming idioms and styles are *hard to use* or "ugly" in some
languages, so they create new languages with different syntax to make
those useful patterns easier to use. But syntax is not everything.
Whether you write:

object.method(arg) // Python, VB, Ruby, Java
object#method arg // OCaml
object:method arg // Lua
method object arg // Haskell, Mercury
object method arg // Io
object->method(arg) // C++, PHP
method(object, arg) // Ada, Dylan
send method(arg) to object // XTalk family of languages


etc. does not really change the fact that you are calling a method on an
object, despite the change in syntax. Even Forth has frameworks that let
you write object-[oriented|based] code using a stack and reverse Polish
notation syntax.


[...]
Just as the study of algorithms arose out of a desire to study program
efficiency but with the nitty-gritty details of machines abstracted
away, in the same way programming language semantics arose in order to
study broad classes of languages with details hidden away.

I don't think that programming language semantics arose so much in order
to *study* languages, more to *program*. Programming languages pre-date
the study of programming languages :)

Unfortunately, even after 50 years of trying, semantics has been a
dismal failure in defining the what and where and whither of OOP. In a
sane world this would have signified that perhaps OOP as a concept(s)
needs to be questioned. Considering that the opposite has happened --
programming language semantics as an area has become distinctly
'old-fashioned' and not-so-respectable-- I can only conclude that the
world is not sane.

All the words are in English, yet somehow I can't quite make sense of
this paragraph. You seem to be using "semantics" in a way that doesn't
quite make sense to me. To me, "programming language semantics" is the
*meaning* of code: "var = func(arg)" is the syntax, "call func with arg
and assign the result to var" is the semantics. What do you mean by it?

Well the tide is slowly turning -- here's a growing bunch of people
questioning the validity of OOP:
http://en.wikipedia.org/wiki/Object-oriented_programming#Criticism

*Criticism* is not the same as questioning the validity. Certainly I can
criticise OOP:

- if you care about machine efficiency, shoving small ints into a bulky
object wrapper is bad for machine efficiency;

- ravioli code is to OOP as spaghetti code was to BASIC;

- large, deep object hierarchies are complex and hard to understand and
learn;

etc. But then I can also criticise functional programming, declarative
programming, imperative programming, logic programming, etc. There is no
One True Programming Paradigm suitable for every task, just as there is
no One True Programming Language.

Of these I find two noteworthy:
1. Stepanov who is next to Stroustrup in C++ circles, calls OOP a hoax.
2. Carnegie Mellon university has eliminated OOP as "unsuitable for a
modern CS curriculum"

I can't imagine what Stepanov means by calling OOP a hoax. It certainly
exists. The fact that he equates it with AI as a hoax speaks more about
him than about OOP -- AI is quietly and without fanfare changing the
world. It's not just showy, high-profile stuff like the fact that the
best chess masters in the world are now machines, but little stuff, often
astonishingly important, that people don't necessarily associate with AI:

- autonomous weapons
- natural language parsing
- speech recognition
- data mining

etc. I think what happens is that whenever AI leads to progress, some
people redefine that area as "not AI". Or they presume that anything less
than HAL or 3-CPO means AI is a failure.

As far as Carnegie Mellon University, in isolation that doesn't say much.
It could mean that they've decided to shift to more academically
fashionable languages, like Mercury and Haskell; or it could mean that
they've decided that the business of "a modern CS curriculum" is to churn
out as many half-trained PHP monkeys as possible. (Substitute Java for
PHP, and that's the philosophy of a lot of CS departments.)

Sure enough, reading the quote in full, the answer is that they've
shifted to teaching functional languages, which is great for academic
rigor and not so great for either programmer productivity or machine
efficiency.
 
R

Roy Smith

Steven D'Aprano said:
According to
some, Java, which has many low-level machine primitive types, is an
object-oriented language, while Python, which has no machine primitives
and where every value is an object, is not. Explain that one, if you can.

That's easy to explain. Python doesn't have private data, therefore it
isn't OO. Well, according to the Java-esque gospel, at any rate.

Whenever somebody tells you what OO means, ask him what part of the
elephant he's touching.
 
R

rusi

That's easy to explain. Python doesn't have private data, therefore it
isn't OO. Well, according to the Java-esque gospel, at any rate.

Whenever somebody tells you what OO means, ask him what part of the
elephant he's touching.

Nice one ! Complementarily one can ask what does a typical Picasso cubist painting eg http://www.artchive.com/artchive/p/picasso/cadaques.jpg
represent? The avant gardists will wax eloquent.
And some (of us philistines) will of course say: Nothing.
Just as there may an elephant which we are too blind to see, its good to keep in mind the possibility that there may be big words without anything actually there.

Colloquially known as the emperor's new clothes

All the words are in English, yet somehow I can't quite make sense of
this paragraph. You seem to be using "semantics" in a way that doesn't
quite make sense to me. To me, "programming language semantics" is the
*meaning* of code: "var = func(arg)" is the syntax, "call func with arg
and assign the result to var" is the semantics. What do you mean by it?

Yes its always good to be skeptical about the emperor's new clothes…

I thought I'd point to wikipedia http://en.wikipedia.org/wiki/Semantics_(computer_science) but its doing a less than stellar job of it so heres mythoughts:


Programming language semantics categorises roughly into:
1. Operational
2. Denotational
3. Axiomatic
4. Algebraic

Some historical context:

1. proof theory vs model theory

In logic, the proof theorists want to study how to prove or reason about something
The model theorists, more platonically inclined, want to say what somethingis
Corresponds to the philosophical divide between epistemology and ontology/metaphysics.
The latter can be more 'grand' but is therefore subject to the most penetrating criticism.
Historically the big contributions of Aristotle and Plato were under the title of metaphysics and the big contributions of Kant and Hume were to show that for the most part metaphysics is bullshit.
In short these kinds of arguments are a bit older than you and me!!

In programming language research, the model-theorists end towards denotational semantics;
the proof theorists tend towards axiomatic and algebraic semantics;
The implementers come from the operational camp

2. programs ∈ languages
To me, "programming language semantics" is the *meaning* of code.

Yes this is an eminently sensible pov (and has precedents like Hoare, Dijkstra, Floyd etc -- the 'axiomatic' guys). However you cannot really talk of the meaning of programs without talking of the meaning of programming languages. Clearly this -- the meaning of the language -- is a grander project and just as in philosophy, is as liable to called out as bullshit. As an example the repeating argument about what IS a variable indicates the crying need to discuss this more metaphysically -- ie denotational semantically --because if we dont, the result is just a mess

- in C a variable *means* this
- in python it means that
- in math it means something
- etc
But then I can also criticise functional programming, declarative programming,
imperative programming, logic programming, etc. There is no One True Programming
Paradigm suitable for every task, just as there is no One True Programming Language.

Analogy:

When we deal with an existing code-base, we often find flaws/issues/gaffes/whatever in the design or architecture. On the other hand sometimes we find there is no design or architecture whatever
I believe there is a big difference between the two. In PLs think php.

I think what Stepanov is saying (and most of the others at http://en.wikipedia.org/wiki/Object-oriented_programming#Criticism ) is this distinction.

My own view in summary: Between programming and mathematics, mathematics isthe prior. This is obviously true as history including the name 'computer' and all its modifications. To the extent that "ontogeny recapitulates phylogeny" this is true in fact as well.

The OOP aficionados are straining to break their umbilical cord with X. That X may -- depending on context/background -- be called math, logic, theory, philosophy, semantics, academics etc etc.
That name is less relevant than the lack of consideration: Where will our sustenance come from?

And I am reminded of Mark Twain:
------
When I was 13 years old I thought my father an utter fool. When I became 21I was amazed at how much the old man had learnt in 8 years.
------

Do you want to make a date for opening this discussion again after 8 years <wink>??

Here's my side of the bet: OOP will be as respectable in 2021 as the goto is today?

Saying which reminds me of a personal memory. My teacher of programming once told me:
What the goto does to control structure, the assignment does to data structure.
Much of my trajectory as a programmer and teacher of programming is an elaboration of that statement.
 
P

Peter Cacioppi

I've written a fair bit of code in pure C, C++, C#, Java and now getting there in Python.

The difference between C# and Java is fairly minor.

The others have large and significant differences between them. Garbage collectors or not is huge. Exceptions or not is huge. Dynamic or static typing is huge. Language support for polymorphism or not is huge.

C code invokes a very meaningful overhead of memory management. The task ofinsuring that memory doesn't leak here is far larger than in garbage collected languages, and much more difficult than C++ (which can guarantee stackbased destruction).

This is just one language feature. I could go on and on. The idea that the differences between these languages is just syntactic sugar and aesthetics is so profoundly misguided that I can only assume that this misconception was proposed as a bizarre form of trolling.
 
C

Chris Angelico

One of the reasons multiple languages exist is because people find that
useful programming idioms and styles are *hard to use* or "ugly" in some
languages, so they create new languages with different syntax to make
those useful patterns easier to use. But syntax is not everything.
Whether you write:

object.method(arg) // Python, VB, Ruby, Java
object#method arg // OCaml
object:method arg // Lua
method object arg // Haskell, Mercury
object method arg // Io
object->method(arg) // C++, PHP
method(object, arg) // Ada, Dylan
send method(arg) to object // XTalk family of languages


etc. does not really change the fact that you are calling a method on an
object, despite the change in syntax. Even Forth has frameworks that let
you write object-[oriented|based] code using a stack and reverse Polish
notation syntax.

There seems to be a school of thought that "if it doesn't have special
syntax, the language doesn't support it". This is true to an extent
(eg C doesn't support sockets, the code is all in the standard
library), and it does distinguish C from C++ in object orientation
(C's version is simply dedicating the first function argument to
'self', where C++ and Python and so on have an alternative syntax that
puts the self/this argument before the function name), but I don't
know that it's the whole story. Python 2.x has special (and somewhat
magical) syntax for file I/O; Python 3.x, as far as I know, has none -
open and print are just builtins, with no magic around them. Which is
better? I would say that standard functions are inherently more
flexible (you can shadow print and call the original, for instance),
but that would mean that the better way is to NOT "support" I/O.

If an analogy helps, let's look at the trading card game Magic: The
Gathering. The general design principle is that unusual effects get
written out on the card, rather than having actual rules to support
them; rulebook text is used only when printing abilities on the card
is unable to do everything. (There are also keyword abilities, which
are like Python's builtins - "fear" simply means "can't be blocked
except by artifact and/or black creatures", and it could just as well
be written out on the card.) Does this mean that Magic supports
lifelink (which has entries in the Comprehensive Rulebook) but doesn't
support defenders (which simply can't attack)? I think not; the game
is most definitely designed for both.

Granted, C probably wasn't designed with object orientation in mind;
but it doesn't take very much effort to make it work (just pointer
casts and some argument order conventions). Sure, a bit of real
support helps (preventing accidental cross-casts), but it's pretty
easy to manage without. Adding to the list of object oriented C
systems is NIH:

http://ifdeflinux.blogspot.com/2012/05/quick-libnih-tutorial.html

It's small enough and light enough to be used by really early systems
like Upstart (a Linux bootloader) and it's a garbage-collected
object-oriented C library.

ChrisA
 
C

Chris Angelico

I've written a fair bit of code in pure C, C++, C#, Java and now getting there in Python.

The difference between C# and Java is fairly minor.

The others have large and significant differences between them. Garbage collectors or not is huge. Exceptions or not is huge. Dynamic or static typing is huge. Language support for polymorphism or not is huge.

This is just one language feature. I could go on and on. The idea that the differences between these languages is just syntactic sugar and aesthetics is so profoundly misguided that I can only assume that this misconceptionwas proposed as a bizarre form of trolling.

I don't think anyone's denying that there are differences. If there
weren't, we'd all be using the same language! But most of what you
said isn't object orientation. Garbage collection is huge, but it's
nothing to do with OOP. Exceptions are completely separate again. (And
exception *usage* is separate from exceptions. C++ and PHP both
support exceptions, but most operations don't raise them. Python, on
the other hand, uses exceptions for everything.)

ChrisA
 
C

Chris Angelico

[ a whole lot of stuff ]

As my crystal ball is once again being mended, would you please be kind
enough to tell all of us who and exactly what you're replying to.

Mine is in service at the moment. It says that Peter was actually trying to say:

"I use Google Groups and it sucks, so I delete all the context because
then nobody can see how much it sucks at showing context."

Peter, please can you use a different posting method? GG doesn't wrap
text properly, so it often results in really long lines with only a
single angle-bracket marking it as a quote, which makes the _next_
level of citation ugly.

ChrisA
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,772
Messages
2,569,593
Members
45,105
Latest member
sheetaldubay7750ync
Top