Does Python really follow its philosophy of "Readability counts"?

P

Paul Rubin

Carl Banks said:
Guess what systems I worked on that didn't even use scoping? I wrote
code for the GP7000 (equipped on some Airbus 380s) and the F-136
(which will be equipped on some F-35 fighters) engine controllers.
Neither one used any data hiding. The language was C (not C++), but
it was generated from schematic diagrams.

Generated from a schematic by a program you mean? In that case, the C
was used sort of like assembly code emitted by a compiler. Not really
the same situation.
Would you like to adopt GE's practice of schematic-generated C with no
namespaces or data hiding? No? Then don't be telling me I have to
embrace Boeing's.

All you're telling us is that GE makes foolish choices.
 
P

Paul Rubin

Bruno Desthuilliers said:
Nope, but your suggestion would have the same practical result as far
as I'm concerned.

Sorry, I don't comprehend that. The rest of your post makes no sense
as a consequence.

Python already had such a change when it deprecated and later got rid
of string exceptions. It's still Python.
 
R

Russ P.

I thought you were done wasting time with this nonsense.

So did I.
Guess what systems I worked on that didn't even use scoping?  I wrote
code for the GP7000 (equipped on some Airbus 380s) and the F-136
(which will be equipped on some F-35 fighters) engine controllers.
Neither one used any data hiding.  The language was C (not C++), but
it was generated from schematic diagrams.

Would you like to adopt GE's practice of schematic-generated C with no
namespaces or data hiding?  No?  Then don't be telling me I have to
embrace Boeing's.

Well, that's interesting. But you say the code was "generated from
schematic diagrams." Does that mean it was automatically generated by
machine? If so, then the concerns about encapsulation may no longer
apply. In that case, the schematics were the implementation
"language," and the code that was generated was essentially a higher
level version of assembly or machine code (because humans don't work
with it directly).

I know some researchers in software engineering who believe that the
ultimate solution to software reliability is automatic code
generation. The don't really care much which language is used, because
it would only be an intermediate form that humans don't interact with
directly. In that scenario, humans would essentially use a "higher
level" language such as UML or some such thing.

I personally have a hard time seeing how that could work, but that may
just be due to be my own lack of understanding or vision.
 
P

Paul Rubin

Russ P. said:
I know some researchers in software engineering who believe that the
ultimate solution to software reliability is automatic code
generation. The don't really care much which language is used, because
it would only be an intermediate form that humans don't interact with
directly. In that scenario, humans would essentially use a "higher
level" language such as UML or some such thing.

I personally have a hard time seeing how that could work, but that may
just be due to be my own lack of understanding or vision.

The usual idea is that you would write a specificiation, and a
constructive mathematical proof that a certain value meets that
specification. The compiler then verifies the proof and turns it into
code. Coq (http://coq.inria.fr) is an example of a language that
works like that. There is a family of jokes that go:

Q. How many $LANGUAGE programmers does it take to change a lightbulb?
A. [funny response that illustrates some point about $LANGUAGE].

The instantiation for Coq goes:

Q. How many Coq programmers does it take to change a lightbulb?
A. Are you kidding? It took two postdocs six months just to prove
that the bulb and socket are threaded in the same direction.

Despite this, a compiler for a fairly substantial C subset has been
written mostly in Coq (http://compcert.inria.fr/doc/index.html). But,
this stuff is far far away from Python.

I have a situation which I face almost every day, where I have some
gigabytes of data that I want to slice and dice somehow and get some
numbers out of. I spend 15 minutes writing a one-off Python program
and then several hours waiting for it to run. If I used C instead,
I'd spend several hours writing the one-off program and then 15
minutes waiting for it to run, which is not exactly better. (Or, I
could spend several hours writing a parallel version of the Python
program and running it on multiple machines, also not an improvement).
Often, the Python program crashes halfway through, even though I
tested it on a few megabytes of data before starting the full
multi-gigabyte run, because it hit some unexpected condition in the
data that could have been prevented with more compile time checking
that made sure the structures understood by the one-off script matched
the ones in the program that generated the input data.

I would be ecstatic with a version of Python where I might have to
spend 20 minutes instead of 15 minutes writing the program, but then
it runs in half an hour instead of several hours and doesn't crash. I
think the Python community should be aiming towards this.
 
B

Bruno Desthuilliers

Paul Rubin a écrit :
Sorry, I don't comprehend that.

IIRC, your suggestion was that one should have to explicitely allow
"dynamic binding" (ie: outside the initializer) of new attributes, and
that the default vould be to disallow them. That's at least what I
understood from :

"""
There are cases where this is useful but they're not terribly common.
I think it would be an improvement if creating new object attributes
was by default not allowed outside the __init__ method. In the cases
where you really do want to create new attributes elsewhere, you'd
have to explicitly enable this at instance creation time, for example
by inheriting from a special superclass:

class Foo (DynamicAttributes, object): pass
"""

(snip)
Python already had such a change when it deprecated and later got rid
of string exceptions.

I really don't get how this would be comparable with the above
suggestion. I can well understand your concerns wrt/ Python's
performances (even if I don't agree on your proposed solutions), but
this one "argument" really looks like a straw man.
 
S

Steve Holden

Paul said:
Russ P. said:
I know some researchers in software engineering who believe that the
ultimate solution to software reliability is automatic code
generation. The don't really care much which language is used, because
it would only be an intermediate form that humans don't interact with
directly. In that scenario, humans would essentially use a "higher
level" language such as UML or some such thing.

I personally have a hard time seeing how that could work, but that may
just be due to be my own lack of understanding or vision.

The usual idea is that you would write a specificiation, and a
constructive mathematical proof that a certain value meets that
specification. The compiler then verifies the proof and turns it into
code. Coq (http://coq.inria.fr) is an example of a language that
works like that. There is a family of jokes that go:

Q. How many $LANGUAGE programmers does it take to change a lightbulb?
A. [funny response that illustrates some point about $LANGUAGE].

The instantiation for Coq goes:

Q. How many Coq programmers does it take to change a lightbulb?
A. Are you kidding? It took two postdocs six months just to prove
that the bulb and socket are threaded in the same direction.

Despite this, a compiler for a fairly substantial C subset has been
written mostly in Coq (http://compcert.inria.fr/doc/index.html). But,
this stuff is far far away from Python.

I have a situation which I face almost every day, where I have some
gigabytes of data that I want to slice and dice somehow and get some
numbers out of. I spend 15 minutes writing a one-off Python program
and then several hours waiting for it to run. If I used C instead,
I'd spend several hours writing the one-off program and then 15
minutes waiting for it to run, which is not exactly better. (Or, I
could spend several hours writing a parallel version of the Python
program and running it on multiple machines, also not an improvement).
Often, the Python program crashes halfway through, even though I
tested it on a few megabytes of data before starting the full
multi-gigabyte run, because it hit some unexpected condition in the
data that could have been prevented with more compile time checking
that made sure the structures understood by the one-off script matched
the ones in the program that generated the input data.

I would be ecstatic with a version of Python where I might have to
spend 20 minutes instead of 15 minutes writing the program, but then
it runs in half an hour instead of several hours and doesn't crash. I
think the Python community should be aiming towards this.

RPython might help, but of course it wouldn't allow you the full language.

regards
Steve
 
B

Brian Allen Vanderburg II

Here is a piece of C code this same guy showed me saying Pythonic
indention would make this hard to read -- Well lets see then!

I swear, before god, this is the exact code he showed me. If you don't
believe me i will post a link to the thread.

// Warning ugly C code ahead!
if( is_opt_data() < sizeof( long double ) ) { // test for insufficient
data
return TRUE; // indicate buffer empty
} // end test for insufficient data
if( is_circ() ) { // test for circular buffer
if( i < o ) { // test for data area divided
if( ( l - o ) > sizeof( long double ) ) { // test for data
contiguous
*t = ( ( long double * ) f )[ o ]; // return data
o += sizeof( long double ); // adjust out
if( o >= l ) { // test for out wrap around
o = 0; // wrap out around limit
} // end test for out wrap around
} else { // data not contiguous in buffer
return load( ( char * ) t, sizeof( long double ) ); // return
data
} // end test for data contiguous
} else { // data are not divided
*t = ( ( float * ) f )[ o ]; // return data
o += sizeof( long double ); // adjust out
if( o >= l ) { // test for out reached limit
o = 0; // wrap out around
} // end test for out reached limit
} // end test for data area divided
} else { // block buffer
*t = ( ( long double * ) f )[ o ]; // return data
o += sizeof( long double ); // adjust data pointer
} // end test for circular buffer
I do a bit of C and C++ programming and even I think that is ugly and
unreadable. First of all there are 'way' to many comments. Why does he
comment every single line. Second of all I've always found that
brace/indent style to lead toward harder-to-read code IMHO. I think the
Allman style is the most readable followed by perhaps Whitesmiths style.

Brian Vanderburg II
 
S

Steven D'Aprano

Paul Rubin a écrit :

IIRC, your suggestion was that one should have to explicitely allow
"dynamic binding" (ie: outside the initializer) of new attributes, and
that the default vould be to disallow them.



Lots of heat and noise in this discussion, but I wonder, just how often
do Python programmers use this dynamism *in practice*? I hardly ever do.
I like that it is there, I like that Python is so easy to use without the
overhead of Java, but I rarely need *all* the dynamism available.

[sweeping generalization] Most objects people use are built-ins, and you
can't add attributes to them. I don't think I've ever done subclasses a
built-in just to get dynamic attributes:

class DynamicInt(int):
pass

x = DynamicInt(2)
x.attribute = "something"


As far as non built-in classes go:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: 'Decimal' object has no attribute 'something'

If I recall correctly, the first implementation of Decimal was written in
Python. Did anyone object to Decimal being re-written in C because they
missed the ability to add arbitrary attributes to Decimal instances?

And if they did, I'm pretty sure the answer given would have been: deal
with it. Subclass Decimal, or use delegation. Don't penalise 99% of uses
of Decimal for that 1% of uses where you need dynamism.

I think that's a wise decision.
 
S

Steven D'Aprano

That is a good point, we somehow lost sight of that in this thread.


I have had the impression that this is a somewhat accidental side effect
and shouldn't be relied on.

Not accidental, but people complain if you use slots for the purpose of
prohibiting attribute creation. They say "That's not what __slots__ was
designed for!". That's okay though, computers were designed for breaking
Germany ciphers and calculating the trajectory of cannon-shells, but
they're not the only things we use computers for these days.
 
P

Paul Rubin

Steven D'Aprano said:
Not accidental, but people complain if you use slots for the purpose of
prohibiting attribute creation. They say "That's not what __slots__ was
designed for!". That's okay though, computers were designed for breaking
Germany ciphers and calculating the trajectory of cannon-shells, but
they're not the only things we use computers for these days.

It's hard to figure out from the docs exactly what slots is supposed
to do:

http://docs.python.org/reference/datamodel.html#id3

I don't think it's good style to rely on accidental, undocumented
characteristics of some language feature. But, I may start using
slots for some of the stuff I'm doing, just to reduce space
consumption and keep more instances around, which is what it's really
intended for.
 
L

Luis Zarrabeitia

Quoting Paul Rubin said:
Often, the Python program crashes halfway through, even though I
tested it on a few megabytes of data before starting the full
multi-gigabyte run, because it hit some unexpected condition in the
data that could have been prevented with more compile time checking
that made sure the structures understood by the one-off script matched
the ones in the program that generated the input data.

Wait, do you _really_ believe that _static_ checks could prevent problems
arising from _unexpected_ conditions in the _data_?
 
P

Paul Rubin

Luis Zarrabeitia said:
Wait, do you _really_ believe that _static_ checks could prevent problems
arising from _unexpected_ conditions in the _data_?

The data does not arrive from outer space on a magtape stapled to a
meteor. It comes out of another program. Most of the problems in
processing it come from mismatches between the processing programs and
the generation programs. Static checks would help eliminate those
mismatches.
 
A

alex23

I would be ecstatic with a version of Python where I might have to
spend 20 minutes instead of 15 minutes writing the program, but then
it runs in half an hour instead of several hours and doesn't crash.

Paul, have you looked into Cython at all? I've only used it trivially
but it does seem to provide the potential for static typing & improve
compile time checking (as well as increased performance) while
retaining as Pythonic a representation as possible.
 
P

Paul Rubin

alex23 said:
Paul, have you looked into Cython at all? I've only used it trivially
but it does seem to provide the potential for static typing & improve
compile time checking (as well as increased performance) while
retaining as Pythonic a representation as possible.

I haven't. Maybe I should. I've looked at Pyrex and PyPy. I've
spent quite a bit of time over the past year or so studying Haskell,
which has been very educational and worthwhile, but not all that
useful for my day-to-day tasks in the present state of things.

I just looked up Cython and see that it's based on Pyrex. Worth
knowing about, I guess; but basically I think C is evil.

I may start looking into Pig Latin (http://wiki.apache.org/pig) for
these large data crunching tasks. Unfortunately it is written in Java
and my attitude towards Java is probably the one typical of Python
programmers, if you know what I mean.
 
P

Paul Rubin

Scott David Daniels said:
But, the research on the language "Self" shows that even in the face
of a language with more dynamism than Smalltalk (or Python), performance
can be obtained using compiler technology. It turns out you don't have
to type those type any extra keystrokes. Compilers capable of doing
strong optimization already have to do enough analysis that they can
discover the static typing that is available in the code you write
naturally. The way to get to such performance on Python is through
efforts like PyPy.

I'd be interested in seeing any publications about that Self research,
which I remember someone else mentioning in another thread as well.
However, part of the idea of the extra keystrokes is to allow the
compiler (or an external tool like Pylint) to flag any
type-inconsistency for closer programmer inspection, instead of just
quietly treating it as dynamic and generating the extra code for it.
The keystrokes let the tool know exactly when the dynamism is
intentional. As has been mentioned a few times already, Python 3.0
has some rudimentary features for type annotation, so this concept
isn't completely anathema to the Python developers.
 
P

Paul Rubin

James Mills said:
In fact, Python borrows features from the Functional Paradigm. Does
this make it a Functional Language ? No. Why ? Because one of the
clear requirements of the Functional Paradigm is that functions
cannot have side affects.

I'd say functional programming emphasizes a style that avoids use
of side effects, but saying "functions cannot have side effects" is
a bit of an overstatement. Even Haskell lets you code effectfully
when necessary, using the type system (IO Monad) to separate
effectful code from pure code.
 
R

Rhodri James

Yes, that would indeed be nice. I am certainly not the only one who
could use a language that is excellent for both research prototyping
*and* the final, safety-critical system. Then perhaps the prototype
could just be cleaned up and "hardened" for the end product rather
than rewritten in another language -- by programmers in another state
who may fail to understand many of the details that the prototype
developer agonized over.

You should always plan on throwing away your prototype anyway (though
few people do), because *something* in the architecture will turn out
to be not what you thought. And if the programmers in the other state
fail to understand the details, maybe the developer should have
documented a whole lot more (he says through gritted teeth, having
had the original code as the sole documentation far too many times).

We return you to your regularly scheduled original point...
 
P

Paul Rubin

Russ P. said:
I started looking at Scala a while back. It is has many nice features.
It seamlessly combines object orientation with advanced functional
programming. It is statically typed, but the types often do not need
to be explicitly declared. It is fully "compatible" with Java, but
(like Python) it is much less verbose and more elegant, typically
requiring less (in some cases, much less) than half the lines of code
for a given task.

I haven't examined Scala closely at all; my mile-high view was that it
was very ugly and closely bound to Java and that the main reason for
finding it interesting would be to interoperate with Java programs in
a JVM. Since I didn't care about that, I didn't pursue Scala further.
But I am reluctant to use Scala for several reasons. For example, it
doesn't have "continue" and "break." The Scala folks claim they are
unnecessary, but they are sure handy sometimes. Maybe my style is bad,
but I use continue and break a lot, particularly continue. Getting by
without them could be hard. The other thing is that Scala has no
default arguments and no argument passing by keyword. Those are a
couple of my favorite Python features. So I'm basically screwed.

I think in functional programming, "continue" and "break" lose some of
their importance since you normally don't code any loops at all (you
express iteration in terms of functions like "map").

Anyway, you might look at Haskell, although it has its own weirdness.
This old paper might be of interest:

http://www.haskell.org/papers/NSWC/jfp.ps
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,769
Messages
2,569,579
Members
45,053
Latest member
BrodieSola

Latest Threads

Top