Does Python really follow its philosophy of "Readability counts"?

R

Russ P.

(...)


Amen! The first thing said right in this entire thread! (one of)

--JamesMills

Wait a minute. Aren't the guy who just took me to task about the
definition of functional programming? So the definition of functional
programming is written in stone, but the definition of OO programming
is written in smoke?

Just for the record, I really don't care much about the definition of
OO programming. I brought it up only because someone tried to claim
that "enforced" encapsulation is a terrible idea. Well, as far as I
can tell, the majority of OO "programmers" (and software engineers,
software architects, etc.) seem to think otherwise. Maybe they are
wrong -- but I seriously doubt it.

As I said before, enforced encapsulation may not be appropriate for
every application, but it is definitely appropriate for some. Not
every door needs a lock, but certainly some do.
 
P

Paul Rubin

Michele Simionato said:
I would be fine having something like pylint built-in in the language
and running at every change of the source code (unless disabled with a
command line switch). I think this is the only reasonable solution to
get some additional protection we can hope for. A true change of the
language IMO is impossible, both technically, politically and for
legacy issue. Also, I am not convinced it would be a good idea, even
theoretically. It is easier to write a new Python-like language
from scratch than to add type checking to Python (I think you
were not proposing adding type checking in this post, right?).

I think this sub-thread has been mostly about dynamically creating new
class instance attributes, but yes, at one point I did suggest adding
type checking (ML-like inference) to pylint, presumably with feedback
to the compiler for optimization purposes. I noted that Python 3.0 in
fact has some features to support annotations for the purpose of
static type checking, so it's not as far off the wall as it might
sound.
 
A

alex23

I just looked up Cython and see that it's based on Pyrex.  Worth
knowing about, I guess; but basically I think C is evil.

I feel much the same way, but the recent modifications to Cython that
provide a "pure Python"[1] approach mean that you can minimise the
amount of C you need to write and still take advantage of features
like static types:

@cython.locals(s=cython.double, i=int, n=int)
def harmonic_sum(n):
s = 0
for i in range(1, n+1):
s += 1.0 / i
return s

The same code will both run in Python and compile via Cython. (I'm
surprised the new "pure" approach isn't more clearly promoted on the
site.)

1: http://wiki.cython.org/pure
 
R

Russ P.

I don't like that. Scala was designed with the idea of putting
together the two worlds, by I think the result was to get the
complications of both worlds. I don't think a SML-like functional
language needs to be integrated with object orientation, it is fine
just without it. The old quote
applies:

"""
Programming languages should be designed not by piling feature on top
of feature, but by removing the weaknesses and restrictions that make
additional features appear necessary. -- William Clinger
"""

I haven't used Scala yet, but I can't see any inherent problem with
combining OO and functional programming. Doesn't Python itself do that
to some extent, after all? The two paradigms are actually
complementary: FP is stateless and the OOP is state-based.

Some problems are best solved using OOP and some are best solved using
FP. Are you suggesting that a programmer should use a different
language for each problem? What if a software project involves many
subproblems, some of which are best solved using OOP and others of
which are best solved using FP? Should one language be used for one
part of the project and another language for the other? I don't think
so.
 
J

James Mills

Wait a minute. Aren't the guy who just took me to task about the
definition of functional programming? So the definition of functional
programming is written in stone, but the definition of OO programming
is written in smoke?

Did anyone say that ? OO concepts are a much
studied concept. However -you- still miss the basic
point. OO programming -is- a model not a paradigm.
Functional Programming -is- a paradigm.
Just for the record, I really don't care much about the definition of
OO programming. I brought it up only because someone tried to claim
that "enforced" encapsulation is a terrible idea. Well, as far as I
can tell, the majority of OO "programmers" (and software engineers,
software architects, etc.) seem to think otherwise. Maybe they are
wrong -- but I seriously doubt it.

Ever thought that perhaps you might be the one
that's wrong ? Not that it really matters, but I am
a Software Engineer myself.
As I said before, enforced encapsulation may not be appropriate for
every application, but it is definitely appropriate for some. Not
every door needs a lock, but certainly some do.

Your analogy is terrible. We are talking about
machines that execute instructions in a sequence.

At the most basic level do you really think a machine
really cares about whether -you- the programmer
has illegally accessed something you shouldn't have ?

--JamesMills
 
L

Luis Zarrabeitia

Quoting Paul Rubin said:
The data does not arrive from outer space on a magtape stapled to a
meteor. It comes out of another program. Most of the problems in
processing it come from mismatches between the processing programs and
the generation programs. Static checks would help eliminate those
mismatches.

No, copy and paste from the original data structures would eliminate those
mismatches. A compiler checking the reimplementation of said data structres,
whatever the language, has no way of knowing if the structure matches.
 
R

Roy Smith

Paul Rubin said:
basically I think C is evil.

C is not evil. It's a tool. Would you call a hammer evil because it's not
very good at driving screws? C is a very good tool for doing the kind of
thing it was designed for, which is highly efficient, low-level, portable
programming.

The fact that C has been used to write all sorts of large-scale
applications doesn't mean that it's good at that kind of stuff. It just
means that all the alternatives suck more than it does for that kind of
stuff.

If you want evil, look at C++.
 
S

Steven D'Aprano

I think this sub-thread has been mostly about dynamically creating new
class instance attributes, but yes, at one point I did suggest adding
type checking (ML-like inference) to pylint, presumably with feedback to
the compiler for optimization purposes. I noted that Python 3.0 in fact
has some features to support annotations for the purpose of static type
checking, so it's not as far off the wall as it might sound.

I fear anything that will lead Python moving towards C/Pascal/Java type
declarations, but I feel great enthusiasm at the thought that maybe by
the time we get to Python 4.0 there will be type inference which could
enable compiler optimizations.

Exciting times people, exciting times.
 
S

Steven D'Aprano

Wait a minute. Aren't the guy who just took me to task about the
definition of functional programming? So the definition of functional
programming is written in stone, but the definition of OO programming is
written in smoke?

Be fair -- James just admitted that everything he's written in this
thread is wrong. If Michele's post was, and I quote James, "the first
thing said right in this entire thread", then obviously everything James
wrote previously was wrong.

*wink*
 
P

Paul Rubin

Luis Zarrabeitia said:
No, copy and paste from the original data structures would eliminate those
mismatches.

The whole point is that would be possible if Python had data structure
definitions ("types") that were possible to copy and paste from some
single location, instead of building up structures dynamically, adding
fields on the fly in ways that have become obscure over the evolution
of the code.
 
P

Paul Rubin

Roy Smith said:
C is not evil. It's a tool. Would you call a hammer evil because it's not
very good at driving screws?

I would call a hammer evil if it were built in a way that made it
unnecessarily likely to hit your thumb.
C is a very good tool for doing the kind of thing it was designed
for, which is highly efficient, low-level, portable programming.
The fact that C has been used to write all sorts of large-scale
applications doesn't mean that it's good at that kind of stuff. It just
means that all the alternatives suck more than it does for that kind of
stuff.

I don't think so: http://www.adaic.org/whyada/ada-vs-c/cada_art.html
 
R

r

I have a situation which I face almost every day, where I have some
gigabytes of data that I want to slice and dice somehow and get some
numbers out of.  I spend 15 minutes writing a one-off Python program
and then several hours waiting for it to run.  If I used C instead,
I'd spend several hours writing the one-off program and then 15
minutes waiting for it to run, which is not exactly better.   [snip]
I would be ecstatic with a version of Python where I might have to
spend 20 minutes instead of 15 minutes writing the program, but then
it runs in half an hour instead of several hours and doesn't crash.  I
think the Python community should be aiming towards this.

You and Everybody -- would be "ecstatic" if this could happen. But
first someone has to design such a complex implementation. You want
everything, but there is a trade-off.

You said you wrote this program in 15 min. How much testing did you
actually do on this data before running it? If you told me you spent
more than 15 minutes i would not believe you. Look, Python is not a
compiled language -- and for good reason -- so for now you need to do
more initial testing if you plan to run a "15 min hack script" on a
multi-GB data source file, and then throw a temper-tantrum when the
damn thing blows chunks!

If Python could give the benefits of compiled languages whilst being
interpreted(without taking the "fun" out of Python), that would be
wonderful, but can you implement such a system? Can anybody at this
point?

If you can, i can assure you will be worshiped as a God.
 
R

r

I would call a hammer evil if it were built in a way that made it
unnecessarily likely to hit your thumb.


I don't think so:  http://www.adaic.org/whyada/ada-vs-c/cada_art.html

Hammers are not evil, they have no logic, interpreters and compilers
are not evil either -- you and i control there every move. The hammer
will go exactly where you guide it to -- if that happens to be you
thumb...??

Python does exactly what it's told, if you tell Python to smash your
thumb, Python will gladly comply :)
 
R

r

Heres a little food for thought,
Maybe you did tell Python to hit the nail head, but your calculations
of the direction vector were slightly off. Instead of a direct hit,
the hammer grazed the head and now the resultant vector aims strait
for your thumb -- Who's to blame here?
 
P

Paul Rubin

r said:
You said you wrote this program in 15 min. How much testing did you
actually do on this data before running it? If you told me you spent
more than 15 minutes i would not believe you.

I would say hours, in the sense that the program ran correctly for
that long, processing several GB's of data before hitting something
obscure that it couldn't handle. This is not a single incident, it's
something that happens all the time; write the program, run it til it
crashes, fix what made it crash, run some more, etc. In some cases
where the program is a background process listening to external
events, it runs for weeks before hitting something it can't handle.

To be fair, that kind of thing is notoriously difficult to make
airtight in real world systems, which is why Erlang uses a "let it
crash" philosophy that emphasizes recovery from failures rather than
trying to avoid them at all costs. But, at least in the stuff I'm
hacking, I think a lot of these errors could be avoided with more
automated ways to check for type consistency.
 
R

r

Paul Rubin
I would say hours, in the sense that the program ran correctly for
that long, processing several GB's of data before hitting something
obscure that it couldn't handle.  This is not a single incident, it's


So what was the fatal error, care to post a traceback?
 
P

Paul Rubin

r said:
So what was the fatal error, care to post a traceback?

Usually it's "expected to find some value but got None", or got a
list, or expected some structure but got a different one, or some
field was missing, etc. It's not a single traceback, it's a recurring
theme in developing this stuff.
 
R

r

Usually it's "expected to find some value but got None", or got a
list, or expected some structure but got a different one, or some
field was missing, etc.  It's not a single traceback, it's a recurring
theme in developing this stuff.  

Sounds like the results of poor testing and lack of design good
program logic
 
R

r

It would sure be nice if the language made it easier, not harder.

I am for anything that makes debugging easier, as long as that "thing"
doesn't take away the freedom i enjoy while writing Python code. If
you can give me both then i will support your efforts -- The world
does not need two Javas!

Python's existence resides in a unique niche, Simplistic-Elegant-
Programming-Bliss. Python promotes self reliance, you don't get the
safety net you do with other languages. You must consider all the
consciences/possible side-effects of your code.

If you are going to use Python to design enormous systems(or operate
on enormous data sources) -- at this point --, then you will need to
do some enormous testing.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,774
Messages
2,569,596
Members
45,142
Latest member
DewittMill
Top