reduce() anomaly?

  • Thread starter Stephen C. Waterbury
  • Start date
P

Patrick Maupin

Douglas said:
Your claim is silly. sum() is not *way* simpler than reduce(), and
anyone can be explained reduce() in 10 seconds: "reduce() is just like
sum(), only with reduce() you can specify whatever addition function
you would like."

Maybe reduce() can be explained in 10 seconds to someone who has used
sum() a few times, but that has no bearing whatsoever on trying to
explain reduce() to someone if sum() is not available and hasn't
been used by them.
they were not taught to my son in _his_ equivalent course [it used
Pascal], and are not going to be taught to my daughter in _her_
equivalent course [it uses C].

Then your children were done a great diservice by receiving a poor
education. (Assuming that is that they wanted to learn Computer
Science, and not Programming in Pascal or Programming in C.)

I'm sorry, but from a pure CS viewpoint, reduce() is way off the
radar screen. Especially for a 101 course. If either of my daughters
wanted to discuss reduce() while taking such a course, I'd want to
see the curriculum to figure out what they _weren't_ being taught.
You should be assuming that your audience are the smart people
that they are, rather than the idiots you are assuming them to be.

Ignorance is not stupidity. I have yet to see a language which
can be used by stupid people with consistent success; however I
have seen a great deal of evidence that Python can be used very
successfully by ignorant people. It gently eases them in and
allows them to perform useful work while slowly reducing their
ignorance.
I sure hope that Python doesn't try to emulate C. It's a terrible,
horrible programming language that held back the world of software
development by at least a decade.

I used to hate C. But then, when it borrowed enough good concepts
from Pascal and other languages, and the compilers got smart enough
to warn you (if you cared to see the warnings) about things like
"if (x = y)" I stopped using Modula-2. C held software back 10
years in the same manner as Microsoft did, e.g. by helping to
standardize things to where I can buy a $199 system from WalMart
which would cost over $20,000 if everybody kept writing code like
the pointy-headed ivory tower academics thought it ought to be written.
For certain problem domains (where domain includes the entire system
of software, hardware, real-time constraints, portability, user
expectations, maintainability by a dispersed team, etc.), C is an
excellent implementation language. But don't take my word for it --
open your eyes and look around. Could there be a better implementation
language? Sure. Would C acquire the bare minimum features needed
to compete with the new language? Absolutely.

(But I freely admit I'm a luddite: I prefer Verilog to VHDL, as well.)
The reason for Python's wide acceptance isn't because it is
particularly well-designed compared to other programming languages
that had similar goals of simplicity and minimality (it also isn't
poorly designed compared to any of them -- it is on par with the
better ones) -- the reason for its success is that it was in the right
place at the right time, it had a lightweight implementation, was
well-suited to scripting, and it came with batteries included.

I'd vote this as the statement in this group most likely to start
a religious flamewar since the lisp threads died down.

I'm not particularly religious, but I _will_ bite on this one:

1) In what way was it at the "right place at the right time?" You
didn't name names of other languages, but I'll bet that if you can
name 5 which are similar by your criteria, at least two of them
were available when Python first came out.

2) What part of "lightweight implementation, well suited to scripting"
contradicts, or is even merely orthorgonal to "particularly well-designed"?

3) Do you _really_ think that all the batteries were included when
Python first came out? Do you even think that Python has more batteries
_right_ _now_ than Perl (via CPAN), or that some competing language
couldn't or hasn't already been designed which can coopt other languages'
batteries?

I can accept the premise that, for Python to enjoy the acceptance
it does today, Guido had to be lucky in addition to being an excellent
language designer. But if I were to accept the premise that Python's
popularity is due to sheer luck alone, my only logical course of action
would to be to buy Guido a plane ticket to Vegas and front him $10,000
worth of chips, because he has been extremely lucky for many years now.

Pat
 
M

Michele Simionato

Douglas Alan said:
C'mon -- all reduce() is is a generalized sum or product. What's
there to think about? It's as intuitive as can be. And taught in
every CS curiculum. What more does one want out of a function?

|>oug

Others pointed out that 'reduce' is not taught in every CS curriculum
and that many (most?) programmers didn't have a CS curriculum as you
intend it, so let me skip on this point. The real point, as David Eppstein
said, is readability:

reduce(operator.add, seq)

is not readable to many people, even if is readable to you.
That's the only reason why I don't use 'reduce'. I would have preferred
a better 'reduce' rather than a new ad hoc 'sum': for instance something
like

reduce([1,2,3],'+')

in which the sequence goes *before* the operator. It is interesting to
notice that somebody recently suggested in the Scheme newsgroup that

(map function sequence)

was a bad idea and that

(map sequence function)

would be better!

Of course, I do realize that there is no hope of changing 'reduce' or 'map'
at this point; moreover the current trend is to make 'reduce' nearly
useless, so I would not complain about its dead in Python 3.0.
Better no reduce than an unreadable reduce.

Also, I am not happy with 'sum' as it is, but at least it is
dead easy to read.

Michele Simionato
 
M

Michele Simionato

Douglas Alan said:
The reason for Python's wide acceptance isn't because it is
particularly well-designed compared to other programming languages
that had similar goals of simplicity and minimality (it also isn't
poorly designed compared to any of them -- it is on par with the
better ones) -- the reason for its success is that it was in the right
place at the right time, it had a lightweight implementation, was
well-suited to scripting, and it came with batteries included.

.... and it is free!!! ;)

More seriously, the fact that it has a standard implementation and a
BDFL ensuring the consistency of the language (not committee for Python!)
is also a big plus. Moreover, it got a good documentation, a very active
community and a wonderful newgroup. Also, the time scale between the
submission of a bug (there are very few of them, BTW) and its fixing
is surprisingly short. This is something I value a lot. Finally, the
language is still evolving at fast pace and you feel the sensation
that is has a future. Probably the same things can be said for Ruby
and Perl, so you have a choice if you don't like the Zen of Python ;)

Michele Simionato
 
G

Georgy Pruss

| <...>
| in which the sequence goes *before* the operator. It is interesting to
| notice that somebody recently suggested in the Scheme newsgroup that
|
| (map function sequence)
|
| was a bad idea and that
|
| (map sequence function)
|
| would be better!
|
| <...>

Yes, that's right. Some scientific studies proved that humans think and
express their thoughts in order subject/object-action. So OOP is very
"human" in this aspect, btw.

G-:
 
R

Robin Becker

Georgy said:
(e-mail address removed)...
| <...>
| in which the sequence goes *before* the operator. It is interesting to
| notice that somebody recently suggested in the Scheme newsgroup that
|
| (map function sequence)
|
| was a bad idea and that
|
| (map sequence function)
|
| would be better!
|
| <...>

Yes, that's right. Some scientific studies proved that humans think and
express their thoughts in order subject/object-action. So OOP is very
"human" in this aspect, btw.

G-:
on oop in this thread nobody has pointed out that we could have

sequence.sum()
sequence.reduce(operator.add[,init])

or even

sequence.filter(func) etc etc

and similar. That would make these frighteningly incomprehensible ;)
concepts seem less like functional programming. Personally I wouldn't
like that to happen.
 
D

Douglas Alan

Douglas Alan wrote:
Maybe reduce() can be explained in 10 seconds to someone who has used
sum() a few times, but that has no bearing whatsoever on trying to
explain reduce() to someone if sum() is not available and hasn't
been used by them.

Describing reduce() in 10 seconds is utterly trivial to anyone with an
IQ above 100, whether or not they have ever used sum():

"To add a sequence of numbers together:

reduce(add, seq)

To multiply a sequence of numbers together:

reduce(mul, seq)

To subtract all of the numbers of a sequence (except the first
number) from the first number of the sequence:

reduce(sub, seq)

To divide the first number in a sequence by all the remaining
numbers in the sequence:

reduce(div, seq)

Any two-argument function can be used in place of add, mul, sub, or
div and you'll get the appropriate result. Other interesting
examples are left as an exercise for the reader."


If someone can't understand this quickly, then they shouldn't be
programming!
I'm sorry, but from a pure CS viewpoint, reduce() is way off the
radar screen. Especially for a 101 course.

I'm sorry, but you are incorrect. When I took CS-101, we learned
assembly language, then were assigned to write a text editor in
assembly language, then we learned LISP and were assigned to write
some programs in LISP, and then we learned C, and then we were
assigned to implement LISP in C.

If you can write a !$#@!!%# LISP interpreter in C, you no doubt can
figure out something as mind-achingly simple as reduce()!
Ignorance is not stupidity.

Assuming that your audience cannot learn the simplest of concepts is
assuming that they are stupid, not that they are ignorant.
I used to hate C. But then, when it borrowed enough good concepts
from Pascal and other languages, and the compilers got smart enough
to warn you (if you cared to see the warnings) about things like
"if (x = y)" I stopped using Modula-2. C held software back 10
years in the same manner as Microsoft did, e.g. by helping to
standardize things to where I can buy a $199 system from WalMart
which would cost over $20,000 if everybody kept writing code like
the pointy-headed ivory tower academics thought it ought to be written.

You score no points for C by saying that it is like Microsoft. That's
a strong damnation in my book. And you really don't know how the
world would have turned out if a different programming language had
been adopted rather than C for all those years. Perhaps computers
would be more expensive today, perhaps not. On the other hand, we
might not have quite so many buffer overflow security exploits.
Perhaps we'd have hardware support for realtime GC, which might be
very nice. On the other hand, perhaps people would have stuck with
assembly language for developing OS's. That wouldn't have been so
pretty, but I'm not sure that that would have made computers more
expensive. Perhaps a variant of Pascal or PL/1 would have taken the
niche that C obtained. Either of those would have been better, though
no great shakes either.

Many of the pointy-headed ivory tower academics, by the way, thought
that code should look something like Python. The reason these
languages are not widely used is because typically they either did not
come with batteries, or there was no lightweight implmentation
provided, or they only ran on special hardware, or all of the above.
I'd vote this as the statement in this group most likely to start
a religious flamewar since the lisp threads died down.

The only way it could start a religious flamewar is if there are
people who wish to present themselves as fanboys. I have said nothing
extreme -- just what is obvious: There are many nice computer
programming languages -- Python is but one of them. If someone
wishes to disagree with this, then they would have to argue that there
are no other nice programming languages. Now that would be a flame!
I'm not particularly religious, but I _will_ bite on this one:
1) In what way was it at the "right place at the right time?"

Perl was in the right place at the right time because system
administrators had gotten frustrated with doing all their scripts in a
mishmash of shell, awk, sed, and grep, etc. And then web-scripting
kicked Perl into even more wide acceptance. Python was in the right
place in the right time because many such script-writers (like yours
truly) just could not stomach Perl, since it is an ugly monstrocity,
and Python offered such people relief from Perl. If Perl had been a
sane OO language, Python would never have had a chance.
You didn't name names of other languages, but I'll bet that if you
can name 5 which are similar by your criteria, at least two of them
were available when Python first came out.

I'm not sure what you are getting at. There were many nice
programming languages before Python, but not many of them, other than
Perl, were portable and well-suited to scripting.

Oh, yeah, I forgot to mention portability in my list of reasons why
Python caught on. That's an essential one. Sure you could elegantly
script a Lisp Machine with Lisp, and some Xerox computers with
Smalltalk, but they didn't provide versions of these languages
well-suited for scripting other platforms.
2) What part of "lightweight implementation, well suited to
scripting" contradicts, or is even merely orthorgonal to
"particularly well-designed"?

Again, I'm not sure what you are getting at. "Lightweight
implementation" and "well-suited to scripting" do not contradict
"well-designed", as Python proves. Lightweightedness and capability
at scripting are certainly orthogonal to the property of being
well-designed, however, since there are a plethora of well-designed
languages that are not suited to scripting. They just weren't
designed to address this niche.
3) Do you _really_ think that all the batteries were included when
Python first came out?

It certainly was not a particularly popular language until it came
with pretty hefty batteries. There are many other languages that
would have been equally popular before Python started coming with
batteries.
Do you even think that Python has more batteries _right_ _now_ than
Perl (via CPAN), or that some competing language couldn't or hasn't
already been designed which can coopt other languages' batteries?

Um, the last time I checked Perl was still a lot more popular than
Python, so once again I'm not sure what you are getting at. Regarding
whether or not some future language might also come with batteries and
therefore steal away Python's niche merely due to having more
batteries: Anything is possible, but this will be an uphill battle for
another language because once a language takes a niche, it is very
difficult for the language to be displaced. On the other hand, a new
language can take over a sub-niche by providing more batteries in a
particular area. PHP would be an example of this.
I can accept the premise that, for Python to enjoy the acceptance it
does today, Guido had to be lucky in addition to being an excellent
language designer. But if I were to accept the premise that
Python's popularity is due to sheer luck alone my only logical
course of action would to be to buy Guido a plane ticket to Vegas
and front him $10,000 worth of chips, because he has been extremely
lucky for many years now.

I never claimed *anything* like the assertion that Python's popularity
is due to luck alone!

|>oug
 
A

Alex Martelli

David Eppstein wrote:
...
If I'm not mistaken, this is buggy when seq is an iterable, and you need

Sorry, I should have said something like "re-iterable" -- an object such
that e.g.:
it1 = iter(seq)
val1 = it1.next()
it2 = iter(seq)
val2 = it2.next()
assert val1 == val2
holds (and keeps holding as you keen next'ing:). list, tuple, dict, etc.
In particular, when the idiom zip(seq, seq[1:]) works, so should this one
(note in passing that, in said idiom, there is no need to slice the first
seq in the zip call to seq[:-1] -- zip truncates at the end of the
_shorter_ sequence anyway).
to do something like
seq1,seq2 = tee(seq)
izip(seq1,islice(seq2,1,None))
instead.

Yes, this is totally general. However, even though tee has now (2.4)
been implemented very smartly, this overall approach is still way
"conceptually heavy" (IMHO) when compared to, e.g.:

def window_by_2(it):
it = iter(it)
first = it.next()
for second in it:
yield first, second
fist = second

in any case, I do think that such 'windowing' is a general enough
need that it deserves its own optimized itertool...


Alex
 
C

Christopher A. Craig

I'm sorry, but from a pure CS viewpoint, reduce() is way off the
radar screen. Especially for a 101 course. If either of my daughters
wanted to discuss reduce() while taking such a course, I'd want to
see the curriculum to figure out what they _weren't_ being taught.

I'd tend to agree, and I've never found myself on the not teaching
generalities side of this debate before. An intro CS class should be
about intro to Computer Science and not any particular language, but
reduce() is a fairly specific functional programming concept. To say
that it should be taught in an intro class is like saying you should
deal with metaclasses, function pointers, or the stack pointer
immediately to teach good computer science. Algorithms, data
structures, state machines and computational theory are fairly basic
computer science, functional programming can be used to present those,
but is by no means needed.

Also the assumption isn't that Python users are stupid, it's that they
may have little to no CS training. Python wasn't designed for the
exclusive use of CS trained individuals (which is a good thing,
because I use it to teach computers to Boy Scouts).

Back on topic, I very rarely use reduce() in Python and then only for
optimization (which I rarely do). The vast majority of the time I
just use a for loop; it just seems to flow better.
 
E

eichin

re cs 101: I'm told (by friends who were undergrads there in the early
90's) that Yale used to have an intro to CS course taught by the late
Alan Perlis. The usual high level concepts you'd expect - but since
it helps a lot to have a language to give examples in, it used APL.

APL has the kind of array manipulation tools that these days you find
in matlab or other specialized tools - I'd say "that other languages
aspire to" but from the looks of it other languages *don't* aspire to
that kind of array handling. But the point here is that "reduce" is
fundamental: x/i5 (where x is multiplication-sign and i is iota) is
a lot like reduce(int.__mul__, range(1,6)), it's just "readable" if
you're comfortable with the notation (and more general, I can't find
a builtin way to say "coerced multiply" without lambda, in 30 seconds
of poking around.) On the other hand, that readability does assume
you're thinking in terms of throwing arrays around, which can be
an... *odd* way of looking at things, though of course when it fits,
it's very nice.

In other words, I'd expect anyone who had a reasonably rich CS
background to have been exposed to it, either from the historical
perspective, the "languages influence how you think" perspective, or
the mathematical operation perspective.

At the same time, I'll admit to not having used it (I've only been
using python for a year, and will more likely write an
accumulator-style block since it will always be clearer than a lambda
(which is what you generally need for reduce - if you are going to
write a function anyway, it's easier to just write an n-adic instead
of only dyadic function, and skip the need for reduce altogether - and
python has a "bias for functions", the gap between "sum" and "write a
function for the whole thing" is fairly narrow.)

(Hmm, r5rs doesn't have reduce, but mit-scheme does.)
 
M

Michael T. Babcock

sequence.sum()
sequence.reduce(operator.add[,init])
sequence.filter(func) etc etc

That would make these frighteningly incomprehensible ;)
concepts seem less like functional programming. Personally I wouldn't
like that to happen.

I'm hoping you were being sarcastic ... but I get the feeling you aren't.

Why, pray-tell, would you want an OO program to do:

results = [ func(x) for x in sequence ]

... instead of ...

results = sequence.map(func) ??

I can understand if you're writing highly LISP-like Python (see IBM's
articles on functional programming with Python for example). However, I
don't see the 'harm' in offering functional methods to lists/arrays.

Looking at this code I wrote today:

matches = [ get_matches(file) for file in duplicates ]
todelete = [ get_oldest(files) for files in matches ]

... would end up being ...

matches = duplicates.map(get_matches)
todelete = matches.map(get_oldest)

... or better ...

todelete = duplicates.map(get_matches).map(get_oldest)

... and I somewhat like that as I look at it.
--
Michael T. Babcock
CTO, FibreSpeed Ltd. (Hosting, Security, Consultation, Database, etc)
http://www.fibrespeed.net/~mbabcock/

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.0.7 (GNU/Linux)
Comment: http://www.fibrespeed.net/~mbabcock/

iQF1AwUBP7JnS2/Ol25qCQydAQIpSAsAkWwusLupCYm59p2+0Ms6cSfJpdmMCgZB
U6J/0UJgeV6MKdRPVMDE8D3Lbvi5fOldQIHAeRhwbUOJsZYCNP2VHkVEZzuV/2e2
1JDF8FEiIBr7eWNXwy6lS77A/JZ4223e44rFToS3dIkh4Zq0UJPWRzv1rhTsaXp5
F9/qxPzhsAwg7i0Wsar+7aUTUQtQMGdl1eYKnKY/nEMnYN5YaKLnrfuXJuC7/vd0
w51tfuvIjZwNjZUtDmx6kFVpBFJOlM2NcDELA/xHAFfzqJ5KVRTGhav7Mi2TQ7D5
JFeU9dpD7F11gZw/z91z/aVtOMpOC5+3OGkuXwFx393Sn42ZziFct9Bans5SdeyP
HClMFsNdonGpeaG7nxSBrvI4sNnwJ2/1u4pMni2YxkqhvUeexBPoWhSEGZexubum
yFg8Bi/CgjlaoiGTNxSyilb8IZ0Jr+w0qHgZkBf+uVRcgISypH7p1Q==
=z+Uz
-----END PGP SIGNATURE-----
 
D

Dave Brueck

sequence.sum()
sequence.reduce(operator.add[,init])
sequence.filter(func) etc etc

That would make these frighteningly incomprehensible ;)
concepts seem less like functional programming. Personally I wouldn't
like that to happen.

I'm hoping you were being sarcastic ... but I get the feeling you aren't.

Why, pray-tell, would you want an OO program to do:

results = [ func(x) for x in sequence ]

... instead of ...

results = sequence.map(func) ??

Because I find the first much more readable (and IMO the "an OO program to
do" bit is irrelevent from a practical point of view).

Also, someone not familiar with either version is likely to correctly guess
what the first one does. It's not an issue of whether or not a person can be
taught what 'map' means). It's subjective, yes, but not _completely_
subjective because the "guessability" of the first form is higher because it
uses other well-known keywords to do its thing. (FWIW I don't think map() is
that big of a deal, but you asked... :) ).

-Dave
 
E

eichin

you're comfortable with the notation (and more general, I can't find
a builtin way to say "coerced multiply" without lambda, in 30 seconds
of poking around.) ...

Sigh, I just realized - everyone talking about operator.mul was being
*literal*, not abstract, there's actually an operator package :) Oops.
Not that reduce(operator.mul, range(1,6)) is any more readable, I'd
still define Product around it...
 
D

Dave Brueck

sequence.sum()
sequence.reduce(operator.add[,init])
sequence.filter(func) etc etc

That would make these frighteningly incomprehensible ;)
concepts seem less like functional programming. Personally I wouldn't
like that to happen.

I'm hoping you were being sarcastic ... but I get the feeling you aren't.

Why, pray-tell, would you want an OO program to do:

results = [ func(x) for x in sequence ]

... instead of ...

results = sequence.map(func) ??

My apologies: my goofy mailer isn't adding the "On <date>, <person> wrote:"
lines, so it sort of confuses what Robin wrote with what Michael wrote. Sorry!

-Dave
 
D

Douglas Alan

Andrew Dalke said:
Douglas Alan:
I agree with Alex on this. I got a BS in CS but didn't learn about
lambda, reduce, map, and other aspects of functional programming
until years later, and it still took some effort to understand it.
(Granted, learning on my own at that point.)
But I well knew what 'sum' did.

How's that? I've never used a programming language that has sum() in
it. (Or at least not that I was aware of.) In fact, the *Python* I
use doesn't even have sum() in it! I've used a number of languages
that have reduce(). If I didn't already know that it exists, I
wouldn't even think to look in the manual for a function that adds up
a sequence of numbers, since such a function is so uncommon and
special-purpose.

(Someone pointed out that different versions of Scheme vary on whether
they give you reduce. In Scheme reduce would have fewer uses since
Scheme uses prefix notation, and operators like "+" can already take
any number of arguments (this is a definite advantage to prefix
notation). Therefore, in Scheme, to add up all of the numbers in a
list, you can just use apply and '+', like so: "(apply + my-list)".)
Was I not taught Computer Science?

I would say that if you didn't get introduced to at least the concept
of functional programming and a hint of how it works, then something
was seriously wrong with your CS education.
Your predicate (that it's understood by anyone who has studied
CS) is false so your argument is moot.

It's irrelevant whether or not many people have received poor CS
educations -- there are many people who haven't. These people should
be pleased to find reduce() in Python. And the people who received
poor or no CS educations can learn reduce() in under a minute and
should be happy to have been introduced to a cool and useful concept!

|>oug
 
A

Andrew Dalke

Me:
Douglas Alan
How's that? I've never used a programming language that has sum() in
it.

1) From Microsoft Multiplan (a pre-Excel spreadsheet). That
was one of its functions, which I used to help my Mom manage
her cheese co-op accounts in ... 1985?

2) I wrote such a routine many times for my own projects
By 1987 I was using a BASIC version (QuickBasic) which
had functions and using Pascal. But I don't recall in either of
those languages ever passing functions into an object until I
started using non-trivial C a couple of years later. (Probably
for a numerical analysis class.)
I wouldn't even think to look in the manual for a function that adds up
a sequence of numbers, since such a function is so uncommon and
special-purpose.

Uncommon? Here's two pre-CS 101 assignments that use
exactly that idea:
- make a computerized grade book (A=4.0, B=3.0, etc.) which
can give the grade point average
- make a program to compute the current balance of a bank
account given the initial amount and

That's not to say that the BASIC I used at that time had support
for functions. It's only to say that the functionALITY is common
and more easily understood than reduce.
I would say that if you didn't get introduced to at least the concept
of functional programming and a hint of how it works, then something
was seriously wrong with your CS education.

It may be. That was my third-place major, which I did mostly
for fun. I focused more on my math and physics degrees, so might
have skipped a few things I would have learned had I been more
rigorous in my studies.

It may also be that my department focused on other things. Eg,
we had a very solid "foundations of computer science" course
compared to some CS departments, and we learned some fuzzy
logic and expert systems because those were a focus of the
department.
It's irrelevant whether or not many people have received poor CS
educations -- there are many people who haven't. These people should
be pleased to find reduce() in Python. And the people who received
poor or no CS educations can learn reduce() in under a minute and
should be happy to have been introduced to a cool and useful concept!

Actually, your claim is 'anyone can be explained reduce() in 10 seconds' ;)

I tell you this. Your estimate is completely off-base. Reduce is
more difficult to understand than sum. It requires knowing that
functions can be passed around. That is non-trivial to most,
based on my experience in explaining it to other people (which
for the most part have been computational physicists, chemists,
and biologists).

It may be different with the people you hang around -- your
email address says 'mit.edu' which is one of the *few* places
in the world which teach Scheme as the intro language for
undergrads, so you already have a strong sampling bias.

(I acknowledge my own sampling bias from observing people
in computational sciences. I hazard to guess that I know more
of those people than you do people who have studied
computer science.)

Andrew
(e-mail address removed)
 
D

Douglas Alan

Andrew Dalke said:
Douglas Alan
1) From Microsoft Multiplan (a pre-Excel spreadsheet). That
was one of its functions, which I used to help my Mom manage
her cheese co-op accounts in ... 1985?

Okay, well I can certainly see that a spreadsheet program should have
a built-in sum() function, since that's about 50% of what spreadsheets
do! But general-purpose programming languages rarely have it.
Uncommon? Here's two pre-CS 101 assignments that use
exactly that idea:
- make a computerized grade book (A=4.0, B=3.0, etc.) which
can give the grade point average
- make a program to compute the current balance of a bank
account given the initial amount and

I'm not saying that it's uncommon to want to sum a sequence of numbers
(though it's not all that common, either, for most typical programming
tasks) -- just that it's uncommon to build into the language a special
function to do it. reduce(+, seq) apply(+, seq) are much more common,
since reduce and/or apply can do the job fine and are more general.
Or just a good old-fashioned loop.
Actually, your claim is 'anyone can be explained reduce() in 10 seconds' ;)

Now you want consistency from me? Boy, you ask a lot!

Besides, 10 seconds is under a minute, is it not?

Also, I said it could be explained in 10 seconds. Perhaps it takes a
minute to learn because one would need the other 50 seconds for it to
sink in.
I tell you this. Your estimate is completely off-base. Reduce is
more difficult to understand than sum. It requires knowing that
functions can be passed around.

Something that anyone learning the language should learn by the time
they need a special-purpose summing function! Before then, they can
use a loop. They need the practice anyway.
That is non-trivial to most, based on my experience in explaining it
to other people (which for the most part have been computational
physicists, chemists, and biologists).

I find this truly hard to believe. APL was a favorite among
physicists who worked at John's Hopkins Applied Physics Laboratory
where I lived for a year when I was in high school, and you wouldn't
survive five minutes in APL without being able to grok this kind of
thing.
It may be different with the people you hang around -- your email
address says 'mit.edu' which is one of the *few* places in the world
which teach Scheme as the intro language for undergrads, so you
already have a strong sampling bias.

Yeah, and using Scheme was the *right* way to teach CS-101, dangit!
But, like I said, I was taught APL in high-school in MD, and no one
seemed troubled by reduce-like things, so it was hardly just an MIT
thing. In fact, people seemed to like reduce() and friends -- people
seemed to think it was a much more fun way to program, rather than
using boring ol' loops.
(I acknowledge my own sampling bias from observing people in
computational sciences. I hazard to guess that I know more of those
people than you do people who have studied computer science.)

Hmm, well I work for X-ray astronomers. Perhaps I should take a
poll.

|>oug
 
P

Patrick Maupin

Douglas said:
Describing reduce() in 10 seconds is utterly trivial to anyone with an
IQ above 100, whether or not they have ever used sum()

Well, yeah, but they may not need or want to learn or remember it.
And then there are the corner cases, e.g. sum([]) vs.
reduce(operator.add,[]) (which throws an exception).
If someone can't understand this quickly, then they shouldn't be
programming!

Again, it's not "can't", it's whether they need to or not.
I'm sorry, but you are incorrect. When I took CS-101, we learned
assembly language, then were assigned to write a text editor in
assembly language, then we learned LISP and were assigned to write
some programs in LISP, and then we learned C, and then we were
assigned to implement LISP in C.

If you can write a !$#@!!%# LISP interpreter in C, you no doubt can
figure out something as mind-achingly simple as reduce()!

Ahh, the lisp background. I _knew_ that would come out sometime :)

Seriously, though, even in this scenario -- you don't really need
reduce() to create a LISP interpreter (as I'm sure you found
when you wrote one in C).

Assuming that your audience cannot learn the simplest of concepts is
assuming that they are stupid, not that they are ignorant.

As I and others have pointed out, it's not a matter of assuming they
can't learn, it's a matter of assuming they have better things to
do. Many people can write all the useful programs they will ever
need without reduce, and sum() makes the percentage of Python
users who can do this even higher. (Having said that, I never
personally argued that reduce() should be removed from the language,
but I do agree that it does not have to be part of "core" Python,
and could easily be relegated to a module.)
You score no points for C by saying that it is like Microsoft. That's
a strong damnation in my book. And you really don't know how the
world would have turned out if a different programming language had
been adopted rather than C for all those years. Perhaps computers
would be more expensive today, perhaps not. On the other hand, we
might not have quite so many buffer overflow security exploits.
Perhaps we'd have hardware support for realtime GC, which might be
very nice. On the other hand, perhaps people would have stuck with
assembly language for developing OS's. That wouldn't have been so
pretty, but I'm not sure that that would have made computers more
expensive. Perhaps a variant of Pascal or PL/1 would have taken the
niche that C obtained. Either of those would have been better, though
no great shakes either.

I agree that I cannot know how the world would have turned out
without C and Microsoft; but likewise, you cannot know for sure
that computer science would be ten years farther along by now :)

(And I personally feel my alternate universe is more realistic
than yours, but then everybody should feel that way about their
own private alternate universe.)
The only way it could start a religious flamewar is if there are
people who wish to present themselves as fanboys. I have said nothing
extreme -- just what is obvious: There are many nice computer
programming languages -- Python is but one of them. If someone
wishes to disagree with this, then they would have to argue that there
are no other nice programming languages. Now that would be a flame!

Well, I guess I may have read more into your original statement
than you put there. You wrote "similar goals of simplicity
and minimality", and to me, the language is pretty much a gestalt
whole, in the sense that when I read "similar goals" I was thinking
about all the features that, to me, make Python Python. These goals
actually include the lightweight implementation, the portability,
the suitability to scripting, etc. In one way or another, I feel
that these contribute to its simplicity and minimality, and on
rereading your words, I think you were probably mainly referring
to the syntax and semantics. (Even there, however, as I think Alex
has pointed out, design decisions were made which might make the
semantics less than optimal, yet contribute heavily to the small
size and portability of the language.)
I'm not sure what you are getting at. There were many nice
programming languages before Python, but not many of them, other than
Perl, were portable and well-suited to scripting.

I was just challenging you to defend a position which it appears
in hindsight you didn't really take :)
It certainly was not a particularly popular language until it came
with pretty hefty batteries. There are many other languages that
would have been equally popular before Python started coming with
batteries.

Here is one area where I think the genius of the design shows
through. Even _before_ the batteries were included, in a crowded
field of other languages, Python was good enough to acquire enough
mindshare to start the snowball rolling, by attracting the kind of
people who can actually build batteries.
Um, the last time I checked Perl was still a lot more popular than
Python, so once again I'm not sure what you are getting at. Regarding
whether or not some future language might also come with batteries and
therefore steal away Python's niche merely due to having more
batteries: Anything is possible, but this will be an uphill battle for
another language because once a language takes a niche, it is very
difficult for the language to be displaced. On the other hand, a new
language can take over a sub-niche by providing more batteries in a
particular area. PHP would be an example of this.

We are in agreement that Perl has more batteries than Python,
and also more "marketshare." To me, this is yet another testament
to Python's good design -- it is in fact currently on a marketshare
ramp, mostly because it attracts the kind of people who can do an
excellent job of writing the batteries.
I never claimed *anything* like the assertion that Python's popularity
is due to luck alone!

In the post I was responding to, you wrote "The reason for Python's wide
acceptance isn't because it is particularly well-designed compared to
other programming languages," and you also used the phrase "in the right
place at the right time." To me, these statements taken together implied
that you thought the process leading to Python's ascending popularity was
mostly stochastic.

Your later posting (and a more careful reading of your original post) help
me to put your words in the proper context, and seem to indicate that our
opinions on the subject are not as divergent as I first thought they were.

Regards,
Pat
 
D

Douglas Alan

Douglas Alan wrote:
Well, yeah, but they may not need or want to learn or remember it.
And then there are the corner cases, e.g. sum([]) vs.
reduce(operator.add,[]) (which throws an exception).
If someone can't understand this quickly, then they shouldn't be
programming!
Again, it's not "can't", it's whether they need to or not.

If you don't want to learn a cool concept that will only take you 60
seconds to learn, then you shouldn't be programming! Or you can stick
to loops.

The argument that some programmers might be too lazy to want to learn
powerful, simple, and elegant features that can be taught in seconds,
is no good reason to remove such features from Python and bloat Python
by replacing them with a plethora of less powerful, less elegant
features.
Ahh, the lisp background. I _knew_ that would come out sometime :)

Knowing about reduce() doesn't come from a LISP background, since it
is uncommon to use reduce() in LISP. There are few binary operators
in LISP, so instead of doing reduce(+, seq), in LISP, you would
typically do apply(+, seq). Knowing about reduce() comes from the
part of your CS education in which they give you a small taste of what
it is like to program in a purely combinator-based style. E.g., you
might have a problem set where they ask you to solve the same problem
in three different ways: (1) using iteration, (2) using recursion, (3)
using only combinators.

Besides, if you weren't exposed at all to LISP (or a LISP-like
language) while getting a CS degree, it wasn't a very good CS
program! They're going to teach you AI techniques in a different
language? That would be rather silly.
As I and others have pointed out, it's not a matter of assuming they
can't learn, it's a matter of assuming they have better things to
do. Many people can write all the useful programs they will ever
need without reduce, and sum() makes the percentage of Python
users who can do this even higher.

And as I have pointed out, it goes against the principle of simplicity
and expressiveness to remove an easy to use and easy to learn simple
and powerful feature with a slew of specific, tailored features. If
reduce() can be relegated to a library or for the user to implement
for himself, then so can sum(). If the language is to only have one,
it should be reduce().
I agree that I cannot know how the world would have turned out
without C and Microsoft; but likewise, you cannot know for sure
that computer science would be ten years farther along by now :)

I didn't say it held back Computer Science -- Computer Science went
along fine. I said it held back software development.

That's not to say that something else wouldn't have taken C's place in
holding back software development, but, in that case, I'd be railing
against that instead.
Here is one area where I think the genius of the design shows
through. Even _before_ the batteries were included, in a crowded
field of other languages, Python was good enough to acquire enough
mindshare to start the snowball rolling, by attracting the kind of
people who can actually build batteries.

I think that Python always came with batteries, since it was designed
to be the scripting language for an OS called Amoeba that Guido was
working on. There were precious few other well-designed,
well-implemented languages around that the time that were aimed at
being scripting languages (and didn't run only on Lisp machines or
what have you).

|>oug
 
D

Douglas Alan

Douglas Alan said:
Knowing about reduce() doesn't come from a LISP background, since it
is uncommon to use reduce() in LISP. There are few binary operators
in LISP, so instead of doing reduce(+, seq), in LISP, you would
typically do apply(+, seq).

Ah, that reminds me -- both sum() and reduce() can be removed from
Python by extending operator.add so that it will take any number of
arguments. Then you can add up a sequence of numbers by doing
add(*seq). The same thing goes for every other binary operator where
it makes sense to operate on more than two arguments at a time.

Now that's clean, simple, and powerful.

|>oug
 
R

Robin Becker

T. Babcock said:
I'm hoping you were being sarcastic ... but I get the feeling you aren't.

Why, pray-tell, would you want an OO program to do:

results = [ func(x) for x in sequence ]

... instead of ...

results = sequence.map(func) ??
..... well actually I'm quite happy with reduce as a function or as a
method on sequences. I actually feel uncomfortable with all the plethora
of iteration tools, comprehensions etc etc. However, I'm not forced to
use them.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,780
Messages
2,569,608
Members
45,242
Latest member
KendrickKo

Latest Threads

Top