Python was designed (was Re: Multi-threading in Python vs Java)

C

Chris Angelico

Along with "batteries included" and "we're all adults", I think Python needs a pithy phrase summarizing how well thought out it is. That is to say, the major design decisions were all carefully considered, and as a result things that might appear to be problematic are actually not barriers in practice. My suggestion for this phrase is "Guido was here".

"Designed".

You simply can't get a good clean design if you just let it grow by
itself, one feature at a time. You'll end up with something where you
can do the same sort of thing in three different ways, and they all
have slightly different names:

http://me.veekun.com/blog/2012/04/09/php-a-fractal-of-bad-design/#general

(Note, I'm not here to say that PHP is awful and Python is awesome
(they are, but I'm not here to say it). It's just that I can point to
a blog post that shows what I'm saying.)

Design is why, for instance, Python's builtin types all behave the
same way with regard to in-place mutator methods: they don't return
self. I personally happen to quite like the "return self" style, as it
allows code like this:

GTK2.MenuBar()
->add(GTK2.MenuItem("_File")->set_submenu(GTK2.Menu()
->add(menuitem("_New Tab",addtab)->add_accelerator(...))
->add(menuitem("Close tab",closetab)->add_accelerator(...))
... etc ...
))
->add(GTK2.MenuItem("_Options")->set_submenu(GTK2.Menu()
->add(menuitem("_Font",fontdlg))
... etc ...
))
... etc ...

It's a single expression (this is from Pike, semantically similar to
Python) that creates and sets up the whole menu bar. Most of Pike's
object methods will return this (aka self) if it's believed to be of
use. The Python equivalent, since the .add() method on GTK objects
returns None, is a pile of code with temporary names. But that's a
smallish point of utility against a large point of consistency;
newbies can trust that a line like:

lst = lst.sort()

will trip them up immediately (since lst is now None), rather than
surprise them later when they try to make a sorted copy of the list:

sorted_lst = lst.sort()

which, if list.sort returned self, would leave you with sorted_lst is
lst, almost certainly not what the programmer intended.

Oh, and the use of exceptions everywhere is a sign of design, too.
Something went wrong that means you can't return a plausible value?
Raise.
ValueError: Expecting object: line 1 column 0 (char 0)
EOFError

Etcetera. PHP borrows from C in having piles and piles of "was there
an error" functions; there's no consistency in naming, nor (in many
cases) in the return values. Pike generally raises exceptions, but I/O
failure usually results in a zero return and the file object's errno
attribute set; but at least they're consistent error codes.

This is design. Python has a king (Guido). It wasn't built by a
committee. Maybe you won't like some aspect of Python's design, but it
has one, it's not just sloppily slapped together.

ChrisA
 
S

Steven D'Aprano

This is design. Python has a king (Guido). It wasn't built by a
committee. Maybe you won't like some aspect of Python's design, but it
has one, it's not just sloppily slapped together.


While I agree with your general thrust, I don't think it's quite so
simple. Perl has a king, Larry Wall, but his design is more or less
"throw everything into the pot, it'll be fine" and consequently Perl is,
well, *weird*, with some pretty poor^W strange design decisions.

- Subroutines don't have signatures, you have to parse arguments
yourself by popping values off the magic variable @_ .

- More special variables than you can shake a stick at: @_ $_ $a $b @ARGV
$& ${^ENCODING} $. $| $= $$ $^O $^S @F and many, many more.

- Context sensitivity: these two lines do very different things:

$foo = @bar
@foo = @bar

and so do these two:

my($foo) = `bar`
my $foo = `bar`

- Sigils. Sigils everywhere.

- Separate namespaces for scalars, arrays, hashes, filehandles,
and subroutines (did I miss anything?), co-existing in the same
scope, all the better for writing code like this:

$bar = &foo($foo, $foo[1], $foo{1})

If you think that all three references to $foo refer to the same
variable, you would be wrong.

- Two scoping systems (dynamic and lexical) which don't cooperate.

- Strangers to Perl might think that the way to create a local variable
is to define it as local:

local $foo;

but you'd be wrong. "local" does something completely different. To
create a local variable, use "my $foo" instead.


More here: http://perl.plover.com/FAQs/Namespaces.html


Likewise Rasmus Lerdorf, king of PHP (at least initially), but he had no
idea what he was doing:

"I had no intention of writing a language. I didn't have a clue how to
write a language. I didn't want to write a language," Lerdorf explained.
"I just wanted to solve a problem of churning out Web applications very,
very fast."

http://www.devshed.com/c/a/PHP/PHP-Creator-Didnt-Set-Out-to-Create-a-Language/
 
C

Chris Angelico

While I agree with your general thrust, I don't think it's quite so
simple. Perl has a king, Larry Wall, but his design is more or less
"throw everything into the pot, it'll be fine" and consequently Perl is,
well, *weird*, with some pretty poor^W strange design decisions.

My apologies, I wasn't exactly clear. Having a king doesn't in any way
guarantee a clean design...
Likewise Rasmus Lerdorf, king of PHP (at least initially), but he had no
idea what he was doing:

"I had no intention of writing a language. I didn't have a clue how to
write a language. I didn't want to write a language," Lerdorf explained.
"I just wanted to solve a problem of churning out Web applications very,
very fast."

.... yeah, what he said; but having no king pretty much condemns a
project to design-by-committee. Python has a king and a clear design.

In any case, we're broadly in agreement here. It's design that makes
Python good. That's why the PEP system and the interminable
bike-shedding on python-dev is so important... and why, at the end of
the day, the PEP's acceptance comes down to one person (Guido or a
BDFL-Delegate).

ChrisA
 
R

Roy Smith

Steven D'Aprano said:
While I agree with your general thrust, I don't think it's quite so
simple. Perl has a king, Larry Wall, but his design is more or less
"throw everything into the pot, it'll be fine" and consequently Perl is,
well, *weird*, with some pretty poor^W strange design decisions.

To be fair to Larry, there were different design drivers working there.

Pre-perl, people built humungous shell scripts, duct-taping together
little bits of sed, grep, awk, and other command-line tools. What perl
did, was make it easier to use the functionality of those disparate
tools together in a single language. By holding on to the little bits
of syntax from the precursor languages, he kept the result familiar
feeling, so Unix sysadmins (who were the main audience for perl) were
willing to adopt it.

It was wildly successful, not because it was perfect, but because it
beat the pants off what came before it.
 
R

rusi

To be fair to Larry, there were different design drivers working there.

One more thing to be said for perl:

I remember when some colleague first told me about perl (I guess early 90s) I was incredulous that the *same* language could run on DOS and on Unix unchanged.
Yeah in principle we all talked about portability however in practice, we found that the only program that would run on all systems was the asymptotic null C program:
main() {;}

So a full scale language whose programs ran unchanged on all systems was BIG back then.

That we take it for granted today indicates the shoulders of the giants we are standing on.
 
J

John Nagle

"Designed".

You simply can't get a good clean design if you just let it grow by
itself, one feature at a time.

No, Python went through the usual design screwups. Look at how
painful the slow transition to Unicode was, from just "str" to
Unicode strings, ASCII strings, byte strings, byte arrays,
16 and 31 bit character builds, and finally automatic switching
between rune widths. Old-style classes vs. new-style classes. Adding a
boolean type as an afterthought (that was avoidable; C went through
that painful transition before Python was created). Operator "+"
as concatenation for built-in arrays but addition for NumPy
arrays.

Each of those reflects a design error in the type system which
had to be corrected.

The type system is now in good shape. The next step is to
make Python fast. Python objects have dynamic operations suited
to a naive interpreter like CPython. These make many compile
time optimizations hard. At any time, any thread can monkey-patch
any code, object, or variable in any other thread. The ability
for anything to use "setattr()" on anything carries a high
performance price. That's part of why Unladen Swallow failed
and why PyPy development is so slow.

John Nagle
 
P

Peter Cacioppi

"Designed".



You simply can't get a good clean design if you just let it grow by

itself, one feature at a time. You'll end up with something where you

can do the same sort of thing in three different ways, and they all

have slightly different names:



http://me.veekun.com/blog/2012/04/09/php-a-fractal-of-bad-design/#general



(Note, I'm not here to say that PHP is awful and Python is awesome

(they are, but I'm not here to say it). It's just that I can point to

a blog post that shows what I'm saying.)



Design is why, for instance, Python's builtin types all behave the

same way with regard to in-place mutator methods: they don't return

self. I personally happen to quite like the "return self" style, as it

allows code like this:



GTK2.MenuBar()

->add(GTK2.MenuItem("_File")->set_submenu(GTK2.Menu()

->add(menuitem("_New Tab",addtab)->add_accelerator(...))

->add(menuitem("Close tab",closetab)->add_accelerator(...))

... etc ...

))

->add(GTK2.MenuItem("_Options")->set_submenu(GTK2.Menu()

->add(menuitem("_Font",fontdlg))

... etc ...

))

... etc ...



It's a single expression (this is from Pike, semantically similar to

Python) that creates and sets up the whole menu bar. Most of Pike's

object methods will return this (aka self) if it's believed to be of

use. The Python equivalent, since the .add() method on GTK objects

returns None, is a pile of code with temporary names. But that's a

smallish point of utility against a large point of consistency;

newbies can trust that a line like:



lst = lst.sort()



will trip them up immediately (since lst is now None), rather than

surprise them later when they try to make a sorted copy of the list:



sorted_lst = lst.sort()



which, if list.sort returned self, would leave you with sorted_lst is

lst, almost certainly not what the programmer intended.



Oh, and the use of exceptions everywhere is a sign of design, too.

Something went wrong that means you can't return a plausible value?

Raise.




ValueError: Expecting object: line 1 column 0 (char 0)




EOFError



Etcetera. PHP borrows from C in having piles and piles of "was there

an error" functions; there's no consistency in naming, nor (in many

cases) in the return values. Pike generally raises exceptions, but I/O

failure usually results in a zero return and the file object's errno

attribute set; but at least they're consistent error codes.



This is design. Python has a king (Guido). It wasn't built by a

committee. Maybe you won't like some aspect of Python's design, but it

has one, it's not just sloppily slapped together.



ChrisA

So Python was designed reasonably well, with a minimum of hacky-screw-ups. This happened because Python's growth was effectively managed by an individual who was well suited to the task. In other words, "Guido was here".

Good thread, I learned a lot from it, thanks.
 
M

Mark Lawrence

So Python was designed reasonably well, with a minimum of hacky-screw-ups. This happened because Python's growth was effectively managed by an individual who was well suited to the task. In other words, "Guido was here".

Good thread, I learned a lot from it, thanks.

Would you be kind enough to learn something from this please
https://wiki.python.org/moin/GoogleGroupsPython

--
Roses are red,
Violets are blue,
Most poems rhyme,
But this one doesn't.

Mark Lawrence
 
C

Chris Angelico

No, Python went through the usual design screwups. Look at how
painful the slow transition to Unicode was, from just "str" to
Unicode strings, ASCII strings, byte strings, byte arrays,
16 and 31 bit character builds, and finally automatic switching
between rune widths. Old-style classes vs. new-style classes. Adding a
boolean type as an afterthought (that was avoidable; C went through
that painful transition before Python was created). Operator "+"
as concatenation for built-in arrays but addition for NumPy
arrays.

Each of those reflects a design error in the type system which
had to be corrected.

Oh, Python's design wasn't perfect - that's a pretty much impossible
goal anyway. Sometimes you don't learn what you ought to have done
till it's been in production for a while - that's why, for instance,
these happened:

https://wiki.theory.org/YourLanguageSucks#Fixed_in_Python_3

You'd have to be completely omniscient to avoid that kind of
misjudgment, and breaking backward compatibility is such a major cost
that sometimes design errors just have to be kept. But you'll still
end up with something far cleaner than would come from ad-hoc
undirected changes; it'll be a design with warts, rather than a lack
of design.
The type system is now in good shape. The next step is to
make Python fast. Python objects have dynamic operations suited
to a naive interpreter like CPython. These make many compile
time optimizations hard. At any time, any thread can monkey-patch
any code, object, or variable in any other thread. The ability
for anything to use "setattr()" on anything carries a high
performance price. That's part of why Unladen Swallow failed
and why PyPy development is so slow.

Yeah, this does make things hard. But that dynamism is a fundamental
part of Python's design, even if it's used by almost nothing. I'd say
this isn't proof of a design error, just a consequence of a design
decision. Python isn't for everyone, nor for every task - sometimes
it'll be too slow for what you want. So be it! There are plenty of
places where it's good. And there are similar languages (hi Pike!) for
when you want a bit more performance.

ChrisA
 
C

Chris Angelico

So Python was designed reasonably well, with a minimum of hacky-screw-ups. This happened because Python's growth was effectively managed by an individual who was well suited to the task. In other words, "Guido was here".

Good thread, I learned a lot from it, thanks.

Pretty much, yeah. We're saying the same thing, only I'm focusing on
the importance of design rather than deifying the person who designed
it. But yes, that comes to much the same result.

ChrisA
 
C

Chris Angelico

No, Python went through the usual design screwups.
Each of [the below] reflects a design error in the type system which
had to be corrected.

I'll pick up each one here as I think some of them need further discussion.
Look at how painful the slow transition to Unicode was,
from just "str" to Unicode strings, ASCII strings, byte strings, byte
arrays, 16 and 31 bit character builds, and finally automatic
switching between rune widths.

I'm not sure what you mean by all of these - I've known Python for
only a (relatively) short time, wasn't there in the 1.x days (much
less the <1.0 days). But according to its history page, the early 1.x
versions of Python predate the widespread adoption of Unicode, so it's
a little unfair to look with 2013 eyes and say that full true Unicode
support should have been there from the start. If anyone invents a
language today that doesn't handle Unicode properly, I would be very
much disappointed; but changing the meaning of quoted string literals
is a pretty major change. I'm just glad it got sorted out for 3.0. As
to the 16/32 bit builds, there aren't actually very many languages
that get this right; Python's now a blazing torch, showing the way for
others to follow. (Pike's had something very similar to PEP 393 for
years, but nobody looks to obscurities.) I hope we'll see other
languages start to follow suit.
Old-style classes vs. new-style classes.

By the time I started using Python, new-style classes existed and were
the recommended way to do things, so I never got the "feel" for
old-style classes. I assume there was a simplicity to them, since
new-style classes were described as having a performance cost, but one
worth paying. My guess is it comes under the category of "would have
to be omniscient to recognize what would happen"; Steven, maybe you
can fill us in?
Adding a
boolean type as an afterthought (that was avoidable; C went through
that painful transition before Python was created).

I don't know about that. Some languages get by just fine without
dedicated a boolean type. Python didn't have them, then it had them as
integers, now it has them as bools. Is it a major problem? (Apart from
adding them in a point release. That's an admitted mistake.) Python
doesn't have a 'vector' type either, you just use a tuple. Some things
don't need to be in the language, they can be pushed off to the
standard library. And speaking of which...
Operator "+" as concatenation for built-in arrays but addition
for NumPy arrays.

.... NumPy definitely isn't part of the language. It's not even part of
the standard library, it's fully third-party. The language decrees
that [1,2] + [3,4] = [1,2,3,4], and that custom_object1 +
custom_object2 = custom_object1.__add__(custom_object2) more or less,
and then leaves the implementation of __add__ up to you. Maybe you'll
make an "Entropy" class, where entropy+=int blocks until it's acquired
that much more entropy (maybe from /dev/random), and entropy-int
returns a random number based on its current state. It makes a measure
of sense, if not what you normally would want. You can shoot yourself
in the foot in any language; and if you write something as big and
popular as NumPy, you get to shoot other people in the foot too! :)

ChrisA
 
M

Mark Janssen

No, Python went through the usual design screwups.

I hesitate to poke my nose in here, but Python is fine. No one knows
how to design the perfect language from the start, otherwise it would
be here. But Python has set the precedent for allowing
backwards-incompatibility to fix language problems and that's what
will keep it from breaking.
Look at how
painful the slow transition to Unicode was, from just "str" to
Unicode strings, ASCII strings, byte strings, byte arrays,

This is where I wish I could have been involved with the discussion,
but I was outside of civilization at the time, and was not able to
contribute.
16 and 31 bit character builds, and finally automatic switching
between rune widths. Old-style classes vs. new-style classes. Adding a
boolean type as an afterthought (that was avoidable; C went through
that painful transition before Python was created). Operator "+"
as concatenation for built-in arrays but addition for NumPy
arrays.

All of this will get fixed, but the problem is that you are stirring
up issues without really understanding the problem. The problem is
something that is at the bleeding-edge of Computer Science itself and
settling on a theory of types. I've answered this by creating a
unified object model, but no one has understood why the hell anyone
needs one, so I'm sitting around waiting for a friend..
Each of those reflects a design error in the type system which
had to be corrected.

To call it a "design error" makes it seem like someone make a decision
that resulted in a mistake, but it isn't (wasn't) that simple.
The type system is now in good shape. The next step is to
make Python fast.

Woah, boy. There's no reason to make an incomplete design faster, for
psuedo-problems that no one will care about in 5-10 years. The field
has yet to realize that it needs an object model, or even what that
is.
Python objects have dynamic operations suited
to a naive interpreter like CPython.

Naive, no.
These make many compile
time optimizations hard. At any time, any thread can monkey-patch
any code, object, or variable in any other thread. The ability
for anything to use "setattr()" on anything carries a high
performance price. That's part of why Unladen Swallow failed
and why PyPy development is so slow.

Yes, and all of that is because, the world has not settled on some
simple facts. It needs an understanding of type system. It's been
throwing terms around, some of which are well-defined, but others,
not: there has been enormous cross-breeding that has made mutts out
of everybody and someone's going to have to eat a floppy disk for
feigning authority where there wasn't any.

Mark J
Tacoma, Washington
 
T

Terry Reedy

I'm not sure what you mean by all of these - I've known Python for
only a (relatively) short time, wasn't there in the 1.x days (much
less the <1.0 days). But according to its history page, the early 1.x
versions of Python predate the widespread adoption of Unicode, so it's
a little unfair to look with 2013 eyes and say that full true Unicode
support should have been there from the start.

The first versions of Python and unicode were developed and released
about the same time. No one knew that either would be as successful as
they have become over two decades.
By the time I started using Python, new-style classes existed and were
the recommended way to do things, so I never got the "feel" for
old-style classes. I assume there was a simplicity to them, since

Too simple. All user classes were instances of the userclass type. All
user instances were instances of the userinstance type, or something
like that. There were otherwise separate from builtin types. I have
forgotten the details and have no wish to remember.

The system was usable but klutzy. I believe it was an add-on after the
initial release. People wanted to be able to subclass builtins even back
in 1.4 days, but Guido did not realized how to use the obscure metaclass
hook to do so until 2.2 was being developed. Most core devs are happy to
be rid of them (except when patching 2.7).
 
C

Chris Angelico

Naive, no.

"Naive", in this instance, means executing code exactly as written,
without optimizing things (and it's not an insult, btw). For instance,
a C compiler might turn this into simple register operations:

int x=5;

int foo()
{
x+=3;
return x*2;
}

The two references to 'x' inside foo() can safely be assumed to be the
same 'x', and the value as written by the += MUST be the one used to
calculate *2. If you declare x to be volatile, that assumption won't
be made, and the interpretation will be naive. Now here's the CPython
equivalent:

x=5
def foo():
global x
x+=3
return x*2
3 0 LOAD_GLOBAL 0 (x)
3 LOAD_CONST 1 (3)
6 INPLACE_ADD
7 STORE_GLOBAL 0 (x)

4 10 LOAD_GLOBAL 0 (x)
13 LOAD_CONST 2 (2)
16 BINARY_MULTIPLY
17 RETURN_VALUE

Note that the global is stored, then reloaded. This is the naive
approach, assuming nothing about the relations between operations.
It's an easy way to be thread-safe, it just costs performance.

ChrisA
 
R

Roy Smith

Terry Reedy said:
The first versions of Python and unicode were developed and released
about the same time. No one knew that either would be as successful as
they have become over two decades.

Much the same can be said for IPv6 :)
 
M

Mark Janssen

Python objects have dynamic operations suited
"Naive", in this instance, means executing code exactly as written,
without optimizing things (and it's not an insult, btw).

In that case, you're talking about a "non-optimizing" interpreter, but
then, that what is supposed to happen. I don't think it's fair to
call it "naive". An interpreter can't guess what you mean to do in
every circumstance (threading?). It's better to do it right (i.e.
well-defined), *slowly* than to do it fast, incorrectly.

MarkJ
Tacoma, Washington
 
C

Chris Angelico

In that case, you're talking about a "non-optimizing" interpreter, but
then, that what is supposed to happen. I don't think it's fair to
call it "naive". An interpreter can't guess what you mean to do in
every circumstance (threading?). It's better to do it right (i.e.
well-defined), *slowly* than to do it fast, incorrectly.

The only thing that's unfair is the interpretation of "naive" as
meaning somehow inferior.

https://en.wikipedia.org/wiki/Naivety#Science

As you say, it's better to do it right slowly than wrong quickly. The
naive method is more easily proven.

ChrisA
 
R

rusi

Yes, and all of that is because, the world has not settled on some
simple facts. It needs an understanding of type system. It's been
throwing terms around, some of which are well-defined, but others,
not: there has been enormous cross-breeding that has made mutts out
of everybody and someone's going to have to eat a floppy disk for
feigning authority where there wasn't any.

Objects in programming languages (or 'values' if one is more functional programming oriented) correspond to things in the world.
Types on the other hand correspond to our classifications and so are thingsin our minds.
So for the world 'to settle' on a single universal type system is about as nonsensical and self contradictory as you and I having the same thoughts.

To see how completely nonsensical a classification system of a so-called alien culture is, please read:
http://en.wikipedia.org/wiki/Celestial_Emporium_of_Benevolent_Knowledge

And then reflect that the passage is implying that CONVERSELY our natural/obvious/FACTual classifications would appear similarly nonsensical to them.

The same in the world of programming languages:

Here's an APL session
$ ./apl

Welcome to GNU APL version 1.0
1 + 2
3
1 + 2 3 4
3 4 5
1 = 2
0
1 2 3 = 2 3 4
0 0 0
1 = 1 2 3
1 0 0
2 ≥ 1 2 3
1 1 0


a perfectly good (and for many of us old-timers a very beautiful) type system
but completely incompatible with anything designed in the last 40 years!
[Hell it does not even have a prompt!
Also note the character-set (≥ not >=) -- long before unicode notan emasculated deference to ASCII
 
S

Steven D'Aprano

No, Python went through the usual design screwups. Look at how
painful the slow transition to Unicode was, from just "str" to Unicode
strings, ASCII strings, byte strings, byte arrays, 16 and 31 bit
character builds, and finally automatic switching between rune widths.

Are you suggesting that Guido van Rossum wasn't omniscient back in 1991
when he first released Python??? OH MY GOD!!! You ought to blog about
this, let the world know!!!!

But seriously... although the Unicode standard was began as early as
1987, the first official release of the standard wasn't until nine months
after the first public release of Python. Do you really consider it a
"design screwup" that Guido didn't build support for Unicode into Python
since the beginning?

Given the constraints of backwards-compatibility, and that Unicode didn't
even exist when Python was first created, I don't think the history of
Unicode support in Python is a screw-up in the least. And if it is a
screw-up, it's *Unicode's* screw-up, because they're the ones that
thought that 16-bit chars would have been enough in the first place.

While it would have been nice if Python had invented the idea of using
different rune widths back in Python 2.2, I don't think we can hold it
against GvR or the other Python devs that they didn't. They're only
human. As far as I know, only one other language does such a thing,
namely Pike, which is not exactly high-profile.

Old-style classes vs. new-style classes. Adding a boolean type as an
afterthought (that was avoidable; C went through that painful transition
before Python was created). Operator "+" as concatenation for
built-in arrays but addition for NumPy arrays.

Each of those reflects a design error in the type system which
had to be corrected.

Perhaps the first one -- had GvR not decided in the first place that
built-in types should be separate from user-defined classes, the old vs
new style class thing would have been unnecessary. But bools are not an
example. The decision to leave out bools as a separate type was, and
remains, a perfectly legitimate decision. Perhaps one might argue that
Python-with-bools is *better* than Python-without-bools, but we would be
foolish to argue that Python-without-bools was a screw-up. Bools are a
nice-to-have, not a must-have.

And as for numpy arrays, well, if a long-standing Python developer such
as yourself doesn't yet understand that this is a feature, not a mistake,
there's a serious problem, and it's not with Python. Operator overloading
exists precisely so that custom classes aren't limited to the exact same
behaviour as built-ins. The fact that the numpy devs made a different
decision as to what + means than the Python devs is not a sign that the
design was screwed up, it is a sign that the system works.

It is true that numpy has a problem with Python operators in that there
aren't enough of them. There have been various attempts to work out a
good syntax for adding arbitrary additional operators, so that numpy can
have *both* element-wise operators and array-wise operators at the same
time. But the lack of this is not a design screw-up. It's a hard problem
to solve, and sometimes it is better to do without a feature than to add
it poorly.

The type system is now in good shape. The next step is to
make Python fast.

Whenever I see somebody describing a *language* as "fast" or "slow",
especially when the next few sentence reveals that they are aware of the
existence of multiple *implementations*:
Python objects have dynamic operations suited to a
naive interpreter like CPython. [...] That's
part of why Unladen Swallow failed and why PyPy development is so slow.

as if "fast" and "slow" were objective, concrete and most importantly
*fixed* standards that are the same for everybody, then I suspect
trolling.

Or to put it another way: Python is already fast. Using PyPy, you can
write pure-Python code that is faster than the equivalent optimized C
code compiled using gcc. Even using vanilla CPython, you can write pure
Python code that (for example) checks over 12,000 nine-digit integers for
primality per second, on a relatively old and slow computer. If that's
not *fast*, nothing is.

Whether it is *fast enough* is a completely different question, and one
which leads to the question "fast enough for what?". But people who like
to complain about "Python being slow" don't like that question.
 
R

rusi

Are you suggesting that Guido van Rossum wasn't omniscient back in 1991
when he first released Python??? OH MY GOD!!! You ought to blog about
this, let the world know!!!!

You are making a strawman out of John's statements:
Python went through the usual design screwups.
[screwup list which perhaps pinche John most]
Each of those reflects a design error in the type system which had to be corrected.

The reasonable interpretation of John's statements is that propriety and even truth is a function of time: It was inappropriate for GvR to have put inunicode in 1990. It was appropriate in 2008. And it was done. You may call that being-human-not-God. I call that being real.

To have reality time-invariant, would imply for example that Abraham Lincoln was a racist because he use the word 'negro': (see speech
http://en.wikipedia.org/wiki/Abraham_Lincoln_and_slavery#Legal_and_political )

Or that it is ok to do so today.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,764
Messages
2,569,566
Members
45,041
Latest member
RomeoFarnh

Latest Threads

Top