Confessions of a Python fanboy

M

Masklinn

In no particular order, and not necessarily exhaustive:

* The risk of obfuscation in your code. That's fairly minimal for
lambdas, because they're just a single expression, but for a large
anonymous code block (ACB) defined inside a named function, it may be
difficult for the reader to easily distinguish which bits are the
outer
function and which are the ACB.
I believe that one's unadulterated BS.
* Loss of useful debugging information. Take this example from Python:

... return f(3)
...
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 2, in main

... return 2/(n-3)
...
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 2, in main
File "<stdin>", line 2, in my_special_function
ZeroDivisionError: integer division or modulo by zero

If your code has only one anonymous function (whether a lambda or a
full
multi-line block), then it's easy to identify which lambda raised the
exception: there is only one it could be. But if your code uses lots
of
lambdas, the lack of a function name makes it hard to distinguish one
<lambda> from another <lambda>. Anonymity makes identification harder.
The traceback gives you the line of the anonymous function (even in
python) so unless you have several anonymous functions on the same
line, there's no reason why that would be much of an issue.
Furthermore, Python doesn't provide any more information when the
error happens out of a function (in a `for` or a `with`), so it's not
like there's much of a difference here between Ruby's block-based
approach and Python's statements-based approach.
* Risk of code-duplication and breaking the principle of Once And Only
Once. Anonymous functions are generally created, used, then
immediately
thrown away -- or at least made more-or-less inaccessible for reuse.
An
anonymous function stored in a callback still exists, but the coder
isn't
able to easily re-use it for another callback somewhere else in the
code.
Consequently, there's a temptation for the coder to write the same
function multiple times:

add_button("Parrot", colour=blue, callback=lambda x: x.stuff('a'))
add_button("Cheese", flavour=tasty, callback=lambda x: x.thing('b'))
add_button("Canary", colour=yellow, callback=lambda x: x.stuff('a'))

instead of:

def bird_callback(x):
return x.stuff('a')

add_button("Parrot", colour=blue, callback=bird_callback)
add_button("Cheese", flavour=tasty, callback=lambda x: x.thing('b'))
add_button("Canary", colour=yellow, callback=bird_callback)
Yes, that one I can give you though I don't think that's a big issue.
And it's not like it's hard to extract the anonymous function into a
named one and then use that on the third strike, so I really don't
believe that point holds much water.
* Recursion is more or less impossible without fragile tricks.

(At least for Python. I don't know how recursion operates in Ruby.)
Code blocks are rarely if ever used recursively. If an operation is
using anonymous functions recursively, then there's often something
very wrong with the idea leading to that code. So I find this
objection irrelevant.
 
S

Steven D'Aprano

You could say that Ruby doesn't either,

Well you could say a lot of things. Admittedly you don't need a separate
"constant" declaration, but creating a variable name with an initial
uppercase letter is sufficient to make it a (supposed) constant:

irb(main):049:0* Thing = 5
=> 5
irb(main):050:0> Thing = 7
(irb):50: warning: already initialized constant Thing
=> 7

As you can see, Ruby (well, irb at least) considers that Thing is a
constant, and then goes ahead and changes it anyway.

Apart from the use of an arbitrary naming convention instead of an
explicit "make this constant" declaration, and the feeble way Ruby
capitulates when you change it, I think having actual write-once
constants is actually a plus.

you just need to read the
documentation. Ruby's unwritten motto is "flexibility über alles". In
this regard, it is consistent (1).

"It's consistent in its inconsistency" sort of thing perhaps?

Not much is really bolted down in
Ruby. You get encapsulation, but it's so easy to break that it's mostly
symbolic.

Out of curiosity, can you read/write class and instance attributes from
outside the class without using a getter/setter?
 
E

Emmanuel Surleau

Well you could say a lot of things. Admittedly you don't need a separate
"constant" declaration, but creating a variable name with an initial
uppercase letter is sufficient to make it a (supposed) constant:

*shrugs* I see it as a way to encourage (but not force) people to follow a
coding convention. Is this such a bad thing?
irb(main):049:0* Thing = 5
=> 5
irb(main):050:0> Thing = 7
(irb):50: warning: already initialized constant Thing
=> 7

As you can see, Ruby (well, irb at least) considers that Thing is a
constant, and then goes ahead and changes it anyway.

I'm quite aware of how constants work in Ruby, yes :)
Apart from the use of an arbitrary naming convention instead of an
explicit "make this constant" declaration, and the feeble way Ruby
capitulates when you change it, I think having actual write-once
constants is actually a plus.

Nothing wrong with naming conventions. This encourages a uniform coding style,
something which Python could really use.
"It's consistent in its inconsistency" sort of thing perhaps?

No, just consistent.
Out of curiosity, can you read/write class and instance attributes from
outside the class without using a getter/setter?

If you have an instance f class with a attribute @foo which doesn't have an
accessor, you could do:

f.instance_eval("@foo='bar'")

And yes, instance_eval is evil.

Cheers,

Emm
 
B

Bruno Desthuilliers

Steven D'Aprano a écrit :
Incorrect.

Correct for all relevant cases, except this one:
... class_attribute = 'No @@ required.'
...
'No @@ required.'

Once again: Ruby's attributes are *private*, so you can't access an
attribute (class or instance) from outside a method. IOW, the above
example is irrelevant.

(snip)
Disadvantages: your code is filled with line noise. It's an arbitrary
choice between @@ meaning instance attribute and @@ meaning class
attribute -- there's no logical reason for choosing one over the other,
so you have to memorise which is which. It's easy to get it wrong.

So far that's something I have no difficulty living with.
What did I misread from here?

Nowhere - it's me that got it wrong here, sorry.

(snip)

It made me feel good.

Why ???

You don't like Ruby ? Fine, don't use it. Period. I can't see the point
of all these pissing contests.
But seriously, while I admit that I have very little Ruby experience, and
so aren't in a great position to judge, it seems to me that Ruby doesn't
have anything like Python's over-riding design principles (the Zen). If
there is a design principle to Ruby, I can't see what it is.

Fullfill the tastes of Matz ?-)

(snip)
Just because Smalltalk had a particular (mis?)feature

You can drop the 'mis' part IMHO. The point of code blocks in Smalltalk
is that once you have something as powerful as the message+code blocks
combo, you just don't need any other 'special form' for control flow.
doesn't mean that
other languages should copy it.

Nope. But OTHO, Python is famous for all the features it copied from
other languages !-)
 
S

Steven D'Aprano

So far that's something I have no difficulty living with.

I don't like arbitrary symbols. Most people don't -- that's why "line
noise" is unpopular. It's hard to read, hard to write, hard to maintain,
and hard to talk about. The more line-noise, the worse the language.

Of course, *ultimately* every symbol is arbitrary. There's no reason why
"2" should mean the integer two, or "list" should mean a sequence type,
but some symbols have such a long history, or have some other connection
(say, with human languages), that the arbitrariness is lost. For
instance, "+" is the obvious, non-arbitrary choice for the addition
operator in any programming language using Latin symbols, and probably
any programming language on Earth. (Not the *only* choice, but the
obvious one.)

I have a similar dislike for decorator syntax, because "@" ("at" in
English) has nothing to do with decorations. It's an arbitrary symbol.
One might argue that "$" would have been a more logical choice, because
we turn numerals into currency by decorating it with a $ sign. (At least
in the US, Canada, Australia, and a few other countries.) I use
decorators all the time, and they are a fantastic invention, but the
arbitrariness of the @ syntax is a negative. Oh well, one negative out of
a whole lot of positives isn't too bad.

At least I only have to deal with *one* such arbitrary symbol that needs
memorizing. There's no need to distinguish between @@function_decorator
and @class_decorator (or should it be the other way around?). Similarly,
Python's choice of syntax for attributes is consistent: object.attribute
works for everything, whether object is a class, an instance, a module,
and whether attribute is callable or not. You can even use it on ints,
provided you are clever about it:
<type 'int'>


Why ???

You don't like Ruby ? Fine, don't use it. Period. I can't see the point
of all these pissing contests.

Criticism of a language is a pissing contest?

Yeah, okay, I was a tad dismissive. I un-apologetically jump to strong
impressions about languages based on minimal use -- but I'm also willing
to change my mind. Ruby certainly looks to me like it has some nice
features. Syntax that looks like Perl isn't one of them though.

You can drop the 'mis' part IMHO. The point of code blocks in Smalltalk
is that once you have something as powerful as the message+code blocks
combo, you just don't need any other 'special form' for control flow.

Well, maybe, but remember, programming languages are only partly for
communication to the compiler. They also have the requirement to
communicate with human programmers as well, and that's even more
important, because

(1) humans spent a lot more time working with code than compilers do;

(2) human programmers charge much more money than compilers do;

(3) and you can modify the compiler to suit human needs much more easily
than you can modify programmers to suit the compiler's needs.

So I'd ask, does Smalltalk's message passing model match the way human
beings think? If not, then that's a good sign it might be a misfeature.

Nope. But OTHO, Python is famous for all the features it copied from
other languages !-)

Absolutely! There's nothing wrong with copying *good* features :)
 
B

Bruno Desthuilliers

Steven D'Aprano a écrit :
> On Tue, 04 Aug 2009 10:03:53 +0200, Bruno Desthuilliers wrote:
>
>
> I don't like arbitrary symbols.

Neither do I - when there are too many at least. But I can certainly
live with a couple ones. Now the point wasn't about my personal tastes,
but about the fact that this particular pair of "arbitrary symbols" was
IMHO still usable - IOW, I wouldn't dismiss Ruby on this sole point.
> Most people don't -- that's why "line
> noise" is unpopular. It's hard to read, hard to write, hard to maintain,
> and hard to talk about. The more line-noise, the worse the language.

OTHO, too much verbosity is a pain too. Ever programmed in AppleScript ?
"set the attribute XXX of object YYY of collection ZZZ to SomeValue"...
Yuck. I bet you prefer "zzz[yyy].xxx = SomeValue" - which uses three
arbitrary symbols.

(snip)
> Oh well, one negative out of
> a whole lot of positives isn't too bad.

Indeed. You could perhaps learn a bit more about Ruby's positives if you
don't block on what you perceive (rightfully or not) as negative points ?-)

(snip)
>
> Criticism of a language is a pissing contest?

Not necessarily. But:
> Yeah, okay, I was a tad dismissive. I un-apologetically jump to strong
> impressions about languages based on minimal use

Possibly, yes.
> -- but I'm also willing
> to change my mind. Ruby certainly looks to me like it has some nice
> features. Syntax that looks like Perl isn't one of them though.

Not my cup of tea neither FWIW. But Ruby is nowhere near Perl in terms
of "line noise".
>
>
> Well, maybe, but remember, programming languages are only partly for
> communication to the compiler. They also have the requirement to
> communicate with human programmers as well, and that's even more
> important, because
>
> (1) humans spent a lot more time working with code than compilers do;
>
> (2) human programmers charge much more money than compilers do;
>
> (3) and you can modify the compiler to suit human needs much more easily
> than you can modify programmers to suit the compiler's needs.
>
> So I'd ask, does Smalltalk's message passing model match the way human
> beings think?

Does all human beings think the same way ? And aren't human beings able
to learn new ways ?

Smalltalk's only control flow construct might seem a bit weird at first
when all you've been exposed to so far are more "traditional" special
constructs, but it's not hard to learn and is way more uniform and
flexible than having special constructs for each and any possible
situation.

One could ask if functional programming or OO "matches the way human
beings think". From experience, some of us just find FP and / or OO just
obvious, and some won't never get it.

FWIW, there are quite a few features and idioms in Python that I _now_
find readable and obvious, but that would have puzzled me ten years ago.
This reminds me of a shop where the CTO had forbidden using any OO
feature of the main language used there because "nobody would understand
it" (needless to say, I only stayed there a couple weeks...).
>
>
> Absolutely! There's nothing wrong with copying *good* features :)

Well... at least when they make sense and integrate smoothly into the
target language.
 
J

Just Another Victim of the Ambient Morality

Steven D'Aprano said:
I don't like arbitrary symbols. Most people don't -- that's why "line
noise" is unpopular. It's hard to read, hard to write, hard to maintain,
and hard to talk about. The more line-noise, the worse the language.

It's not "line noise" if it conveys information...

Of course, *ultimately* every symbol is arbitrary. There's no reason why
"2" should mean the integer two, or "list" should mean a sequence type,
but some symbols have such a long history, or have some other connection
(say, with human languages), that the arbitrariness is lost. For
instance, "+" is the obvious, non-arbitrary choice for the addition
operator in any programming language using Latin symbols, and probably
any programming language on Earth. (Not the *only* choice, but the
obvious one.)

I have a similar dislike for decorator syntax, because "@" ("at" in
English) has nothing to do with decorations. It's an arbitrary symbol.
One might argue that "$" would have been a more logical choice, because
we turn numerals into currency by decorating it with a $ sign. (At least
in the US, Canada, Australia, and a few other countries.) I use
decorators all the time, and they are a fantastic invention, but the
arbitrariness of the @ syntax is a negative. Oh well, one negative out of
a whole lot of positives isn't too bad.

You can think of "@" as describing something being "at" the instance or
the class. "$" is totally arbitrary to me 'cause I don't thnk of my code as
currency...

At least I only have to deal with *one* such arbitrary symbol that needs
memorizing. There's no need to distinguish between @@function_decorator
and @class_decorator (or should it be the other way around?). Similarly,
Python's choice of syntax for attributes is consistent: object.attribute
works for everything, whether object is a class, an instance, a module,
and whether attribute is callable or not. You can even use it on ints,
provided you are clever about it:

You can think of "@" as being at an instance or "@@" to be more
emphatically (as the Japanese do in their language) integrated with a class,
being available to all instances... or you can simply understand that
instance vairables are more common than class variables so the shorter
notation is used for the more common case...
You want to talk about arbitrariness? Why is len() a function you pass
objects into while objects can have methods that describe themselves to you?
At least Ruby defines operators using the actual name of the operator
instead of you having to remember an arbitrary magic incantation that
corresponds to the operator...

class Test
attr_reader :data

def initialize(data)
@data = data
end

# This is the operator part...
def + right
Test.new @data + right.data
end
end

Criticism of a language is a pissing contest?

Yeah, okay, I was a tad dismissive. I un-apologetically jump to strong
impressions about languages based on minimal use -- but I'm also willing
to change my mind. Ruby certainly looks to me like it has some nice
features. Syntax that looks like Perl isn't one of them though.

Yeah, that would be the "pissing contest" part.
You could simply have gone to the Ruby newsgroup and posted some
criticisms to see what was behind those decisions. However, that would have
avoided the pissing contest and perhaps you wanted one...

Well, maybe, but remember, programming languages are only partly for
communication to the compiler. They also have the requirement to
communicate with human programmers as well, and that's even more
important, because

(1) humans spent a lot more time working with code than compilers do;

(2) human programmers charge much more money than compilers do;

(3) and you can modify the compiler to suit human needs much more easily
than you can modify programmers to suit the compiler's needs.

So I'd ask, does Smalltalk's message passing model match the way human
beings think? If not, then that's a good sign it might be a misfeature.

I think this is overstated. Not all humans think in exactly the same
way so it's very presumptuous. It also ignores our intellect. We can
change the way we think and we can use this versatility to our advantage.
We can increase our productivity in certain tasks by changing the way we
think to some other paradigm that better serves a given purpose...

Absolutely! There's nothing wrong with copying *good* features :)

The trick, of course, is figuring out which features are good!
 
L

Luis Zarrabeitia

#each is simply a method that takes a function (called blocks in
ruby). One could call it a higher-order method I guess.

It's an implementation of the concept of internal iteration: instead
of collections yielding iterator objects, and programmers using those
through specially-built iteration constructs (e.g. `for…in`),
collections control iteration over themselves (the iteration is
performed "inside" the collection, thus the "internal" part) and the
programmer provides the operations to perform at each iterative step
through (usually) a function.

Interesting. I know what internal iteration is, and I suspected it was along
these lines when I saw the syntax and that .each was a function and not a
keyword. But if it is internal iteration and the .each method is receiving an
anonymous function, I wonder what is the scope of the variables in that
function. In pseudo-python terms (using your example),

x = 5
some_list.each((def (item):
do_something(item, x)
x = do_something_else(item)
))

(or something like that). In python, the inner function would be invalid
(the 'x' is local). For that construct to be equivalent to a for loop, the
anonymous function shouldn't create a new scope. Is that what is happening?
(If it is, this would be a big difference between anonymous and non-anonymous
functions).

Anyway, this is OT. Thank you for your reply.
(ah, sorry for taking so long... I was out of town)
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,769
Messages
2,569,582
Members
45,059
Latest member
cryptoseoagencies

Latest Threads

Top