Attack a sacred Python Cow

Discussion in 'Python' started by Jordan, Jul 24, 2008.

  1. Jordan

    Jordan Guest

    Hi everyone,

    I'm a big Python fan who used to be involved semi regularly in
    comp.lang.python (lots of lurking, occasional posting) but kind of
    trailed off a bit. I just wrote a frustration inspired rant on my
    blog, and I thought it was relevant enough as a wider issue to the
    Python community to post here for your discussion and consideration.

    This is not flamebait. I love Python, and I'm not out to antagonise
    the community. I also realise that one of the issues I raise is way
    too ingrained to be changed now. I'd just like to share my thinking on
    a misstep in Python's guiding principles that has done more harm than
    good IMO. So anyway, here's the post.

    I've become utterly convinced that at least one criticism leveled at
    my favourite overall programming language, Python, is utterly true and
    fair. After quite a while away from writing Python code, I started
    last night on a whim to knock up some code for a prototype of an idea
    I once had. It's going swimmingly; the Python Image Library, which I'd
    never used before, seems quick, intuitive, and with the all the
    features I need for this project. As for Python itself, well, my heart
    still belongs to whitespace delimitation. All the basics of Python
    coding are there in my mind like I never stopped using them, or like
    I've been programming in this language for 10 years.

    Except when it comes to Classes. I added some classes to code that had
    previously just been functions, and you know what I did - or rather,
    forgot to do? Put in the 'self'. In front of some of the variable
    accesses, but more noticably, at the start of *every single method
    argument list.* This cannot be any longer blamed as a hangover from
    Java - I've written a ton more code, more recently in Python than in
    Java or any other OO language. What's more, every time I go back to
    Python after a break of more than about a week or so, I start making
    this 'mistake' again. The perennial justification for this 'feature'
    of the language? That old Python favourite, "Explicit is better than
    implicit."

    I'm sorry, but EXPLICIT IS NOT NECESSARILY BETTER THAN IMPLICIT.
    Assembler is explicit FFS. Intuitive, clever, dependable, expected,
    well-designed *implicit* behaviour is one of the chief reasons why I
    use a high level language. Implicitly garbage collect old objects for
    me? Yes, please!

    I was once bitten by a Python wart I felt was bad enough to raise and
    spend some effort advocating change for on comp.lang.python (never got
    around to doing a PEP; partly laziness, partly young and inexperienced
    enough to be intimidated at the thought. Still am, perhaps.)

    The following doesn't work as any sane, reasonable person would
    expect:

    # Blog code, not tested
    class A():
    def __eq__(self, obj):
    return True
    a = A()
    b = []
    assert a == b
    assert not (a != b)

    The second assertion fails. Why? Because coding __eq__, the most
    obvious way to make a class have equality based comparisons, buys you
    nothing from the != operator. != isn't (by default) a synonym for the
    negation of == (unlike in, say, every other language ever); not only
    will Python let you make them mean different things, without
    documenting this fact - it actively encourages you to do so.

    There were a disturbingly high number of people defending this
    (including one quite renowned Pythonista, think it might have been
    Effbot). Some had the temerity to fall back on "Explicit is better
    than implict: if you want != to work, you should damn well code
    __ne__!"

    Why, for heaven's sake, should I have to, when in 99.99% of use cases
    (and of those 0.01% instances quoted in the argument at the time only
    one struck me as remotely compelling) every programmer is going to
    want __ne__ to be the logical negation of __eq__? Why, dear Python,
    are you making me write evil Java-style language power reducing
    boilerplate to do the thing you should be doing yourself anyway?
    What's more, every programmer is going to unconciously expect it to
    work this way, and be as utterly as mystified as me when it fails to
    do so. Don't tell me to RTFM and don't tell me to be explicit. I'll
    repeat myself - if I wanted to be explicit, I'd be using C and
    managing my own memory thank you very much. Better yet, I'd explicitly
    and graphically swear - swear in frustration at this entrenched design
    philosophy madness that afflicts my favourite language.

    I think the real problem with the explicit is better than implicit,
    though, is that while you can see the underlying truth its trying to
    get at (which is perhaps better expressed by Ruby's more equivocal,
    less dependable, but more useful Principle of Least Surprise), in its
    stated form its actually kind of meanginless and is used mainly in
    defence of warts - no, we'll call them for what they are, a language
    design *bugs*.

    You see, the problem is, there's no such thing of explict in
    programming. Its not a question of not doing things implicitly; its a
    question of doing the most sensible thing implicitly. For example this
    python code:

    some_obj.some_meth(some_arg1, some_arg2)

    is implicitly equivalent to

    SomeClass.some_meth(some_obj, some_arg1, some_arg2)

    which in turn gives us self as a reference to some_obj, and Python's
    OO model merrily pretends its the same as Java's when in fact is a
    smarter version that just superficially looks the same.

    The problem is that the explicit requirement to have self at the start
    of every method is something that should be shipped off to the
    implicit category. You should have to be explicit, yes - explicit when
    you want the *other* behaviour, of self *not* being an argument,
    because thats the more unusual, less likely case.

    Likewise,

    a != b

    is implicitly equivalent to something like calling this function (may
    not be correct, its a while since I was heavily involved in this
    issue):

    def equal(a, b):
    if hasattr(a, "__ne__"): return a.__ne__(b)
    if hasattr(b, "__ne__"): return b.__ne__(a)
    if hasattr(a, "__cmp__"): return not (a.__cmp__(b) == 0)
    if hasattr(b, "__cmp__"): return not (b.__cmp__(a) == 0)
    return not (a is b)

    There's absolutely nothing explicit about this. I wasn't arguing for
    making behaviour implicit; I was arguing for changing the stupid
    implict behaviour to something more sensible and less surprising.

    The sad thing is there are plenty of smart Python programmers who will
    justify all kinds of idiocy in the name of their holy crusade against
    the implict.

    If there was one change I could make to Python, it would be to get
    that damn line out of the Zen.
    Jordan, Jul 24, 2008
    #1
    1. Advertising

  2. Jordan

    Jordan Guest

    On Jul 24, 3:41 pm, Jordan <> wrote:
    > Hi everyone,
    >
    > I'm a big Python fan who used to be involved semi regularly in
    > comp.lang.python (lots of lurking, occasional posting) but kind of
    > trailed off a bit. I just wrote a frustration inspired rant on my
    > blog, and I thought it was relevant enough as a wider issue to the
    > Python community to post here for your discussion and consideration.
    >
    > This is not flamebait. I love Python, and I'm not out to antagonise
    > the community. I also realise that one of the issues I raise is way
    > too ingrained to be changed now. I'd just like to share my thinking on
    > a misstep in Python's guiding principles that has done more harm than
    > good IMO. So anyway, here's the post.
    >
    > I've become utterly convinced that at least one criticism leveled at
    > my favourite overall programming language, Python, is utterly true and
    > fair. After quite a while away from writing Python code, I started
    > last night on a whim to knock up some code for a prototype of an idea
    > I once had. It's going swimmingly; the Python Image Library, which I'd
    > never used before, seems quick, intuitive, and with the all the
    > features I need for this project. As for Python itself, well, my heart
    > still belongs to whitespace delimitation. All the basics of Python
    > coding are there in my mind like I never stopped using them, or like
    > I've been programming in this language for 10 years.
    >
    > Except when it comes to Classes. I added some classes to code that had
    > previously just been functions, and you know what I did - or rather,
    > forgot to do? Put in the 'self'. In front of some of the variable
    > accesses, but more noticably, at the start of *every single method
    > argument list.* This cannot be any longer blamed as a hangover from
    > Java - I've written a ton more code, more recently in Python than in
    > Java or any other OO language. What's more, every time I go back to
    > Python after a break of more than about a week or so, I start making
    > this 'mistake' again. The perennial justification for this 'feature'
    > of the language? That old Python favourite, "Explicit is better than
    > implicit."
    >
    > I'm sorry, but EXPLICIT IS NOT NECESSARILY BETTER THAN IMPLICIT.
    > Assembler is explicit FFS. Intuitive, clever, dependable, expected,
    > well-designed *implicit* behaviour is one of the chief reasons why I
    > use a high level language. Implicitly garbage collect old objects for
    > me? Yes, please!
    >
    > I was once bitten by a Python wart I felt was bad enough to raise and
    > spend some effort advocating change for on comp.lang.python (never got
    > around to doing a PEP; partly laziness, partly young and inexperienced
    > enough to be intimidated at the thought. Still am, perhaps.)
    >
    > The following doesn't work as any sane, reasonable person would
    > expect:
    >
    > # Blog code, not tested
    > class A():
    >   def __eq__(self, obj):
    >     return True
    > a = A()
    > b = []
    > assert a == b
    > assert not (a != b)
    >
    > The second assertion fails. Why? Because coding __eq__, the most
    > obvious way to make a class have equality based comparisons, buys you
    > nothing from the != operator. != isn't (by default) a synonym for the
    > negation of == (unlike in, say, every other language ever); not only
    > will Python let you make them mean different things, without
    > documenting this fact - it actively encourages you to do so.
    >
    > There were a disturbingly high number of people defending this
    > (including one quite renowned Pythonista, think it might have been
    > Effbot). Some had the temerity to fall back on "Explicit is better
    > than implict: if you want != to work, you should damn well code
    > __ne__!"
    >
    > Why, for heaven's sake, should I have to, when in 99.99% of use cases
    > (and of those 0.01% instances quoted in the argument at the time only
    > one struck me as remotely compelling) every programmer is going to
    > want __ne__ to be the logical negation of __eq__? Why, dear Python,
    > are you making me write evil Java-style language power reducing
    > boilerplate to do the thing you should be doing yourself anyway?
    > What's more, every programmer is going to unconciously expect it to
    > work this way, and be as utterly as mystified as me when it fails to
    > do so. Don't tell me to RTFM and don't tell me to be explicit. I'll
    > repeat myself - if I wanted to be explicit, I'd be using C and
    > managing my own memory thank you very much. Better yet, I'd explicitly
    > and graphically swear - swear in frustration at this entrenched design
    > philosophy madness that afflicts my favourite language.
    >
    > I think the real problem with the explicit is better than implicit,
    > though, is that while you can see the underlying truth its trying to
    > get at (which is perhaps better expressed by Ruby's more equivocal,
    > less dependable, but more useful Principle of Least Surprise), in its
    > stated form its actually kind of meanginless and is used mainly in
    > defence of warts - no, we'll call them for what they are, a language
    > design *bugs*.
    >
    > You see, the problem is, there's no such thing of explict in
    > programming. Its not a question of not doing things implicitly; its a
    > question of doing the most sensible thing implicitly. For example this
    > python code:
    >
    > some_obj.some_meth(some_arg1, some_arg2)
    >
    > is implicitly equivalent to
    >
    > SomeClass.some_meth(some_obj, some_arg1, some_arg2)
    >
    > which in turn gives us self as a reference to some_obj, and Python's
    > OO model merrily pretends its the same as Java's when in fact is a
    > smarter version that just superficially looks the same.
    >
    > The problem is that the explicit requirement to have self at the start
    > of every method is something that should be shipped off to the
    > implicit category. You should have to be explicit, yes - explicit when
    > you want the *other* behaviour, of self *not* being an argument,
    > because thats the more unusual, less likely case.
    >
    > Likewise,
    >
    > a != b
    >
    > is implicitly equivalent to something like calling this function (may
    > not be correct, its a while since I was heavily involved in this
    > issue):
    >
    > def equal(a, b):
    >   if hasattr(a, "__ne__"): return a.__ne__(b)
    >   if hasattr(b, "__ne__"): return b.__ne__(a)
    >   if hasattr(a, "__cmp__"): return not (a.__cmp__(b) == 0)
    >   if hasattr(b, "__cmp__"): return not (b.__cmp__(a) == 0)
    >   return not (a is b)
    >
    > There's absolutely nothing explicit about this. I wasn't arguing for
    > making behaviour implicit; I was arguing for changing the stupid
    > implict behaviour to something more sensible and less surprising.
    >
    > The sad thing is there are plenty of smart Python programmers who will
    > justify all kinds of idiocy in the name of their holy crusade against
    > the implict.
    >
    > If there was one change I could make to Python, it would be to get
    > that damn line out of the Zen.


    P.S. Forgive the typos, it was blogged in extreme haste and then only
    quickly proofread and edited before posting here. Hopefully the point
    I'm making is not diminshed by your reduced respect for me as a result
    of my carelessness :)
    Jordan, Jul 24, 2008
    #2
    1. Advertising

  3. Jordan

    pluskid Guest

    On Jul 24, 1:41 pm, Jordan <> wrote:
    > Hi everyone,
    >
    > I'm a big Python fan who used to be involved semi regularly in
    > comp.lang.python (lots of lurking, occasional posting) but kind of
    > trailed off a bit. I just wrote a frustration inspired rant on my
    > blog, and I thought it was relevant enough as a wider issue to the
    > Python community to post here for your discussion and consideration.
    >

    [...snip...]

    +1 for most of your opinion. I was also bitten by the __eq__/__ne__
    problem this morning. :)
    pluskid, Jul 24, 2008
    #3
  4. Jordan wrote:

    > Except when it comes to Classes. I added some classes to code that had
    > previously just been functions, and you know what I did - or rather,
    > forgot to do? Put in the 'self'. In front of some of the variable
    > accesses, but more noticably, at the start of *every single method
    > argument list.* This cannot be any longer blamed as a hangover from
    > Java - I've written a ton more code, more recently in Python than in
    > Java or any other OO language. What's more, every time I go back to
    > Python after a break of more than about a week or so, I start making
    > this 'mistake' again. The perennial justification for this 'feature'
    > of the language? That old Python favourite, "Explicit is better than
    > implicit."


    Do you seriously think that Python is designed by mindless application
    of a set of humorous and somewhat self-deprecating observations posted
    to a newsgroup a number of years ago?

    </F>
    Fredrik Lundh, Jul 24, 2008
    #4
  5. Jordan

    Jordan Guest

    Of course not.

    I just think Explicit is better than Implicit is taken seriously by a
    large segment the Python community as a guiding principle, and overall
    its influence does more harm than good.

    Clearly self being in every argument list was a decision arrived at
    long before the Zen was ever coined. Its merely an example of what I
    feel is a shortcoming in the conventional 'pythonic' approach to
    thinking about problems.

    The reluctance to admit that the __eq__ behaviour is a poor design
    choice is further evidence; its something (unlike self) that quite
    conceivably could be changed, and should be changed, but its somehow
    seen (by certain people) as the way that Python should do things.
    Jordan, Jul 24, 2008
    #5
  6. Jordan a écrit :

    (snip rant about self and __eq__ / __ne__)

    1/ about the __eq__ / __ne__ stuff:

    Please get your facts, the behaviour *is* actually fully documented:

    """
    There are no implied relationships among the comparison operators. The
    truth of x==y does not imply that x!=y is false. Accordingly, when
    defining __eq__(), one should also define __ne__() so that the operators
    will behave as expected.
    """
    http://docs.python.org/ref/customization.html

    FWIW, the __lt__ / __le__ / __eq__ / __ne__ / __gt__ / __ge__ methods
    set, known as "rich comparisons", was added in Python 2.1 to give more
    fine-grained control on comparisons. If you don't need such a
    granularity, just implement the __cmp__ method and you'll have all
    comparison operators working as expected.

    2/ wrt/ self in functions signatures:

    How would you handle this case with an implicit 'self' :

    class Foo(object):
    pass

    def bar(self):
    print self

    Foo.bar = bar
    Bruno Desthuilliers, Jul 24, 2008
    #6
  7. Jordan

    Jordan Guest

    On Jul 24, 7:40 pm, Torsten Bronger <-aachen.de>
    wrote:
    > Hallöchen!
    >
    > Bruno Desthuilliers writes:
    > > [...]

    >
    > > How would you handle this case with an implicit 'self' :

    >
    > > class Foo(object):
    > >    pass

    >
    > > def bar(self):
    > >    print self

    >
    > > Foo.bar = bar

    >
    > Just like this.  However, the compiler could add "self" to
    > non-decorated methods which are defined within "class".
    >
    > Tschö,
    > Torsten.
    >
    > --
    > Torsten Bronger, aquisgrana, europa vetus
    >                    Jabber ID: -aachen.de


    Yeah, forgot what I said, Torsten's reply is much better :)
    Jordan, Jul 24, 2008
    #7
  8. In message
    <>, Jordan
    wrote:

    > Except when it comes to Classes. I added some classes to code that had
    > previously just been functions, and you know what I did - or rather,
    > forgot to do? Put in the 'self'. In front of some of the variable
    > accesses, but more noticably, at the start of *every single method
    > argument list.*


    The reason is quite simple. Python is not truly an "object-oriented"
    language. It's sufficiently close to fool those accustomed to OO ways of
    doing things, but it doesn't force you to do things that way. You still
    have the choice. An implicit "self" would take away that choice.
    Lawrence D'Oliveiro, Jul 24, 2008
    #8
  9. Jordan <>:

    > # Blog code, not tested
    > class A():
    > def __eq__(self, obj):
    > return True
    > a = A()
    > b = []
    > assert a == b
    > assert not (a != b)
    >
    > The second assertion fails. Why? Because coding __eq__, the most
    > obvious way to make a class have equality based comparisons, buys you
    > nothing from the != operator. != isn't (by default) a synonym for the
    > negation of == (unlike in, say, every other language ever);


    This is just plain wrong for at least C# and C++. C# wants you to
    explicitly overload "!=", if you have overloaded "==", C++ complains
    about "!=" not being defined for class A. If you had derived A from a
    another class in C++, the compiler would happily use the operator from the
    base class instead of doing silly aliasing of "!=" to "! ==" ...

    > The sad thing is there are plenty of smart Python programmers who will
    > justify all kinds of idiocy in the name of their holy crusade against
    > the implict.
    >
    > If there was one change I could make to Python, it would be to get
    > that damn line out of the Zen.


    Fortunately, Python isn't designed according to your ideas, and won't
    change, so consider your posting a waste of time. If feeling like bringing
    such old "issues" up again next time, spend your time learning another
    programming language, as you would obviously not get happy with Python
    anyway ...

    --
    Freedom is always the freedom of dissenters.
    (Rosa Luxemburg)
    Sebastian \lunar\ Wiesner, Jul 24, 2008
    #9
  10. Jordan

    Jordan Guest

    OK, it seems my original reply to Bruno got lost in the Aether
    (apologies therefore if a paraphrased "quantum duplicate" of this
    message is eventually forthcoming.)

    Torsten has adequately responded to his second point, so I need only
    replicated what I said for the first.

    > Please get your facts, the behaviour *is* actually fully documented:


    I have the facts. I know full well the behaviour is documented - it
    was pointed out at the time of the original discussion. Documenting a
    confusing, unintuitive design decision (whether its in a programming
    language, an end user GUI app or anything in between) doesn't justify
    it.

    To attack a strawman: "foolanguage uses the bar IO library; printing
    to stdout takes about 10 mins on the average machine. But thats ok,
    because look, its documented right here."

    > FWIW, the __lt__ / __le__ / __eq__ / __ne__ / __gt__ / __ge__ methods
    > set, known as "rich comparisons", was added in Python 2.1 to give more
    > fine-grained control on comparisons. If you don't need such a
    > granularity, just implement the __cmp__ method and you'll have all
    > comparison operators working as expected.


    First, the most serious justification for rich comparisons I remember
    seeing was that scipy "needed" them. I never saw a good reason scipy
    couldnt use methods like the rest of us mortals, nor why it was
    justifiable introducing a wart into the entire language for the sake
    of mildly conveniencing an (admittedly important and widely used)
    library.

    Second, fine, have silly C++-operator-overloading-style rich
    comparisons that confuse people reading your code if you must. Why
    does it have to be the default behaviour? Its people wanting __ne__ do
    do something other than not __eq__ who should have to be explicit
    about it.

    Third, __cmp__ is no good as a fix. Most classes that wan't equality
    comparison (== and !=) don't want ordered based comparison (>= etc.)
    thrown in as well. I shouldn't implement __cmp__ unless I want my
    class to implement every order comparison operator.

    Fourth, I'm trying to examine the wider implications of the Explicit >
    Implict mantra here, not resurrect an old campaign to change !=
    behaviour that I think is probably a lost cause (if it happens as a
    side effect though, that'd be kinda cool.)
    Jordan, Jul 24, 2008
    #10
  11. Jordan

    Jordan Guest


    > This is just plain wrong for at least C# and C++.  C# wants you to
    > explicitly overload "!=", if you have overloaded "==",


    While this is as inconvenient as Python at least it doesn't catch you
    unawares. C# 1 (or maybe 0.5), Python 0.

    > C++ complains
    > about "!=" not being defined for class A.  


    See above. C++ 1, Python 0.

    So in showing my clearly hyperbolic comment was technically incorrect
    (something I could have told you myself), you have merely shown that
    two languages I find vastly inferior to Python overall are actually
    better than it in this case.

    > Fortunately, Python isn't designed according to your ideas, and won't
    > change, so consider your posting a waste of time.  If feeling like bringing
    > such old "issues" up again next time, spend your time learning another
    > programming language, as you would obviously not get happy with Python
    > anyway ...


    OK, if that's your response, that's sad. Of course, I try to learn new
    languages all the time. Python is still IMO the best. If the attitude
    in the community in response to feedback/criticism has gone from
    "maybe you've got a point" to "your a lunatic, we'll never change",
    well, only Python will suffer in the long term.
    Jordan, Jul 24, 2008
    #11
  12. Jordan

    Jordan Guest

    On Jul 24, 8:01 pm, Lawrence D'Oliveiro <l...@geek-
    central.gen.new_zealand> wrote:
    > In message
    > <>, Jordan
    > wrote:
    >
    > > Except when it comes to Classes. I added some classes to code that had
    > > previously just been functions, and you know what I did - or rather,
    > > forgot to do? Put in the 'self'. In front of some of the variable
    > > accesses, but more noticably, at the start of *every single method
    > > argument list.*

    >
    > The reason is quite simple. Python is not truly an "object-oriented"
    > language. It's sufficiently close to fool those accustomed to OO ways of
    > doing things, but it doesn't force you to do things that way. You still
    > have the choice. An implicit "self" would take away that choice.


    You could still explicitly request non-implicit self on a method by
    method basis.
    Jordan, Jul 24, 2008
    #12
  13. Jordan

    Kay Schluehr Guest

    On 24 Jul., 11:40, Torsten Bronger <-aachen.de>
    wrote:
    > Hallöchen!
    >
    > Bruno Desthuilliers writes:
    > > [...]

    >
    > > How would you handle this case with an implicit 'self' :

    >
    > > class Foo(object):
    > > pass

    >
    > > def bar(self):
    > > print self

    >
    > > Foo.bar = bar

    >
    > Just like this. However, the compiler could add "self" to
    > non-decorated methods which are defined within "class".


    And $self2, $self3, ... to the object methods of nested classes and
    $cls2, $cls3, ... to the classmethods of those classes...?

    And when we are at it, here is a nice little exercise for the
    proponents of compiler magic.

    Write a decorator that takes and returns a method and prints the
    object the method is bound to. It's very easy to do it when the object
    is passed explicitely:

    def print_self(func):
    def call(self, *args, **kwd):
    print self
    return func(self, *args, **kwd)
    return call

    Conceptual clarity isn't always an entirely bad thing to have.
    Kay Schluehr, Jul 24, 2008
    #13
  14. Jordan

    alex23 Guest

    On Jul 24, 8:21 pm, Jordan <> wrote:
    > If the attitude
    > in the community in response to feedback/criticism has gone from
    > "maybe you've got a point" to "your a lunatic, we'll never change",
    > well, only Python will suffer in the long term.


    Personally, I think it has more to do with statements like "there are
    plenty of smart Python programmers who will
    justify all kinds of idiocy in the name of their holy crusade" than
    with your position. You don't begin a discussion by discrediting
    anyone who might disagree with you as some kind of religious bigot
    while simultaneously holding that you are the only sane voice
    speaking.
    alex23, Jul 24, 2008
    #14
  15. Jordan a écrit :
    > OK, it seems my original reply to Bruno got lost in the Aether
    > (apologies therefore if a paraphrased "quantum duplicate" of this
    > message is eventually forthcoming.)
    >
    > Torsten has adequately responded to his second point,


    Not MHO, by far.

    > so I need only
    > replicated what I said for the first.
    >
    >> Please get your facts, the behaviour *is* actually fully documented:

    >
    > I have the facts. I know full well the behaviour is documented


    Then why do you write, let me quote:

    """
    (snip) coding __eq__ (snip) buys you
    nothing from the != operator. != isn't (by default) a synonym for the
    negation of == (unlike in, say, every other language ever); not only
    will Python let you make them mean different things, without
    documenting this fact - it actively encourages you to do so.
    """


    >- it
    > was pointed out at the time of the original discussion. Documenting a
    > confusing, unintuitive design decision (whether its in a programming
    > language, an end user GUI app or anything in between) doesn't justify
    > it.


    I was not commenting on the actual design choice, just stating that it
    is actually documented.

    > To attack a strawman: "foolanguage uses the bar IO library; printing
    > to stdout takes about 10 mins on the average machine. But thats ok,
    > because look, its documented right here."


    And you're talking about strawman ??? Come on, you obviously can tell
    the difference between a one-line statement and your above strawman
    argument, don't you ?

    Please understand that I'm not arguing about this particular design
    choice (and FWIW, I'd mostly agree on the point that having a != b
    different from not (a == b) is actually a wart). I'm just correcting
    your statement about the behaviour of __eq__ / __ne__ not being
    documented, which is obviously false.

    (snip)
    Bruno Desthuilliers, Jul 24, 2008
    #15
  16. Torsten Bronger a écrit :
    > Hallöchen!
    >
    > Bruno Desthuilliers writes:
    >
    >> [...]
    >>
    >> How would you handle this case with an implicit 'self' :
    >>
    >> class Foo(object):
    >> pass
    >>
    >> def bar(self):
    >> print self
    >>
    >> Foo.bar = bar

    >
    > Just like this. However, the compiler could add "self" to
    > non-decorated methods which are defined within "class".


    What's defined within classes are plain functions. It's actually the
    lookup mechanism that wraps them into methods (and manage to insert the
    current instance as first argument).
    Bruno Desthuilliers, Jul 24, 2008
    #16
  17. Torsten Bronger a écrit :
    > Hallöchen!
    >
    > Kay Schluehr writes:
    >
    >> On 24 Jul., 11:40, Torsten Bronger <-aachen.de>
    >> wrote:
    >>> Bruno Desthuilliers writes:
    >>>
    >>>> [...]
    >>>>
    >>>> How would you handle this case with an implicit 'self' :
    >>>>
    >>>> class Foo(object):
    >>>> pass
    >>>>
    >>>> def bar(self):
    >>>> print self
    >>>>
    >>>> Foo.bar = bar
    >>> Just like this. However, the compiler could add "self" to
    >>> non-decorated methods which are defined within "class".

    >> And $self2, $self3, ... to the object methods of nested classes
    >> and $cls2, $cls3, ... to the classmethods of those classes...?

    >
    > One could surely find ways to realise this. However, the design
    > goal should be: Make the frequent case simple, and the rare case
    > possible.


    Given the (more and more prominent) use of decorators, metaclasses and
    other meta-programming techniques in Python, I'm not sure the cases
    where you really need access to Python's object model inners are that
    "rare". Not in my code at least.
    Bruno Desthuilliers, Jul 24, 2008
    #17
  18. Jordan <>:

    >> Fortunately, Python isn't designed according to your ideas, and won't
    >> change, so consider your posting a waste of time.  If feeling like
    >> bringing such old "issues" up again next time, spend your time learning
    >> another programming language, as you would obviously not get happy with
    >> Python anyway ...

    >
    > OK, if that's your response, that's sad. Of course, I try to learn new
    > languages all the time. Python is still IMO the best. If the attitude
    > in the community in response to feedback/criticism has gone from
    > "maybe you've got a point" to "your a lunatic, we'll never change",
    > well, only Python will suffer in the long term.


    I don't really mind, what you think about my response. Python will suffer
    from it as little as it will suffer from your complaints: These things
    will not change, whatever any of us says about them. So this discussion
    unlikely to produce any new insight, especially because this as been
    discussed over and over again in the past, without any effect on Python.

    Let's just drop this, and if you want to complain next time, just complain
    about something, that is really worth being complained about, like for
    instance old and outdated modules in the standard library, or real
    showstoppers in Python (e.g. the GIL).

    --
    Freedom is always the freedom of dissenters.
    (Rosa Luxemburg)
    Sebastian \lunar\ Wiesner, Jul 24, 2008
    #18
  19. Lawrence D'Oliveiro a écrit :
    > In message
    > <>, Jordan
    > wrote:
    >
    >> Except when it comes to Classes. I added some classes to code that had
    >> previously just been functions, and you know what I did - or rather,
    >> forgot to do? Put in the 'self'. In front of some of the variable
    >> accesses, but more noticably, at the start of *every single method
    >> argument list.*

    >
    > The reason is quite simple. Python is not truly an "object-oriented"
    > language.


    Oh yes ? What's missing exactly ? You have objects that have an id,
    state and behaviour, and you have a message-passing mechanism.

    You meant "Python is not truly a mainstream class-based language", I think.

    > It's sufficiently close to fool those accustomed to OO ways of
    > doing things,


    s/OO/class-based/

    > but it doesn't force you to do things that way. You still
    > have the choice. An implicit "self" would take away that choice.


    It's not even a question of OO/non-OO. An implicit "self" would take
    away some things that makes Python's *object* model so powerful.
    Bruno Desthuilliers, Jul 24, 2008
    #19
  20. Jordan

    Jordan Guest

    > Personally, I think it has more to do with statements like "there are
    > plenty of smart Python programmers who will
    > justify all kinds of idiocy in the name of their holy crusade" than
    > with your position. You don't begin a discussion by discrediting
    > anyone who might disagree with you as some kind of religious bigot
    > while simultaneously holding that you are the only sane voice
    > speaking.


    I didn't set out to discredit anyone who might disagree with me; in
    fact I didn't in anyway try to pre-empt any person who might disagree
    with my thesis. I merely stated an observation - I have in the past
    seen seemingly intelligent people take silly stands in the name of
    Explicit is greater than Implicit (not just on comp.lang.python, and
    not just concerning != or self).

    I wish in retrospect I'd had the time, patience and focus to edit the
    initial post to make it more measured and less inflammatory, because
    its clear the tone detracts from the argument I'm making, which I feel
    still stands.

    So if you wish, ignore the specifics of the frustration that inspired
    me and consider only the thrust of what I'm saying:

    "Explicit is better than Implict" considered harmful. Discuss.
    Jordan, Jul 24, 2008
    #20
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Mathias

    Q: Scheduling in scipy.cow

    Mathias, Dec 28, 2004, in forum: Python
    Replies:
    2
    Views:
    294
    Fernando Perez
    Dec 29, 2004
  2. U S Contractors Offering Service A Non-profit

    " They say why buy the cOw when the milk is fOr frEE "

    U S Contractors Offering Service A Non-profit, Nov 30, 2006, in forum: C Programming
    Replies:
    0
    Views:
    615
    U S Contractors Offering Service A Non-profit
    Nov 30, 2006
  3. mosfet
    Replies:
    3
    Views:
    367
    Chris Thomasson
    May 11, 2007
  4. Light slices + COW

    , May 3, 2008, in forum: Python
    Replies:
    5
    Views:
    343
  5. jacob navia

    The portability sacred cow

    jacob navia, Apr 20, 2014, in forum: C Programming
    Replies:
    40
    Views:
    190
    Bill Cunningham
    Apr 27, 2014
Loading...

Share This Page