Confused with methods

D

Dan Perl

jfj said:
I don't understand.
We can take a function and attach it to an object, and then call it
as an instance method as long as it has at least one argument:

#############
class A:
pass

def foo(x):
print x

A.foo = foo
a=A()
a.foo()
#############

However this is not possible for another instance method:

############
class A:
pass

class B:
def foo(x,y)
print x,y

b=B()
A.foo = b.foo
a=A()

# error!!!
a.foo()
##############

Python complains that 'foo() takes exactly 2 arguments (1 given)'.
But by calling "b.foo(1)" we prove that it is indeed a function which
takes
exactly one argument.

Isn't that inconsistent?

You called b.foo(1) but a.foo(). Note one argument in the first call and no
arguments in the second call. Would you have called a.foo(1), you would
have gotten the same result as with b.foo(1). I suppose that was just a
small omission on your part, but what are you trying to do anyway? It's a
very strange use of instance methods.
 
A

Alex Martelli

jfj said:
I don't understand.
We can take a function and attach it to an object, and then call it
as an instance method as long as it has at least one argument:

#############
class A:
pass

def foo(x):
print x

A.foo = foo
a=A()
a.foo()
#############

Right. If you want to understand how this happens, look at the
foo.__get__ special method, which makes foo (like every other function)
a *descriptor*. One you've set foo as an attribute of class A, the
access a.foo calls foo.__get__(a, A) which returns an object of class
"instance method".

However this is not possible for another instance method:

############
class A:
pass

class B:
def foo(x,y)
print x,y

b=B()
A.foo = b.foo

What you're assigning here as an attribute of class A is not a
descriptor: it does not have a special method __get__.
a=A()

# error!!!
a.foo()

Since A.foo does not have a method __get__, it's just returned when you
access it (on instance a). It's already an instance method -- but it's
bound to instance b of class B.
##############

Python complains that 'foo() takes exactly 2 arguments (1 given)'.
But by calling "b.foo(1)" we prove that it is indeed a function which takes
exactly one argument.

Calling something does not prove it's a function. Python has a lot of
callable objects, and functions are just one callable types. So, you're
mistaken if you think that by calling (whatever) you prove said whatever
is a function (of any kind).

The error message you're seeing does come from a function -- the im_func
attribute of b.foo, which is a function foo taking two arguments
(created and set into class B by the "def foo(x, y):" above, even though
you forgot the colon I'm sure you intended to type it).

Isn't that inconsistent?

That Python has many callable types, not all of which are descriptors?
I don't see any inconsistency there. Sure, a more generalized currying
(argument-prebinding) capability would be more powerful, but not more
consistent (there's a PEP about that, I believe).

If what you want is to focus on the error message, you can completely
separate it from the idea of having set b.foo as an attribute of class
A. Just try calling b.foo() without arguments and you'll get exactly
the same error message -- from the underlying function foo, which is
also b.foo.im_func (the function's FIRST argument is bound to
b.foo.im_self, which is b; whence the error being about function foo
taking exactly two arguments and only one having been given).


Alex
 
D

Diez B. Roggisch

I expected that when we add this "x" to a class's dictionary and
then we request it from an instance of that class, it will be
converted to an bound-method and receive its --one-- argument
from the referring instance.

Here you are wrong: your b.foo is a bound method - it already _has_ its
first argument (an instance of B) bound to it. And just passing it around
doesn't change that. You can assign it to a name in whatever scope you like
- that won't change its nature.

Now if you want to use foo in A as instancemethod, you could do this:

A.foo = b.foo.im_func
a = A()
a.foo(200)


That works because im_func is a "pure" function:
<function foo at 0xb7ac47d4>


If things worked as you wanted it to, that would mean that passing a bound
method as argument to a class and storing it there to a instance variable
that would "eat up" the arguments - surely not the desired behaviour.
 
A

Alex Martelli

jfj said:
If I say:

x=b.foo
x(1)

Then, without looking at the previous code, one can say that "x" is a
function which takes one argument.

One can say whatever one wishes, but saying it does not make it true.

One can say that x is a green frog, but that's false: x is a
boundmethod.

One can say that x is a function, but that's false: x is a boundmethod.

One can say that x is a spade, but that's false: x is a boundmethod.


You can call green frogs (with a suitable whistling), you can call
functions, and you can call a spade (a spade). The fact that they're
all callable (since you can call them), and a boundmethod is also
callable, does not necessarily make them equivalent to each other, nor
any of them equivalent to a boundmethod, in any other way whatsoever.


Alex
 
A

Alex Martelli

jfj said:
Thanks for the explanation.

The inconsistency I see is that if I wanted this kind of behavior
I would've used the staticmethod() builtin (which in a few words
alters __get__ to return the function unmodified).

The staticmethod builtin would NOT have given you the behavior: "bind
the first argument". I do not know what you mean by "this kind of
behavior", but making a boundmethod most assuredly DOES give you
extremely different behavior from wrapping a function in staticmethod.

I still wouldn't see any "inconsistency" even if two different ways of
proceeding gave the same result in a certain case. That would be like
saying that having x-(-y) give the same result as x+y (when x and y are
numbers) is ``inconsistent''... the word ``inconsistent'' just doesn't
MEAN that!

"Inconsistent" means sort of the reverse: one way of proceeding giving
different results. But the fact that the same operation on objects of
different types may well give different results isn't _inconsistent_ --
it's the sole purpose of HAVING different types in the first place...!


Alex
 
J

jfj

I don't understand.
We can take a function and attach it to an object, and then call it
as an instance method as long as it has at least one argument:

#############
class A:
pass

def foo(x):
print x

A.foo = foo
a=A()
a.foo()
#############

However this is not possible for another instance method:

############
class A:
pass

class B:
def foo(x,y)
print x,y

b=B()
A.foo = b.foo
a=A()

# error!!!
a.foo()
##############

Python complains that 'foo() takes exactly 2 arguments (1 given)'.
But by calling "b.foo(1)" we prove that it is indeed a function which takes
exactly one argument.

Isn't that inconsistent?

Thanks,

Gerald.
 
J

jfj

Dan said:
You called b.foo(1) but a.foo(). Note one argument in the first call and no
arguments in the second call. Would you have called a.foo(1), you would
have gotten the same result as with b.foo(1). I suppose that was just a
small omission on your part, but what are you trying to do anyway? It's a
very strange use of instance methods.

No omission.
If I say:

x=b.foo
x(1)

Then, without looking at the previous code, one can say that "x" is a
function which takes one argument. Continuing with "x":

A.foo = x
# this is ok
A.foo(1)
a=A()
# this is not ok
a.foo()

I expected that when we add this "x" to a class's dictionary and
then we request it from an instance of that class, it will be
converted to an bound-method and receive its --one-- argument
from the referring instance.

So "a.foo()" == "A.foo(a)" == "x(a)" == "b.foo(a)" == "B.foo(b,a)",
or at least "why not?" (head exploded?:)

I'm not trying to do something specific with this though.


G.
 
J

jfj

Alex said:
That Python has many callable types, not all of which are descriptors?
I don't see any inconsistency there. Sure, a more generalized currying
(argument-prebinding) capability would be more powerful, but not more
consistent (there's a PEP about that, I believe).

Thanks for the explanation.

The inconsistency I see is that if I wanted this kind of behavior
I would've used the staticmethod() builtin (which in a few words
alters __get__ to return the function unmodified).

So I would write

A.foo = staticmethod (b.foo)

But now, it always acts as staticmethod:(

Anyway, if there's a PEP about it, I'm +1 because its "pythonic".


G.
 
J

jfj

Alex said:
I still wouldn't see any "inconsistency" even if two different ways of
proceeding gave the same result in a certain case. That would be like
saying that having x-(-y) give the same result as x+y (when x and y are
numbers) is ``inconsistent''... the word ``inconsistent'' just doesn't
MEAN that!

"Inconsistent" means sort of the reverse: one way of proceeding giving
different results. But the fact that the same operation on objects of
different types may well give different results isn't _inconsistent_ --
it's the sole purpose of HAVING different types in the first place...!

Ok! I said I was confused in the first place!


G.
 
A

Alex Martelli

I understand that a function and a boundmethod are *different* things.
For one a *boundmethod* has the attributes im_self, im_class, which
a function does not have (a green frog neither). Thus they are not
the same thing.

Great! So, do *NOT* ``say that "x" is a function'', when you know that
is false.
HOWEVER, what I ask is WHY don't we set the tp_descr_get of
the boundmethod object to be the same as the func_descr_get???
Or WHY do they *have* to differ in this specific part?

Why we don't change boundmethods' binding behavior NOW is obvious --
can't break backwards compatibility. If you're talking about Python
3.0, which _will_ be allowed to break backwards compatibility when it
comes, then I, personally, have no special objection to changing that
binding behavior; a PEP may be necessary, of course, but it may also be
possible to get the idea into the existing PEP 3000 by presenting a
good case on python-dev (not here -- no decisions on Python's future
are made here).
I quickly looked at the source of python and it seems that a
one-liner would be enough to enable this. So it's not that it

You looked too quickly. Currently, PyMethod objects are used to
implement both bound and unbound methods, and function
instancemethod_descr_get (around line 2430 of classobject.c) has the
logic to implement the differences in __get__ behavior between them.
Recently GvR mused about doing away with unbound-methods per se, but
after some discussion on python-dev he currently seems to have
withdrawn the proposal; you may want to read that discussion in the
python-dev archives since it's relevant to prototyping your idea. In
any case, a one-liner would not be enough, although:
would be hard to implement it, or inefficient.

....that doesn't necessarily make it either hard or inefficient.
If there a good reason that the __get__ of a boundmethod does not
create a new boundmethod wrapper over the first boundmethod?

There might be some inefficiencies in the multiple indirections, but
that remains to be proven until we have a prototype. Besides, if there
are, the right approach would probably be to have some generalization
of boundmethods able to bind the first N arguments rather than one at a
time.


Alex
 
D

Diez B. Roggisch

HOWEVER, what I ask is WHY don't we set the tp_descr_get of
the boundmethod object to be the same as the func_descr_get???
Or WHY do they *have* to differ in this specific part?

I quickly looked at the source of python and it seems that a
one-liner would be enough to enable this. So it's not that it
would be hard to implement it, or inefficient.

A bound method would *still* be a boundmethod.
We could additionally have:

<boundmethod of <boundmethod of <boundmethod of....<__main__.A
instance at 0x>>>

If there a good reason that the __get__ of a boundmethod does not
create a new boundmethod wrapper over the first boundmethod?

I already gave you the good reason:

class A:
def callback(self, arg):
print self, arg

def callback(arg):
print arg

class DoSomethingWithCallback:
def __init__(self, cb):
self.cb = cb

def run(self):
for i in xrange(100):
self.cb(i)

u = DoSomethingWithCallback(A().callback)
v = DoSomethingWithCallback(callback)

# would crash if your suggestion worked
u.run()
v.run()


Bound methods aren't a general purpose currying scheme - they exist sorely
for the OO-style implicit first argument. That they exist at all is a great
difference to e.g. C++, where you can't pass callbacks around that are
instance methods.

If you are after currying - look at the cookbook, there are recipes for
that.
 
A

Antoon Pardon

Op 2005-02-06 said:
That Python has many callable types, not all of which are descriptors?
I don't see any inconsistency there. Sure, a more generalized currying
(argument-prebinding) capability would be more powerful, but not more
consistent (there's a PEP about that, I believe).

I think python is a bit inconsistent here, by using 'def' for
two different things. I think it would have been more consistent
if def always produces a function to be used as such and that
methods had there own keyword. That would make the magic of
bound methods more explicit.
 
A

Alex Martelli

Diez B. Roggisch said:
I already gave you the good reason:

Hmmm, not sure the code below is ``a good reason'' to avoid changing the
__get__ behavior of boundmethods come Python 3.0 time. Assume all
classes are newstyle (they'll be, in 3.0), not that it matters here (I
think):
class A:
def callback(self, arg):
print self, arg

def callback(arg):
print arg

class DoSomethingWithCallback:
def __init__(self, cb):
self.cb = cb

Here we're setting an INSTANCE attribute,
def run(self):
for i in xrange(100):
self.cb(i)

....and here we're fetching it back. So, it doesn't matter if the cb
argument is a descriptor of whatever variant -- it must be callable, and
that's all we ask of it. IOW, cb.__get__ -- whether it exists at all,
and what it does -- just doesn't *matter*; cb.__call__ is all that does
matter.
u = DoSomethingWithCallback(A().callback)
v = DoSomethingWithCallback(callback)

# would crash if your suggestion worked
u.run()
v.run()

I don't think there would be any crash, because the idea is about
changing some __get__ behavior, NOT any __call__ behavior, and no
__get__ is involved in this example.

If strong use cases exist for a boundmethod's __get__ keeping its
current behavior (apart from the obvious backwards compatibility issues
which mean that any change will have to wait for 3.0), I don't think
this can be one.

Bound methods aren't a general purpose currying scheme - they exist sorely
for the OO-style implicit first argument. That they exist at all is a great
difference to e.g. C++, where you can't pass callbacks around that are
instance methods.

This is historically true. However, this historical fact is not
necessarily a reason for keeping the current lower-power behavior of
boundmethods, when 3.0 allows breaking backwards compatibility.

If you are after currying - look at the cookbook, there are recipes for
that.

Sure, I wrote some of them;-). But since we're talking about future
prospects, I think the PEP on partial application is more relevant than
the current pure-Python workarounds. They _are_ workarounds, btw --
partial application, aka currying, is an important general idea; it can
be implemented in Python, but it's not directly part of it. The PEP's
idea is to have that concept directly implemented.

Actually, I believe that if you call new.instancemethod you CAN already
get that effect -- let's see...:
23 45 67

Yep -- just fine (in __call__ terms). Not so in __get__ terms:
Traceback (most recent call last):
File "<stdin>", line 1, in ?
TypeError: f() takes exactly 3 arguments (2 given)


So, for generic currying (in the original sense: prebinding the FIRST
argument of any callable), new.instancemethod is OK (even though you
have to pass a third argument of 'object', no big deal), and you can
call it repeatedly. We're only missing a descriptor with a __get__ that
does this appropriately when you set such a ``curried'' callable as a
class attribute.

I'm not sure if the best approach is to change boundmethods' own
__get__, or to have another descriptor for the purpose:
.... def __init__(self, callable): self.callable = callable
.... def __get__(self, obj, cls): return curry(self.callable, obj, cls)
.... 23 45 <__main__.X object at 0x402dfa8c>

This ``binder'' approach would have the advantage of allowing ANY
callable whatsoever, e.g.:

Of course, a custom metaclass could easily wrap 'binder' around any
class attributes which are callable but aren't descriptors, to get the
same effect more transparently. Hmmmm -- the tradeoffs aren't clear to
me at this time. That's exactly what the PEP process is for... ensure
that any tradeoffs ARE discussed in depth before any change is made.


Alex
 
A

Alex Martelli

Antoon Pardon said:
I think python is a bit inconsistent here, by using 'def' for
two different things.

It doesn't.
I think it would have been more consistent
if def always produces a function

It does.

def g(self): pass
class A(object):
f = g

class B(object):
def f(self): pass

class C(object): pass
C.f = g

class D(object):
f = B.f

These four classes are essentially equivalent. def always produces a
function object and binds it to the name coming after keyword 'def'.
Any such function object, no matter how produced and how bound hither
and thither, always behaves in exactly the same way.

You're free to like or dislike this arrangement, but calling it
"inconsistent" is a madman's folly, as it is TOTALLY consistent.


Alex
 
J

John Lenton

def always produces a
function object and binds it to the name coming after keyword 'def'.
Any such function object, no matter how produced and how bound hither
and thither, always behaves in exactly the same way.

This isn't exactly true:
... def __new__(*a): pass
... <unbound method D.__new__>

of course, __new__ is special-cased (*some*body should've read "import
this", especially the part "explicit is better than implicit").

--
John Lenton ([email protected]) -- Random fortune:
Al freír será el reír.

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.5 (GNU/Linux)

iD8DBQFCB3//gPqu395ykGsRAj+oAKCTuBPtru/i2eRKGmglq9V3xD7ePACeNPBB
a1jyxnxo1mQp9372+wimRnQ=
=bDQb
-----END PGP SIGNATURE-----
 
A

Antoon Pardon

Op 2005-02-07 said:
It doesn't.


It does.

def g(self): pass
class A(object):
f = g

class B(object):
def f(self): pass

class C(object): pass
C.f = g

class D(object):
f = B.f

These four classes are essentially equivalent. def always produces a
function object and binds it to the name coming after keyword 'def'.
Any such function object, no matter how produced and how bound hither
and thither, always behaves in exactly the same way.

You're free to like or dislike this arrangement, but calling it
"inconsistent" is a madman's folly, as it is TOTALLY consistent.

Yes it is inconsistent with the rest of python. That you found
a subset in which it is consistent doesn't change that.

And what if you do:

c = C()
c.f = g


The fact that a function in a class performs a lot of magic if
it is called through an instance, that isn't performed otherwise,
makes python inconsistent here. You may like the arrangement
(and it isn't such a big deal IMO) but that doesn't make it consistent.
 
J

John Lenton

The fact that a function in a class performs a lot of magic if
it is called through an instance, that isn't performed otherwise,
makes python inconsistent here. You may like the arrangement
(and it isn't such a big deal IMO) but that doesn't make it consistent.

I vote for accepting the fact (it goes with the "practicality beats
purity" bit, because otherwise all our methods would have to start
with an @instancemethod). This doesn't justify (IMVVHO) a new keyword,
like you (was it you?) seemed to imply (do you really mean for
instancemethod (or somesuch) and classmethod to become keywords?). I
think it's fine the way it is: there's an implicit @instancemethod,
and @staticmethod (which you'd have be the default if I read you
right) has to be explicit, when in classes. I think __new__ being an
exception to this is a (minor) wart, in fact it feels like premature
optimization (how many __new__s do you write, that you can't stick a
@staticmethod in front of them?

--
John Lenton ([email protected]) -- Random fortune:
Fun Facts, #14:
In table tennis, whoever gets 21 points first wins. That's how
it once was in baseball -- whoever got 21 runs first won.

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.5 (GNU/Linux)

iD8DBQFCB4gKgPqu395ykGsRAnfEAJ9Dzqh/5WxP3W2C6EZKdEJ3Dk9CogCeOrDz
Se4r4OBFagG6F6tbnxOUhA0=
=feKR
-----END PGP SIGNATURE-----
 
A

Antoon Pardon

Op 2005-02-07 said:
--cvVnyQ+4j833TQvp
Content-Type: text/plain; charset=us-ascii
Content-Disposition: inline
Content-Transfer-Encoding: quoted-printable



I vote for accepting the fact (it goes with the "practicality beats
purity" bit,

But before one can produce the "practicality beats purity" argument,
one has to accept this isn't pure. That was all I was saying here,
python is not consistent/pure here. Now there can be good arguments
to have this that offset the inconsistency, but the python people
shouldn't then try to argue that it is consistent anyway.
because otherwise all our methods would have to start
with an @instancemethod). This doesn't justify (IMVVHO) a new keyword,
like you (was it you?) seemed to imply (do you really mean for
instancemethod (or somesuch) and classmethod to become keywords?). I
think it's fine the way it is: there's an implicit @instancemethod,
and @staticmethod (which you'd have be the default if I read you
right) has to be explicit, when in classes.

I'm not saying I would have it so, I'm saying that would be the
pure/consistent way to do it.
 
A

Alex Martelli

Antoon Pardon said:
Yes it is inconsistent with the rest of python. That you found
a subset in which it is consistent doesn't change that.

And what if you do:

c = C()
c.f = g


The fact that a function in a class performs a lot of magic if
it is called through an instance, that isn't performed otherwise,
makes python inconsistent here. You may like the arrangement
(and it isn't such a big deal IMO) but that doesn't make it consistent.

Any descriptor (be it a function or otherwise) has its __get__ method
called, when _accessed_ by attribute syntax, if and only if that
descriptor is in a class. _ALL_ of Python is perfectly consistent on
this point, and if I didn't already know the kind of crazy and obviously
false assertions that you post *QUITE* consistently, I would be
astonished to see you claim otherwise. Knowing your posts, this latest
idiocy is perfectly "par for the course".

"A lot of magic" is simply a stupid and imprecise way to describe "the
__get__ method gets called". Saying that any of this happens when the
function is CALLED is a definitely more serious mistake, since it's
absolutely obvious that the __get__ method is called when the function
(or any other attribute) is *ACCESSED* -- the call operation (on
whatever object __get__ returns) happens AFTERWARDS.

Why you, and a few other habitual trolls, keep lowering the signal to
noise ratio of this newsgroup with your blatherings, I don't know; I'm
sure this behavior must be giving you guys some kind of satisfaction.
Whether the damage you do to the clarity of the issues, and to the
understanding of newbies who are unfortunate enough to read and trust
the many imprecise and/or utterly false assertions you keep making, with
totally unjustified airs of competence, is part of your jollies, or just
a side effect you don't care a whit about, I don't know either. Guess
I'll just killfile you for another month now -- wish MacSOUP had a
simple way to to permanent killfiling, since it's pretty obvious by now
that it's quite unlikely you'll even post anything worthwhile at all.


Alex
 
A

Alex Martelli

John Lenton said:
I think __new__ being an
exception to this is a (minor) wart, in fact it feels like premature
optimization (how many __new__s do you write, that you can't stick a
@staticmethod in front of them?

I personally think it's quite reasonable for Python's infrastructure to
consider each special name specially -- that's what the double
underscores before and after are FOR, after all. Since __new__ MUST be
a staticmethod, and there's no use case for it ever being otherwise, it
seems quite sensible for the default metaclass to MAKE it a staticmethod
(and otherwise treat it specially, as it needs to be). I don't see what
optimization has to do with it (quite apart from the fact that __new__
was introduced before the splatsyntax for decorators, so that rather
than sticking anything in front the alternative would have been to
demand serious boilerplate, a '__new__ = staticmethod(__new__)' after
every definition of such a method). Reducing boilerplate with no ill
effects whatsoever seems quite a worthy goal to me, when reachable.


Alex
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,012
Latest member
RoxanneDzm

Latest Threads

Top