is decorator the right thing to use?

D

Dmitry S. Makovey

Hi,

after hearing a lot about decorators and never actually using one I have
decided to give it a try. My particular usecase is that I have class that
acts as a proxy to other classes (i.e. passes messages along to those
classes) however hand-coding this type of class is rather tedious, so I
decided to use decorator for that. Can somebody tell me if what I'm doing
is a potential shot-in-the-foot or am I on the right track? (Note, It's
rather rudimentary proof-of-concept implementation and not the final
solution I'm about to employ so there are no optimizations or
signature-preserving code there yet, just the idea).

Here's the code:

class A:
b=None
def __init__(self,b):
self.val='aval'
self.b=b
b.val='aval'

def mymethod(self,a):
print "A::mymethod, ",a

def mymethod2(self,a):
print "A::another method, ",a


def Aproxy(fn):
def delegate(*args,**kw):
print "%s::%s" % (args[0].__class__.__name__,fn.__name__)
args=list(args)
b=getattr(args[0],'b')
fnew=getattr(b,fn.__name__)
# get rid of original object reference
del args[0]
fnew(*args,**kw)
setattr(A,fn.__name__,delegate)
return fn

class B:
def __init__(self):
self.val='bval'

@Aproxy
def bmethod(self,a):
print "B::bmethod"
print a, self.val

@Aproxy
def bmethod2(self,a):
print "B::bmethod2"
print a, self.val

b=B()
b.bmethod('foo')
a=A(b)
b=B()
b.val='newval'
a.bmethod('bar')
a.bmethod2('zam')
 
S

showellshowell

Hi,

after hearing a lot about decorators and never actually using one I have
decided to give it a try. My particular usecase is that I have class that
acts as a proxy to other classes (i.e. passes messages along to those
classes) however hand-coding this type of class is rather tedious, so I
decided to use decorator for that. Can somebody tell me if what I'm doing
is a potential shot-in-the-foot or am I on the right track? (Note, It's
rather rudimentary proof-of-concept implementation and not the final
solution I'm about to employ so there are no optimizations or
signature-preserving code there yet, just the idea).

Your code below is very abstract, so it's kind of hard to figure out
what problem you're trying to solve, but it seems to me that you're
using the B proxy class to decorate the A target class, which means
you want one of these options:

1) Put decorators over the methods in A, not B. Isn't it the
methods of A that are being decorated here?

2) Eliminate the decorator syntax and make your code more
expressive:

a = SomeClass()
# first call it directly
x = a.foo()
y = a.bar()
# now decorate it
debug_proxy =
ClassThatDecoratesMethodCallsToObjectWithDebuggingCode(a)
debug_proxy.decorate_methods('foo', 'bar')

The decorate_methods method would be magical, in terms of overwriting
a's innards, while still preserving the same interface for its users.

But again, I'm just guessing here, because it's hard to know what
problem you're really solving.

Cheers,

Steve

Code quoted below:
Here's the code:

class A:
    b=None
    def __init__(self,b):
        self.val='aval'
        self.b=b
        b.val='aval'

    def mymethod(self,a):
        print "A::mymethod, ",a

    def mymethod2(self,a):
        print "A::another method, ",a

def Aproxy(fn):
    def delegate(*args,**kw):
        print "%s::%s" % (args[0].__class__.__name__,fn.__name__)
        args=list(args)
        b=getattr(args[0],'b')
        fnew=getattr(b,fn.__name__)
        # get rid of original object reference
        del args[0]
        fnew(*args,**kw)
    setattr(A,fn.__name__,delegate)
    return fn

class B:
    def __init__(self):
        self.val='bval'

    @Aproxy
    def bmethod(self,a):
        print "B::bmethod"
        print a, self.val

    @Aproxy
    def bmethod2(self,a):
        print "B::bmethod2"
        print a, self.val

b=B()
b.bmethod('foo')
a=A(b)
b=B()
b.val='newval'
a.bmethod('bar')
a.bmethod2('zam')
 
A

Aaron \Castironpi\ Brady

Hi,

after hearing a lot about decorators and never actually using one I have
decided to give it a try. My particular usecase is that I have class that
acts as a proxy to other classes (i.e. passes messages along to those
classes) however hand-coding this type of class is rather tedious, so I
decided to use decorator for that. Can somebody tell me if what I'm doing
is a potential shot-in-the-foot or am I on the right track? (Note, It's
rather rudimentary proof-of-concept implementation and not the final
solution I'm about to employ so there are no optimizations or
signature-preserving code there yet, just the idea).

Here's the code:

class A:
    b=None
    def __init__(self,b):
        self.val='aval'
        self.b=b
        b.val='aval'

    def mymethod(self,a):
        print "A::mymethod, ",a

    def mymethod2(self,a):
        print "A::another method, ",a

def Aproxy(fn):
    def delegate(*args,**kw):
        print "%s::%s" % (args[0].__class__.__name__,fn.__name__)
        args=list(args)
        b=getattr(args[0],'b')
        fnew=getattr(b,fn.__name__)
        # get rid of original object reference
        del args[0]
        fnew(*args,**kw)
    setattr(A,fn.__name__,delegate)
    return fn

class B:
    def __init__(self):
        self.val='bval'

    @Aproxy
    def bmethod(self,a):
        print "B::bmethod"
        print a, self.val

    @Aproxy
    def bmethod2(self,a):
        print "B::bmethod2"
        print a, self.val

b=B()
b.bmethod('foo')
a=A(b)
b=B()
b.val='newval'
a.bmethod('bar')
a.bmethod2('zam')

It might help to tell us the order of events that you want in your
program. You're not using 'mymethod' or 'mymethod2', and you probably
want 'return fnew' for the future. Something dynamic with __getattr__
might work. Any method call to A, that is an A instance, tries to
look up a method of the same name in the B instance it was initialized
with.

a.foo( ) -> 'in a.foo' -> 'calling b.foo' -> 'return' -> 'return'
a.mymethod( ) -> 'in a.mymethod' -> 'calling b.mymethod' ->
AttributeError: 'b' has no attribute 'mymethod'.
 
D

Dmitry S. Makovey

Your code below is very abstract, so it's kind of hard to figure out
what problem you're trying to solve, but it seems to me that you're
using the B proxy class to decorate the A target class, which means
you want one of these options:

Sorry for unclarities in original post. Basically A aggregates object of
class B (example with no decorators and again it's oversimplified):

class A:
b=None
def __init__(self,b):
self.b=b

def amethod(self,a):
print "A::amethod ", a

def bmethod(self,a):
print "A::bmethod ",a
return self.b.bmethod(a)

def bmethod2(self,a,z):
print "A::bmethod2 ",a,z
return self.b.bmethod2(a,z)


class B:
def __init__(self):
self.val=a

def bmethod(self,a):
print "B::bmethod ",a

def bmethod2(self,a,z):
print "B::bmethod2 ",a,z


b=B()
a=A(b)
a.bmethod('foo')
a.bmethod2('bar','baz')

In my real-life case A is a proxy to B, C and D instances/objects, not just
one. If you look at above code - whenever I write new method in either B, C
or D I have to modify A, or even when I modify signature (say, add
parameter x to bmethod) in B, C or D I have to make sure A is synchronized.
I was hoping to use decorator to do it automatically for me. Since the
resulting code is virtually all the same for all those proxy methods it
seems to be a good place for automation. Or am I wrong assuming that?
(since it is my first time using decorators I honestly don't know)

Abovementioned code ilustrates what I am doing right now. My original post
is an attempt to make things more automated/foolproof.
 
D

Dmitry S. Makovey

Aaron "Castironpi" Brady wrote:
It might help to tell us the order of events that you want in your
program. You're not using 'mymethod' or 'mymethod2', and you probably
want 'return fnew' for the future. Something dynamic with __getattr__
might work. Any method call to A, that is an A instance, tries to
look up a method of the same name in the B instance it was initialized
with.

well 'mymethod' and 'mymethod2' were there just to show that A doesn't
function as a pure proxy - it has methods of it's own. See my respnse to
Steve - I proxy messages to more than one aggregated object. going over
them on __getattr__ to look up methods just doesn't seem to be really
efficient to me (I might be wrong though). Decorators seemed to present
good opportunity to simplify the code (well except for the decorator
function itself :) ), make code bit more "fool-proofed" (and give me the
opportunity to test decorators in real life, he-he).

So decorators inside of B just identify that those methods will be proxied
by A. On one hand from logical standpoint it's kind of weird to tell class
that it is going to be proxied by another class, but declaration would be
real close to original function definition which helps to identify where is
it used.

Note that my decorator doesn't change original function - it's a subversion
of decorator to a certain degree as I'm just hooking into python machinery
to add methods to A upon their declaration in B (or so I think).
 
D

Dmitry S. Makovey

Dmitry said:
In my real-life case A is a proxy to B, C and D instances/objects, not
just one.

forgot to mention that above would mean that I need to have more than one
decorator function like AproxyB, AproxyC and AproxyD or make Aproxy smarter
about which property of A has instance of which class etc.

Unless I'm totally "out for lunch" and there are better ways of implementing
this (other than copy-pasting stuff whenever anything in B, C or D
changes).
 
D

Diez B. Roggisch

Dmitry said:
forgot to mention that above would mean that I need to have more than one
decorator function like AproxyB, AproxyC and AproxyD or make Aproxy smarter
about which property of A has instance of which class etc.

Unless I'm totally "out for lunch" and there are better ways of implementing
this (other than copy-pasting stuff whenever anything in B, C or D
changes).

__getattr__?

class Proxy(object):


def __init__(self, delegate):
self._delegate = delegate


def __getattr__(self, attr):
v = getattr(self._delegate, attr)
if callable(v):
class CallInterceptor(object):
def __init__(self, f):
self._f = f

def __call__(self, *args, **kwargs):
print "Called " + str(self._f) + " with " +
str(args) + str(kwargs)
return self._f(*args, **kwargs)
return CallInterceptor(v)
return v


Decorators have *nothing* to do with this. They are syntactic sugar for


def foo(...):
...

foo = a_decorator(foo)

Nothing less, nothing more.

Diez
 
B

Bruno Desthuilliers

Dmitry S. Makovey a écrit :
well 'mymethod' and 'mymethod2' were there just to show that A doesn't
function as a pure proxy - it has methods of it's own. See my respnse to
Steve - I proxy messages to more than one aggregated object. going over
them on __getattr__ to look up methods just doesn't seem to be really
efficient to me (I might be wrong though). Decorators seemed to present
good opportunity to simplify the code (well except for the decorator
function itself :) ), make code bit more "fool-proofed" (and give me the
opportunity to test decorators in real life, he-he).

So decorators inside of B just identify that those methods will be proxied
by A. On one hand from logical standpoint it's kind of weird to tell class
that it is going to be proxied by another class,

Indeed - usually, proxied objects shouldn't have to be aware of the
fact. That doesn't mean your variation on the proxy pattern is
necessarily bad design (hard to tell without lot of context anyway...),
but still there's some alarm bell ringing here IMHO - IOW : possibly the
right thing to do, but needs to be double-checked.
but declaration would be
real close to original function definition which helps to identify where is
it used.

Note that my decorator doesn't change original function - it's a subversion
of decorator to a certain degree as I'm just hooking into python machinery
to add methods to A upon their declaration in B (or so I think).

I wouldn't call this a "subversion" of decorators - it's even a pretty
common idiom to use decorators to flag some functions/methods for
special use.

Now I'm not sure I really like your implementation. Here's a possible
rewrite using a custom descriptor:

class Proxymaker(object):
def __init__(self, attrname):
self.attrname = attrname

def __get__(self, instance, cls):
def _proxied(fn):
fn_name = fn.__name__
def delegate(inst, *args, **kw):
target = getattr(inst, self.attrname)
#return fn(target, *args,**kw)
method = getattr(target, fn_name)
return method(*args, **kw)

delegate.__name__ = "%s_%s_delegate" % \
(self.attrname, fn_name)

setattr(cls, fn_name, delegate)
return fn

return _proxied

class A(object):
def __init__(self,b):
self.val='aval'
self.b=b
b.val='aval'

proxy2b = Proxymaker('b')

def mymethod(self,a):
print "A::mymethod, ",a

def mymethod2(self,a):
print "A::another method, ",a

class B(object):
def __init__(self):
self.val='bval'

@A.proxy2b
def bmethod(self,a):
print "B::bmethod"
print a, self.val

@A.proxy2b
def bmethod2(self,a):
print "B::bmethod2"
print a, self.val


My point is that:
1/ you shouldn't have to rewrite a decorator function - with basically
the same code - for each possible proxy class / attribute name pair combo
2/ making the decorator an attribute of the proxy class makes
dependencies clearer (well, IMHO at least).

I'm still a bit uneasy wrt/ high coupling between A and B, and if I was
to end up with such a design, I'd probably take some times to be sure
it's really ok.

My cents...
 
D

Dmitry S. Makovey

Diez said:
__getattr__?

see, in your code you're assuming that there's only 1 property ( 'b' )
inside of A that needs proxying. In reality I have several. So in your code
self._delegate should be at least a tupple or a list. Plus what you're
doing - you just promiscuously passing any method not found in Proxy to
self._delegate which is not what I need as I need to pass only a subset of
calls, so now your code needs to acquire dictionary of "allowed" calls, and
go over all self._delegates to find if any one has it which is not
efficient since there IS a 1:1 mapping of A::method -> B::method so lookups
shouldn't be necessary IMO (for performance reasons).
class Proxy(object):


def __init__(self, delegate):
self._delegate = delegate


def __getattr__(self, attr):
v = getattr(self._delegate, attr)
if callable(v):
class CallInterceptor(object):
def __init__(self, f):
self._f = f

def __call__(self, *args, **kwargs):
print "Called " + str(self._f) + " with " +
str(args) + str(kwargs)
return self._f(*args, **kwargs)
return CallInterceptor(v)
return v
Decorators have *nothing* to do with this. They are syntactic sugar for
def foo(...):
...
foo = a_decorator(foo)

exactly. and in my case they would've simplified code reading/maintenance.
However introduced "tight coupling" (A knows about B and be should know
about A) is something that worries me and I'm trying to figure out if there
is another way to use decorators for my scenario or is there another way of
achieving the same thing without using decorators and without bloating up
the code with alternative solution.

Another way could be to use Metaclass to populate class with method upon
declaration but that presents quite a bit of "special" cruft which is more
than I have to do with decorators :) (but maybe it's all *necessary* ? )
 
D

Dmitry S. Makovey

Thanks Bruno,

your comments were really helpful (so was the "improved" version of code).

My replies below:

Bruno said:
Indeed - usually, proxied objects shouldn't have to be aware of the
fact. That doesn't mean your variation on the proxy pattern is
necessarily bad design (hard to tell without lot of context anyway...),
but still there's some alarm bell ringing here IMHO - IOW : possibly the
right thing to do, but needs to be double-checked.

I'm kind of looking at options and not dead-set on decorators, but I can't
find any other "elegant enough" solution which wouldn't lead to such tight
coupling. The problem I'm trying to solve is not much more complicated than
what I have already described so if anybody can suggest a better approach -
I'm all for it.
Now I'm not sure I really like your implementation. Here's a possible
rewrite using a custom descriptor:

yeah, that was going to be my next step - I was just aiming for
proof-of-concept more then efficient code :)
class Proxymaker(object):
def __init__(self, attrname):
self.attrname = attrname

def __get__(self, instance, cls):
def _proxied(fn):
fn_name = fn.__name__
def delegate(inst, *args, **kw):
target = getattr(inst, self.attrname)
#return fn(target, *args,**kw)
method = getattr(target, fn_name)
return method(*args, **kw)

delegate.__name__ = "%s_%s_delegate" % \
(self.attrname, fn_name)

setattr(cls, fn_name, delegate)
return fn

return _proxied

class A(object):
def __init__(self,b):
self.val='aval'
self.b=b
b.val='aval'

proxy2b = Proxymaker('b')

def mymethod(self,a):
print "A::mymethod, ",a

def mymethod2(self,a):
print "A::another method, ",a

class B(object):
def __init__(self):
self.val='bval'

@A.proxy2b
def bmethod(self,a):
print "B::bmethod"
print a, self.val

@A.proxy2b
def bmethod2(self,a):
print "B::bmethod2"
print a, self.val
My point is that:
1/ you shouldn't have to rewrite a decorator function - with basically
the same code - for each possible proxy class / attribute name pair combo
2/ making the decorator an attribute of the proxy class makes
dependencies clearer (well, IMHO at least).

agreed on all points
I'm still a bit uneasy wrt/ high coupling between A and B, and if I was
to end up with such a design, I'd probably take some times to be sure
it's really ok.

that is the question that troubles me at this point - thus my original post
(read the subject line ;) ). I like the clarity decorators bring to the
code and the fact that it's a solution pretty much "out-of-the-box" without
need to create something really-really custom, but I'm worried about tight
coupling and somewhat backward logic that they would introduce (the way I
envisioned them).
 
B

Bruno Desthuilliers

Dmitry S. Makovey a écrit :
Thanks Bruno,

your comments were really helpful (so was the "improved" version of code).

My replies below:



I'm kind of looking at options and not dead-set on decorators, but I can't
find any other "elegant enough" solution which wouldn't lead to such tight
coupling. The problem I'm trying to solve is not much more complicated than
what I have already described

Well... You didn't mention why you need a proxy to start with !-)

so if anybody can suggest a better approach -
I'm all for it.
(snip code)
agreed on all points


that is the question that troubles me at this point - thus my original post
(read the subject line ;) ). I like the clarity decorators bring to the
code and the fact that it's a solution pretty much "out-of-the-box" without
need to create something really-really custom, but I'm worried about tight
coupling and somewhat backward logic that they would introduce (the way I
envisioned them).

Well... The canonical solution for delegation in Python is using
__getattr__. Your problem - according to this post and your answer to
Diez - is that your proxy may have to
1/ delegate to more than one object
2/ don't necessarily delegate each and any attribute access

I can envision one solution using both __getattr__ and a simple decorator:

def proxy(func):
func._proxied = True
return func

class A(object):
def __init__(self, delegates):
self._delegates = delegates
def __getattr__(self, name):
for d in self.__delegate:
func = getattr(d, name)
if callable(func) and getattr(func, '_proxied', False):
return func
raise AttributeError(
'object %s has no attribute '%s' % (self.__class__, name)
)


class B(object):
def __init__(self):
self.val='bval'

@proxy
def bmethod(self,a):
print "B::bmethod"
print a, self.val

@proxy
def bmethod2(self,a):
print "B::bmethod2"
print a, self.val


class C(object):
def __init__(self):
self.val='bval'

@proxy
def cmethod(self,a):
print "B::bmethod"
print a, self.val

@proxy
def cmethod2(self,a):
print "B::bmethod2"
print a, self.val


a = A([B(), C()])

# not tested...


This solves most of the coupling problems (B and C still have to make
clear which methods are to be proxied, but at least they need not know
which class will be used as proxy), and makes sure only 'allowed' method
calls are delegated. But I wouldn't call it a perfect solution neither.
If you do have more than one object having method xxx, only the first
one will match... And let's not talk about the lookup penalty.

There's a possible variant that avoids the call to __getattr__ (in
short: attaching delegation instancemethods to A instance in the
initializer for each proxied method in delegates), but that wont solve
the problem of potential name clashes.


My 2 cents...
 
B

Bruno Desthuilliers

Aaron "Castironpi" Brady a écrit :
(snip)
You should write it like this:

class B(object):
@A.proxy
def bmethod(self,a):

Making 'proxy' a class method on A.

That's exactly what I wanted to avoid here : making B depending on A.

(snip)
I agree that __setattr__ is the canonical solution to proxy,

Err... I assume you mean '__getattr__' ???
but you
have stated that you want each proxied method to be a member in the
proxy class.

This doesn't necessarily imply that "proxied" classes need to know about
the "proxying" class. FWIW, that was the whole point : decoupling.
 
A

Aaron \Castironpi\ Brady

Dmitry S. Makovey a écrit :


Thanks Bruno,
your comments were really helpful (so was the "improved" version of code).
My replies below:
I'm kind of looking at options and not dead-set on decorators, but I can't
find any other "elegant enough" solution which wouldn't lead to such tight
coupling. The problem I'm trying to solve is not much more complicated than
what I have already described

Well... You didn't mention why you need a proxy to start with !-)
so if anybody can suggest a better approach -
I'm all for it.

(snip code)


agreed on all points
that is the question that troubles me at this point - thus my original post
(read the subject line ;) ).  I like the clarity decorators bring to the
code and the fact that it's a solution pretty much "out-of-the-box" without
need to create something really-really custom, but I'm worried about tight
coupling and somewhat backward logic that they would introduce (the way I
envisioned them).

Well... The canonical solution for delegation in Python is using
__getattr__. Your problem - according to this post and your answer to
Diez - is that your proxy may have to
1/ delegate to more than one object
2/ don't necessarily delegate each and any attribute access

I can envision one solution using both __getattr__ and a simple decorator:

def proxy(func):
    func._proxied = True
    return func

class A(object):
     def __init__(self, delegates):
         self._delegates = delegates
     def __getattr__(self, name):
         for d in self.__delegate:
             func = getattr(d, name)
             if callable(func) and getattr(func, '_proxied', False):
                 return func
         raise AttributeError(
               'object %s has no attribute '%s' % (self.__class__, name)
               )

class B(object):
     def __init__(self):
         self.val='bval'

     @proxy
     def bmethod(self,a):
         print "B::bmethod"
         print a, self.val

     @proxy
     def bmethod2(self,a):
         print "B::bmethod2"
         print a, self.val

class C(object):
     def __init__(self):
         self.val='bval'

     @proxy
     def cmethod(self,a):
         print "B::bmethod"
         print a, self.val

     @proxy
     def cmethod2(self,a):
         print "B::bmethod2"
         print a, self.val

a = A([B(), C()])

# not tested...

This solves most of the coupling problems (B and C still have to make
clear which methods are to be proxied, but at least they need not know
which class will be used as proxy), and makes sure only 'allowed' method
calls are delegated. But I wouldn't call it a perfect solution neither.
If you do have more than one object having method xxx, only the first
one will match... And let's not talk about the lookup penalty.

There's a possible variant that avoids the call to __getattr__ (in
short: attaching delegation instancemethods to A instance in the
initializer for each proxied method in delegates), but that wont solve
the problem of potential name clashes.

My 2 cents...

You should write it like this:

class B(object):
@A.proxy
def bmethod(self,a):

Making 'proxy' a class method on A. In case different A instances (do
you have more than one BTW?) proxy different objects, you could make
it a plain old method.

a= A()
class B(object):
@a.proxy
def bmethod(self,a):

I recommend this solution so that if you add a method to a B instance
later, 'a' can be notified simply.:

b.meth3= a.proxy( meth3 )

The big problem with that is if 'a' has 'b' in its constructor. You
can reverse that, since B 'knows' about it proxy object quite a bit
anyway.

What you've said implies that you only have one B instance, or only
one per A instance. Is this correct?

I agree that __setattr__ is the canonical solution to proxy, but you
have stated that you want each proxied method to be a member in the
proxy class.
 
D

Dmitry S. Makovey

Aaron said:
You should write it like this:

class B(object):
@A.proxy
def bmethod(self,a):

Making 'proxy' a class method on A.

makes sense.
In case different A instances (do
you have more than one BTW?)

yep. I have multiple instances of class A, each one has properties (one per
class) of classes B, C and D:

class A:
b=None
c=None
d=None
def __init__(self,b,c,d):
self.b=b
self.c=c
self.d=d

...magic with proxying methods goes here...

class B:
def bmethod(self,x): pass # we proxy this method from A
def bmethod2(self,x): pass # this is not proxied
class C:
def cmethod(self,x): pass # we proxy this method from A
class D:
def dmethod(self,x): pass # we proxy this method from A

a=A(B(),C(),D())
x='foo'
a.bmethod(x)
a.cmethod(x)
a.dmethod(x)
a.bmethod2(x) # raises error as we shouldn't proxy bmethod2

above is the ideal scenario.
What you've said implies that you only have one B instance, or only
one per A instance. Is this correct?

yes. as per above code.
I agree that __setattr__ is the canonical solution to proxy, but you
have stated that you want each proxied method to be a member in the
proxy class.

well. kind of. if I can make it transparent to the consumer so that he
shouldn't do:

a.b.bmethod(x)

but rather:

a.bmethod(x)

As I'm trying to keep b, c and d as private properties and would like to
filter which calls are allowed to those. Plus proxied methods in either one
always expect certain parameters like:

class B:
def bmethod(self,c,x): pass

and A encapsulates 'c' already and can fill in that blank automagically:

class A:
c=None
b=None
def bmethod(self,c,x):
if not c:
c=self.c
b.bmethod(self,c,x)

I kept this part of the problem out of this discussion as I'm pretty sure I
can fill those in once I figure out the basic problem of auto-population of
proxy methods since for each class/method those are going to be nearly
identical. If I can autogenerate those on-the-fly I'm pretty sure I can add
some extra-logic to them as well including signature change where
A::bmethod(self,c,x) would become A::bmethod(self,x) etc.
 
A

Aaron \Castironpi\ Brady

makes sense.


yep. I have multiple instances of class A, each one has properties (one per
class) of classes B, C and D:

class A:
        b=None
        c=None
        d=None
        def __init__(self,b,c,d):
                self.b=b
                self.c=c
                self.d=d

        ...magic with proxying methods goes here...

class B:
        def bmethod(self,x): pass # we proxy this method from A
        def bmethod2(self,x): pass # this is not proxied
class C:
        def cmethod(self,x): pass # we proxy this method from A
class D:
        def dmethod(self,x): pass # we proxy this method from A

a=A(B(),C(),D())
x='foo'
a.bmethod(x)
a.cmethod(x)
a.dmethod(x)
a.bmethod2(x) # raises error as we shouldn't proxy bmethod2

above is the ideal scenario.


yes. as per above code.


well. kind of. if I can make it transparent to the consumer so that he
shouldn't do:

a.b.bmethod(x)

but rather:

a.bmethod(x)

As I'm trying to keep b, c and d as private properties and would like to
filter which calls are allowed to those. Plus proxied methods in either one
always expect certain parameters like:

class B:
        def bmethod(self,c,x): pass

and A encapsulates 'c' already and can fill in that blank automagically:

class A:
        c=None
        b=None
        def bmethod(self,c,x):
                if not c:
                        c=self.c
                b.bmethod(self,c,x)

I kept this part of the problem out of this discussion as I'm pretty sure I
can fill those in once I figure out the basic problem of auto-population of
proxy methods since for each class/method those are going to be nearly
identical. If I can autogenerate those on-the-fly I'm pretty sure I can add
some extra-logic to them as well including signature change where
A::bmethod(self,c,x) would become A::bmethod(self,x) etc.

Do you want to couple instances or classes together?

If A always proxies for B, C, and D, then the wrapper solution isn't
bad. If you're going to be doing any instance magic, that can change
the solution a little bit.

There's also a revision of the first implementation of Aproxy you
posted, which could stand alone as you have it, or work as a
classmethod or staticmethod.
def Aproxy(fn):
def delegate(*args,**kw):
print "%s::%s" % (args[0].__class__.__name__,fn.__name__)
args=list(args)
b=getattr(args[0],'b')
fnew=getattr(b,fn.__name__)
# get rid of original object reference
del args[0]
fnew(*args,**kw)
setattr(A,fn.__name__,delegate)
return fn

def Aproxy(fn):
def delegate(self,*args,**kw):
print "%s::%s" % (args[0].__class__.__name__,fn.__name__)
fnew=getattr(self.b,fn.__name__)
return fnew(*args,**kw)
setattr(A,fn.__name__,delegate)
return fn
 
D

Dmitry S. Makovey

Aaron said:
Do you want to couple instances or classes together?

It would be nice to have objects of B, C and D classes not knowing that they
are proxied (as they are used on their own too, not only inside of A
objects).
If A always proxies for B, C, and D, then the wrapper solution isn't
bad.

the whole purpose of A is pretty much to proxy and filter. It's got some
extra logic to combine and manipulate b, c and d objects inside of A class
objects.
If you're going to be doing any instance magic, that can change
the solution a little bit.

There's also a revision of the first implementation of Aproxy you
posted, which could stand alone as you have it, or work as a
classmethod or staticmethod.

def Aproxy(fn):
def delegate(self,*args,**kw):
print "%s::%s" % (args[0].__class__.__name__,fn.__name__)
fnew=getattr(self.b,fn.__name__)
return fnew(*args,**kw)
setattr(A,fn.__name__,delegate)
return fn

yep, that does look nicer/cleaner :)
 
G

George Sakkis

It would be nice to have objects of B, C and D classes not knowing that they
are proxied (as they are used on their own too, not only inside of A
objects).

I'm not sure if the approach below deals with all the issues, but one
thing it does is decouple completely the proxied objects from the
proxy:

#======== usage
================================================================

from proxies import Proxy

class B(object):
def __init__(self): self.val = 'bval'
def bmethod(self,n): print "B::bmethod",n
def bmethod2(self,n,m): print "B::bmethod2",n,m

class C(object):
def __init__(self): self.val = 'cval'
def cmethod(self,x): print "C::cmethod",x
def cmethod2(self,x,y): print "C::cmethod2",x,y
cattr = 4

class A(Proxy):
DelegateMap = {
'bmethod' : B,
'bmethod2': B,
'cmethod': C,
# do NOT delegate C.cmethod2
#'cmethod2': C,
'cattr' : C,
}

def __init__(self, b, c):
print "init A()"
# must call Proxy.__init__(*delegates)
super(A,self).__init__(b,c)

def amethod(self,a):
print "A::mymethod",a


if __name__ == '__main__':
a = A(B(), C())
a.amethod('foo')

# test bounded methods
a.bmethod('foo')
a.bmethod2('bar','baz')
a.cmethod('foo')
try: a.cmethod2('bar','baz')
except Exception, ex: print ex

# works for unbound methods too
A.bmethod(a,'foo')
A.bmethod2(a,'bar','baz')
A.cmethod(a, 'foo')
try: A.cmethod2(a,'bar','baz')
except Exception, ex: print ex

# non callable attributes
print A.cattr

#====== output ==================================
init A()
A::mymethod foo
B::bmethod foo
B::bmethod2 bar baz
C::cmethod foo
'A' object has no attribute 'cmethod2'
B::bmethod foo
B::bmethod2 bar baz
C::cmethod foo
type object 'A' has no attribute 'cmethod2'
4

#======== proxies.py =========================

class _ProxyMethod(object):
def __init__(self, name):
self._name = name
def unbound(proxy, *args, **kwds):
method = proxy._get_target_attr(name)
return method(*args, **kwds)
self._unbound = unbound

def __get__(self, proxy, proxytype):
if proxy is not None:
return proxy._get_target_attr(self._name)
else:
return self._unbound


class _ProxyMeta(type):
def __new__(meta, name, bases, namespace):
for attrname,cls in namespace.get('DelegateMap',
{}).iteritems():
if attrname not in namespace:
attr = getattr(cls, attrname)
if callable(attr):
namespace[attrname] = _ProxyMethod(attrname)
else:
namespace[attrname] = attr
return super(_ProxyMeta,meta).__new__(meta, name, bases,
namespace)


class Proxy(object):
__metaclass__ = _ProxyMeta

def __init__(self, *delegates):
self._cls2delegate = {}
for delegate in delegates:
cls = type(delegate)
if cls in self._cls2delegate:
raise ValueError('More than one %s delegates were
given' % cls)
self._cls2delegate[cls] = delegate

def _get_target_attr(self, name):
try:
cls = self.DelegateMap[name]
delegate = self._cls2delegate[cls]
return getattr(delegate, name)
except (KeyError, AttributeError):
raise AttributeError('%r object has no attribute %r' %
(self.__class__.__name__, name))

HTH,
George
 
D

Dmitry S. Makovey

George said:
I'm not sure if the approach below deals with all the issues, but one
thing it does is decouple completely the proxied objects from the
proxy:

class _ProxyMeta(type):

<snip/>

It smelled to me more and more like metaclass too, I was just trying to
avoid them :)

Your code looks awefully close to what I'm trying to do, except it looks bit
heavier than decorators. Seems like decorators are not going to happen in
this part of project for me anyway, however the whole discussion gave me a
lot to think about. Thank you Bruno, Aaron, Diez and George.

Thanks for the concrete code with metaclass. I'm going to study it
thoroughly to see if I can spot caveats/issues for my usecases however it
seems to put me on the right track. I never used metaclasses before and
decorators seemed to be bit more straight-forward to me :) ..oh well.
 
B

Bruno Desthuilliers

Dmitry S. Makovey a écrit :
(snip)
I never used metaclasses before and
decorators seemed to be bit more straight-forward to me :) ..oh well.

Don't be afraid !-) While it's true that they can be a bit confusing at
first, metaclasses are just classes - whose instances happens to be
classes themselves.

And while you're certainly right to not jump on metaclasses *before*
hanving spent a bit time thinking of other possible solutions, there's
nothing wrong with using metaclasses when that's really what you need...
 
P

Paul McGuire

see, in your code you're assuming that there's only 1 property ( 'b' )
inside of A that needs proxying. In reality I have several. So in your code
self._delegate should be at least a tupple or a list. Plus what you're
doing - you just promiscuously passing any method not found in Proxy to
self._delegate which is not what I need as I need to pass only a subset of
calls, so now your code needs to acquire dictionary of "allowed" calls, and
go over all self._delegates to find if any one has it which is not
efficient since there IS a 1:1 mapping of A::method -> B::method so lookups
shouldn't be necessary IMO (for performance reasons).

No, really, Diez has posted the canonical Proxy form in Python, using
__getattr__ on the proxy, and then redirecting to the contained
delegate object. This code does *not* assume that only one property
('b'? where did that come from?) is being redirected - __getattr__
will intercept all attribute lookups and redirect them to the
delegate.

If you need to get fancier and support this single-proxy-to-multiple-
delegates form, then yes, you will need some kind of map that says
which method should delegate to which object. Or, if it is just a
matter of precedence (try A, then try B, then...), then use hasattr to
see if the first delegate has the given attribute, and if not, move on
to the next.
I'm trying to figure out if there
is another way to use decorators for my scenario or is there another way of
achieving the same thing without using decorators and without bloating up
the code with alternative solution.

Another way could be to use Metaclass to populate class with method upon
declaration but that presents quite a bit of "special" cruft which is more
than I have to do with decorators :) (but maybe it's all *necessary* ? )

Your original question was "is decorator the right thing to use?" For
this application, the answer is "no". It sounds like you are trying
to force this particular to solution to your problem, but you are
probably better off giving __getattr__ intercepting another look.

-- Paul
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,756
Messages
2,569,535
Members
45,008
Latest member
obedient dusk

Latest Threads

Top