[RFC] Parametric Polymorphism

C

Catalin Marinas

Hi,

Sorry if this was previously discussed but it's something I miss in
Python. I get around this using isinstance() but it would be cleaner
to have separate functions with the same name but different argument
types. I think the idea gets quite close to the Lisp/CLOS
implementation of methods.

Below is just simple implementation example (and class functions are
not supported) but it can be further extended/optimised/modified for
better type detection like issubclass() etc. The idea is similar to
the @accepts decorator:


methods = dict()

def method(*types):
def build_method(f):
assert len(types) == f.func_code.co_argcount

if not f.func_name in methods:
methods[f.func_name] = dict()
methods[f.func_name][str(types)] = f

def new_f(*args, **kwds):
type_str = str(tuple([type(arg) for arg in args]))
assert type_str in methods[f.func_name]
return methods[f.func_name][type_str](*args, **kwds)
new_f.func_name = f.func_name

return new_f

return build_method


And its utilisation:

@method(int)
def test(arg):
print 'int', arg

@method(float)
def test(arg):
print 'float', arg

test(1) # succeeds
test(1.5) # succeeds
test(1, 2) # assert fails
test('aaa') # assert fails


Let me know what you think. Thanks.
 
D

Diez B. Roggisch

Catalin said:
Hi,

Sorry if this was previously discussed but it's something I miss in
Python. I get around this using isinstance() but it would be cleaner
to have separate functions with the same name but different argument
types. I think the idea gets quite close to the Lisp/CLOS
implementation of methods.

Below is just simple implementation example (and class functions are
not supported) but it can be further extended/optimised/modified for
better type detection like issubclass() etc. The idea is similar to
the @accepts decorator:


methods = dict()

def method(*types):
def build_method(f):
assert len(types) == f.func_code.co_argcount

if not f.func_name in methods:
methods[f.func_name] = dict()
methods[f.func_name][str(types)] = f

def new_f(*args, **kwds):
type_str = str(tuple([type(arg) for arg in args]))
assert type_str in methods[f.func_name]
return methods[f.func_name][type_str](*args, **kwds)
new_f.func_name = f.func_name

return new_f

return build_method


And its utilisation:

@method(int)
def test(arg):
print 'int', arg

@method(float)
def test(arg):
print 'float', arg

test(1) # succeeds
test(1.5) # succeeds
test(1, 2) # assert fails
test('aaa') # assert fails


Let me know what you think. Thanks.

google for gnosis utils and multimethods to see a more "oldfashioned"
implementation. But your approach certainly is interesting - however, I
_rarely_ need such functionality. Ususally duck-typing suits me well.

Regards,

Diez
 
C

Catalin Marinas

Diez B. Roggisch said:
google for gnosis utils and multimethods to see a more "oldfashioned"
implementation.

I now remember to have seen it but it requires a lot of typing to
achieve it and you would call a different function name from the one
you define, reducing the code clarity.
But your approach certainly is interesting - however,

The idea is not new. Lisp/CLOS implements the defmethod macro which
creates a generic function (i.e. a dispatcher) with the same name as
the function you define:
http://www.lisp.org/HyperSpec/Body/mac_defmethod.html. Actually, the
defmehod macro is the base for implementing object polymorphism.
I _rarely_ need such functionality. Ususally duck-typing suits me
well.

Of course, duck-typing is simple to use but the parametric
polymorphism is useful when the types are unrelated. Let's say you
want to implement a colour-print function which should support basic
types like ints and floats as well as lists and dictionaries. In this
case, and thank to the function decorations support, the code would be
clearer.

Another use case is to extended the functionality of a class using
functions but you cannot easily modify the class or create a subclass
(the objects are generated by some factory implemented in a
third-party library). Of course, you might be able to get around this
but parametric polymorphism could reduce the written code.

This idea only needs an optimised (and deterministic) implementation
for the best parameter match based on subclass-superclass
relations. It can also be extended to implement polymorphism based on
the parameter values (something people tried to do in C++ with
complicated templates but, well, only at compilation time, with
obvious drawbacks).
 
T

Tom Anderson

Sorry if this was previously discussed but it's something I miss in
Python. I get around this using isinstance() but it would be cleaner to
have separate functions with the same name but different argument types.
I think the idea gets quite close to the Lisp/CLOS implementation of
methods.

Below is just simple implementation example (and class functions are
not supported) but it can be further extended/optimised/modified for
better type detection like issubclass() etc. The idea is similar to
the @accepts decorator:

methods = dict()

def method(*types):
def build_method(f):
assert len(types) == f.func_code.co_argcount

if not f.func_name in methods:
methods[f.func_name] = dict()
methods[f.func_name][str(types)] = f

def new_f(*args, **kwds):
type_str = str(tuple([type(arg) for arg in args]))
assert type_str in methods[f.func_name]
return methods[f.func_name][type_str](*args, **kwds)
new_f.func_name = f.func_name

return new_f

return build_method

Neat. I'd come up with the same general idea myself, but since i am a
worthless slob, i never actually implemented it.

Is there any reason you have to stringify the type signature? Types are
hashable, so a tuple of types is hashable, so you can just use that as a
key. Replace "methods[f.func_name][str(types)] = f" with
"methods[f.func_name][types] = f" and "type_str = str(tuple([type(arg) for
arg in args]))" with "type_str = tuple(type(arg) for arg in args)". And
then rename type_str to types thoughout.

Also, we can exploit the closureness of new_f to avoid a dict lookup:

f_implementations = methods[f.func_name]
def new_f(*args, **kwds):
types = tuple(type(arg) for arg in args)
return f_implementations[types](*args, **kwds)


tom
 
K

Kay Schluehr

Catalin said:
Hi,

Sorry if this was previously discussed but it's something I miss in
Python. I get around this using isinstance() but it would be cleaner
to have separate functions with the same name but different argument
types. I think the idea gets quite close to the Lisp/CLOS
implementation of methods.

Guido himself addressed multimethods in his Artima blog:

http://www.artima.com/weblogs/viewpost.jsp?thread=101605

See also the subsequent discussion about subtyping problems.

Kay
 
C

Catalin Marinas

Tom Anderson said:
Is there any reason you have to stringify the type signature? Types
are hashable, so a tuple of types is hashable, so you can just use
that as a key. Replace "methods[f.func_name][str(types)] = f" with
"methods[f.func_name][types] = f" and "type_str = str(tuple([type(arg)
for arg in args]))" with "type_str = tuple(type(arg) for arg in
args)". And then rename type_str to types thoughout.

You're right. I initially had a list in there and it wasn't hashable
(and I thought tuples aren't either). Anyway, the implementation is
just an example, it should use isinstance() instead. The link posted
by Kay shows a better implementation.
 
P

Pierre Barbier de Reuille

Kay Schluehr a écrit :
Guido himself addressed multimethods in his Artima blog:

http://www.artima.com/weblogs/viewpost.jsp?thread=101605

See also the subsequent discussion about subtyping problems.

Kay

Well, as said in the comments, what he propose is *not* working !
You cannot look for the type of the argument using a dictionnary of
definition types, it just doesn't work because of polymorphism.

Now suppose you have two classes A and B, B subclassing A.
If you define :

@method(A)
def myfct(f):
do_something_with_f

Then, you want it to work with any object of type B ... but with either
Guido's or this implementation, it won't ! The problem is much more
complex and cannot be solved in constant (or even linear) time. What I
read concerning Lisp is they use a cache to optimize method resolution.

Pierre
 
D

Donn Cave

Catalin Marinas said:
Of course, duck-typing is simple to use but the parametric
polymorphism is useful when the types are unrelated. Let's say you
want to implement a colour-print function which should support basic
types like ints and floats as well as lists and dictionaries. In this
case, and thank to the function decorations support, the code would be
clearer.

What you're describing (and implementing) is not what I would
call parametric polymorphism, though. See

http://en.wikipedia.org/wiki/Polymorphism_(computer_science)

You're talking about "ad-hoc" polymorphism.

Personally, I can't agree that, in principle, this practice
makes code clearer. In more common, less formal implementations
you get functions like this --

def dosome(cmd):
if type(cmd) == StringType:
cmd = [cmd]
...

Of course the first problem with this that it unnecessarily
constrains the input type: if your API supports a string
parameter, then in Python you should expect any value to work
that supports string operations. This isn't a hypothetical
matter, you can see relatively major Python applications have
trouble with Unicode for exactly this reason.

Secondly, rather than clarify the API, it confuses it. How
many programmers will observe this usage and erroneously assume
that dosome() _just_ takes a string, when the list parameter
may in fact be the more ideal usage? This isn't hypothetical
either.

Your example is a fine one, and some kind of table to resolve
the function according to type of input argument is a good idea.
I'm just saying that more general application of this idea is
best left to languages like C++.

Donn Cave, (e-mail address removed)
 
B

Bengt Richter

Hi,

Sorry if this was previously discussed but it's something I miss in
Python. I get around this using isinstance() but it would be cleaner
to have separate functions with the same name but different argument
types. I think the idea gets quite close to the Lisp/CLOS
implementation of methods.

Below is just simple implementation example (and class functions are
not supported) but it can be further extended/optimised/modified for
better type detection like issubclass() etc. The idea is similar to
the @accepts decorator:


methods = dict()

def method(*types):
def build_method(f):
assert len(types) == f.func_code.co_argcount

if not f.func_name in methods:
methods[f.func_name] = dict()
methods[f.func_name][str(types)] = f

def new_f(*args, **kwds):
type_str = str(tuple([type(arg) for arg in args]))
assert type_str in methods[f.func_name]
return methods[f.func_name][type_str](*args, **kwds)
new_f.func_name = f.func_name

return new_f

return build_method


And its utilisation:

@method(int)
def test(arg):
print 'int', arg

@method(float)
def test(arg):
print 'float', arg

test(1) # succeeds
test(1.5) # succeeds
test(1, 2) # assert fails
test('aaa') # assert fails


Let me know what you think. Thanks.
I am reminded of reinventing multimethods, but an interesting twist, so I'll add on ;-)
The following could be made more robust, but avoids a separately named dictionary
and lets the function name select a dedicated-to-the-function-name dictionary (subclass)
directly instead of looking it up centrally with two levels involved.

Just thought of this variant of your idea, so not tested beyond what you see ;-)
... def mkdisp(f):
... try: disp = eval(f.func_name)
... except NameError:
... class disp(dict):
... def __call__(self, *args):
... return self[tuple((type(arg) for arg in args))](*args)
... disp = disp()
... disp[types] = f
... return disp
... return mkdisp
... ... def test(arg):
... print 'int', arg
... ... def test(arg):
... print 'float', arg
... Traceback (most recent call last):
File "<stdin>", line 1, in ?
File "<stdin>", line 7, in __call__
Traceback (most recent call last):
File "<stdin>", line 1, in ?
{(<type 'int'>,): <function test at 0x02EEAE2C>, (<type 'float'>,): <function test at 0x02EEAE64>}

You could give it a nice __repr__ ...

Hm, I'll just cheat right here instead of putting it in the decorator's class where it belongs:
... fname = self.values()[0].func_name
... types = [tuple((t.__name__ for t in sig)) for sig in self.keys()]
... def test(s1, s2): return s1, s2
... int 123

That __repr__ could definitely be improved some more ;-)

Regards,
Bengt Richter
 
C

Catalin Marinas

Pierre Barbier de Reuille said:
Now suppose you have two classes A and B, B subclassing A.
If you define :

@method(A)
def myfct(f):
do_something_with_f

Then, you want it to work with any object of type B ... but with either
Guido's or this implementation, it won't ! The problem is much more
complex and cannot be solved in constant (or even linear) time. What I
read concerning Lisp is they use a cache to optimize method
resolution.

Yes, I realised that subclass/superclass relations cannot be quicly
solved. I haven't looked at the Lisp implementation and can't comment
on it. I suspect that using something like splay trees to re-adjust
the search graph at every search would speed things up a bit.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,744
Messages
2,569,484
Members
44,903
Latest member
orderPeak8CBDGummies

Latest Threads

Top