Something is rotten in Denmark...

I

Ian Kelly

So there should be a way to replace the closure of a function with a
snapshot of it at a certain time. If there was an internal function with
access to the readonly attribute func_closure and with the capability of
changing or creating a cell object and thus hbeing capable of doing so, it
could be used a a decorator for a function to be "closure-snapshotted".

So in

funcs=[]
for i in range(100):
 @closure_snapshot
 def f(): return i
 funcs.append(f)

each f's closure content cells would just be changed not to point to the
given variables, but to a cell referenced nowhere else and initialized with
the reference pointed to by the original cells at the given time.

For CPython 3.2:

import functools
import types

def makecell(value):
def f():
return value
return f.__closure__[0]

def closure_snapshot(f):
if f.__closure__:
snapshot = tuple(makecell(cell.cell_contents) for cell in f.__closure__)
else:
snapshot = f.__closure__
g = types.FunctionType(f.__code__, f.__globals__.copy(), f.__name__,
f.__defaults__, snapshot)
functools.update_wrapper(g, f, functools.WRAPPER_ASSIGNMENTS +
('__kwdefaults__',))
return g
.... @closure_snapshot
.... def f(): return i
.... funcs.append(f)
....
[f() for f in funcs] [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
funcs = [closure_snapshot(lambda: i) for i in range(10)]
[f() for f in funcs]
[0, 1, 2, 3, 4, 5, 6, 7, 8, 9]


It doesn't really seem any more straightforward to me than the "i=i"
trick. Also, I don't know how portable this is to different Python
implementations or future versions. Finally, note that in order to
make this work correctly in all cases (such as the first example
above, where i is a global, not a cell) we have to snapshot the
globals as well, which could cause further confusion.

Cheers,
Ian
 
H

harrismh777

Alain said:
The reason why we have the kind of lambdas we have in python (and
scheme, and javascript, etc.) is just that it is way easier to
implement. That's all I've said. And people have gotten used to it,
without ever realizing they are using something completely different
from what Church called the "lambda abstraction".

This is why I'm willing to accept Terry's 'hypnotized'
accreditation. The term 'lambda' carries some baggage with it that
python has chosen to ignore. Using the term 'lambda' as short-hand for
'an easier way to code in-line functions' causes some of the hypnotizing
effect, and much of the misunderstanding.

Frankly, having thought this over for several days, I am now
convinced the the issue at hand is two-fold: 1) the closure should
provide option(s) for snap-shot, and 2) the lambda should be implemented
in a 'purely' functional way or eliminated... if eliminated another
synonym could be invented to represent in-line function short-hand.

This is clearing up for me... but probably just beginning to simmer
for others.


kind regards,
m harris
 
G

Gregory Ewing

Alain said:
You must be kidding. Like many others, you seem to think that Scheme is
a typical functional language, which it is not.

I never said that Scheme is a functional language -- I'd be
the first to acknowledge that it's not. I do know what real
functional languages are like.

However, Scheme is more relevant to this discussion than
Haskell, precisely because it's *not* purely functional --
it does allow existing bindings to be changed. Yet its
lambdas are late-binding, and nobody seems to get tripped
up by that they way they do in Python.

Why not? It's because Scheme encourages a style of programming
which favours creation of new bindings rather than changing
existing ones, so most of the time the bindings captured by
a lambda don't change later.
 
R

rusi

rusi said:
So I tried:
Recast the comprehension as a map
Rewrite the map into a fmap (functionalmap) to create new bindings
def fmap(f,lst):
    if not lst: return []
    return [f(lst[0])] + fmap(f, lst[1:])
Still the same effects.
Obviously I am changing it at the wrong place...

   >>> fs = [(lambda n : n + i) for i in range(10)]
   >>> [f(1) for f in fs]
   [10, 10, 10, 10, 10, 10, 10, 10, 10, 10]

   >>> fs = list(map(lambda i : lambda n : n + i, range(10)))
   >>> list(map(lambda f : f(1), fs))
   [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]

Thanks Jussi for that code -- but I am not fully able to wrap my head
around it.

Is the problem in the lambda? ZF?
Are you trying to say that map works (functionally) and ZF is
imperative?
 
J

Jussi Piitulainen

rusi said:
rusi said:
So I tried:
Recast the comprehension as a map
Rewrite the map into a fmap (functionalmap) to create new bindings
def fmap(f,lst):
    if not lst: return []
    return [f(lst[0])] + fmap(f, lst[1:])
Still the same effects.
Obviously I am changing it at the wrong place...

   >>> fs = [(lambda n : n + i) for i in range(10)]
   >>> [f(1) for f in fs]
   [10, 10, 10, 10, 10, 10, 10, 10, 10, 10]

   >>> fs = list(map(lambda i : lambda n : n + i, range(10)))
   >>> list(map(lambda f : f(1), fs))
   [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]

Thanks Jussi for that code -- but I am not fully able to wrap my head
around it.

Oops, sorry, I seem to have edited out a question that I meant to ask.
The question was this: How do -you- write the list comprehension in
terms of your fmap? What -is- the expression that still has the same
effects? That is, where do you bind the i's?

The obvious-to-me way is shown above, but there it is the outer lambda
that establishes the distinct i's for the different closures.

(The composition list(map(...)) works in both versions of Python.)
Is the problem in the lambda? ZF?
Are you trying to say that map works (functionally) and ZF is
imperative?

Sorry, what is ZF?

I'm saying that your fmap works, but in itself it does not provide the
bindings that we are talking about, and you didn't show what does. The
outer lambda in my example does that.

The Python list comprehension [... for i in ...] binds (or assigns to)
just one i which is shared by all the closures above. They end up
having the same value for i because it's the same i.

I hope this is less obscure now.
 
R

rusi

rusi said:
rusi writes:
So I tried:
Recast the comprehension as a map
Rewrite the map into a fmap (functionalmap) to create new bindings
def fmap(f,lst):
    if not lst: return []
    return [f(lst[0])] + fmap(f, lst[1:])
Still the same effects.
Obviously I am changing it at the wrong place...
   >>> fs = [(lambda n : n + i) for i in range(10)]
   >>> [f(1) for f in fs]
   [10, 10, 10, 10, 10, 10, 10, 10, 10, 10]
   >>> fs = list(map(lambda i : lambda n : n + i, range(10)))
   >>> list(map(lambda f : f(1), fs))
   [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
Thanks Jussi for that code -- but I am not fully able to wrap my head
around it.

Oops, sorry, I seem to have edited out a question that I meant to ask.
The question was this: How do -you- write the list comprehension in
terms of your fmap? What -is- the expression that still has the same
effects? That is, where do you bind the i's?

The obvious-to-me way is shown above, but there it is the outer lambda
that establishes the distinct i's for the different closures.

(The composition list(map(...)) works in both versions of Python.)
Is the problem in the lambda? ZF?
Are you trying to say that map works (functionally) and ZF is
imperative?

Sorry, what is ZF?

I'm saying that your fmap works, but in itself it does not provide the
bindings that we are talking about, and you didn't show what does. The
outer lambda in my example does that.

The Python list comprehension [... for i in ...] binds (or assigns to)
just one i which is shared by all the closures above. They end up
having the same value for i because it's the same i.

I hope this is less obscure now.

I was wondering why the list(...
Now I see that map returns normal lists in python2 and some generator-
like-thing in 3
I would have said: Shall we just stick to 2 (for this discussion) but
then 2 seems to have a double error that self-corrects this example...

OOOOFFF -- too many variables...

[ZF is Zermelo-Fraenkel -- old name for list comprehension -- I guess
my age leaks like python's ZF (sorry comprehension) :) ]

Anyway... My (summary,tentative) conclusion is that python's
comprehensions leak.
The one leak fixed in python3 has not fixed the other (revealed in
this thread)

All this has little to do with lambda (whose scope rules were fixed
around python 2.2 IIRC)

I'd be interested in ur take on this...
 
J

Jussi Piitulainen

rusi said:
rusi said:
On Jun 3, 11:17 am, Jussi Piitulainen wrote:
rusi writes:
So I tried:
Recast the comprehension as a map
Rewrite the map into a fmap (functionalmap) to create new bindings
def fmap(f,lst):
    if not lst: return []
    return [f(lst[0])] + fmap(f, lst[1:])
Still the same effects.
Obviously I am changing it at the wrong place...
   >>> fs = [(lambda n : n + i) for i in range(10)]
   >>> [f(1) for f in fs]
   [10, 10, 10, 10, 10, 10, 10, 10, 10, 10]
   >>> fs = list(map(lambda i : lambda n : n + i, range(10)))
   >>> list(map(lambda f : f(1), fs))
   [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
Thanks Jussi for that code -- but I am not fully able to wrap my head
around it.

Oops, sorry, I seem to have edited out a question that I meant to ask.
The question was this: How do -you- write the list comprehension in
terms of your fmap? What -is- the expression that still has the same
effects? That is, where do you bind the i's?

The obvious-to-me way is shown above, but there it is the outer lambda
that establishes the distinct i's for the different closures.

(The composition list(map(...)) works in both versions of Python.)
Is the problem in the lambda? ZF?
Are you trying to say that map works (functionally) and ZF is
imperative?

Sorry, what is ZF?

I'm saying that your fmap works, but in itself it does not provide the
bindings that we are talking about, and you didn't show what does. The
outer lambda in my example does that.

The Python list comprehension [... for i in ...] binds (or assigns to)
just one i which is shared by all the closures above. They end up
having the same value for i because it's the same i.

I hope this is less obscure now.

I was wondering why the list(...
Now I see that map returns normal lists in python2 and some generator-
like-thing in 3
I would have said: Shall we just stick to 2 (for this discussion) but
then 2 seems to have a double error that self-corrects this example...

OOOOFFF -- too many variables...

[ZF is Zermelo-Fraenkel -- old name for list comprehension -- I guess
my age leaks like python's ZF (sorry comprehension) :) ]

Oh, ok. Zermelo-Fraenkel came to mind, but I didn't know it's been
used as a name for list comprehensions.
Anyway... My (summary,tentative) conclusion is that python's
comprehensions leak.
The one leak fixed in python3 has not fixed the other (revealed in
this thread)

All this has little to do with lambda (whose scope rules were fixed
around python 2.2 IIRC)

I'd be interested in ur take on this...

I think we agree. I've been saying all along that the perceived
problem is not with the lambda at all but with the way the list
comprehension deals with its variable. (I see that some people
understand the situation correctly but blame the lambda anyway.)

Personally, I like your summary, but I'm enough of an outsider that I
do not want to suggest that there is anything wrong with Python. I
just wish to understand how it works, as with any language that I use.

The issue of this thread seems to turn up repeatedly, but is there a
problem with the language mechanisms, is there a problem with the ways
we talk about the language, or is this just the case where people need
to be educated? I'm not sure.

(Incidentally, I'm so new to Python that I started right with 3.0. I
use some 2.4, 2.5-ish a little because software on a certain server is
not kept updated, but I never knew 2.2.)
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,769
Messages
2,569,580
Members
45,054
Latest member
TrimKetoBoost

Latest Threads

Top