Why don't generators execute until first yield?

  • Thread starter Martin Sand Christensen
  • Start date
M

Martin Sand Christensen

Hi!

First a bit of context.

Yesterday I spent a lot of time debugging the following method in a
rather slim database abstraction layer we've developed:

,----
| def selectColumn(self, table, column, where={}, order_by=[], group_by=[]):
| """Performs a SQL select query returning a single column
|
| The column is returned as a list. An exception is thrown if the
| result is not a single column."""
| query = build_select(table, [column], where, order_by, group_by)
| result = DBResult(self.rawQuery(query))
| if result.colcount != 1:
| raise QueryError("Query must return exactly one column", query)
| for row in result.fetchAllRowsAsList():
| yield row[0]
`----

I'd just rewritten the method as a generator rather than returning a
list of results. The following test then failed:

,----
| def testSelectColumnMultipleColumns(self):
| res = self.fdb.selectColumn('db3ut1', ['c1', 'c2'],
| {'c1':(1, 2)}, order_by='c1')
| self.assertRaises(db3.QueryError, self.fdb.selectColumn,
| 'db3ut1', ['c1', 'c2'], {'c1':(1, 2)}, order_by='c1')
`----

I expected this to raise a QueryError due to the result.colcount != 1
constraint being violated (as was the case before), but that isn't the
case. The constraint it not violated until I get the first result from
the generator.

Now to the main point. When a generator function is run, it immediately
returns a generator, and it does not run any code inside the generator.
Not until generator.next() is called is any code inside the generator
executed, giving it traditional lazy evaluation semantics. Why don't
generators follow the usual eager evaluation semantics of Python and
immediately execute up until right before the first yield instead?
Giving generators special case semantics for no good reason is a really
bad idea, so I'm very curious if there is a good reason for it being
this way. With the current semantics it means that errors can pop up at
unexpected times rather than the code failing fast.

Martin
 
I

Ian Kelly

Now to the main point. When a generator function is run, it immediately
returns a generator, and it does not run any code inside the generator.
Not until generator.next() is called is any code inside the generator
executed, giving it traditional lazy evaluation semantics. Why don't
generators follow the usual eager evaluation semantics of Python and
immediately execute up until right before the first yield instead?
Giving generators special case semantics for no good reason is a really
bad idea, so I'm very curious if there is a good reason for it being
this way. With the current semantics it means that errors can pop up at
unexpected times rather than the code failing fast.

Isn't lazy evaluation sort of the whole point of replacing a list with
an iterator? Besides which, running up to the first yield when
instantiated would make the generator's first iteration inconsistent
with the remaining iterations. Consider this somewhat contrived
example:

def printing_iter(stuff):
for item in stuff:
print item
yield item

Clearly, the idea here is to create a generator that wraps another
iterator and prints each item as it yields it. But using your
suggestion, this would instead print the first item at the time the
generator is created, rather than when the first item is actually
iterated over.

If you really want a generator that behaves the way you describe, I
suggest doing something like this:

def myGenerator(args):
immediate_setup_code()

def generator():
for item in actual_generator_loop():
yield item
return generator()
 
M

Martin Sand Christensen

Ian> Isn't lazy evaluation sort of the whole point of replacing a list
Ian> with an iterator? Besides which, running up to the first yield when
Ian> instantiated would make the generator's first iteration
Ian> inconsistent with the remaining iterations.

That wasn't my idea, although that may not have come across quite
clearly enough. I wanted the generator to immediately run until right
before the first yield so that the first call to next() would start with
the first yield.

My objection is that generators _by default_ have different semantics
than the rest of the language. Lazy evaluation as a concept is great for
all the benefits it can provide, but, as I've illustrated, strictly lazy
evaluation semantics can be somewhat surprising at times and lead to
problems that are hard to debug if you don't constantly bear the
difference in mind. In this respect, it seems to me that my suggestion
would be an improvement. I'm not any kind of expert on languages,
though, and I may very well be missing a part of the bigger picture that
makes it obvous why things should be as they are.

As for code to slightly change the semantics of generators, that doesn't
really address the issue as I see it: if you're going to apply such code
to your generators, you're probably doing it exactly because you're
aware of the difference in semantics, and you're not going to be
surprised by it. You may still want to change the semantics, but for
reasons that are irrelevant to my point.

Martin
 
M

Martin Sand Christensen

[...]
Duncan> Now try:
Duncan>
Duncan> for command in getCommandsFromUser():
Duncan> print "the result of that command was", execute(command)
Duncan>
Duncan> where getCommandsFromUser is a greedy generator that reads from stdin,
Duncan> and see why generators don't work that way.

I don't see a problem unless the generator isn't defined where it's
going to be used. In other similar input bound use cases, such as the
generator iterating over a query result set in my original post, I see
even less of a problem. Maybe I'm simply daft and you need to spell it
out for me. :)

Martin
 
M

Marco Mariani

Duncan said:
It does this:

def getCommandsFromUser():
while True:
yield raw_input('Command?')


print "that was command", cmd


Command?hello
Command?goodbye
that was command hello
Command?wtf
that was command goodbye
Command?


Not here..


In [7]: def getCommandsFromUser():
while True:
yield raw_input('Command?')
...:
...:

In [10]: for cmd in getCommandsFromUser(): print "that was command", cmd
....:
Command?hi
that was command hi
Command?there
that was command there
Command?wuwuwuw
that was command wuwuwuw
Command?
 
M

Marco Mariani

Marco said:
Not here..

Oh, sorry, I obviously didn't see the @greedy decorator amongst all the
quoting levels.

Anyway, the idea doesn't make much sense to me :)
 
M

Marco Mariani

Duncan said:
Perhaps if you'd copied all of my code (including the decorator that was
the whole point of it)...

Sure, I missed the point. Python's > symbols become quoting levels and
mess up messages.

Anyway, I would loathe to start execution of a generator before starting
to iterate through it. Especially when generators are passed around.
The current behavior makes perfect sense.
 
C

castironpi

Sure, I missed the point. Python's > symbols become quoting levels and
mess up messages.

Anyway, I would loathe to start execution of a generator before starting
to iterate through it. Especially when generators are passed around.
The current behavior makes perfect sense.
Question:
... print 0
... while 1:
... yield 1
...
1

This might fit the bill:
... h.next( )
... return h
...1

However as dropfirst is dropping a value, both caller -and- cally have
to designate a/the exception. Hold generators are better "first-
dropped", and you hold 'next' inherently causes side effects. @greedy
(from earlier) frees the caller of a responsibility/obligation.

What can follow without a lead?

The definitions may lean harder on the 'generation' as prior to the
'next': generators inherently don't cause side effects.

Or hold, first-dropped is no exception:
... print 0
... yield special
... while 1:
... yield 1
...
1
 
D

Diez B. Roggisch

Now to the main point. When a generator function is run, it immediately
returns a generator, and it does not run any code inside the generator.
Not until generator.next() is called is any code inside the generator
executed, giving it traditional lazy evaluation semantics. Why don't
generators follow the usual eager evaluation semantics of Python and
immediately execute up until right before the first yield instead?
Giving generators special case semantics for no good reason is a really
bad idea, so I'm very curious if there is a good reason for it being
this way. With the current semantics it means that errors can pop up at
unexpected times rather than the code failing fast.

The semantics of a generator are very clear: on .next(), run until the next
yield is reached and then return the yielded value. Plus of course the
dealing with StopIteration-stuff.

Your scenario would introduce a special-case for the first run, making it
necessary to keep additional state around (possibly introducing GC-issues
on the way), just for the sake of it. And violate the lazyness a generator
is all about. Think of a situation like this:

def g():
while True:
yield time.time()

Obviously you want to yield the time at the moment of .next() being called.
Not something stored from ages ago. If anything that setups the generator
shall be done immediatly, it's easy enough:

def g():
first_result = time.time()
def _g():
yield first_result
while True:
yield time.time()
return _()

Diez
 
M

Michael Torrie

Martin said:
Why don't
generators follow the usual eager evaluation semantics of Python and
immediately execute up until right before the first yield instead?

A great example of why this behavior would defeat some of the purpose of
generators can be found in this amazing PDF presentation:

http://www.dabeaz.com/generators/Generators.pdf
Giving generators special case semantics for no good reason is a really
bad idea, so I'm very curious if there is a good reason for it being
this way. With the current semantics it means that errors can pop up at
unexpected times rather than the code failing fast.

Most assuredly they do have good reason. Consider the cases in the PDF
I just mentioned. Building generators that work on the output of other
generators allows assembling entire pipelines of behavior. A very
powerful feature that would be impossible if the generators had the
semantics you describe.

If you want generators to behave as you suggest they should, then a
conventional for x in blah approach is likely the better way to go.

I use a generator anytime I want to be able to iterate across something
that has a potentially expensive cost, in terms of memory or cpu, to do
all at once.
 
C

castironpi

A great example of why this behavior would defeat some of the purpose of
generators can be found in this amazing PDF presentation:

http://www.dabeaz.com/generators/Generators.pdf


Most assuredly they do have good reason.  Consider the cases in the PDF
I just mentioned.  Building generators that work on the output of other
generators allows assembling entire pipelines of behavior.  A very
powerful feature that would be impossible if the generators had the
semantics you describe.

If you want generators to behave as you suggest they should, then a
conventional for x in blah approach is likely the better way to go.

I use a generator anytime I want to be able to iterate across something
that has a potentially expensive cost, in terms of memory or cpu, to do
all at once.

The amount of concentration you can write in a program in a sitting
(fixed amount of time) is kind of limited. Sounds like @greedy was
the way to go. The recall implementation may have a short in the
future, but isn't functools kind of full? Has wraptools been
written? Is it any different?

Naming for @greedy also comes to question. My humble opinion muscles
glom on to @early vs. @late; @yieldprior; @dropfirst; @cooperative.
Thesaurus.com adds @ahead vs. @behind.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,767
Messages
2,569,570
Members
45,045
Latest member
DRCM

Latest Threads

Top