article on Python 2.5 features

B

Beliavsky

A Parade of New Features Debuts in Python 2.5
by Gigi Sayfan
"Python 2.5 still has the smell of fresh paint but it's the perfect
time to drill down on the most important new features in this
comprehensive release. Read on for detailed explanations and examples
of exception handling, resource management, conditional expressions,
and more."

The full article is at
http://www.devx.com/webdev/Article/33005/0/page/1 .
 
M

Michele Simionato

Beliavsky said:
A Parade of New Features Debuts in Python 2.5
by Gigi Sayfan
"Python 2.5 still has the smell of fresh paint but it's the perfect
time to drill down on the most important new features in this
comprehensive release. Read on for detailed explanations and examples
of exception handling, resource management, conditional expressions,
and more."

The full article is at
http://www.devx.com/webdev/Article/33005/0/page/1 .

The article is nice overall, but I did not like the part about enhanced
generators. For instance, the sentence "The new interactive send()
empowers
generators to implement co-routines", could be debated. Also, it gives
you the false impression that you could not interact with Python 2.2
generators ("interact" in the sense of the article). IMO the important
bit about Python 2.5 generators is that you can send exceptions to
a generator, all the other features can be emulated even using revious
versions of Python.
For instance, here is an emulation of his tokenizer using Python 2.2
generators and a helper 'read_eval_yield_loop' class:

# morally turns a Python 2.2+ generator into a Python 2.5 generator
class read_eval_yield_loop(object):
def __init__(self, gen, *args, **kw):
self._arg = None
self._it = gen(iter(lambda : self._arg, StopIteration), *args,
**kw)
def __iter__(self):
return self._it
def send(self, arg):
self._arg = arg
return self._it.next()
def next(self):
return self.send(None)

# copied from the article with very minor changes
def tokenizer3(read, text, sep):
try:
while True:
token = ''
while text[0] == sep:
text = text[1:]

index = 0
while text[index] != sep:
token += text[index]
index += 1
yield token
new_sep = read.next()
if new_sep != None:
sep = new_sep
text = text[index:]
except IndexError:
if token != '':
yield token

if __name__=='__main__':
print '--- Smart Python 2.2+ tokenizer ---'
g = read_eval_yield_loop(tokenizer3, text, ' ')
for t in g:
print t
if t == 'comma':
g.send(',')

Michele Simionato
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,768
Messages
2,569,574
Members
45,051
Latest member
CarleyMcCr

Latest Threads

Top