Classes derived from dict and eval

J

Jeremy Sanders

Hi -

I'm trying to subclass a dict which is used as the globals environment of
an eval expression. For instance:

class Foo(dict):
def __init__(self):
self.update(globals())
self['val'] = 42

def __getitem__(self, item):
# this doesn't get called from the eval statement
print "*", item
return dict.__getitem__(self, item)

a = Foo()

print a['val']
print eval('val*2+6', a)

The first print statements also prints "* val", but __getitem__ is never
called by the evaluation in the eval statement.

Is this a bug? Does anyone have an idea for a workaround? I'm using
Python 2.3.3.

Thanks

Jeremy
 
R

Robert Kern

Jeremy said:
Hi -

I'm trying to subclass a dict which is used as the globals environment of
an eval expression. For instance:

class Foo(dict):
def __init__(self):
self.update(globals())
self['val'] = 42

def __getitem__(self, item):
# this doesn't get called from the eval statement
print "*", item
return dict.__getitem__(self, item)

a = Foo()

print a['val']
print eval('val*2+6', a)

The first print statements also prints "* val", but __getitem__ is never
called by the evaluation in the eval statement.

Is this a bug? Does anyone have an idea for a workaround? I'm using
Python 2.3.3.

In [1]: eval?
Type: builtin_function_or_method
Base Class: <type 'builtin_function_or_method'>
String Form: <built-in function eval>
Namespace: Python builtin
Docstring:
eval(source[, globals[, locals]]) -> value

Evaluate the source in the context of globals and locals.
The source may be a string representing a Python expression
or a code object as returned by compile().
The globals must be a dictionary and locals can be any mappping,
defaulting to the current globals and locals.
If only globals is given, locals defaults to it.

globals needs to be a real dictionary. The implementation uses the C
API, it doesn't use the overridden __getitem__. The locals argument,
apparently can be some other kind of mapping.

--
Robert Kern
(e-mail address removed)

"In the fields of hell where the grass grows high
Are the graves of dreams allowed to die."
-- Richard Harter
 
K

Kent Johnson

Jeremy said:
Hi -

I'm trying to subclass a dict which is used as the globals environment of
an eval expression. For instance:

class Foo(dict):
def __init__(self):
self.update(globals())
self['val'] = 42

def __getitem__(self, item):
# this doesn't get called from the eval statement
print "*", item
return dict.__getitem__(self, item)

a = Foo()

print a['val']
print eval('val*2+6', a)

The first print statements also prints "* val", but __getitem__ is never
called by the evaluation in the eval statement.

Is this a bug? Does anyone have an idea for a workaround? I'm using
Python 2.3.3.

Try Python 2.4.1:
Python 2.4.1 (#65, Mar 30 2005, 09:13:57) [MSC v.1310 32 bit (Intel)] on win32
Type "help", "copyright", "credits" or "license" for more information. ... def __init__(self):
... self.update(globals())
... self['val'] = 42
... def __getitem__(self, item):
... # this doesn't get called from the eval statement
... print "*", item
... return dict.__getitem__(self, item)
...
>>> a = Foo()
>>>
>>> print a['val']
* val
42* val
90

Kent
 
J

Jeremy Sanders

globals needs to be a real dictionary. The implementation uses the C
API, it doesn't use the overridden __getitem__. The locals argument,
apparently can be some other kind of mapping.

It seems that on Python 2.3 then neither globals or locals accessing by
eval calls the __getitem__ member of the dicts.

Jeremy
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,769
Messages
2,569,579
Members
45,053
Latest member
BrodieSola

Latest Threads

Top