pickle: maximum recursion depth exceeded

S

Simon Burton

Hi,

I am pickling big graphs of data and running into this problem:

[...]
File "/usr/lib/python2.2/pickle.py", line 225, in save
f(self, object)
File "/usr/lib/python2.2/pickle.py", line 414, in save_list
save(element)
File "/usr/lib/python2.2/pickle.py", line 219, in save
self.save_reduce(callable, arg_tup, state)
File "/usr/lib/python2.2/pickle.py", line 249, in save_reduce
save(state)
File "/usr/lib/python2.2/pickle.py", line 225, in save
f(self, object)
File "/usr/lib/python2.2/pickle.py", line 447, in save_dict
save(value)
File "/usr/lib/python2.2/pickle.py", line 219, in save
self.save_reduce(callable, arg_tup, state)
File "/usr/lib/python2.2/pickle.py", line 245, in save_reduce
save(arg_tup)
File "/usr/lib/python2.2/pickle.py", line 225, in save
f(self, object)
File "/usr/lib/python2.2/pickle.py", line 374, in save_tuple
save(element)
File "/usr/lib/python2.2/pickle.py", line 225, in save
f(self, object)
File "/usr/lib/python2.2/pickle.py", line 405, in save_list
write(self.put(memo_len))
RuntimeError: maximum recursion depth exceeded

However, it works when i try the smallest examples and use
sys.setrecursionlimit(4000)

This seems like a limitation in the pickling code. Yes?

The data is perhaps better off in some kind of DB designed for
massively interconnected objects. Any suggestions? ZODB ?

BTW, the data is path searching info for a game, and takes 1-2Mb of memory.

Thankyou,

Simon Burton.
 
A

Anthony Briggs

Hi,

I am pickling big graphs of data and running into this problem:

[...]
File "/usr/lib/python2.2/pickle.py", line 225, in save
f(self, object)
File "/usr/lib/python2.2/pickle.py", line 414, in save_list
save(element)
....

File "/usr/lib/python2.2/pickle.py", line 225, in save
f(self, object)
File "/usr/lib/python2.2/pickle.py", line 405, in save_list
write(self.put(memo_len))
RuntimeError: maximum recursion depth exceeded

However, it works when i try the smallest examples and use
sys.setrecursionlimit(4000)

This seems like a limitation in the pickling code. Yes?

I would suspect that you have a loop in your definitions, eg. A
imports B, and B imports A, particularly since you're trying small
examples, and they're still exceeding the recursion depth.

Hope that helps,

Anthony
 
U

Ulrich Petri

Anthony Briggs said:
I would suspect that you have a loop in your definitions, eg. A
imports B, and B imports A, particularly since you're trying small
examples, and they're still exceeding the recursion depth.

Unlikely, since it works if he sets recursionlimit(4000).
As others put it:
"our new model of Cray is fast enough to finilize an infinite loop in few
nanosecs"

Ciao Ulrich
 
S

Simon Burton

Here is a simple example that breaks pickle.
(With N=256 all is well.)
This probably should go in the docs for pickle -
that highly interlinked data cannot be pickled.
I know that in the past we have been told not to
do this, because of GC issues, so I understand
if it's considered too pathalogical for now.

Simon.

#!/usr/bin/env python

#import cPickle as pickle
import pickle
import os
#sys.setrecursionlimit(4000)

N = 512
print "building..."
nest = [ [] for i in range(N) ]
for i in range(N):
for j in range(N):
nest.append( nest[j] )

print "dumping..."
file = open("nest.pkl","wb")
try:
pickle.dump( nest, file )
except RuntimeError, e:
print e
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,755
Messages
2,569,537
Members
45,022
Latest member
MaybelleMa

Latest Threads

Top