maximum recursion depth?

G

globalrev

i received an error maximum recursion depth when processing large
amounts of data.

i dont know exactly how many recursive calls i made but id assume
50000 or so.

is there a definitie limit to the nbr of calls or is the memory that
runs out? is that then the RAMmemory? is there a special amount of
memory assigned for python or it just takes and takes until windows
runs out of it?
 
A

alex23

is there a definitie limit to the nbr of calls or is the memory that
runs out? is that then the RAMmemory? is there a special amount of
memory assigned for python or it just takes and takes until windows
runs out of it?

You can alter the recursion limit using sys.setrecursionlimit:

setrecursionlimit(n)

Set the maximum depth of the Python interpreter stack to n. This
limit prevents infinite recursion from causing an overflow of the
C
stack and crashing Python. The highest possible limit is
platform-
dependent.
 
J

John Nagle

alex23 said:
You can alter the recursion limit using sys.setrecursionlimit:

setrecursionlimit(n)

Set the maximum depth of the Python interpreter stack to n. This
limit prevents infinite recursion from causing an overflow of the
C
stack and crashing Python.

The default is rather low. I've actually hit it parsing big HTML
files with BeautifulSoup.

John Nagle
 
B

bearophileHUGS

Dennis Lee Bieber, the ghost:
I'd have to wonder why so many recursive calls?

Why not? Maybe the algorithm is written in a recursive style. A
language is good if allows you to use that style too.
On modern CPUs 50000 levels don't look that many levels.

Bye,
bearophile
 
M

Marc 'BlackJack' Rintsch

Dennis Lee Bieber, the ghost:

Why not?

Because of the recursion limit of course. And function call overhead in
Python is quite high compared to an iterative approach.

Ciao,
Marc 'BlackJack' Rintsch
 
S

Sebastian 'lunar' Wiesner

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

[ Marc 'BlackJack' Rintsch said:
Dennis Lee Bieber, the ghost:

Why not?

Because of the recursion limit of course. And function call overhead in
Python is quite high compared to an iterative approach.
And limiting the recursion depth is quite reasonable: The python interpreter
doesn't perform tail call optimisation, each level of recursion depth eats
a bit more memory. Without a recursion limit a python process might hit
the memory restrictions of the OS kernel, which would cause the OS kernel
to just silently kill the interpreter process. Now image this happening
inside a mission critical server process ;)


- --
Freedom is always the freedom of dissenters.
(Rosa Luxemburg)
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.9 (GNU/Linux)

iEYEARECAAYFAkg9nFQACgkQn3IEGILecb5ziQCfe7BcH/7hzMH/6QmGcFy0qQGd
cGoAn0dM0fkErYTs4zlY6kDYdOBEN8+D
=cWxH
-----END PGP SIGNATURE-----
 
J

Jochen Schulz

* Marc 'BlackJack' Rintsch:
Because of the recursion limit of course. And function call overhead in
Python is quite high compared to an iterative approach.

One of my pet projects[1, it's about building and searching trees] made
heavy use of recursion in the beginning. I rewrote parts of it using
iteration because I hit the recursion limit and suspected a performance
hit as well. To my (mild) surprise, the rewrite didn't perform
significantly better. My benchmarks only showed an improvement of a few
percent in runtime. I didn't measure memory usage, though.

J.

[1] http://well-adjusted.de/mspace.py/
(Sorry, baerophile, I'll get back to you about this! My SNV working
copy is currently a mess and I need to clean that up first.)
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,767
Messages
2,569,572
Members
45,046
Latest member
Gavizuho

Latest Threads

Top