python shutting down sloooooooowly/tuning dictionaries

T

Till Plewe

Is there a way to speed up killing python from within a python program?
Sometimes shutting down takes more than 10 times as much time as the actual
running of the program.

The programs are fairly simple (searching/organizing large boardgame
databases) but use a lot of memory (1-6GB). The memory is mostly used
for simple structures like trees or relations. Typically there will be
a few large dictionaries and many small dictionaries/sets/arrays/lists.

Since the entire database is too large I typically run the same
program 10-100 times on smaller parts. The annoying bit is that the
time for finishing one instance of a program before starting the next
instance can take a lot of time. I have started using a shell script to
check whether certain files have been written and then kill the
program from the os to speed up everything. But this solution is way
too ugly. There should be a better way, but I don't seem to be able to
find one.

A second question. Is it possible to control the resizing of
dictionaries? It seems that resizing always doubles the size, but for
dictionaries using more than half of the remaining memory than this
is a problem. I would rather have a slightly too full dictionaries in
main memory than have a sparse dictionary partially swapped out.

I would like to be able to restrict the maximum growth of each resize
operation to around 100MB or so.

TIA.

- Till
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,744
Messages
2,569,484
Members
44,903
Latest member
orderPeak8CBDGummies

Latest Threads

Top