multiple processes with private working dirs

T

Tim Arnold

I have a bunch of processes to run and each one needs its own working
directory. I'd also like to know when all of the processes are finished.

(1) First thought was threads, until I saw that os.chdir was process-global.
(2) Next thought was fork, but I don't know how to signal when each child is
finished.
(3) Current thought is to break the process from a method into a external
script; call the script in separate threads. This is the only way I can see
to give each process a separate dir (external process fixes that), and I can
find out when each process is finished (thread fixes that).

Am I missing something? Is there a better way? I hate to rewrite this method
as a script since I've got a lot of object metadata that I'll have to
regenerate with each call of the script.

thanks for any suggestions,
--Tim Arnold
 
S

sturlamolden

Am I missing something? Is there a better way?

Use the pyprocessing module (to appear as standard module
multiprocessing in Python 2.6). It has almost the same interface as
Python's threading and Queue standard modules, except you are working
with processes not threads. To wait for a process to finish, just join
it like you would do with a thread.


http://pyprocessing.berlios.de/
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,770
Messages
2,569,584
Members
45,075
Latest member
MakersCBDBloodSupport

Latest Threads

Top