multiple processes, private working directories

T

Tim Arnold

I have a bunch of processes to run and each one needs its own working
directory. I'd also like to know when all of the processes are
finished.

(1) First thought was threads, until I saw that os.chdir was process-
global.
(2) Next thought was fork, but I don't know how to signal when each
child is
finished.
(3) Current thought is to break the process from a method into a
external
script; call the script in separate threads. This is the only way I
can see
to give each process a separate dir (external process fixes that), and
I can
find out when each process is finished (thread fixes that).

Am I missing something? Is there a better way? I hate to rewrite this
method
as a script since I've got a lot of object metadata that I'll have to
regenerate with each call of the script.

thanks for any suggestions,
--Tim Arnold
 
R

r0g

Tim said:
I have a bunch of processes to run and each one needs its own working
directory. I'd also like to know when all of the processes are
finished.

(1) First thought was threads, until I saw that os.chdir was process-
global.
(2) Next thought was fork, but I don't know how to signal when each
child is
finished.
(3) Current thought is to break the process from a method into a
external
script; call the script in separate threads. This is the only way I
can see
to give each process a separate dir (external process fixes that), and
I can
find out when each process is finished (thread fixes that).

Am I missing something? Is there a better way? I hate to rewrite this
method
as a script since I've got a lot of object metadata that I'll have to
regenerate with each call of the script.

thanks for any suggestions,
--Tim Arnold

(1) + avoid os.chdir and maintain hard paths to all files/folders? or
(2) + sockets? or
(2) + polling your systems task list?
 
M

Michael Palmer

I have a bunch of processes to run and each one needs its own working
directory. I'd also like to know when all of the processes are
finished.

(1) First thought was threads, until I saw that os.chdir was process-
global.
(2) Next thought was fork, but I don't know how to signal when each
child is
finished.
(3) Current thought is to break the process from a method into a
external
script; call the script in separate threads. This is the only way I
can see
to give each process a separate dir (external process fixes that), and
I can
find out when each process is finished (thread fixes that).

Am I missing something? Is there a better way? I hate to rewrite this
method
as a script since I've got a lot of object metadata that I'll have to
regenerate with each call of the script.

thanks for any suggestions,
--Tim Arnold

1, Does the work in the different directories really have to be done
concurrently? You say you'd like to know when each thread/process was
finished, suggesting that they are not server processes but rather
accomplish some limited task.

2. If the answer to 1. is yes: All that os.chdir gives you is an
implicit global variable. Is that convenience really worth a multi-
process architecture? Would it not be easier to just work with
explicit path names instead? You could store the path of the per-
thread working directory in an instance of threading.local - for
example:
.... def __init__(self, path):
.... t.path=path
...

the thread-specific value of t.path would then be available to all
classes and functions running within that thread.
 
K

Karthik Gurusamy

I have a bunch of processes to run and each one needs its own working
directory. I'd also like to know when all of the processes are
finished.

(1) First thought was threads, until I saw that os.chdir was process-
global.
(2) Next thought was fork, but I don't know how to signal when each
child is
finished.
(3) Current thought is to break the process from a method into a
external
script; call the script in separate threads.  This is the only way I
can see
to give each process a separate dir (external process fixes that), and
I can
find out when each process is finished (thread fixes that).

Am I missing something? Is there a better way? I hate to rewrite this
method
as a script since I've got a lot of object metadata that I'll have to
regenerate with each call of the script.

Use subprocess; it supports a cwd argument to provide the given
directory as the child's working directory.

Help on class Popen in module subprocess:

class Popen(__builtin__.object)
| Methods defined here:
|
| __del__(self)
|
| __init__(self, args, bufsize=0, executable=None, stdin=None,
stdout=None, st
derr=None, preexec_fn=None, close_fds=False, shell=False, cwd=None,
env=None, un
iversal_newlines=False, startupinfo=None, creationflags=0)
| Create new Popen instance.

You want to provide the cwd argument above.
Then once you have launched all your n processes, run thru' a loop
waiting for each one to finish.

# cmds is a list of dicts providing details on what processes to run..
what it's cwd should be

runs = []
for c in cmds:
run = subprocess.Popen(cmds['cmd'], cwd = cmds['cwd'] ..... etc
other args)
runs.append(run)

# Now wait for all the processes to finish
for run in runs:
run.wait()

Note that if any of the processes generate lot of stdout/stderr, you
will get a deadlock in the above loop. Then you way want to go for
threads or use run.poll and do the reading of the output from your
child processes.

Karthik
 
C

Carl Banks

(2) Next thought was fork, but I don't know how to signal when each
child is
finished.

Consider the multiprocessing module, which is available in Python 2.6,
but it began its life as a third-party module that acts like threading
module but uses processes. I think you can still run it as a third-
party module in 2.5.


Carl Banks
 
T

Tim Arnold

Tim Arnold said:
I have a bunch of processes to run and each one needs its own working
directory. I'd also like to know when all of the processes are
finished.

Thanks for the ideas everyone--I now have some news tools in the toolbox.
The task is to use pdflatex to compile a bunch of (>100) chapters and know
when the book is complete (i.e. the book pdf is done and the separate
chapter pdfs are finished. I have to wait for that before I start some
postprocessing and reporting chores.

My original scheme was to use a class to manage the builds with threads,
calling pdflatex within each thread. Since pdflatex really does need to be
in the directory with the source, I had a problem.

I'm reading now about python's multiprocessing capabilty, but I think I can
use Karthik's suggestion to call pdflatex in subprocess with the cwd set.
That seems like the simple solution at this point, but I'm going to give
Cameron's pipes suggestion a go as well.

In any case, it's clear I need to rethink the problem. Thanks to everyone
for helping me get past my brain-lock.

--Tim Arnold
 
M

Michael Palmer

Thanks for the ideas everyone--I now have some news tools in the toolbox.
The task is to use pdflatex to compile a bunch of (>100) chapters and know
when the book is complete (i.e. the book pdf is done and the separate
chapter pdfs are finished. I have to wait for that before I start some
postprocessing and reporting chores.

My original scheme was to use a class to manage the builds with threads,
calling pdflatex within each thread. Since pdflatex really does need to be
in the directory with the source, I had a problem.

I'm reading now about python's multiprocessing capabilty, but I think I can
use Karthik's suggestion to call pdflatex in subprocess with the cwd set.
That seems like the simple solution at this point, but I'm going to give
Cameron's pipes suggestion a go as well.

In any case, it's clear I need to rethink the problem. Thanks to everyone
for helping me get past my brain-lock.

--Tim Arnold

I still don't see why this should be done concurrently? Do you have >
100 processors available? I also happen to be writing a book in Latex
these days. I have one master document and pull in all chapters using
\include, and pdflatex is only ever run on the master document. For a
quick preview of the chapter I'm currently working on, I just use
\includeonly - compiles in no time at all.

How do you manage to get consistent page numbers and cross-referencing
if you process all chapters separately, and even in _parallel_ ? That
just doesn't look right to me.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,778
Messages
2,569,605
Members
45,238
Latest member
Top CryptoPodcasts

Latest Threads

Top