Possible to make subprocess.Popen jobs run serially rather than inparallel?

C

Chris Seberino

Possible to make subprocess.Popen jobs run serially rather than in
parallel?

In other words, if a computer is low on memory and doesn't mind
waiting.....can Popen be configured to submit to a queue and run jobs
*ONE AT TIME*??

That might be useful and avoid crashes and disk swapping.

cs
 
S

Stephen Hansen

Possible to make subprocess.Popen jobs run serially rather than in
parallel?

In other words, if a computer is low on memory and doesn't mind
waiting.....can Popen be configured to submit to a queue and run jobs
*ONE AT TIME*??

That might be useful and avoid crashes and disk swapping.

Just call "process.wait()" after you call process = subprocess.Popen(...)

--

Stephen Hansen
... Also: Ixokai
... Mail: me+list/python (AT) ixokai (DOT) io
... Blog: http://meh.ixokai.io/


-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.10 (Darwin)

iQEcBAEBAgAGBQJMF86IAAoJEKcbwptVWx/laLUH/jl95FmeTGZ9JaqwDv3ccP3e
phaH+YHcBPE0mfV2i2JJwSF05VEKNl/7N78fR2mTqkXevP6ifzE+y8y+OOmq9YP2
YRC8HRfHVnxiD8mr/uOscVQwi3b1XMHKbg6j6ca7Gnek1DfIyUfXOxE3l7afIB7/
rpE2o5Dd5/wWvVMSkgE6iN6CZams3KNvCnLLx9FuJ7jy2NZtBoIWDrZF6J+OwChz
8pJsgNQKoPi6vqvwVgKyoY1+sr81RjMOH4TlDzj726ILbPXxVFxSRrKzUuPtPfFq
My2uK6rEB4QFTKUyuMPeF5dBaHmFt/uNNia1WGkVLSXyMwtCeXoA1JLwNo9qRxQ=
=pwia
-----END PGP SIGNATURE-----
 
C

Chris Seberino

Just call "process.wait()" after you call process = subprocess.Popen(....)

I may have not been clear.....
I *don't* want web app to block on Popen.wait.
I *do* want the Popen process to run in the background which web app
still runs doing other things.

Rather, I don't want *MANY* Popen processes to run in the
background....just one preferably.

cs
 
T

Thomas Jollans

I may have not been clear.....
I *don't* want web app to block on Popen.wait.
I *do* want the Popen process to run in the background which web app
still runs doing other things.

Rather, I don't want *MANY* Popen processes to run in the
background....just one preferably.

so create a single extra worker process. Main app spawns or connects to
worker process, worker process takes care of getting the work done,
either by forking and waiting (sounds silly to me...) or by just doing
the work it's asked to so.

How about threads? How about using os.fork instead of subprocess? How
about using the new multiprocessing module?
 
J

Jean-Michel Pichavant

Chris said:
I may have not been clear.....
I *don't* want web app to block on Popen.wait.
I *do* want the Popen process to run in the background which web app
still runs doing other things.

Rather, I don't want *MANY* Popen processes to run in the
background....just one preferably.

cs
then put the process.wait() in a single thread, this thread will chain
Popen calls in the background.

If your executing python code in your Popen close, you'd better go for
the multiprocessing module (python 2.6+)

JM
 
S

Stephen Hansen

I may have not been clear.....
I *don't* want web app to block on Popen.wait.
I *do* want the Popen process to run in the background which web app
still runs doing other things.

Rather, I don't want *MANY* Popen processes to run in the
background....just one preferably.

The simpliest method that comes to mind then is to have a "Process
Runner" thread that you start when the web app begins. Then create a
Queue.Queue() instance, share it between said thread and your web app.

When you want to run an application, do Queue.put( (argspec,) )

Have Process Runner do a blocking wait with Queue.get().

When you wake it up with Queue.put, have it pass the args off into
subprocess.Popen. Then have it do process.wait() to block on said
process's completion.

Once it's done, our little infinite loop jumps to the top, and it calls
queue.get() again -- if another process request has been put in, it
immediately gets it and goes and runs it, thus your processes are
executing one at a time. If nothing is ready for it, it blocks until you
wake it up.

Something like (written off of top of head, may have errors):

import threading
import Queue
import subprocess

class ProcessRunner(threading.Thread):
def __init__(self, queue):
self._queue = queue
self.setDaemon(True)

def run(self):
while True:
args, kwargs = self._queue.get()
process = subprocess.Popen(*args, **kwargs)
process.wait()


# ... And somewhere in our real web-app initialization, we do...

runner_queue = Queue.Queue()
runner_thread = ProcessRunner(runner_queue)
runner_thread.start()

# ... And later, when we want to start a process ...

runner_queue.put( (("ls -la",), {"shell": False}) ) # (*) see bottom

--

Stephen Hansen
... Also: Ixokai
... Mail: me+list/python (AT) ixokai (DOT) io
... Blog: http://meh.ixokai.io/

P.S. Passing in 'args' and 'kwargs' into the queue is usually in my
experience overkill (in addition to being slightly ugly); generally the
subprocesses I want to run are similar in nature or environment, so I
just have the runner-thread smart. But, the above is the most naive
implementation.


-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.10 (Darwin)

iQEcBAEBAgAGBQJMGPtiAAoJEKcbwptVWx/lrCIH/027eDrP7UxuVjRLeBEypjZ9
tBq1y5AwlCTv9BloPlLjhRb+g/4tWxg9B+nrRjdzoxojNMhrLynZAQhGLpco2slg
rgjy71BLxNxVDmbZGFhwthkJfm++JjszsMo2YoK8PxrT3lrELsx64y6840tGj/y8
8qNx3MNiOEt44pry0H9NeUrx5PvHQxlaay+/eqgEOR6HUsCQjXaiMsbZRPin/nid
nHRVngP0moqcHFKsdEb4KdP4gUEQ5Z9XyyqQB8277oPCaJMJ7bwhUQ6WYdDZSQ4B
g4glNhhbOpZubwQl39w6DNu9rRDP7zIYQ5x8k1dZKQaQoQ5WXTgqZ1/91f2Hjvc=
=Eb6Z
-----END PGP SIGNATURE-----
 
C

Chris Seberino

The simpliest method that comes to mind then is to have a "Process
Runner" thread that you start when the web app begins. Then create a
Queue.Queue() instance, share it between said thread and your web app.

When you want to run an application, do Queue.put( (argspec,) )

Have Process Runner do a blocking wait with Queue.get().

When you wake it up with Queue.put, have it pass the args off into
subprocess.Popen. Then have it do process.wait() to block on said
process's completion.

Once it's done, our little infinite loop jumps to the top, and it calls
queue.get() again -- if another process request has been put in, it
immediately gets it and goes and runs it, thus your processes are
executing one at a time. If nothing is ready for it, it blocks until you
wake it up.

Something like (written off of top of head, may have errors):

import threading
import Queue
import subprocess

class ProcessRunner(threading.Thread):
    def __init__(self, queue):
        self._queue = queue
        self.setDaemon(True)

    def run(self):
        while True:
            args, kwargs = self._queue.get()
            process = subprocess.Popen(*args, **kwargs)
            process.wait()

# ... And somewhere in our real web-app initialization, we do...

    runner_queue = Queue.Queue()
    runner_thread = ProcessRunner(runner_queue)
    runner_thread.start()

# ... And later, when we want to start a process ...

    runner_queue.put( (("ls -la",), {"shell": False}) ) # (*) see bottom

--

   Stephen Hansen
   ... Also: Ixokai
   ... Mail: me+list/python (AT) ixokai (DOT) io
   ... Blog:http://meh.ixokai.io/

P.S. Passing in 'args' and 'kwargs' into the queue is usually in my
experience overkill (in addition to being slightly ugly); generally the
subprocesses I want to run are similar in nature or environment, so I
just have the runner-thread smart. But, the above is the most naive
implementation.

 signature.asc
< 1KViewDownload

Thanks all. I must say I implemented the threading + Queue module
suggestion and it is incredibly simple and elegant. I'm still
recovering from the glorious light rays emanating from the Python
code.

cs
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,768
Messages
2,569,574
Members
45,050
Latest member
AngelS122

Latest Threads

Top