correct way of running a sub-process

  • Thread starter Daniel Timothy Bentley
  • Start date
D

Daniel Timothy Bentley

What is the (hopefully unique) obvious way of runnign a sub-process if I
want to get the exit code and input without resorting to multi-threading?

It seems like I should be able to do the following:

foo = popen2.Popen3(cmd)
foo.wait()
foo.fromchild.read()

But, it seems like on my system (a Sun Ultra 30 running Solaris 8) this
will hang if the input happens to have more than about 12k of text. I'm
guessing this is a buffersize issue, but it would be a mistake to assume
at compile time an output size for the child process.

Couldn't wait be modified to actually wait until a process exits by
buffering?

While we're at it, is there any way to read/wait with a time-out? I.e.,
read more than 0 bytes, unless it takes more than a time-out value, so I
can run sub-processes that terminate?

-Dan
 
J

Jp Calderone

What is the (hopefully unique) obvious way of runnign a sub-process if I
want to get the exit code and input without resorting to multi-threading?

It seems like I should be able to do the following:

foo = popen2.Popen3(cmd)
foo.wait()
foo.fromchild.read()

But, it seems like on my system (a Sun Ultra 30 running Solaris 8) this
will hang if the input happens to have more than about 12k of text. I'm
guessing this is a buffersize issue, but it would be a mistake to assume
at compile time an output size for the child process.

Couldn't wait be modified to actually wait until a process exits by
buffering?

While we're at it, is there any way to read/wait with a time-out? I.e.,
read more than 0 bytes, unless it takes more than a time-out value, so I
can run sub-processes that terminate?

(Untested)

from twisted.internet import protocol, reactor

class PrintyProtocol(protocol.ProcessProtocol):
bytes = ''
errBytes = ''

def outReceived(self, bytes):
self.bytes += bytes

def errReceived(self, bytes):
self.errBytes += bytes

def processEnded(self, reason):
print 'Process done. Got:', repr(self.bytes), repr(self.errBytes)
reactor.stop()

reactor.spawnProcess(PrintyProtocol(), cmd, args=(cmd,))
reactor.run()

Jp
 
T

Thomas Guettler

Am Thu, 12 Feb 2004 06:10:01 -0800 schrieb Daniel Timothy Bentley:
What is the (hopefully unique) obvious way of runnign a sub-process if I
want to get the exit code and input without resorting to multi-threading?

It seems like I should be able to do the following:

foo = popen2.Popen3(cmd)
foo.wait()
foo.fromchild.read()

I do it like this:

def shell_command(cmd):
# There mustnot be output to stdout or stderr
# otherwise an exception is raised

p=popen2.Popen4(cmd) # read stdout and stderr
output=p.fromchild.read()
ret=p.wait()
if ret or output:
raise("Error in shell_command '%s': ret=%s output='%s'" %(
cmd, ret, output))
 
D

Daniel Danger Bentley

Correct me if I'm wrong, but can't this not catch all the output? Or is
read in python guaranteed to return all the data that can ever be returned
(unlike the C library function)?

-D
 
D

Donn Cave

"Daniel Danger Bentley said:
Correct me if I'm wrong, but can't this not catch all the output? Or is
read in python guaranteed to return all the data that can ever be returned
(unlike the C library function)?

You are indeed wrong, misled by the name - the C library function
in question is fread(3), which does read all. Not like the read(2)
system call, which is posix.read (a.k.a. os.read)

Back to the original question, I think the simplest way to get
status and output is

fp = os.popen(cmd, 'r')
output = fp.read()
status = os.WEXITSTATUS(fp.close())

Now if you want a timeout, you'll have to do it yourself, and
of course the file object isn't the way to go because of its
underlying buffered read. You can turn off buffering, but then
you're looking at one system call per byte to implement readline
et al., so it's not a generally attractive solution. Better to
get the file descriptor (fileobject.fileno()), and probably use
select on it. You might also want to do that if you end up
reading from two different pipes for stderr and stdout, because
that can lead to the same buffer deadlock you're running into
now - if I'm reading from stdout while the process writes to
stderr, or vice versa, it can fill the pipe and block waiting
for me to empty it, which I won't do because I'm stuck on stdout.

Donn Cave, (e-mail address removed)
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,756
Messages
2,569,540
Members
45,025
Latest member
KetoRushACVFitness

Latest Threads

Top