output to console and to multiple files

N

nathan.shair

Hello,

I searched on Google and in this Google Group, but did not find any
solution to my problem.

I'm looking for a way to output stdout/stderr (from a subprocess or
spawn) to screen and to at least two different files.

eg.
stdout/stderr -> screen
stdout -> log.out
stderr -> log.err

and if possible
stdout/stderr -> screen and log.txt

3 files from stdout/stderr
 
M

Matimus

I took a look around and I couldn't find anything either. I will be
keeping an eye on this thread to see if someone posts a more standard
solution. In the mean time though, I will offer up a potential
solution. Duck typing is your friend. If you are only using the write
method of your files, it can be pretty simple to implement a fake file
object to do what you want.

Code:
import sys

class TeeFile(object):
    def __init__(self,*files):
        self.files = files
    def write(self,txt):
        for fp in self.files:
            fp.write(txt)

if __name__ == "__main__":
    outf = file("log.out","w")
    errf = file("log.err","w")
    allf = file("log.txt","w")
    sys.stdout = TeeFile(sys.__stdout__,outf,allf)
    sys.stderr = TeeFile(sys.__stderr__,errf,allf)

    print "hello world this is stdout"
    print >> sys.stderr , "hello world this is stderr"
 
G

goodwolf

like this?

class Writers (object):

def __init__(self, *writers):
self.writers = writers

def write(self, string):
for w in self.writers:
w.write(string)

def flush(self):
for w in self.writers:
w.flush():

import sys

logfile = open('log.txt', 'w')
sys.stdout = Writers(aya.stdout, file('log.out', 'w'), logfile)
sys.stderr = Writers(aya.stdout, file('log.err', 'w'), logfile)
 
G

Gabriel Genellina

En Wed, 14 Feb 2007 19:28:34 -0300, (e-mail address removed)
I'm looking for a way to output stdout/stderr (from a subprocess or
spawn) to screen and to at least two different files.

Look at the tee command. If you control the subprocess, and it's written
in Python, using the Python recipes would be easier and perhaps you have
more control.
But if you can't modify the subprocess, you'll have to use tee.
 
N

nathan.shair

like this?

class Writers (object):

def __init__(self, *writers):
self.writers = writers

def write(self, string):
for w in self.writers:
w.write(string)

def flush(self):
for w in self.writers:
w.flush():

import sys

logfile = open('log.txt', 'w')
sys.stdout = Writers(aya.stdout, file('log.out', 'w'), logfile)
sys.stderr = Writers(aya.stdout, file('log.err', 'w'), logfile)


i've tried simliar methods to this and to what Matimus wrote. I know
it works great when using print statements.
However, I'm looking to find something that will work with the output
from a subprocess, such as from spawn, os.system, os.popen, etc.
 
N

nathan.shair

En Wed, 14 Feb 2007 19:28:34 -0300, (e-mail address removed)


Look at the tee command. If you control the subprocess, and it's written
in Python, using the Python recipes would be easier and perhaps you have
more control.
But if you can't modify the subprocess, you'll have to use tee.

Tee, the unix function? Or is there a tee that is python?
 
M

Matimus

i've tried simliar methods to this and to what Matimus wrote. I know
it works great when using print statements.
However, I'm looking to find something that will work with the output
from a subprocess, such as from spawn, os.system, os.popen, etc.

I think you should be able to use my or goodwolf's solution with the
subprocess module. Something like this (untested):

Code:
class TeeFile(object):
    def __init__(self,*files):
        self.files = files
    def write(self,txt):
        for fp in self.files:
            fp.write(txt)

if __name__ == "__main__":
    import sys
    from subprocess import Popen

    command = "whatever you want to run"
    outf = file("log.out","w")
    errf = file("log.err","w")
    allf = file("log.txt","w")
    Popen(
        command,
        stdout = TeeFile(sys.__stdout__,outf,allf),
        stderr = TeeFile(sys.__stderr__,errf,allf)
    )
 
M

Matimus

i've tried simliar methods to this and to what Matimus wrote. I know
it works great when using print statements.
However, I'm looking to find something that will work with the output
from a subprocess, such as from spawn, os.system, os.popen, etc.

I think you should be able to use my or goodwolf's solution with the
subprocess module. Something like this (untested):

Code:
class TeeFile(object):
def __init__(self,*files):
self.files = files
def write(self,txt):
for fp in self.files:
fp.write(txt)

if __name__ == "__main__":
import sys
from subprocess import Popen

command = "whatever you want to run"
outf = file("log.out","w")
errf = file("log.err","w")
allf = file("log.txt","w")
Popen(
command,
stdout = TeeFile(sys.__stdout__,outf,allf),
stderr = TeeFile(sys.__stderr__,errf,allf)
)

I tried this at lunch and it doesn't work. Some version of this method
may work, but Popen tries to call the 'fileno' method of the TeeFile
object (at least it did on my setup) and it isn't there. This is just
a preemptive warning before someone comes back to let me know my code
doesn't work.
 
G

Gabriel Genellina

I think you should be able to use my or goodwolf's solution with the
subprocess module. Something like this (untested):

Code:
class TeeFile(object):
def __init__(self,*files):
self.files = files
def write(self,txt):
for fp in self.files:
fp.write(txt)
[/QUOTE]

I tried this at lunch and it doesn't work. Some version of this method
may work, but Popen tries to call the 'fileno' method of the TeeFile
object (at least it did on my setup) and it isn't there. This is just
a preemptive warning before someone comes back to let me know my code
doesn't work.[/QUOTE]

I don't think any Python only solution could work. The pipe options  
available for subprocess are those of the underlying OS, and the OS knows  
nothing about Python file objects.
 
N

nathan.shair

En Thu, 15 Feb 2007 19:35:10 -0300, Matimus <[email protected]> escribió:


I think you should be able to use my or goodwolf's solution with the
subprocess module. Something like this (untested):
Code:
class TeeFile(object):
def __init__(self,*files):
self.files = files
def write(self,txt):
for fp in self.files:
fp.write(txt)[/QUOTE][/QUOTE]
[QUOTE]
I tried this at lunch and it doesn't work. Some version of this method
may work, but Popen tries to call the 'fileno' method of the TeeFile
object (at least it did on my setup) and it isn't there. This is just
a preemptive warning before someone comes back to let me know my code
doesn't work.[/QUOTE]

I don't think any Python only solution could work. The pipe options
available for subprocess are those of the underlying OS, and the OS knows
nothing about Python file objects.
[/QUOTE]

I've tried the subprocess method before without any luck.


Thanks for all your suggestions. I guess it's time to rethink what I
want to do.
 
B

Bart Ogryczak

Hello,

I searched on Google and in this Google Group, but did not find any
solution to my problem.

I'm looking for a way to output stdout/stderr (from a subprocess or
spawn) to screen and to at least two different files.

eg.
stdout/stderr -> screen
stdout -> log.out
stderr -> log.err

and if possible
stdout/stderr -> screen and log.txt

3 files from stdout/stderr

I'd derive a class from file, overwrite it's write() method to send a
copy to the log, and then assign sys.stdout = newFile(sys.stdout).
Same for stderr.
 
G

Gabriel Genellina

On Feb 14, 11:28 pm, "(e-mail address removed)" <[email protected]>
wrote:

I'd derive a class from file, overwrite it's write() method to send a
copy to the log, and then assign sys.stdout = newFile(sys.stdout).
Same for stderr.

That's ok inside the same process, but the OP needs to use it "from a
subprocess or spawn".
You have to use something like tee, working with real file handles.
 
G

garrickp

That's ok inside the same process, but the OP needs to use it "from a
subprocess or spawn".
You have to use something like tee, working with real file handles.

I'm not particularly familiar with this, but it seems to me that if
you're trying to catch stdout/stderr from a program you can call with
(say) popen2, you could just read from the returned stdout/stderr
pipe, and then write to a series of file handles (including
sys.stdout).

Or am I missing something? =)

~G
 
N

nathan.shair

I'm not particularly familiar with this, but it seems to me that if
you're trying to catch stdout/stderr from a program you can call with
(say) popen2, you could just read from the returned stdout/stderr
pipe, and then write to a series of file handles (including
sys.stdout).

Or am I missing something? =)

~G

That works, but it isn't live streaming of stdout/stderr. Most of the
time, if you stream both, one could lock the process, or have the
stdout/stderr printed in the wrong order.
 
F

Fuzzyman

That works, but it isn't live streaming of stdout/stderr. Most of the
time, if you stream both, one could lock the process, or have the
stdout/stderr printed in the wrong order.

Everytime I've looked to do something like this (non-blocking read on
the stdout of a subprocess) I've always come back to the conclusion
that threads and queues are the only reasonable way (particularly on
windows). There may be a better solution using select.

Fuzzyman
http://www.voidspace.org.uk/python/articles.shtml
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,767
Messages
2,569,572
Members
45,046
Latest member
Gavizuho

Latest Threads

Top