multiprocessing and accessing server's stdout

T

Tim Arnold

Hi,
I'm using multiprocessing's BaseManager to create a server on one
machine and a client on another. The client fires a request and the
server does some work, the result of which ends up on a shared file
system that both the client and server can see.

However, I need the client machine to see the stdout of the process
running on the server. Not sure this is doable--I've been unable to
google anything useful on this one.

thanks,
--Tim Arnold
 
A

Adam Tauno Williams

Hi,
I'm using multiprocessing's BaseManager to create a server on one
machine and a client on another. The client fires a request and the
server does some work, the result of which ends up on a shared file
system that both the client and server can see.
However, I need the client machine to see the stdout of the process
running on the server. Not sure this is doable--I've been unable to
google anything useful on this one.

Nope, it isn't. Don't use stdout, use an IPC mechanism to communicate
between the client and the server if you need feedback.
 
T

Tim Arnold

Nope, it isn't.  Don't use stdout, use an IPC mechanism to communicate
between the client and the server if you need feedback.

Thanks for that info, it saves me some time. This is a new area for me
though: do you redirect stdout on the server to a socket and have the
client listen and somehow pipe the sockets contents to the client
stdout?

Interestingly, the RPYc package manages it--that is, the client gets
the stdout of the server process, so I'll dig into that code to get an
idea. In the meantime, are there any recipes or other docs that would
be helpful? I've been googling but without much luck.

thanks,
--Tim
 
A

Adam Tauno Williams

Thanks for that info, it saves me some time. This is a new area for me
though: do you redirect stdout on the server to a socket and have the
client listen and somehow pipe the sockets contents to the client
stdout?

No, I close stdin, stderr, and stdout on the server processes and attach
them to /dev/null. Just don't use stdout.
Interestingly, the RPYc package manages it--that is, the client gets
the stdout of the server process, so I'll dig into that code to get an
idea. In the meantime, are there any recipes or other docs that would
be helpful? I've been googling but without much luck.

Closing stdout and attaching it to any other file descriptor is pretty
simple.

sys.stdout = open('/dev/null', 'w')

You should be able to point it any any file-like object. But, again,
why?

If you have the data in the process why send it to stdout and redirect
it. Why not just send the data to the client directly?
 
M

Martin P. Hellwig

On 05/28/10 13:17, Adam Tauno Williams wrote:
You should be able to point it any any file-like object. But, again,
why?

If you have the data in the process why send it to stdout and redirect
it. Why not just send the data to the client directly?

Well you might want to multiplex it to more then one client, not saying
that this is the case here, just something I imagine possible.
 
A

Adam Tauno Williams

On 05/28/10 13:17, Adam Tauno Williams wrote:

Well you might want to multiplex it to more then one client, not saying
that this is the case here, just something I imagine possible.

That still doesn't make sense. Why 'multiplex stdout'? Why not just
multiplex the data into proper IPC channels in the first place?
 
M

Martin P. Hellwig

That still doesn't make sense. Why 'multiplex stdout'? Why not just
multiplex the data into proper IPC channels in the first place?

I am going on a stretch here, I mostly agree with you, just trying to
illustrate that there could be corner cases where this is sensible.
The current situation could be that there is a client/server program
(binary only perhaps) which is not multi-user safe.

Python can be used as a wrapper around the server to make it
multi-client, by emulating the exact behavior towards the client, the
client program does not have to be changed.
 
T

Tim Arnold

I am going on a stretch here, I mostly agree with you, just trying to
illustrate that there could be corner cases where this is sensible.
The current situation could be that there is a client/server program
(binary only perhaps) which is not multi-user safe.

Python can be used as a wrapper around the server to make it
multi-client, by emulating the exact behavior towards the client, the
client program does not have to be changed.

Hi, This is the setup I was asking about.
I've got users using a python-written command line client. They're
requesting services from a remote server that fires a LaTeX process. I
want them to see the stdout from the LaTeX process.

I was using multiprocessing to handle the requests, but the stdout
shows up on the server's terminal window where I started the
server.serve_forever process.

I started using RPyC and now the stdout appears on the client terminal
making the request.

I was trying to minimize the number of packages I use, hoping I could
get the same capability from multiprocessing that I get with RPyC.

thanks for the comments. I'm still processing what's been written
here.
--Tim
 
B

Bryan

Tim said:
Hi, This is the setup I was asking about.
I've got users using a python-written command line client. They're
requesting services from a remote server that fires a LaTeX process. I
want them to see the stdout from the LaTeX process.

So what you really need is to capture the output of a command, in this
case LaTeX, so you can copy it back to the client. You can do that
with the subprocess module in the Python standard library.

If the command generated so much output so fast that you felt the need
to avoid the extra copy, I suppose you could fork() then hook stdout
directly to socket connected to the client with dup2(), then exec()
the command. But no need for that just to capture LaTeX's output.
 
B

Bryan

I said:
So what you really need is to capture the output of a command, in this
case LaTeX, so you can copy it back to the client. You can do that
with the subprocess module in the Python standard library.

If the command generated so much output so fast that you felt the need
to avoid the extra copy, I suppose you could fork() then hook stdout
directly to socket connected to the client with dup2(), then exec()
the command. But no need for that just to capture LaTeX's output.

Upon further reading, I see that the subprocess module makes the
direct-hookup method easy, at least on 'nix systems. Just tell
subprocess.Popen to use the client-connected socket as the
subprocess's stdout.

The question here turns out to make more sense than I had though upon
reading the first post. The server runs a command at the client's
request, and we want to deliver the output of that command back to the
client. A brilliantly efficient method is to direct the command's
stdout to the client's connection.

Below is a demo server that sends the host's words file to any client
that connects. It assumes Unix.


--Bryan Olson


#!/usr/bin/python

from thread import start_new_thread
from subprocess import Popen


def demo(sock):
subp = Popen(['cat', '/usr/share/dict/words'], stdout=sock)
subp.wait()
sock.close()

if __name__ == '__main__':
listener_sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
listener_sock.bind(('', 54321))
listener_sock.listen(5)
while True:
sock, remote_address = listener_sock.accept()
start_new_thread(demo, (sock,))
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,770
Messages
2,569,584
Members
45,077
Latest member
SangMoor21

Latest Threads

Top