Best way to spawn process on back end computer

Discussion in 'Python' started by sophie_newbie, Oct 16, 2008.

  1. Hi,

    I'm running a python cgi script on a frontend web server and I want it
    to spawn another script (that takes a long time to run) on a backend
    number crunching server thats connected to the same network. What do
    you think is the best way to do this? I have a few ideas but I'm sure
    there is a "best" way to go about this.

    Thanks.
     
    sophie_newbie, Oct 16, 2008
    #1
    1. Advertising

  2. 2008/10/16 sophie_newbie <>:

    > I'm running a python cgi script on a frontend web server and I want it
    > to spawn another script (that takes a long time to run) on a backend
    > number crunching server thats connected to the same network. What do
    > you think is the best way to do this? I have a few ideas but I'm sure
    > there is a "best" way to go about this.


    http://edit.kamaelia.org/ might be qworth a look.

    --
    Cheers,
    Simon B.
     
    Simon Brunning, Oct 16, 2008
    #2
    1. Advertising

  3. sophie_newbie

    Robin Becker Guest

    sophie_newbie wrote:
    > Hi,
    >
    > I'm running a python cgi script on a frontend web server and I want it
    > to spawn another script (that takes a long time to run) on a backend
    > number crunching server thats connected to the same network. What do
    > you think is the best way to do this? I have a few ideas but I'm sure
    > there is a "best" way to go about this.



    The main problem here is that you'll probably need to detach the job to allow
    the current cgi request to return a response to the client.

    The implication of that is that the job either has to be anonymous and requires
    no further attention or you need to provide some means of making the job
    responsive to requests about its status so that a periodic request can be made
    by the web page. That implies that the job can be identified and the creation
    reponse returns the identity. One of the major problems is that the normal www
    user has few privileges and cannot normally write to disk.

    I have done this using both external shell scripts to do the main processing and
    detaching and or python scripts that know how to detach. It was not terribly
    easy or obvious.

    Another alternative, as Simon's Kamaelia might indicate, is that you might
    consider running a job server to service the cgi script requests on the remote
    host. I have also done this as part of a web application. One of the advantages
    was that the jobserver can run as any user and thus gets access to whatever the
    owner has; additionally by providing a suitable protocol eg XMLRPC you can test
    the jobserver without going through the web.
    --
    Robin Becker
     
    Robin Becker, Oct 16, 2008
    #3
  4. sophie_newbie

    Paul Boddie Guest

    On 16 Okt, 15:51, Robin Becker <> wrote:
    > sophie_newbie wrote:
    > > I'm running a python cgi script on a frontend web server and I want it
    > > to spawn another script (that takes a long time to run) on a backend
    > > number crunching server thats connected to the same network. What do
    > > you think is the best way to do this? I have a few ideas but I'm sure
    > > there is a "best" way to go about this.

    >
    > The main problem here is that you'll probably need to detach the job to allow
    > the current cgi request to return a response to the client.


    I've added support for background processes to the pprocess library;
    this attempts to address the problems around detaching from worker
    processes and re-attaching to them later in order to collect the
    results:

    http://www.boddie.org.uk/python/pprocess/tutorial.html#BackgroundCallable

    Arguably, this is more complicated than the most basic approach, which
    would involve having separate, spawned processes just writing to files
    whose contents would then be passed back to the user or processed in
    the CGI script, but it's the notification that's the most difficult
    part, not the data transfer: efficiently getting a notification event,
    rather than polling stuff frequently, is the main problem.

    > The implication of that is that the job either has to be anonymous and requires
    > no further attention or you need to provide some means of making the job
    > responsive to requests about its status so that a periodic request can be made
    > by the web page. That implies that the job can be identified and the creation
    > reponse returns the identity. One of the major problems is that the normal www
    > user has few privileges and cannot normally write to disk.


    I've used UNIX sockets as the means of communication between creating/
    collecting processes (the CGI script in this case) and the created/
    worker processes. Someone suggested an alternative method of binding
    to kernel-managed namespaces, if I recall the nature of the suggestion
    correctly, but I haven't looked into this yet.

    More details here:

    http://www.boddie.org.uk/python/pprocess.html

    And for the impatient, a repository is here:

    https://hg.boddie.org.uk/pprocess

    Paul
     
    Paul Boddie, Oct 16, 2008
    #4
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Derek Basch
    Replies:
    2
    Views:
    1,321
    Donn Cave
    Jan 21, 2005
  2. Karen Sundquist
    Replies:
    1
    Views:
    163
    Saurabh Nandu
    Dec 1, 2003
  3. Ed Hames
    Replies:
    0
    Views:
    389
    Ed Hames
    Apr 16, 2008
  4. Edgardo Hames
    Replies:
    1
    Views:
    374
    Ed Hames
    May 6, 2008
  5. Victor Hooi
    Replies:
    1
    Views:
    148
    Nobody
    Feb 10, 2013
Loading...

Share This Page