Fork You.. Forking and threading..

Discussion in 'Python' started by rh0dium, Jul 5, 2006.

  1. rh0dium

    rh0dium Guest

    Hi all,

    I have a problem with putting a job in the background. Here is my
    (ugly) script which I am having problems getting to background. There
    are threads about doing

    python script.py &

    and others

    nohup python script.py &

    and yet others

    ( python script.py > /dev/null & ) &

    Regardless timing these all produce (roughly the same time) and none of
    them are backgrounded.. So I attempted to first thread my little
    script ( which I think is still needed for optimization purposes ) and
    then I forked it. But it still does not background it.. Here is what
    I have..

    --------------------------------------------------------------------------------

    # File pushsync.py
    #
    #

    import logging,traceback,os,sys,paramiko,threading
    from RunSSHCmd import RunSSHCmd

    # Set's up a basic logger
    logging.basicConfig(level=logging.INFO, format="%(asctime)s %(name)s
    %(levelname)-8s %(message)s",
    datefmt='%d %b %Y %H:%M:%S', stream=sys.stderr)

    # This little mess identifies (hopefully) the current module name
    try:
    module=
    os.path.basename(traceback.extract_stack(limit=2)[1][0]).split(".py")[0]+"."
    except:
    module =
    os.path.basename(traceback.extract_stack(limit=2)[0][0]).split(".py")[0]+"."

    # Class declarations
    class syncer(threading.Thread):

    def __init__(self,host,keyfile,passwd):
    threading.Thread.__init__(self)
    self.log = logging.getLogger(module+self.__class__.__name__)
    self.keyfile = keyfile
    self.passwd = passwd
    self.host= host

    def run(self):
    # Import the key..
    key = paramiko.DSSKey.from_private_key_file(self.keyfile,
    password=self.passwd
    agent_keys=[key]

    print "Updating host %s" % self.host
    results = RunSSHCmd( host=self.host, cmd="p4 sync", timeout=10,
    keys=agent_keys ).run()
    if results is None:
    log.error("We had a problem..")
    return results


    # OK let's get busy
    def main(hosts,keyfile, passwd):

    # Fork You!
    #
    if os.fork() == 0:
    os.setsid
    sys.stdout = open("/dev/null", 'w')
    sys.stdin = open("/dev/null", 'r')

    log = logging.getLogger(module+sys._getframe().f_code.co_name )

    for host in hosts:
    log.info("Updating host %s" % host)
    syncer(host,keyfile,passwd).start()

    # General run..
    if __name__ == '__main__':

    # SSH Keyfile
    KEYFILE=os.environ['HOME'] + '/.ssh/id_dsa'
    PASSWD='YXV0MG1hdDM=\n'

    # Perforce writable hosts
    HOSTS="savoy", "phxlfs03"

    main(HOSTS,KEYFILE,PASSWD)


    --------------------------------------------------------------------------------

    Now when I run this from the command line it appears to work. But when
    I call it from my other app (perforce) it does not background it. I
    can tell because I expect the command return immediately ( because of
    the fork) but it doesn't and seems to take a very long time..

    Can someone who's very familiar with this help me out please.
     
    rh0dium, Jul 5, 2006
    #1
    1. Advertising

  2. On 2006-07-05, rh0dium <> wrote:
    > Hi all,
    >
    > I have a problem with putting a job in the background. Here is my
    > (ugly) script which I am having problems getting to background. There
    > are threads about doing
    >
    > python script.py &
    >
    > and others
    >
    > nohup python script.py &
    >
    > and yet others
    >
    > ( python script.py > /dev/null & ) &
    >
    > Regardless timing these all produce (roughly the same time) and none of
    > them are backgrounded.


    Yes they are -- unless you're using a different definition of
    the word "backgrounded" than the rest of the Unix world.

    What do you mean by "backgrounded"?

    What are you trying to accomplish?

    --
    Grant Edwards grante Yow! Yow!! That's a GOOD
    at IDEA!! Eating a whole FIELD
    visi.com of COUGH MEDICINE should
    make you feel MUCH BETTER!!
     
    Grant Edwards, Jul 5, 2006
    #2
    1. Advertising

  3. rh0dium

    Jon Ribbens Guest

    In article <>, rh0dium wrote:
    > if os.fork() == 0:
    > os.setsid
    > sys.stdout = open("/dev/null", 'w')
    > sys.stdin = open("/dev/null", 'r')


    I don't know if it's the cause of your problem, but you're not doing
    the backgrounding right, it should be:

    if os.fork():
    os._exit(0)
    os.setsid()
    os.chdir("/")
    fd = os.open("/dev/null", os.O_RDWR)
    os.dup2(fd, 0)
    os.dup2(fd, 1)
    os.dup2(fd, 2)
    if fd > 2:
    os.close(fd)
    # do stuff
     
    Jon Ribbens, Jul 5, 2006
    #3
  4. rh0dium <> wrote:
    > I have a problem with putting a job in the background. Here is my
    > (ugly) script which I am having problems getting to background. There
    > are threads about doing
    >
    > python script.py &
    >
    > and others
    >
    > nohup python script.py &
    >
    > and yet others
    >
    > ( python script.py > /dev/null & ) &
    >
    > Regardless timing these all produce (roughly the same time) and none of
    > them are backgrounded..


    I suspect you want the old fork/setsid/fork trick...

    http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/66012

    That releases the controlling terminal from your job and it exits
    from the process group.

    You could probably close / redirect stdin/out/err too. Search for
    daemonize.py and you'll find a module which does all this.

    --
    Nick Craig-Wood <> -- http://www.craig-wood.com/nick
     
    Nick Craig-Wood, Jul 5, 2006
    #4
  5. rh0dium

    rh0dium Guest

    Re: Fork You.. Forking and threading..

    Hi Nick!

    This is much better than the kludge job I did - Thanks for the help!!


    Nick Craig-Wood wrote:
    > rh0dium <> wrote:
    > > I have a problem with putting a job in the background. Here is my
    > > (ugly) script which I am having problems getting to background. There
    > > are threads about doing
    > >
    > > python script.py &
    > >
    > > and others
    > >
    > > nohup python script.py &
    > >
    > > and yet others
    > >
    > > ( python script.py > /dev/null & ) &
    > >
    > > Regardless timing these all produce (roughly the same time) and none of
    > > them are backgrounded..

    >
    > I suspect you want the old fork/setsid/fork trick...
    >
    > http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/66012
    >
    > That releases the controlling terminal from your job and it exits
    > from the process group.
    >
    > You could probably close / redirect stdin/out/err too. Search for
    > daemonize.py and you'll find a module which does all this.
    >
    > --
    > Nick Craig-Wood <> -- http://www.craig-wood.com/nick
     
    rh0dium, Jul 6, 2006
    #5
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Sophia Cao

    threading/forking and IPC

    Sophia Cao, Oct 14, 2005, in forum: Python
    Replies:
    2
    Views:
    339
    Qun Cao
    Oct 15, 2005
  2. Replies:
    9
    Views:
    1,077
    Mark Space
    Dec 29, 2007
  3. Eric Snow

    os.fork and pty.fork

    Eric Snow, Jan 8, 2009, in forum: Python
    Replies:
    0
    Views:
    579
    Eric Snow
    Jan 8, 2009
  4. Steven Woody
    Replies:
    0
    Views:
    441
    Steven Woody
    Jan 9, 2009
  5. Joerg Diekmann
    Replies:
    0
    Views:
    116
    Joerg Diekmann
    Feb 16, 2006
Loading...

Share This Page