Fork You.. Forking and threading..

R

rh0dium

Hi all,

I have a problem with putting a job in the background. Here is my
(ugly) script which I am having problems getting to background. There
are threads about doing

python script.py &

and others

nohup python script.py &

and yet others

( python script.py > /dev/null & ) &

Regardless timing these all produce (roughly the same time) and none of
them are backgrounded.. So I attempted to first thread my little
script ( which I think is still needed for optimization purposes ) and
then I forked it. But it still does not background it.. Here is what
I have..

--------------------------------------------------------------------------------

# File pushsync.py
#
#

import logging,traceback,os,sys,paramiko,threading
from RunSSHCmd import RunSSHCmd

# Set's up a basic logger
logging.basicConfig(level=logging.INFO, format="%(asctime)s %(name)s
%(levelname)-8s %(message)s",
datefmt='%d %b %Y %H:%M:%S', stream=sys.stderr)

# This little mess identifies (hopefully) the current module name
try:
module=
os.path.basename(traceback.extract_stack(limit=2)[1][0]).split(".py")[0]+"."
except:
module =
os.path.basename(traceback.extract_stack(limit=2)[0][0]).split(".py")[0]+"."

# Class declarations
class syncer(threading.Thread):

def __init__(self,host,keyfile,passwd):
threading.Thread.__init__(self)
self.log = logging.getLogger(module+self.__class__.__name__)
self.keyfile = keyfile
self.passwd = passwd
self.host= host

def run(self):
# Import the key..
key = paramiko.DSSKey.from_private_key_file(self.keyfile,
password=self.passwd
agent_keys=[key]

print "Updating host %s" % self.host
results = RunSSHCmd( host=self.host, cmd="p4 sync", timeout=10,
keys=agent_keys ).run()
if results is None:
log.error("We had a problem..")
return results


# OK let's get busy
def main(hosts,keyfile, passwd):

# Fork You!
#
if os.fork() == 0:
os.setsid
sys.stdout = open("/dev/null", 'w')
sys.stdin = open("/dev/null", 'r')

log = logging.getLogger(module+sys._getframe().f_code.co_name )

for host in hosts:
log.info("Updating host %s" % host)
syncer(host,keyfile,passwd).start()

# General run..
if __name__ == '__main__':

# SSH Keyfile
KEYFILE=os.environ['HOME'] + '/.ssh/id_dsa'
PASSWD='YXV0MG1hdDM=\n'

# Perforce writable hosts
HOSTS="savoy", "phxlfs03"

main(HOSTS,KEYFILE,PASSWD)


--------------------------------------------------------------------------------

Now when I run this from the command line it appears to work. But when
I call it from my other app (perforce) it does not background it. I
can tell because I expect the command return immediately ( because of
the fork) but it doesn't and seems to take a very long time..

Can someone who's very familiar with this help me out please.
 
G

Grant Edwards

Hi all,

I have a problem with putting a job in the background. Here is my
(ugly) script which I am having problems getting to background. There
are threads about doing

python script.py &

and others

nohup python script.py &

and yet others

( python script.py > /dev/null & ) &

Regardless timing these all produce (roughly the same time) and none of
them are backgrounded.

Yes they are -- unless you're using a different definition of
the word "backgrounded" than the rest of the Unix world.

What do you mean by "backgrounded"?

What are you trying to accomplish?
 
J

Jon Ribbens

if os.fork() == 0:
os.setsid
sys.stdout = open("/dev/null", 'w')
sys.stdin = open("/dev/null", 'r')

I don't know if it's the cause of your problem, but you're not doing
the backgrounding right, it should be:

if os.fork():
os._exit(0)
os.setsid()
os.chdir("/")
fd = os.open("/dev/null", os.O_RDWR)
os.dup2(fd, 0)
os.dup2(fd, 1)
os.dup2(fd, 2)
if fd > 2:
os.close(fd)
# do stuff
 
N

Nick Craig-Wood

rh0dium said:
I have a problem with putting a job in the background. Here is my
(ugly) script which I am having problems getting to background. There
are threads about doing

python script.py &

and others

nohup python script.py &

and yet others

( python script.py > /dev/null & ) &

Regardless timing these all produce (roughly the same time) and none of
them are backgrounded..

I suspect you want the old fork/setsid/fork trick...

http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/66012

That releases the controlling terminal from your job and it exits
from the process group.

You could probably close / redirect stdin/out/err too. Search for
daemonize.py and you'll find a module which does all this.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,044
Messages
2,570,388
Members
47,052
Latest member
ketan

Latest Threads

Top