How to time out a forked command but still see output?

Discussion in 'Perl Misc' started by thecrow, Apr 16, 2007.

  1. thecrow

    thecrow Guest

    The goal... perl script launches the external program, shows all its
    ouput in realtime. If too much time expires, perl script exits,
    redirects all output of external program to some file. Can someone
    give me a nudge in the right direction?

    I tried a few things involving alarm() and eval but couldn't get them
    to work. I won't waste your time with everything that failed, but the
    following code is as close as I got. It is not acceptable because it
    doesn't show the output of the program.

    I also tried something like redirecting CMD to STDOUT but when I do
    that, the output of the program keeps scrolling to the term even after
    the timeout. I tried to solve this by closing these filehandles and
    redirecting them to /dev/null outside of the eval, but those didn't
    work either.

    Help is appreciated...

    #!/usr/bin/perl
    $command = shift @ARGV;
    $startupWait = shift @ARGV || 60;

    eval {
    local $SIG{'ALRM'} =
    sub {
    die "\nTimed out command $command after waiting
    $startupWait seconds\n";
    };
    alarm($startupWait);
    print "Running command: $command\n";
    print "with timeout of $startupWait\n";
    open(CMD, "$command|");
    (@output) = (<CMD>);
    close CMD;
    alarm 0;
    print "Command completed, output is:\n";
    print map { "$_\n" } @output;
    };
    die "$@" if ($@);
    thecrow, Apr 16, 2007
    #1
    1. Advertising

  2. thecrow

    thecrow Guest

    Never mind, I hit on something simple that worked. Don't know why I
    didn't try this first. Haven't done Perl in years.

    alarm($timeout);
    open(CMD, "$command|");
    while (<CMD>) {
    print $_;
    }
    close CMD;
    alarm(0);

    The output is displayed up until the time of completion or timeout.
    alarm() seems to interrupt this and its children just fine.

    I'm sure the solution is obvious to everyone, but I'll record it here
    for posterity for whoever else has the same question.

    On Apr 16, 4:47 pm, "thecrow" <> wrote:
    > The goal... perl script launches the external program, shows all its
    > ouput in realtime. If too much time expires, perl script exits,
    > redirects all output of external program to some file. Can someone
    > give me a nudge in the right direction?
    >
    > I tried a few things involving alarm() and eval but couldn't get them
    > to work. I won't waste your time with everything that failed, but the
    > following code is as close as I got. It is not acceptable because it
    > doesn't show the output of the program.
    >
    > I also tried something like redirecting CMD to STDOUT but when I do
    > that, the output of the program keeps scrolling to the term even after
    > the timeout. I tried to solve this by closing these filehandles and
    > redirecting them to /dev/null outside of the eval, but those didn't
    > work either.
    >
    > Help is appreciated...
    >
    > #!/usr/bin/perl
    > $command = shift @ARGV;
    > $startupWait = shift @ARGV || 60;
    >
    > eval {
    > local $SIG{'ALRM'} =
    > sub {
    > die "\nTimed out command $command after waiting
    > $startupWait seconds\n";
    > };
    > alarm($startupWait);
    > print "Running command: $command\n";
    > print "with timeout of $startupWait\n";
    > open(CMD, "$command|");
    > (@output) = (<CMD>);
    > close CMD;
    > alarm 0;
    > print "Command completed, output is:\n";
    > print map { "$_\n" } @output;};
    >
    > die "$@" if ($@);
    thecrow, Apr 16, 2007
    #2
    1. Advertising

  3. thecrow

    Guest

    "thecrow" <> wrote:
    > Never mind, I hit on something simple that worked. Don't know why I
    > didn't try this first. Haven't done Perl in years.
    >
    > alarm($timeout);
    > open(CMD, "$command|");
    > while (<CMD>) {
    > print $_;
    > }
    > close CMD;
    > alarm(0);


    That doesn't seem do what you said you want. It doesn't redirect
    to a file and let child finish sending output there.

    The way I do something vaguely like that is just to do:

    $ perl something.pl > foo
    $ tail -f foo

    Then hit ctrl-c on the tail when I got bored.

    Xho


    >
    > On Apr 16, 4:47 pm, "thecrow" <> wrote:
    > > The goal... perl script launches the external program, shows all its
    > > ouput in realtime. If too much time expires, perl script exits,
    > > redirects all output of external program to some file. Can someone
    > > give me a nudge in the right direction?


    --
    -------------------- http://NewsReader.Com/ --------------------
    Usenet Newsgroup Service $9.95/Month 30GB
    , Apr 16, 2007
    #3
  4. thecrow

    Guest

    wrote:
    > "thecrow" <> wrote:
    > > Never mind, I hit on something simple that worked. Don't know why I
    > > didn't try this first. Haven't done Perl in years.
    > >
    > > alarm($timeout);
    > > open(CMD, "$command|");
    > > while (<CMD>) {
    > > print $_;
    > > }
    > > close CMD;
    > > alarm(0);

    >
    > That doesn't seem do what you said you want. It doesn't redirect
    > to a file and let child finish sending output there.
    >
    > The way I do something vaguely like that is just to do:
    >
    > $ perl something.pl > foo


    Of course that should be:
    $ perl something.pl > foo &

    > $ tail -f foo
    >
    > Then hit ctrl-c on the tail when I got bored.
    >
    > Xho


    --
    -------------------- http://NewsReader.Com/ --------------------
    Usenet Newsgroup Service $9.95/Month 30GB
    , Apr 16, 2007
    #4
  5. thecrow

    thecrow Guest

    Thanks for the reply. This worked somewhat for me because I wasn't
    looking to redirect to a file within the perl script itself... the
    wrapper can do that.

    I found that I had to add a SIGCHLD handler to break the loop. Also,
    for stuff that sends unbuffered output, I couldn't use while (<CMD>),
    instead I had to use sysread. I just love those helpful programs
    that send "status dots" with no carriage return and no way to disable
    them :(

    In my new solution it still doesn't grab the last chunk of output
    until the external prog finishes, so I'm going to look at what you've
    posted and see if there are any improvements to be had there.

    Thanks...

    On Apr 16, 7:14 pm, ""
    <> wrote:
    > On Apr 16, 3:51 pm, wrote:
    >
    > > That doesn't seem do what you said you want. It doesn't redirect
    > > to a file and let child finish sending output there.

    >
    > I started to play with the close-on-exec flag and
    > had the signal handler exec to a new process:
    >
    > my $fh;
    > my @output;
    > eval {
    > local $SIG{'ALRM'} = sub {
    > close $fh;
    > exec("finish_at_your_leisure");
    > };
    >
    > alarm 15;
    > local $^F = 3;
    > open(CMD, "slow_process|")
    > or die "Cound not open pipe: $!";
    > open $fh, '>', '/tmp/save'
    > or die "Could not open file : $!";
    > my $old_fh = select($fh);
    > $| = 1;
    > select($old_fh);
    > while (<CMD>)
    > {
    > push @output, $_;
    > print $fh $_;
    > }
    > alarm 0;
    > print "Command completed, output is:\n";
    > print $_ for @output;
    >
    > };
    >
    > die "$@" if $@;
    >
    > Where finish_at_your_leisure was something like:
    >
    > open my $fh, '>>', '/tmp/save'
    > or die $!;
    >
    > open (CMD, '<&=3')
    > or die "fdopen equivalent failed: $!";
    >
    > while (<CMD>)
    > {
    > print $fh $_;
    >
    > }
    >
    > close $fh;
    >
    > It seems to work but I don't usually
    > work with signal handlers and frankly having
    > anything complex in them gives me an uneasy
    > feeling.
    >
    > --
    > Hope this helps,
    > Steven
    thecrow, Apr 17, 2007
    #5
  6. On Apr 16, 1:47 pm, "thecrow" <> wrote:
    > The goal... perl script launches the external program, shows all its
    > ouput in realtime. If too much time expires, perl script exits,
    > redirects all output of external program to some file. Can someone
    > give me a nudge in the right direction?
    >
    > I tried a few things involving alarm() and eval but couldn't get them
    > to work. I won't waste your time with everything that failed, but the
    > following code is as close as I got. It is not acceptable because it
    > doesn't show the output of the program.
    >
    > I also tried something like redirecting CMD to STDOUT but when I do
    > that, the output of the program keeps scrolling to the term even after
    > the timeout. I tried to solve this by closing these filehandles and
    > redirecting them to /dev/null outside of the eval, but those didn't
    > work either.
    >
    > Help is appreciated...
    >
    > #!/usr/bin/perl
    > $command = shift @ARGV;
    > $startupWait = shift @ARGV || 60;
    >
    > eval {
    > local $SIG{'ALRM'} =
    > sub {
    > die "\nTimed out command $command after waiting
    > $startupWait seconds\n";
    > };
    > alarm($startupWait);
    > print "Running command: $command\n";
    > print "with timeout of $startupWait\n";
    > open(CMD, "$command|");
    > (@output) = (<CMD>);
    > close CMD;
    > alarm 0;
    > print "Command completed, output is:\n";
    > print map { "$_\n" } @output;};
    >
    > die "$@" if ($@);



    Another way if redirecting all -- rather than just
    the ensuing output from the point of the interrupt --
    to a file is acceptable:

    open( my $fh, "$command |") or die "fork failed: $!" ;

    local $SIG{ ALRM } = sub { close $fh;
    system("$command >save.txt &"); exit;
    };
    alarm $startupWait;
    ....
    print while <$fh>; # eg.

    --
    Charles DeRykus
    comp.llang.perl.moderated, Apr 18, 2007
    #6
  7. thecrow

    thecrow Guest

    On Apr 18, 8:19 am, "comp.llang.perl.moderated" <c...@blv-
    sam-01.ca.boeing.com> wrote:
    > Another way if redirecting all -- rather than just
    > the ensuing output from the point of the interrupt --
    > to a file is acceptable:
    >
    > open( my $fh, "$command |") or die "fork failed: $!" ;
    >
    > local $SIG{ ALRM } = sub { close $fh;
    > system("$command >save.txt &"); exit;};
    >
    > alarm $startupWait;
    > ...
    > print while <$fh>; # eg.


    I'm confused by this example, it seems that it would run the command
    twice, discarding all the output the first time, and saving all the
    output the second time. Definitely that's not what I'm looking for.

    My main challenge was that I wanted to see that output in realtime,
    but the command writes unbuffered output to STDOUT, and is prone to
    hanging.
    thecrow, Apr 21, 2007
    #7
  8. On Apr 21, 8:58 am, thecrow <> wrote:
    > On Apr 18, 8:19 am, "comp.llang.perl.moderated" <c...@blv-
    >
    > sam-01.ca.boeing.com> wrote:
    > > Another way if redirecting all -- rather than just
    > > the ensuing output from the point of the interrupt --
    > > to a file is acceptable:

    >
    > > open( my $fh, "$command |") or die "fork failed: $!" ;

    >
    > > local $SIG{ ALRM } = sub { close $fh;
    > > system("$command >save.txt &"); exit;};

    >
    > > alarm $startupWait;
    > > ...
    > > print while <$fh>; # eg.

    >
    > I'm confused by this example, it seems that it would run the command
    > twice, discarding all the output the first time, and saving all the
    > output the second time. Definitely that's not what I'm looking for.
    >


    Did you try it... No, it shouldn't. The program will
    print all output to the screen in real time. If, however,
    your timeout occurs before program completion, then
    the program is launched in the background and output re-directed as
    you specified. I also assume that your
    code included what you demo'ed in your post earlier:
    alarm 0;
    print "Command completed, output is:\n";
    ...


    > My main challenge was that I wanted to see that output in realtime,
    > but the command writes unbuffered output to STDOUT, and is prone to
    > hanging.


    Again, if the program hangs and there's a timeout, then the handler
    closes the pipe and launchs the program in the
    background before exiting itself. And the pipe open is actually a
    fork behind the scenes so the program is
    running in a separate child process and can't pre-empt
    a timeout handler in the parent.

    HTH,
    --
    Charles DeRykus
    comp.llang.perl.moderated, Apr 22, 2007
    #8
  9. On Apr 21, 10:06 pm, "comp.llang.perl.moderated" <c...@blv-
    sam-01.ca.boeing.com> wrote:
    > On Apr 21, 8:58 am, thecrow <> wrote:
    >
    >
    >
    > > On Apr 18, 8:19 am, "comp.llang.perl.moderated" <c...@blv-

    >
    > > sam-01.ca.boeing.com> wrote:
    > > > Another way if redirecting all -- rather than just
    > > > the ensuing output from the point of the interrupt --
    > > > to a file is acceptable:

    >
    > > > open( my $fh, "$command |") or die "fork failed: $!" ;

    >
    > > > local $SIG{ ALRM } = sub { close $fh;
    > > > system("$command >save.txt &"); exit;};

    >
    > > > alarm $startupWait;
    > > > ...
    > > > print while <$fh>; # eg.

    >
    > > I'm confused by this example, it seems that it would run the command
    > > twice, discarding all the output the first time, and saving all the
    > > output the second time. Definitely that's not what I'm looking for.

    >
    > Did you try it... No, it shouldn't. The program will
    > print all output to the screen in real time. If, however,
    > your timeout occurs before program completion, then
    > the program is launched in the background and output re-directed as
    > you specified. I also assume that your
    > code included what you demo'ed in your post earlier:
    > alarm 0;
    > print "Command completed, output is:\n";
    > ...
    >
    > > My main challenge was that I wanted to see that output in realtime,
    > > but the command writes unbuffered output to STDOUT, and is prone to
    > > hanging.

    >
    > Again, if the program hangs and there's a timeout, then the handler
    > closes the pipe and launchs the program in the
    > background before exiting itself. And the pipe open is actually a
    > fork behind the scenes so the program is
    > running in a separate child process and can't pre-empt
    > a timeout handler in the parent.
    >


    If you're concerned about the timed out program running to completion
    after
    the background child program starts, be aware the child process will
    terminate with a SIGPIPE as soon as it tries to write to the closed
    pipe
    in any event.

    However, you might be able to force an even early termination:

    my $child = open( my $fh, "$command |") or die "fork failed: $!" ;
    local $SIG{ ALRM } = sub { close $fh; kill 'TERM', $child or kill
    'KILL',$child;
    system("$command
    >save.txt &"); exit;};

    ....

    --
    Charles DeRykus
    comp.llang.perl.moderated, Apr 22, 2007
    #9
  10. thecrow

    thecrow Guest

    On Apr 22, 1:06 am, "comp.llang.perl.moderated" <c...@blv-
    sam-01.ca.boeing.com> wrote:
    > On Apr 21, 8:58 am, thecrow <> wrote:
    >
    >
    >
    > > On Apr 18, 8:19 am, "comp.llang.perl.moderated" <c...@blv-

    >
    > > sam-01.ca.boeing.com> wrote:
    > > > Another way if redirecting all -- rather than just
    > > > the ensuing output from the point of the interrupt --
    > > > to a file is acceptable:

    >
    > > > open( my $fh, "$command |") or die "fork failed: $!" ;

    >
    > > > local $SIG{ ALRM } = sub { close $fh;
    > > > system("$command >save.txt &"); exit;};

    >
    > > > alarm $startupWait;
    > > > ...
    > > > print while <$fh>; # eg.

    >
    > > I'm confused by this example, it seems that it would run the command
    > > twice, discarding all the output the first time, and saving all the
    > > output the second time. Definitely that's not what I'm looking for.

    >
    > Did you try it... No, it shouldn't. The program will
    > print all output to the screen in real time. If, however,
    > your timeout occurs before program completion, then
    > the program is launched in the background and output re-directed as
    > you specified.


    If the program output is not buffered, the effect of your code will
    be:
    1) Launch program
    2) Program runs but no output is ever displayed.
    3) Timeout arrives.
    4) Second copy of the program is launched, which redirects to a file.
    5) Presumably the first program hangs

    So there are 2 major issues here, first, it won't work with unbuffered
    output. Second, it only works if you assume a stateless program...
    that nothing happened during the first launch, and running it a second
    time is an equivalent operation.
    thecrow, May 7, 2007
    #10
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Steve Kershaw
    Replies:
    1
    Views:
    352
    Brennan Stehling
    Sep 26, 2006
  2. flamesrock
    Replies:
    8
    Views:
    449
    Hendrik van Rooyen
    Nov 24, 2006
  3. Alexander N. Spitzer
    Replies:
    1
    Views:
    532
    Lawrence Kirby
    Nov 12, 2004
  4. Stan R.
    Replies:
    1
    Views:
    132
  5. grocery_stocker
    Replies:
    2
    Views:
    103
Loading...

Share This Page