perl wrapper to limit stderr to first 1000 lines?

Discussion in 'Perl Misc' started by Mike Hunter, Jun 18, 2004.

  1. Mike Hunter

    Mike Hunter Guest

    Hi,

    I have some cron jobs that can sometimes send out too much noise to stderr,
    which in turn causes sendmail to do bad things :( I'm trying to limit the
    amount of stderr I see from those scripts without changing the scripts
    themselves. I am looking to write a perl wrapper that does something like this:

    my $program = shift @ARGV;
    my $args = join " ", @ARGV;

    open PGMSTDOUT, "$program $args|" or die "blah!";

    ......somehow get the program's stdout into PGMSTDOUT

    while (<PGMSTDOUT>)
    {
    print $_;
    }

    my $n = 0;
    my $error_line = <PGMSTDERR>;
    while (<PGMSTDERR> && ($n < 1000))
    {
    $error_line = <PGMSTDERR>;
    print STDERR $error_line;
    $n++;
    }

    The only similar advice I've seen on the web was here:

    http://perlmonks.thepen.com/730.html

    But I don't want to follow that approach because I don't want to create a file
    on disk with all the STDERR stuff, I want to discard it.

    Any help? How do I *pipe* stderr to something without duping it to stdout?

    Thanks,

    Mike
    Mike Hunter, Jun 18, 2004
    #1
    1. Advertising

  2. Mike Hunter

    Ben Morrow Guest

    Quoth :
    >
    > I have some cron jobs that can sometimes send out too much noise to stderr,
    > which in turn causes sendmail to do bad things :( I'm trying to limit the
    > amount of stderr I see from those scripts without changing the scripts
    > themselves. I am looking to write a perl wrapper that does something like this:
    >
    > my $program = shift @ARGV;
    > my $args = join " ", @ARGV;


    What's the point of shifing @ARGV if you're just going to join
    "$program " onto the beginning anyway?

    > open PGMSTDOUT, "$program $args|" or die "blah!";


    Don't do this: use three-arg open.
    Use lexical file-handles.

    open my $PGMSTDOUT, '-|', @ARGV or die "can't run $ARGV[0]: $!";

    > while (<PGMSTDOUT>)
    > {
    > print $_;
    > }
    >
    > my $n = 0;


    Perl provides the special variable $. for this job. See perldoc perlvar.

    > my $error_line = <PGMSTDERR>;
    > while (<PGMSTDERR> && ($n < 1000))


    This is wrong: Perl does special magic with while (<>). What you mean is

    while (<PGMSTDERR>) {
    $. > 999 and last;

    or

    while (defined($_ = <PGMSTDERR>) and $. < 1000) {

    which is what perl expands while (<>) into.

    > {
    > $error_line = <PGMSTDERR>;


    Presumably you are reading again because you lost the results when you
    lost the magic while (<>); this will discard every other line, though.

    > print STDERR $error_line;
    > $n++;
    > }
    >
    > The only similar advice I've seen on the web was here:
    >
    > http://perlmonks.thepen.com/730.html
    >
    > But I don't want to follow that approach because I don't want to create a file
    > on disk with all the STDERR stuff, I want to discard it.
    >
    > Any help? How do I *pipe* stderr to something without duping it to stdout?


    If you simply want to discard all of stderr, use 2>/dev/null in the
    command line. If you want to grab stdout and stderr separately, you
    will need to use IPC::Open3; you will also need to use IO::Select to
    process the bits of each as they arrive, or you'll get deadlocks (you'll
    be waiting for the end of stdout, the program will be blocking trying to
    write something to stderr).

    If all you want to do to stderr is grab the first bit, then try this
    shell script (untested):

    #!/bin/sh

    stderr=$(mktemp -t cron.XXXXXXXXXX)
    stdout=$(mktemp -t cron.XXXXXXXXXX)

    "$@" 2>&1 >"$stdout" | head -n1000 >"$stderr"
    err=$?

    cat "$stdout"
    cat "$stderr" >&2

    rm -f "$stdout" "$stderr"

    exit $err

    __END__

    Using temporary files makes avoiding deadlock a lot easier.

    Ben

    --
    We do not stop playing because we grow old;
    we grow old because we stop playing.
    Ben Morrow, Jun 18, 2004
    #2
    1. Advertising

  3. Mike Hunter

    Mike Hunter Guest

    On Fri, 18 Jun 2004 01:54:49 +0000 (UTC), Ben Morrow wrote:
    >
    > Quoth :
    > >
    > > I have some cron jobs that can sometimes send out too much noise to stderr,
    > > which in turn causes sendmail to do bad things :( I'm trying to limit the
    > > amount of stderr I see from those scripts without changing the scripts
    > > themselves. I am looking to write a perl wrapper that does something like this:
    > >
    > > my $program = shift @ARGV;
    > > my $args = join " ", @ARGV;

    >
    > What's the point of shifing @ARGV if you're just going to join
    > "$program " onto the beginning anyway?


    Just thinking ahead :)

    > > open PGMSTDOUT, "$program $args|" or die "blah!";

    >
    > Don't do this: use three-arg open.
    > Use lexical file-handles.
    >
    > open my $PGMSTDOUT, '-|', @ARGV or die "can't run $ARGV[0]: $!";
    >
    > > while (<PGMSTDOUT>)
    > > {
    > > print $_;
    > > }
    > >
    > > my $n = 0;

    >
    > Perl provides the special variable $. for this job. See perldoc perlvar.


    Thanks.

    > > my $error_line = <PGMSTDERR>;
    > > while (<PGMSTDERR> && ($n < 1000))

    >
    > This is wrong: Perl does special magic with while (<>). What you mean is
    >
    > while (<PGMSTDERR>) {
    > $. > 999 and last;
    >
    > or
    >
    > while (defined($_ = <PGMSTDERR>) and $. < 1000) {
    >
    > which is what perl expands while (<>) into.
    >
    > > {
    > > $error_line = <PGMSTDERR>;

    >
    > Presumably you are reading again because you lost the results when you
    > lost the magic while (<>); this will discard every other line, though.


    Yeah, my bad.

    > > print STDERR $error_line;
    > > $n++;
    > > }
    > >
    > > The only similar advice I've seen on the web was here:
    > >
    > > http://perlmonks.thepen.com/730.html
    > >
    > > But I don't want to follow that approach because I don't want to create a file
    > > on disk with all the STDERR stuff, I want to discard it.
    > >
    > > Any help? How do I *pipe* stderr to something without duping it to stdout?

    >
    > If you simply want to discard all of stderr, use 2>/dev/null in the
    > command line. If you want to grab stdout and stderr separately, you
    > will need to use IPC::Open3; you will also need to use IO::Select to
    > process the bits of each as they arrive, or you'll get deadlocks (you'll
    > be waiting for the end of stdout, the program will be blocking trying to
    > write something to stderr).


    Thanks, I'll look into those. I knew it wouldn't be easy :)

    Mike
    Mike Hunter, Jun 18, 2004
    #3
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Replies:
    1
    Views:
    1,067
    Victor Bazarov
    Jun 28, 2005
  2. Replies:
    3
    Views:
    237
    John Machin
    Sep 18, 2007
  3. pozz
    Replies:
    27
    Views:
    718
    Seebs
    Mar 4, 2011
  4. Markus Dehmann
    Replies:
    1
    Views:
    128
    Tad McClellan
    Sep 26, 2006
  5. Joe
    Replies:
    2
    Views:
    217
Loading...

Share This Page