Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Programming > Perl > Perl Misc > perl wrapper to limit stderr to first 1000 lines?

Reply
Thread Tools

perl wrapper to limit stderr to first 1000 lines?

 
 
Mike Hunter
Guest
Posts: n/a
 
      06-18-2004
Hi,

I have some cron jobs that can sometimes send out too much noise to stderr,
which in turn causes sendmail to do bad things I'm trying to limit the
amount of stderr I see from those scripts without changing the scripts
themselves. I am looking to write a perl wrapper that does something like this:

my $program = shift @ARGV;
my $args = join " ", @ARGV;

open PGMSTDOUT, "$program $args|" or die "blah!";

......somehow get the program's stdout into PGMSTDOUT

while (<PGMSTDOUT>)
{
print $_;
}

my $n = 0;
my $error_line = <PGMSTDERR>;
while (<PGMSTDERR> && ($n < 1000))
{
$error_line = <PGMSTDERR>;
print STDERR $error_line;
$n++;
}

The only similar advice I've seen on the web was here:

http://perlmonks.thepen.com/730.html

But I don't want to follow that approach because I don't want to create a file
on disk with all the STDERR stuff, I want to discard it.

Any help? How do I *pipe* stderr to something without duping it to stdout?

Thanks,

Mike
 
Reply With Quote
 
 
 
 
Ben Morrow
Guest
Posts: n/a
 
      06-18-2004

Quoth http://www.velocityreviews.com/forums/(E-Mail Removed):
>
> I have some cron jobs that can sometimes send out too much noise to stderr,
> which in turn causes sendmail to do bad things I'm trying to limit the
> amount of stderr I see from those scripts without changing the scripts
> themselves. I am looking to write a perl wrapper that does something like this:
>
> my $program = shift @ARGV;
> my $args = join " ", @ARGV;


What's the point of shifing @ARGV if you're just going to join
"$program " onto the beginning anyway?

> open PGMSTDOUT, "$program $args|" or die "blah!";


Don't do this: use three-arg open.
Use lexical file-handles.

open my $PGMSTDOUT, '-|', @ARGV or die "can't run $ARGV[0]: $!";

> while (<PGMSTDOUT>)
> {
> print $_;
> }
>
> my $n = 0;


Perl provides the special variable $. for this job. See perldoc perlvar.

> my $error_line = <PGMSTDERR>;
> while (<PGMSTDERR> && ($n < 1000))


This is wrong: Perl does special magic with while (<>). What you mean is

while (<PGMSTDERR>) {
$. > 999 and last;

or

while (defined($_ = <PGMSTDERR>) and $. < 1000) {

which is what perl expands while (<>) into.

> {
> $error_line = <PGMSTDERR>;


Presumably you are reading again because you lost the results when you
lost the magic while (<>); this will discard every other line, though.

> print STDERR $error_line;
> $n++;
> }
>
> The only similar advice I've seen on the web was here:
>
> http://perlmonks.thepen.com/730.html
>
> But I don't want to follow that approach because I don't want to create a file
> on disk with all the STDERR stuff, I want to discard it.
>
> Any help? How do I *pipe* stderr to something without duping it to stdout?


If you simply want to discard all of stderr, use 2>/dev/null in the
command line. If you want to grab stdout and stderr separately, you
will need to use IPC::Open3; you will also need to use IO::Select to
process the bits of each as they arrive, or you'll get deadlocks (you'll
be waiting for the end of stdout, the program will be blocking trying to
write something to stderr).

If all you want to do to stderr is grab the first bit, then try this
shell script (untested):

#!/bin/sh

stderr=$(mktemp -t cron.XXXXXXXXXX)
stdout=$(mktemp -t cron.XXXXXXXXXX)

"$@" 2>&1 >"$stdout" | head -n1000 >"$stderr"
err=$?

cat "$stdout"
cat "$stderr" >&2

rm -f "$stdout" "$stderr"

exit $err

__END__

Using temporary files makes avoiding deadlock a lot easier.

Ben

--
We do not stop playing because we grow old;
we grow old because we stop playing.
(E-Mail Removed)
 
Reply With Quote
 
 
 
 
Mike Hunter
Guest
Posts: n/a
 
      06-18-2004
On Fri, 18 Jun 2004 01:54:49 +0000 (UTC), Ben Morrow wrote:
>
> Quoth (E-Mail Removed):
> >
> > I have some cron jobs that can sometimes send out too much noise to stderr,
> > which in turn causes sendmail to do bad things I'm trying to limit the
> > amount of stderr I see from those scripts without changing the scripts
> > themselves. I am looking to write a perl wrapper that does something like this:
> >
> > my $program = shift @ARGV;
> > my $args = join " ", @ARGV;

>
> What's the point of shifing @ARGV if you're just going to join
> "$program " onto the beginning anyway?


Just thinking ahead

> > open PGMSTDOUT, "$program $args|" or die "blah!";

>
> Don't do this: use three-arg open.
> Use lexical file-handles.
>
> open my $PGMSTDOUT, '-|', @ARGV or die "can't run $ARGV[0]: $!";
>
> > while (<PGMSTDOUT>)
> > {
> > print $_;
> > }
> >
> > my $n = 0;

>
> Perl provides the special variable $. for this job. See perldoc perlvar.


Thanks.

> > my $error_line = <PGMSTDERR>;
> > while (<PGMSTDERR> && ($n < 1000))

>
> This is wrong: Perl does special magic with while (<>). What you mean is
>
> while (<PGMSTDERR>) {
> $. > 999 and last;
>
> or
>
> while (defined($_ = <PGMSTDERR>) and $. < 1000) {
>
> which is what perl expands while (<>) into.
>
> > {
> > $error_line = <PGMSTDERR>;

>
> Presumably you are reading again because you lost the results when you
> lost the magic while (<>); this will discard every other line, though.


Yeah, my bad.

> > print STDERR $error_line;
> > $n++;
> > }
> >
> > The only similar advice I've seen on the web was here:
> >
> > http://perlmonks.thepen.com/730.html
> >
> > But I don't want to follow that approach because I don't want to create a file
> > on disk with all the STDERR stuff, I want to discard it.
> >
> > Any help? How do I *pipe* stderr to something without duping it to stdout?

>
> If you simply want to discard all of stderr, use 2>/dev/null in the
> command line. If you want to grab stdout and stderr separately, you
> will need to use IPC::Open3; you will also need to use IO::Select to
> process the bits of each as they arrive, or you'll get deadlocks (you'll
> be waiting for the end of stdout, the program will be blocking trying to
> write something to stderr).


Thanks, I'll look into those. I knew it wouldn't be easy

Mike
 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Optimizing the expression (x * 1000 / 1000) pozz C Programming 27 03-04-2011 04:46 AM
Some shareware has a time limit and the software will not work after the time limit has expired. anthony crowder Computer Support 20 01-16-2007 10:01 AM
Take over STDOUT / STDERR for Perl embedded in C application William Perl Misc 1 10-05-2005 06:37 AM
c program, file size limit, how to solve? 2G bytes limit. guru.slt@gmail.com C++ 1 06-27-2005 11:05 PM
get messages from background task to log file (start-stop-daemon with perl script stderr) Grischa Schuering Perl Misc 1 09-29-2003 11:53 PM



Advertisments