Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Programming > Perl > Perl Misc > too many open files? How to know?

Reply
Thread Tools

too many open files? How to know?

 
 
Derrell Durrett
Guest
Posts: n/a
 
      05-23-2005
Howdy-

I have a situation in which a program that executes on solaris, a RedHat
flavor of Linux (32- and 64-bit), and Windows XP (32- and 64-bit) fails
on 32-bit XP w/the $! value equivalent to the string "Too many open files."

I am running an external program whose output to stderr I want to
capture, in case it's interesting. The algorithm is (as suggested in
recipe 7.20 in the Perl Cookbook):

dup STDERR (open using ">&" ) to new filehandle.
create new filehandle, to temporary file (using IO::File->new_tmpfile)
take file-descriptor of new filehandle (using fileno())
alias STDERR to new filehandle (open STDERR using ">&=$fileno" )

I then run my external program, and close the filehandles, undef the
temporary variable attached to the temporary file, and reopen stderr to
point back to the original.

In opening STDERR to alias it to the temporary file's descriptor, the
program fails.

I've done this in the debugger, and when I look at the symbol table for
either main, or the package in which the filehandles are being created,
and I don't see anything unusual (only STDOUT, STDIN, STDERR, and the
duplicate ). I did this using the 'x \%main::' and
'x\%<package_name>::' commands at the debugger command line.

Is there a better way to see what files are open? Is this likely a red
herring?

Thanks,

Derrell

--
Derrell Durrett
Xilinx, Inc. / Software Productivity Tools
Longmont, Colorado / 720.652.3843
***remove bits about .processed meats and .death from e-mail to reply
 
Reply With Quote
 
 
 
 
Derrell Durrett
Guest
Posts: n/a
 
      05-23-2005
Derrell Durrett wrote:

> I have a situation in which a program that executes on solaris, a
> RedHat flavor of Linux (32- and 64-bit), and Windows XP (32- and
> 64-bit) fails on 32-bit XP w/the $! value equivalent to the string
> "Too many open files."


I can duplicate the problem using the following code:

use strict;
use warnings;
use English;
use IO::File;

my $count = 0;
while (1) {

my @output;
$count++;
runCmd( 'ls', \@output );
}

sub runCmd {

my ( $cmd, $container ) = @ARG;

# The following code mimics recipe 7.20 from the Perl Cookbook and is
# necessary because the commands being run may output to STDERR and we
# want to capture that.
unless( open( ORIGINAL_STDERR, ">&STDERR" ) ) {

die( "Could not redirect STDERR" );
}

my $error_fh;
unless( $error_fh = IO::File->new_tmpfile ) {

die( "Could not open temporary file for STDERR: $OS_ERROR" );
}
my $error_fd = $error_fh->fileno();
unless( open( STDERR, ">&=$error_fd" ) ) {

die( "Iteration: $count\nCould not duplicate temporary filehandle
for ",
"STDERR: ",$OS_ERROR );
}
STDERR->autoflush( 1 );

@{ $container } = (`$cmd`);

# Close the temporary filehandle.
close $error_fh
or die( "Couldn't close temporary STDERR. ", $OS_ERROR );

# Close redirected handle
close STDERR
or die( "Could not close redirected STDERR" );

# Clean up after ourselves
undef $error_fh;

# Restore STDERR
open( STDERR, ">&ORIGINAL_STDERR" )
or die( "Could not restore STDERR" );

# Close copy to prevent leaks
close ORIGINAL_STDERR
or die( "Could not close copied STDERR" );
}

This gives varying numbers of iterations, depending on whether I'm
executing the program locally, or via rsh, but the error is the same:

"Could not duplicate temporary filehandle for STDERR: Too many open
files at test_opens.plx line 34"

>
> I've done this in the debugger, and when I look at the symbol table
> for either main, or the package in which the filehandles are being
> created, and I don't see anything unusual (only STDOUT, STDIN, STDERR,
> and the duplicate ). I did this using the 'x \%main::' and
> 'x\%<package_name>::' commands at the debugger command line.
>
> Is there a better way to see what files are open? Is this likely a
> red herring?


Since I can create it w/out the original program, it's clearly this bit
of code that matters. What am I missing? I've tried to be scrupulous
about closing all opened file handles, but seem to have a leak nevertheless.

Anything helps,

Derrell

--
Derrell Durrett
Xilinx, Inc. / Software Productivity Tools
Longmont, Colorado / 720.652.3843
***remove bits about .processed meats and .death from e-mail to reply
 
Reply With Quote
 
 
 
 
Derrell Durrett
Guest
Posts: n/a
 
      05-27-2005
Derrell Durrett wrote:

> Derrell Durrett wrote:
>
>> I have a situation in which a program that executes on solaris, a
>> RedHat flavor of Linux (32- and 64-bit), and Windows XP (32- and
>> 64-bit) fails on 32-bit XP w/the $! value equivalent to the string
>> "Too many open files."

>

When I replace the previous code w/File::Temp, I see the same problem.
The XP OS complains after about 500 iterations that I've run out of file
descriptors. The error message isn't particularly informative: bldperl
writestderr exitted with 16777215 and core dumped from signal 127, where
writestderr is the following:

use strict;
use warnings;

print STDERR q[I'm freaking out! ];
die "Still!\n";

and bldperl is a simple wrapper around perl.

Has anyone else seen this problem? I can work around it using real
files to capture stdout and stderr, but since there are often cases
where I am trying to do this on multiple machines over a short amount of
time in a shared network directory. IO::File's tmpfile (if I understood
the docs correctly) was using memory to create these files, not real
files. If it was doing it w/real files, at least I didn't have to think
of algorithms that make the files just more likely to be unique. In any
case, I preferred that method, and for NT it worked. For SunOS 5.8 it
works. For RedHat 7.2 (or whatever the Enterprise equivalent is that
we're using now), it works.

I found (but lost and cannot find again) a mention that some file
related module was not available for XP, and it sounded potentially
related to this.

If anyone has had an experience w/XP similar to this, where the error
"Too many open files" has appeared even though you're sure you're
closing them (if because they go out of scope, if nothing else), I'd be
interested in comparing notes.

Thanks,

Derrell

--
Derrell Durrett
Xilinx, Inc. / Software Productivity Tools
Longmont, Colorado / 720.652.3843
***remove bits about .processed meats and .death from e-mail to reply
 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Re: How include a large array? Edward A. Falk C Programming 1 04-04-2013 08:07 PM
How many warnings is too many? Rhino Java 33 12-22-2005 09:39 PM
How many threads are too many? rbt Python 1 06-11-2005 11:03 PM
How many threads is too many? peelman Java 12 01-15-2005 07:37 AM
Too many (small) vs. too large linked script files in a document... Dag Sunde Javascript 4 12-16-2004 11:38 PM



Advertisments