At the moment I'm using :
eval {read (STDIN,$request,$length);};
if ($@ ne '') {$response=$error1}
which appears to be working.
Regards
John
That will trap fatal errors from read.
An example is an undefined filehandle.
For some reason, file i/o does not take kindly
to undefined'ness when a handle is passed around
internally so much. Theorehtically, that may cause
a program crash if subcode does not check for null
pointers. Believe me there is alot of C code that
is buggy. So all the entry points are checked for undef
filehandles and die serves as a warning that your
program is working on borrowed time.
There are other less aggredious errors, secondary
level, or non-fatal errors that do not generate an
exception like die. These return an undef result on
i/o operations. An example is that you opened a filehandle
but did not check if it 'failed'. In reality, you sucessfully
allocated a correct filehandle but does not have a valid file
descriptor. This does not cause Perl's internal C code to
generate a fault as a null pointer would.
But how can you trap both kinds of errors, fatal/non-fatal?
In the real world, you don't want to trap fatal errors though,
you want to let them gracefully stop your program. This gives
you an oppurtunity to fix your code, which is broken.
I don't recommend trapping fatal errors, but if you want to
trap all errors (fatal/non-fatal) below is one way to do it.
-sln
---------------------------------------------------
use strict;
use warnings;
my ($buf,$length) = ('',5);
# Invoke error #1, NON - FATAL error on read.
# File doesen't exist, however, $fh is valid
open my $fh, '<', 'notexists.txt';
# Invoke error #2, FATAL error on read
#my $fh;
open STDERR, '>errors.txt';
{
local $!;
my $status = eval { read ($fh, $buf, $length) };
$@ =~ s/\s+$//;
if ($@ || (!$status && $!)) {
print "Error in read: ". ($@ ? $@ : $! ). "\n";
}
}
print "More code ...\n";
exit;
__END__