file handle problem

N

newbie

Hi all,

I have the following simple file reading code, the files being read in is
fairly large (4 MB with 160,000 lines, but each line only contain like 3
strings).

open(FILEHANDLE, "<a.dat") || die "Can't open: $!\n";
chomp(@array1 = <FILEHANDLE>);
close FILEHANDLE;

#do somthing with @array1

open(FILEHANDLE, "<b.data") || die "Can't open: $!\n";
chomp(@array2 = <FILEHANDLE>);
close FILEHANDLE;

#do something with @array2

My problem is that the 1st time reading in the file is fine, but when I open
the filehandler the 2nd time, the program gets stuck. I think it's something
to do with the large file, because I try out with smaller files, and it
works fine. Can anyone help me? Thanks

While on file reading subject, is there a fast way of reading in large
files?
 
G

Gunnar Hjalmarsson

newbie said:
I have the following simple file reading code, the files being read in is
fairly large (4 MB with 160,000 lines, but each line only contain like 3
strings).

open(FILEHANDLE, "<a.dat") || die "Can't open: $!\n";
chomp(@array1 = <FILEHANDLE>);
close FILEHANDLE;

#do somthing with @array1

open(FILEHANDLE, "<b.data") || die "Can't open: $!\n";
chomp(@array2 = <FILEHANDLE>);
close FILEHANDLE;

#do something with @array2

My problem is that the 1st time reading in the file is fine, but when I open
the filehandler the 2nd time, the program gets stuck. I think it's something
to do with the large file, because I try out with smaller files, and it
works fine. Can anyone help me?

The starting-point is that you should avoid reading such large files
into memory if possible, and rather read them line by line:

open(FILEHANDLE, "<a.dat") || die "Can't open: $!\n";
while (<FILEHANDLE>) {
# do somthing with line X
}
close FILEHANDLE;

Why do you think you need to read the files into memory? If you let us
know what you want to do, somebody may be able to advise you further.
 
J

Jürgen Exner

newbie said:
I have the following simple file reading code, the files being read
in is fairly large (4 MB with 160,000 lines, but each line only
contain like 3 strings).

open(FILEHANDLE, "<a.dat") || die "Can't open: $!\n";
chomp(@array1 = <FILEHANDLE>);
close FILEHANDLE; [...]
While on file reading subject, is there a fast way of reading in large
files?

Well, the standard answer is do you _really_ need to read the whole file
into memory?
The more perlish way would be to read the file line by line and process each
line as it is being read. Then even huge files are no problem.

Yes, there are cases where you do need the whole file content in memory.
But experience showed that in real life scenarios you can almost always
process the file line by line.

jue
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,774
Messages
2,569,596
Members
45,135
Latest member
VeronaShap
Top