H
Hal Vaughan
I'm using Perl 5.6.1 (and in some cases 5.8) on Linux. I've noticed that
when I'm processing files, that Perl writes in blocks, so it'll process a
number of items, and instead of the file having one line at a time written
to it, it'll get a whole block at once suddenly written to the disk.
Is there any way to avoid this and force Perl to write each line as I use a
"print" statement to output the line? I log (in MySQL) each item as I
finish it, so if power fails or the program is aborted, the system can pick
up right where it left off. Because of the buffers, the log is ahead of
what is written to the file, which would mean I'd lose the data between
what's written and what's logged.
Thanks!
Hal
when I'm processing files, that Perl writes in blocks, so it'll process a
number of items, and instead of the file having one line at a time written
to it, it'll get a whole block at once suddenly written to the disk.
Is there any way to avoid this and force Perl to write each line as I use a
"print" statement to output the line? I log (in MySQL) each item as I
finish it, so if power fails or the program is aborted, the system can pick
up right where it left off. Because of the buffers, the log is ahead of
what is written to the file, which would mean I'd lose the data between
what's written and what's logged.
Thanks!
Hal