How to read file in nonblocking mode?

S

sonet

Will the win2003 server lock the directory when create and
read difference files at the same time?

my $JobFileLocation='c:/pjob.sp';
open( my $JobFile, $JobFileLocation );
binmode $JobFile;
$contents = join('',<$JobFile>);
close($JobFile);

Sometime the line 4 will block. And cpu speed up to 25%(The
system have 4 cpu). The process stop at line 4 and the system
is hang. Sometime the file size is just 300k. The memory still
enough and system does not busy at that time.The situation
occur about 1~2 times / one day.

Will it solve the problem by rewrite the code ?

$/=undef;
$contents = <$JobFile>;

The second method read file is seem fast than the ftrst one.

Need i read files using nonblocking mode? And how to read
a file in nonblocking mode ?
================================================
cpu 4
memory 4G
hd 32G (SCSI)
Active perl, v5.8.8 built for MSWin32-x86-multi-thread
Win2003 server
 
B

Ben Morrow

Quoth "sonet said:
Will the win2003 server lock the directory when create and
read difference files at the same time?

If I have understood your question correctly (will Windows prevent you
from reading a file if a different file in the same directory is open
for writing) then the answer is no. Windows does place a lock on any
file that is open, so if another process is holding *the same* file open
for writing your read (or rather, I would expect, your open) will fail.
my $JobFileLocation='c:/pjob.sp';

Are you really locating files in the root of the C drive? This is not a
good idea.
open( my $JobFile, $JobFileLocation );

Use three-arg open (it's safer if the file has an odd name.
Always check if open succeeded.

open( my $JobFile, '<', $JobFileLocation)
or die "can't open '$JobFileLocation': $!";

You don't need those parens, but if you feel happier with them there
that's fine.
binmode $JobFile;
$contents = join('',<$JobFile>);

This is an inefficient way of reading a whole file. Perl will first
split the file into lines and then join them back together again.
Setting $/ to undef is better, and better still would be to use the
File::Slurp module from CPAN.
close($JobFile);

Sometime the line 4 will block. And cpu speed up to 25%(The
system have 4 cpu). The process stop at line 4 and the system
is hang.

How do you know it stops at line 4? If that is not the code you are
running, please post your actual code. It prevents people from wasting
time chasing the wrong problem.
$/=undef;
$contents = <$JobFile>;

The second method read file is seem fast than the ftrst one.

It will be faster, but I doubt if it would be enough faster that you'd
notice it.
Need i read files using nonblocking mode? And how to read
a file in nonblocking mode ?

use Fcntl;

my $flags = fcntl $JobFile, F_GETFL;
fcntl $JobFile, $flags | O_NONBLOCK
or die "can't set non-blocking mode: $!";

or, if you prefer,

use IO::Handle;

$JobFile->blocking(0)
or die "can't set non-blocking mode: $!";

You don't want to do this, though: as a rule, non-blocking mode doesn't
apply to ordinary files, and in any case you *want* to block until all
the data has been read.

Ben
 
B

Brian McCauley

Will the win2003 server lock the directory when create and
read difference files at the same time?

I'm not sure what you mean. What are "difference files"? This looks
like a question about Win2003. This is a Perl newgroup.

You appear to be asking can Windows have multiple files open at the
same time. But I don't really believe you could be asking that.
my $JobFileLocation='c:/pjob.sp';
open( my $JobFile, $JobFileLocation );
binmode $JobFile;
$contents = join('',<$JobFile>);
close($JobFile);

Sometime the line 4 will block.

Are you sure?
And cpu speed up to 25%(The system have 4 cpu).

I think you just said the Perl interpreter process has maxed-out one
CPU. That is the opposite of being blocked.
The process stop at line 4 and the system is hang.

If the system is hung how can you tell what the process is doing? I do
not believe you that the system is hung.
Sometime the file size is just 300k. The memory still
enough and system does not busy at that time.
The situation occur about 1~2 times / one day.

Will it solve the problem by rewrite the code ?

$/=undef;
$contents = <$JobFile>;

The second method read file is seem fast than the ftrst one.

Yes, if you want to read a lot of data into memory as a single string
then it is more efficient to read it into memory as a single string
rather than read it into memory as lots of smaller strings then join
them into a single string.

Of course perhaps you should think about how to avoid wanting to slurp
300k in the first place.
Need i read files using nonblocking mode?

What makes you think that?
And how to read a file in nonblocking mode ?

Non-blocking on file IO is rarely needed. I very much doubt there's
significant blocking going on.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,764
Messages
2,569,566
Members
45,041
Latest member
RomeoFarnh

Latest Threads

Top