cgi scripts stops executing?

S

squash

I have a cgi perl script that appears to randomly stop executing. Is
this a known issue with cgi scripts or is there some bug in my
programming logic? It seems to happen when the server is slow to
respond to the user request so the user gets impatient and starts
double clicking on the link. Does this result in some requests not
going through all the way and just terminating in the middle of
execution?
I am hoping it is a bug in my program and not perl/cgi!

Thx!
 
S

squash

You are probably right that it has to do with race condition. Here is
my lock file subroutine I call when my code reaches the critical point
where only user should be accessing it. Do you see any error in it?

sub lock_file {

$lock_file=".lock";

my $sec = time();

if (-e $lock_file) {
open(READLOCK, $lock_file) || &die_sub ("Can not open $lock_file");
flock(READLOCK, 2); # lock file
$read_lock_time=<READLOCK>;
close READLOCK;

# check for stale lock_file

my $difference = $sec - $read_lock_time;

if ($difference > 50) { unlink $lock_file;}
else {&die_sub("Another user is currently using the software.
Please try again in a few moments");}

} # end large if

if (!-e $lock_file) {

use Fcntl qw:)DEFAULT :flock);
select undef, undef, undef, 0.5;
sysopen (LOCK, $lock_file , O_WRONLY|O_EXCL|O_CREAT);

open(LOCK, ">$lock_file") || &die_sub ("Can not open $lock_file");
flock(LOCK, 2); # lock file
print LOCK "$sec\n";
close LOCK;
} # end if

} # end sub
 
A

A. Sinan Unur

(e-mail address removed) wrote in @f14g2000cwb.googlegroups.com:
You are probably right that it has to do with race condition. Here is
my lock file subroutine I call when my code reaches the critical point
where only user should be accessing it. Do you see any error in it?

Quite a few, actually. We just discussed this issue in this group a
couple of days ago.
sub lock_file {

$lock_file=".lock";

my $sec = time();

if (-e $lock_file) {
open(READLOCK, $lock_file) || &die_sub ("Can not open $lock_file");

Race condition!

The file can come into existence between the check and the open.
flock(READLOCK, 2); # lock file

Don't use arbitrary constants like this.

use Fcntl ':flock';

will give you LOCK_EX, LOCK_SH, LOCK_NB and LOCK_UN. For more
information, see

perldoc -f flock.
$read_lock_time=<READLOCK>;
close READLOCK;

Race condition. You closed the locked file and therefore gave up the
lock. Another instance of the script may end up accessing the file
between this and the next locking attempt.
# check for stale lock_file

my $difference = $sec - $read_lock_time;

if ($difference > 50) { unlink $lock_file;}
else {&die_sub("Another user is currently using the software.
Please try again in a few moments");}

You should not use & when calling a subroutine unless you need the side
effects of using it. This is not such a case.

See

perldoc perlsub

for more information.
} # end large if

Properly indented code does not need these kinds of remarks. What is the
difference between a large if and a skinny if?
if (!-e $lock_file) {

Same race condition as the first one.
use Fcntl qw:)DEFAULT :flock);

Why you have suddenly decided to use Fcntl is beyond me, but OK.
select undef, undef, undef, 0.5;
sysopen (LOCK, $lock_file , O_WRONLY|O_EXCL|O_CREAT);

I am very puzzled by this.

Why don't you check if the sysopen succeeded?
open(LOCK, ">$lock_file") || &die_sub ("Can not open $lock_file");

But you just opened it.
flock(LOCK, 2); # lock file
print LOCK "$sec\n";
close LOCK;
} # end if

} # end sub

Further reference:

perldoc -q lock

and

http://tinyurl.com/3jd9d

Sinan.
 
X

xhoster

I have a cgi perl script that appears to randomly stop executing. Is
this a known issue with cgi scripts or is there some bug in my
programming logic?

Most likely both.
It seems to happen when the server is slow to
respond to the user request so the user gets impatient and starts
double clicking on the link. Does this result in some requests not
going through all the way and just terminating in the middle of
execution?

Yes. When the user starts clicking on other things, the browser terminates
it original connection to the web server. Once the web server realizes
the browser is no longer interested in the reply, it kills off the
CGI program which was working on that reply.
I am hoping it is a bug in my program and not perl/cgi!

It is not a bug, but a feature. Why labor on creating marvelous HTML that
no longer has anywhere to go? And the feature is not of Perl, but
of Apache/CGI (or whatever your webserver is).

Xho
 
S

squash

Thanks for all the replies. I just discovered there was a bug in my
code logic !! So the randomly stop executing was not the culprit.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,776
Messages
2,569,603
Members
45,188
Latest member
Crypto TaxSoftware

Latest Threads

Top