maximum number of concurrent sessions

D

david

HI all,

I have got a requirement, when the number of concurrent sessions a
certain web site has exceeds a number than i should send a response
that i can not give now the service and that the user should try in a
couple of minutes.

My questions are:

Is this something I should handle in the perl code or more upstream?

If yes, how can i handle it in perl ?

I tried to make google searches on this issue and did not find an
answer.

Thanks in advance,
David
 
S

siva0825

HI all,

I have got a requirement, when the number of concurrent sessions a
certain web site has exceeds a number than i should send a response
that i can not give now the service and that the user should try in a
couple of minutes.

My questions are:

Is this something I should handle in the perl code  or more upstream?

If yes, how can i handle it in perl ?

I tried to make google searches on this issue and did not find an
answer.

Thanks in advance,
David

I think you need something like this:

use Win32::Net::Session;
my $SESS = Win32::Net::Session->new($server, $level, [$clientname,
$username]);
my $ret = $SESS->GetSessionInfo();
my $numsessions = $SESS->NumberofSessions();
if($numsessions == 0) {print "No Clients connected\n"; exit; }
print "Number of Sessions: " . $numsessions . "\n";

Take a look at:

http://search.cpan.org/~rpagitsch/Win32-Net-Session-0.02/Session.pm
 
T

Tim Greer

I think you need something like this:

use Win32::Net::Session;

We don't know what web server or platform the OP is using. Not that you
didn't guess right (I have no idea).
 
T

Tim Greer

david said:
HI all,

I have got a requirement, when the number of concurrent sessions a
certain web site has exceeds a number than i should send a response
that i can not give now the service and that the user should try in a
couple of minutes.

My questions are:

Is this something I should handle in the perl code or more upstream?

If yes, how can i handle it in perl ?

I tried to make google searches on this issue and did not find an
answer.

Thanks in advance,
David

What web server and platform are you using? For example, you could do
this with Apache. Depending on exactly what you want to limit (just
CGI and/or PHP, etc.) or all files (images, text/HTML), could depend on
the solution, but the web server aspect is probably best (for example,
limiting the number of processes a user can run at once is a common
setting in Apache, including limiting how much memory, cpu, how long
the process runs, and, the total processes allowed using the Rlimit*
directives -- which sounds like what you want -- but this is for CGI).
So, it sounds like you want a web server/Apache group (depending on
your platform and web server software).
 
X

xhoster

david said:
HI all,

I have got a requirement, when the number of concurrent sessions a
certain web site has exceeds a number than i should send a response
that i can not give now the service and that the user should try in a
couple of minutes.

How is Perl currently being used at this certain web site?

My questions are:

Is this something I should handle in the perl code or more upstream?

Probably more upstream. If it is only a particular Perl code (CGI script,
etc.) that needs to be throttled, then maybe do it in Perl.
If yes, how can i handle it in perl ?

To prevent more than two instances of a certain cgi from running at once,
I do something like this:


sub getLockfile {
foreach (1..200) {
foreach my $i ( 1,2 ) { # only let 2 run, as it takes 0.7 out of 2
Gig.
open my $file, ">_memory_lock_$i" or die $!;
if (flock $file,LOCK_EX|LOCK_NB) {
return $file; # return the handle itself, someone needs to hold
# onto for the length of the program.
}
}
sleep 1;
};
return; # caller must detect and do the right thing.
}


I could do more error checking, for example to make sure flock fails due
to EWOULDBLOCK rather than something else. You probably want to remove
the outer loop and have it fail immediately.


Xho

--
-------------------- http://NewsReader.Com/ --------------------
The costs of publication of this article were defrayed in part by the
payment of page charges. This article must therefore be hereby marked
advertisement in accordance with 18 U.S.C. Section 1734 solely to indicate
this fact.
 
D

david

How is Perl currently being used at this certain web site?





Probably more upstream.  If it is only a particular Perl code (CGI script,
etc.) that needs to be throttled, then maybe do it in Perl.




To prevent more than two instances of a certain cgi from running at once,
I do something like this:

sub getLockfile {
  foreach (1..200) {
    foreach my $i ( 1,2 ) {  # only let 2 run, as it takes 0.7 out of 2
    Gig.
      open my $file,  ">_memory_lock_$i" or die $!;
      if (flock $file,LOCK_EX|LOCK_NB) {
        return $file; # return the handle itself, someone needs to hold
                      # onto for the length of the program.
      }
    }
    sleep 1;
  };
  return;  # caller must detect and do the right thing.

}

I could do more error checking, for example to make sure flock fails due
to EWOULDBLOCK rather than something else.  You probably want to remove
the outer loop and have it fail immediately.

Xho

--
--------------------http://NewsReader.Com/--------------------
The costs of publication of this article were defrayed in part by the
payment of page charges. This article must therefore be hereby marked
advertisement in accordance with 18 U.S.C. Section 1734 solely to indicate
this fact.

Sorry that i did not write the platform. it is apache on linux. I am
running a couple of cgis on the machine and it could be that the
limitations for each cgi are different. I think that the apache can
just be configured in general for all cgis (but i'll have to make
homework and read the apache manual). The perl solution would solve
the problem, but that would mean that i have to manage the session in
perl alone.
Thanks for your comments !!!
 
S

Snorik

HI all,

I have got a requirement, when the number of concurrent sessions a
certain web site has exceeds a number than i should send a response
that i can not give now the service and that the user should try in a
couple of minutes.

My questions are:

Is this something I should handle in the perl code  or more upstream?

Isnt that handled in apache:

# worker MPM
# StartServers: initial number of server processes to start
# MaxClients: maximum number of simultaneous client connections
# MinSpareThreads: minimum number of worker threads which are kept
spare
# MaxSpareThreads: maximum number of worker threads which are kept
spare
# ThreadsPerChild: constant number of worker threads in each server
process
# MaxRequestsPerChild: maximum number of requests a server process
serves
<IfModule worker.c>
StartServers 2
MaxClients 150
MinSpareThreads 25
MaxSpareThreads 75
ThreadsPerChild 25
MaxRequestsPerChild 0
</IfModule>
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,764
Messages
2,569,567
Members
45,041
Latest member
RomeoFarnh

Latest Threads

Top