A
avtanski
Hello,
I'm not sure that this is the right group for this question - Googling
for similar questions gave me always "comp.lang.perl.misc", but if this
is OT here, please, let me know.
The problem I'm having is with a CGI script that needs to load and
parse some data from a quite large file, then do some extra processing
based on user form input and return. The slow part is (I guess)
loading and parsing the data from the file. Although this is taking a
fraction of a second, the script is quite often invoked by a number of
customers simultaneosly, and is causing problems with my shared hosting
provider (reportedly 25% CPU load at some times).
I'm looking for high-level ideas for a solution. What I can come up
with is:
1) Optimizing the file format for easier parsing - not much could be
done here, this is pretty straightforward task. The file size is about
200K, and there is simply no way to avoid loading it. Partially loading
also doesn't work (splitting it to pieces, things like this).
2) Switching to PHP - this means rewriting everything, what a mess...
3) Using something like FastCGI, providing that the hosting provider
have it. Do you think this can help? I don't know much about FastCGI,
can I somehow preload the data in memory and just use it from the
script?
4) Using mod_perl? Don't know much about it too. Can I parse the data
from the file once, have it stored in memory and each time my script is
invoked access it? How much cooperation is required from my provider
for this?
5) Do something else?
Any help/ideas/suggestions are appreciated.
Thanks,
- Alex
I'm not sure that this is the right group for this question - Googling
for similar questions gave me always "comp.lang.perl.misc", but if this
is OT here, please, let me know.
The problem I'm having is with a CGI script that needs to load and
parse some data from a quite large file, then do some extra processing
based on user form input and return. The slow part is (I guess)
loading and parsing the data from the file. Although this is taking a
fraction of a second, the script is quite often invoked by a number of
customers simultaneosly, and is causing problems with my shared hosting
provider (reportedly 25% CPU load at some times).
I'm looking for high-level ideas for a solution. What I can come up
with is:
1) Optimizing the file format for easier parsing - not much could be
done here, this is pretty straightforward task. The file size is about
200K, and there is simply no way to avoid loading it. Partially loading
also doesn't work (splitting it to pieces, things like this).
2) Switching to PHP - this means rewriting everything, what a mess...
3) Using something like FastCGI, providing that the hosting provider
have it. Do you think this can help? I don't know much about FastCGI,
can I somehow preload the data in memory and just use it from the
script?
4) Using mod_perl? Don't know much about it too. Can I parse the data
from the file once, have it stored in memory and each time my script is
invoked access it? How much cooperation is required from my provider
for this?
5) Do something else?
Any help/ideas/suggestions are appreciated.
Thanks,
- Alex