Benchmarking a script that can't be executed form the command line

E

Erik Tank

I have written a web based help desk/billing/anything else my
employeer wants it to be system using Perl. It's been in production
over 1 year now and working well but some things are beginning to slow
down. I am trying to figure out where the slow down is occure.

With other scripts I have always used DProf/SmallProf or something
like that but the problem here is there are several security
precautions that make it (so far) unexecutable from the command line.
To clarify I can execute it but I simply get a redirect to the login
script.

Any thoughts/suggestions?
 
R

Ragnar Hafstað

Erik Tank said:
I have written a web based help desk/billing/anything else my
employeer wants it to be system using Perl. It's been in production
over 1 year now and working well but some things are beginning to slow
down. I am trying to figure out where the slow down is occure.

With other scripts I have always used DProf/SmallProf or something
like that but the problem here is there are several security
precautions that make it (so far) unexecutable from the command line.
To clarify I can execute it but I simply get a redirect to the login
script.

Any thoughts/suggestions?

yes. plain old log files

make a debug() function that prints timestamped messages to a log file, and
pepper calls to it all over your
application. this will help you locate problem areas. to disable it change
the function, or change a config variable
or just perl -pi -e 's/debug\(/#debug(/g' src/*.pl

gnari
 
J

James Willmore

I have written a web based help desk/billing/anything else my
employeer wants it to be system using Perl. It's been in production
over 1 year now and working well but some things are beginning to
slow down. I am trying to figure out where the slow down is occure.

With other scripts I have always used DProf/SmallProf or something
like that but the problem here is there are several security
precautions that make it (so far) unexecutable from the command
line. To clarify I can execute it but I simply get a redirect to the
login script.

Any thoughts/suggestions?

There are several logging module that could aid you in trying to find
out what's going on. One of them is Log::TraceMessages. Put this
together with Time::HiRes, and you could see what's going on.

Another option is to re-think your code to allow it to be run from the
command line - you may run into this type of situation again.

HTH

--
Jim

Copyright notice: all code written by the author in this post is
released under the GPL. http://www.gnu.org/licenses/gpl.txt
for more information.

a fortune quote ...
A "No" uttered from deepest conviction is better and greater than
a "Yes" merely uttered to please, or what is worse, to avoid
trouble. -- Mahatma Ghandi
 
P

pkent

Erik Tank said:
I have written a web based help desk/billing/anything else my
employeer wants it to be system using Perl. It's been in production
over 1 year now and working well but some things are beginning to slow
down. I am trying to figure out where the slow down is occure.

The fact that it started well but has now slowed down makes my first
instinct suspect the part(s) of your system that have grown over time.
E.g. If you're interacting with a growing database then find out what
SQL queries are running and run then at a SQL*Plus/mysql/etc prompt, or
maybe do an 'EXPLAIN PLAN' or equivalent. Or are there big text files
that are being sequentially scanned? Of course the actual problem may be
completely unrelated...
With other scripts I have always used DProf/SmallProf or something
like that but the problem here is there are several security
precautions that make it (so far) unexecutable from the command line.
To clarify I can execute it but I simply get a redirect to the login
script.

You can easily use dprof to profile web applications by adding the
appropriate switch to the #! line, and you can even profile mod_perl
applications with Apache::DProf - we've done that at our place. Note
that you might find it best to start a webserver on a new port to run
these profiling versions and then you can execute them and not get the
normal users creating profile data too.

The CGI environment is really not that complex to emulate in a
command-line wrapper script - it pretty much boils down to setting a
load of environment variables, clearing the rest, and running the
program... obviously the actual spec is a bit more detailed :)

There are other ways to get started: another poster suggested timestamps
in a log file which will certainly help you, for example.

Also if you can't profile _any_ of your code then that implies to me
that your code _might_ well benefit from better modularization and
abstraction. E.g. once you have some code abstracted into modules you
can test/profile those modules in isolation which can help you find
bottlenecks.

Also, if you're not running your app under mod_perl (or similar), that
is probably going to be a very big win for relatively little outlay.

P
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,744
Messages
2,569,482
Members
44,901
Latest member
Noble71S45

Latest Threads

Top