Converting a big perl script which is called over and over to a module ?

O

OdedDV

Hello all,

I have a rather long Perl script (which is Tk based). The script needs to
run another instance per user request. It is currently done using `system
myscript param1 param2 param3 &`. This works great in terms of
functionality, but the issue is that myscript gets "compiled/interpreted"
over and over whenever a new instance is required. This is a waste of time,
especially on some of our slow Solaris machines.

I wanted to convert myscript to a module, such that I can write another
short script which will use this module to launch the first instance.
It means that the module itself would have a function which launches another
instance.

The issue I'm facing is regarding the global variables I have in the script.
Naturally every instance should have its own copy of the parameters. It
means I can no longer have these as global parameters in the module, as this
implies that they are shared across all instances created. Trying to move
them to the "main" function I export from the module is no good, as I have
dozens of other functions in this module which need to access those
used-to-be "global" parameters.

Can someone please share with me some ideas / point me to relevant
documentation for the simplest way for me to achieve what I need ?
Maybe there's another solution (not using a module) ?

Thanks in advance,
Oded
 
T

Tad McClellan

OdedDV said:
I have a rather long Perl script (which is Tk based). The script needs to
run another instance per user request. It is currently done using `system
myscript param1 param2 param3 &`. This works great in terms of
functionality, but the issue is that myscript gets "compiled/interpreted"
over and over whenever a new instance is required. This is a waste of time,
especially on some of our slow Solaris machines.

I wanted to convert myscript to a module, such that I can write another
short script which will use this module to launch the first instance.
It means that the module itself would have a function which launches another
instance.


How will the module launch the new instance?

Using system()?

If so, then making it into a module does not solve your problem.

There will be a compile phase for each instance anyway.

The issue I'm facing is regarding the global variables I have in the script.


I think the issue you're facing is an incomplete mental model of how
processes work, leading to an "XY problem".

If I understand your problem correctly, then module vs. monolithic
does not address your problem.

I think you want fork/exec instead.

Naturally every instance should have its own copy of the parameters.


And if you launch it with system, whether in main or in a module,
they _will_ get their own (command line) parameters.

Can someone please share with me some ideas / point me to relevant
documentation for the simplest way for me to achieve what I need ?
Maybe there's another solution (not using a module) ?


perldoc -q background

How do I start a process in the background?

perldoc -f fork
perldoc -f exec
 
O

OdedDV

Tad McClellan said:
How will the module launch the new instance?

Using system()?

Nope, sorry if I wasn't clear enough. The idea is to convert the system
call to a function call.
Thus I'm calling code which was already "compiled/interpreted".
Am I saying something stupid ? Maybe it's not possible ?!

Is it possible to have some instances of the same function at the same time
?
Maybe I do need to fork/exec, but then again, I want to make sure I don't
ask Perl to recompile/reinterpret the code.

Can this be done ?

Thanks,
Oded
 
T

Tad McClellan

OdedDV said:
Maybe I do need to fork/exec,

Right.


but then again, I want to make sure I don't
ask Perl to recompile/


Since you added that, I'm quite sure that you don't yet have an
accurate mental model of how processes work.

fork() makes a clone of the current process, with the instruction
counter at the same place in each process.

The cloning includes the memory image to be run, so the compilation
has already been done and is not repeated.
 
X

xhoster

OdedDV said:
Hello all,

I have a rather long Perl script (which is Tk based). The script needs
to run another instance per user request. It is currently done using
`system myscript param1 param2 param3 &`. This works great in terms of
functionality, but the issue is that myscript gets "compiled/interpreted"
over and over whenever a new instance is required. This is a waste of
time, especially on some of our slow Solaris machines.

Get faster machines.
I wanted to convert myscript to a module, such that I can write another
short script which will use this module to launch the first instance.
It means that the module itself would have a function which launches
another instance.

The issue I'm facing is regarding the global variables I have in the
script. Naturally every instance should have its own copy of the
parameters. It means I can no longer have these as global parameters in
the module, as this implies that they are shared across all instances
created. Trying to move them to the "main" function I export from the
module is no good, as I have dozens of other functions in this module
which need to access those used-to-be "global" parameters.

Then you have to pass those variables to those dozens of functions.
Or pack all those different variables into one hash and pass that
to the functions.
Can someone please share with me some ideas / point me to relevant
documentation for the simplest way for me to achieve what I need ?

In my experience in doing this, all that I can say is "Sucks to be you."
I know of no easy way to pull off this conversion. You might be able to
hack something up by "local"izing all the package variables everytime you
start the main function. But I'd probably just roll up my sleeves and get
to work turning everything into lexicals (with a good search and replace
function, it isn't all *that* much work), rather than messing around with
local. But first I'd try to convince the guys with the purse-strings that
it would be cheaper to buy new machines to run the old, bad code than it
would be to re-do the code. (And then in the future us lexicals from the
start.)
Maybe there's another solution (not using a module) ?

If you find one, please let me know.

Xho
 
X

xhoster

OdedDV said:
Nope, sorry if I wasn't clear enough. The idea is to convert the system
call to a function call.

I thought that was clear in the original, but then I noticed that in the
original you ended your "system" command with a "&". If you just change it
into a plain function call, then it will by synchronous, unlike "system"
with "&". Is that OK with you?

Thus I'm calling code which was already "compiled/interpreted".
Am I saying something stupid ? Maybe it's not possible ?!

Now that I think about it, I think that it is possible.
Is it possible to have some instances of the same function at the same
time ?

Don't know what that means.
Maybe I do need to fork/exec, but then again, I want to make sure I don't
ask Perl to recompile/reinterpret the code.

A fork doesn't ask Perl to recompile the code

So what you want to do is have the parent process include the module, so
the module gets compiled only once, at parent start up. The parent process
has itself compiled, and also has the "bad" module compiled (bad because it
uses globals in an unsafe way). But as long as the parent never does
anything with that module other than compile it, then the all the globals
in that module are "clean" in the parent process.

Now, some functionality in the module needs to be used. The parent process
forks, and then the child process invokes code from the module. The child
dirties up it's copy of the unsafe globals, but as long as the child never
creates children itself, this isn't a problem. The parent process's
version of the bad module is still clean, so the next time the parent forks
that child also gets a clean version.

Still, I'd work on cleaning up the bad module.

Xho
 
J

John Smith

OdedDV said:
Nope, sorry if I wasn't clear enough. The idea is to convert the system
call to a function call.
Thus I'm calling code which was already "compiled/interpreted".
Am I saying something stupid ? Maybe it's not possible ?!

Is it possible to have some instances of the same function at the same
time ?
Maybe I do need to fork/exec, but then again, I want to make sure I don't
ask Perl to recompile/reinterpret the code.

Can this be done ?

Thanks,
Oded

How is the "mother prosess" that calls system() today being called itself?

Is it at CGI-script? (handles only one request and then dies)
A mod_perl httpd process? (probably handles more than one request)

What I'm really asking is: Are the mother process being startet once each
request or not?
If it are, you probably woudn't get much better performance by converting
the child
to a module. It could even be worse if the child doesn't get called each
time.

To turn a .pl into a .pm I think you have two alternatives:

Both includes finding all global variables in the .pl

Alt. 1
For the object oriented alternative, let the .pl globals become
instance variables. Most likely keys in an blessed instance hash.
This requires knowledge of writing object oriented perl.

Alt. 2
For the non-object oriented alternative. Declare all globals in
the .pl as my $var1; my %hash2; my @arr3; and so on in the
top of your .pm. And then write a: sub clean_up { undef $var1;
undef %hash2; undef @arr3; ... } ...and so on. Call Module::clean_up
each time first thing where the old system() call was.
 
O

OdedDV

Thanks for all who tried to explain. It looks like the issue is a little
bit more complicated.
Thanks for your patience, I will try to explain it better to answer the
questions I was asked:

Situation today:

I have a large Perl/Tk script called "myscript".
It is launched from the command line with arguments "arg1 arg2 arg3".
When executed, Perl fires up and "compiles" the code.

The code itself has an option (upon user request at a button click) to
launch ANOTHER instance of "myscript". It can be with the exact same
arguments (arg1 arg2 arg3) or with different values for these three
arguments.
Currently the NEW instance is launched using a background system call
(system "myscript .... &").
Of course more instances can be created whenever a user asks to (either from
the "original" instance or from new instances).
It is important that each instance is completely independent of the other
instances. When it is closed/killed - it should NOT affect any of the other
instances.

I would like to improve the performance of "myscript" such that new
instances created will NOT require Perl recompilation.

Can it be done ?

Thanks,
Oded
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,768
Messages
2,569,574
Members
45,051
Latest member
CarleyMcCr

Latest Threads

Top