Prevent multiple instances of a script to launch

P

Philipp

Hello
I have written a Perl script (win32) and use it to drag&drop files on it
for execution.
Now if I drop multiple files at once, they are processed one after the
other. But if I drop one file at a time, each starts a new perl
interpreter and new script.

Is there an easy way to prevent these new copies to start and instead
queue the corresponding files in an already running script? (so they get
executed sequentially and not in parallel)

Thanks for your answers
Phil
 
A

anno4000

Philipp said:
Hello
I have written a Perl script (win32) and use it to drag&drop files on it
for execution.
Now if I drop multiple files at once, they are processed one after the
other. But if I drop one file at a time, each starts a new perl
interpreter and new script.

Is there an easy way to prevent these new copies to start and instead
queue the corresponding files in an already running script? (so they get
executed sequentially and not in parallel)

The standard method is for the program to acquire an exclusive lock
on some (fixed) file before starting to work. That serializes the
processes. I don't know how well file locking works on win32.

Anno
 
P

Philipp

The standard method is for the program to acquire an exclusive lock
on some (fixed) file before starting to work. That serializes the
processes. I don't know how well file locking works on win32.

If I understand this correctly, this will start a second interpreter but
it will wait until the lock of the first is released? This is not
exactly the behavior that I want.
I would like all files to be processed by the same interpreter instance.
This means that a second instance would need to:
- check somehow if a first one is already running (eg. with a lock?)
- add files to the queue of first instance (how?)

For the second point I can imagine a method like writing a queue to a
file but probably writing to some stack in memory is better/faster. How
could I do this? How can you make two scripts share a piece of memory?

If you could point me to some documentation or even better som existing
code somewhere, I would be very grateful.

Phil
 
A

anno4000

Philipp said:
If I understand this correctly, this will start a second interpreter but
it will wait until the lock of the first is released? This is not
exactly the behavior that I want.

Right. That's often good enough.
I would like all files to be processed by the same interpreter instance.
This means that a second instance would need to:
- check somehow if a first one is already running (eg. with a lock?)
- add files to the queue of first instance (how?)

That amounts to a sever-client model. There is a lot of literature
about that (start with perlipc) and lots of CPAN modules that support
that model.

You'll need some form of IPC to transmit requests to the server. The
server could be started on demand and quit when there is nothing to do,
or it could be started once and keep running (other possibilities exist).
Again, file locking is the safest way of keeping multiple instances
of the server from running, but the server doesn't wait for the lock,
it tests the lock non-blocking and just quits when it can't get it.

Anno
 
P

Philipp

That amounts to a sever-client model. There is a lot of literature
about that (start with perlipc) and lots of CPAN modules that support
that model.

You'll need some form of IPC to transmit requests to the server. The
server could be started on demand and quit when there is nothing to do,
or it could be started once and keep running (other possibilities exist).
Again, file locking is the safest way of keeping multiple instances
of the server from running, but the server doesn't wait for the lock,
it tests the lock non-blocking and just quits when it can't get it.

OK thanks. I will look into that (although it looks way out of my
programming skills...)
Phil
 
D

Dr.Ruud

Philipp schreef:
I have written a Perl script (win32) and use it to drag&drop files on
it for execution.
Now if I drop multiple files at once, they are processed one after the
other. But if I drop one file at a time, each starts a new perl
interpreter and new script.

Is there an easy way to prevent these new copies to start and instead
queue the corresponding files in an already running script? (so they
get executed sequentially and not in parallel)

The new copy should check whether there is an older copy running, and
then hand over its job-data to that older copy. You could use POE.
 
C

Charles DeRykus

Philipp said:
Hello
I have written a Perl script (win32) and use it to drag&drop files on it
for execution.
Now if I drop multiple files at once, they are processed one after the
other. But if I drop one file at a time, each starts a new perl
interpreter and new script.

Is there an easy way to prevent these new copies to start and instead
queue the corresponding files in an already running script? (so they get
executed sequentially and not in parallel)

One possible solution would be to write the files to a simple DBM and
delete DBM entries as the files are processed. The "master" instance
would do the processing after locking a sentinel file to prevent a
possible "master" race condition Subsequent "helper" instances of the
script would detect the lock and just add their files to the DBM before
exiting. The master could check for running helpers when the DBM emptied
out. Then, if helpers were still running, the master could sleep/loop
checking for DBM additions; if no helpers were running and the DBM was
empty, the master could unlock the sentinel and exit.


hth,
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,020
Latest member
GenesisGai

Latest Threads

Top