monitoring friendly applications

I

Imbaud Pierre

I have A LOT of batch applications to monitor, on linux machines, mostly
written in python.
I have to know:
- which are active, at a given moment?
- when did the last run occur? How long did it last?
- for some daemons: are they stuck? generally, waiting for i/o, or lost
in some C call.

I could do this (I partly did, btw) thru external processes tools. But
most processes are python programs, making slight changes is not a
problem. Thats why I think a python library could help me here; let us
call it mfa (for monitor friendly application); then one call to
mfa.in() at start (or at import?), mfa.out() at exit (maybe even on
exception?), and possibly mfa.loop(), for loop managed daemons. the
library would open a pipe, or a flow (to a file, a centralizing process,
a database, a log), and every call make a dated entry. Ideally, the
application should not be aware of the implementation, especially the
repository being a file or a db.
Another aspect of process management can be endorsed by the library:
sometimes only one instance of the process should be active, then
withdraw if already running.
Not a lot of code to write, but careful design, to keep things simple,
reliable, only add a low overhead, etc.
By any chance, does something like this exist? Would someone be
interested with this development?
 
I

Imbaud Pierre

Thanks a lot. Your suggestions lead me to pypi, (I knew it but didnt
remember the exact spelling, and no obvious link from www.python.org),
and from there to supervisord, that answers pretty well my problem.
Thanks again.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,781
Messages
2,569,616
Members
45,306
Latest member
TeddyWeath

Latest Threads

Top