Scanning Multiple Log files for patterns continously

T

tambekp

Hello,

I want to scan multiple log files for multiple patterns continously. I
am limited to the default perl modules (cant install any extra
modules). Here is what I am attempting to do. Any guidance, pointers to
achieve this are highly appreciated.

- I want to make a hash of patterns and files they need to be searched
against and then every pattern has a set threshold. If the threshold
exceeds in any of the files the script should alert. I dont want to
pass patterns/files as arguments but want to put them in a config file
or hash

- I want to have the script run in an infinite loop so that it scans
the files from the position it left the last scan.

- I have tried doing a hash of patterns=>filenames but this fails (and
rightly so) if same pattern has to be searched in different files as
the key of the hash does not remain unique in that case.

Any suggestions on how to go about implementing this?

Thanks in advance for all the help
-k
 
T

tambekp

Thanks for the pointers Sinan.

The readings you suggested should be a good start for me.

-k
A. Sinan Unur said:
(e-mail address removed) wrote in @s13g2000cwa.googlegroups.com:
- I want to make a hash of patterns and files they need to be searched
against and then every pattern has a set threshold. If the threshold
exceeds in any of the files the script should alert. I dont want to
pass patterns/files as arguments but want to put them in a config file
or hash

- I want to have the script run in an infinite loop so that it scans
the files from the position it left the last scan.

The first task is to write this for scanning only one file at a time.
First off, I do not think keeping the file open is a good idea: The file
may actually be deleted and re-created while your script is running (I
don't think this is possible on Windows, but AFAIK, it is allowed on
various *nix flavored OS). In that case, you would be scanning the wrong
file.

Instead, check out the discussion in

perldoc -f seek
- I have tried doing a hash of patterns=>filenames but this fails (and
rightly so) if same pattern has to be searched in different files as
the key of the hash does not remain unique in that case.

I am not sure what you are talking about here. Hash keys are always
strings, and cannot be patterns.

The most obvious structure would be:

my %patterns_by_file = (
log1 => [ qr/^ERROR/, qr/^WARNING/ ],
log2 => [ qr/^INFO/, qr/^ALERT/ ],
# etc
);
Any suggestions on how to go about implementing this?

Get the one file version working first. You might want to use or refer to
File::Tail:

http://search.cpan.org/~mgrabnar/File-Tail-0.99.3/Tail.pm for help.

Then, possibly use Parallel::ForkManager to run multiple scanners
simultaneously:

http://search.cpan.org/~dlux/Parallel-ForkManager-0.7.5/ForkManager.pm

Now, it is your turn to attack the problem, fill in the blanks, and come
up with some code.

In the mean time, please read the posting guidelines (esp. this section on
posting code).

Sinan

--
A. Sinan Unur <[email protected]>
(remove .invalid and reverse each component for email address)

comp.lang.perl.misc guidelines on the WWW:
http://augustmail.com/~tadmc/clpmisc/clpmisc_guidelines.html
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,766
Messages
2,569,569
Members
45,042
Latest member
icassiem

Latest Threads

Top