2 consecutive jobs in one perl program

Y

yezi

Dear all:

My problem is I ask the eth0 to listen all the packet source from
www.cnn.com , then collect all the packets to some file cnn.dmp. SO I
write the code with perl.

#!/usr/bin/perl
use warnings;
use strict;


my $filename;
my $cmdline;
my $cmdline1;


$filename="/home/binye/test_perl/testdownload.txt";
open( FILE, "< $filename" ) or die "Can't open $filename : $!";


while( <FILE> ) {
my $cmdline1="tcpdump -i eth0 src host $_ -w $_.dmp";
system($cmdline1);
print "Processing wget.";
my $cmdline="wget http://$_ -q";
system($cmdline);
print "done ...w $_..";
}
close FILE;


The code can execute the first tcpdump, then just stay here. I suppose
after tcpdump, the code should simultaneously execute the wget, so that
the file can tcpdump to the cnn.dmp. I do not know how to solve the
sequence problem. Any help is appreciated.

Thanks all
 
N

nobull

yezi said:
My problem is I ask the eth0 to listen all the packet source from
www.cnn.com , then collect all the packets to some file cnn.dmp. SO I
write the code with perl.

#!/usr/bin/perl
use warnings;
use strict;

Good! 10 Brownie points.
my $filename;

You are suffering from premature declaration.
my $cmdline;
my $cmdline1;

And the related affliction - redundant declaration. You never use
these variables.

$filename="/home/binye/test_perl/testdownload.txt";
open( FILE, "< $filename" ) or die "Can't open $filename : $!";


while( <FILE> ) {
my $cmdline1="tcpdump -i eth0 src host $_ -w $_.dmp";
system($cmdline1);
print "Processing wget.";
my $cmdline="wget http://$_ -q";
system($cmdline);
print "done ...w $_..";
}
close FILE;


The code can execute the first tcpdump, then just stay here. I suppose
after tcpdump, the code should simultaneously execute the wget, so that
the file can tcpdump to the cnn.dmp. I do not know how to solve the
sequence problem.

The trivial solution is a non-Perl one. If you are using a Unix-like
OS just put a '&' character at the end of $cmdline1.
 
Y

yezi

& function is same as too consecutive commands, I mean just like write
too system command at the same time. Present the problem is tcpdump
just wait here untill manually interrupt. is some linux system command
can help tcpdump finish after certain period? thanks
 
Y

yezi

The program now is following:
#!/usr/bin/perl
use warnings;
use strict;


my $filename;
my $cmdline;


$filename="/home/binye/test_perl/testdownload.txt";
open( FILE, "< $filename" ) or die "Can't open $filename : $!";


while( <FILE> ) {


print "Processing wget.";
my $cmdline="wget -w 50 -q http://$_ & tcpdump -c 20 -i eth0 src host
$_ -w $_.dmp ";
system($cmdline);
print "done ...w $_..";


}
close FILE;


I try to run it, by using the & the waring message is :

syntax error near unexpected token `&'
sh: -c: line 2: ` & tcpdump -c 20 -i eth0 src host www.cron.com'


Then I exchange the & to \$, still same problem.

So I donot know how to make the program works now. Thanks for any
comments
 
A

Ala Qumsieh

yezi said:
& function is same as too consecutive commands, I mean just like write
too system command at the same time. Present the problem is tcpdump
just wait here untill manually interrupt. is some linux system command
can help tcpdump finish after certain period? thanks

That is not correct. The ampersand '&' causes the command to run in the
background. This causes system() to return immediately. So, you still
need two calls to system().

--Ala
 
I

Ilmari Karonen

yezi said:
my $cmdline="wget -w 50 -q http://$_ & tcpdump -c 20 -i eth0 src host
$_ -w $_.dmp ";
system($cmdline);

No no no, that's not the way you should use the &. You put it at the
_end_ of the command line, like this:

system("tcpdump -c 20 -i eth0 src host $_ -w $_.dmp &");
system("wget -w 50 -q http://$_");

But this solution has some problems, since you can't tell if the
tcpdump started as expected, and more importantly, you don't know the
pid of the tcpdump process, which means you can't kill it when you're
done with it.

For those reasons, it's probably better to fork the background process
yourself:

# fork child process and start tcpdump
defined(my $pid = fork) or die "Can't fork: $!\n";
exec qw(tcpdump -c 20 -i eth0 src host), $_, '-w', "$_.dmp" or exit
unless $pid;

# run wget and check return value
system qw(wget -w 50 -q), "http://$_";
warn "wget returned exit code $?\n" if $?;

sleep 1; # just in case tcpdump isn't done dumping yet...

# kill tcpdump, wait for it and check return value
kill INT => $pid;
waitpid $pid, 0;
warn "tcpdump returned exit code $?\n" if $?;

Notice the use of multi-arg system() and exec() to avoid invoking the
shell. I'm not sure it's absolutely necessary here, but I generally
do so on principle -- I don't see any reason to start an extra shell
process when it's not really needed at all. In any case, if you let
exec() go through a shell, $pid will be the pid of the shell, not of
tcpdump itself.

--
Ilmari Karonen
To reply by e-mail, please replace ".invalid" with ".net" in address.

"My father used to claim that he heard god on numerous occasions. He
would ask god questions and keep a record of the answers, and at the end
of the year he would do a chi-square analysis to find out whether god had
been right more often than chance would lead one to expect." -- Pat Bowne
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,764
Messages
2,569,567
Members
45,041
Latest member
RomeoFarnh

Latest Threads

Top