H
Helmut Blass
HI,
I've written a perl script which extracts links from websites.
it's working fine but unfortunately only at the first call
after program start. if I call the routine for a second time,
the result list stays empty thou the site to be parsed has been
retrieved. after finishing and restarting the script the routine
works again properly.
all variables are declared locally.
my @url_list = ();
####
sub callback {
my($tag, %attr) = @_;
return if $tag ne 'a';
push(@url_list, values %attr);
}
######
my $p = HTML::LinkExtor->new(\&callback);
# Request document and parse it as it arrives
my $res = $ua->request(HTTP::Request->new(GET => $_),
sub {$p->parse($_[0])});
thanx. Helmut
I've written a perl script which extracts links from websites.
it's working fine but unfortunately only at the first call
after program start. if I call the routine for a second time,
the result list stays empty thou the site to be parsed has been
retrieved. after finishing and restarting the script the routine
works again properly.
all variables are declared locally.
my @url_list = ();
####
sub callback {
my($tag, %attr) = @_;
return if $tag ne 'a';
push(@url_list, values %attr);
}
######
my $p = HTML::LinkExtor->new(\&callback);
# Request document and parse it as it arrives
my $res = $ua->request(HTTP::Request->new(GET => $_),
sub {$p->parse($_[0])});
thanx. Helmut