Obtaining verbose info for http transfers.

S

Sisyphus

Hi,

One of the nice things about Net::FTP is that if you run it with debugging
switched on, you get a report of the actual communication that's taking
place between the local box and the remote ftp server.

Is there a module that provides the same sort of report in relation to http
downloads ?

I have a satellite broadband connection, and when surfing the web I find
it's about as slow as my old 28kbps dial-up connection was. (There's no
problem when it comes to http or ftp downloads of large files - it's just
that in general surfing it's fairly slow.) I suspect this has something to
do with the latency involved in the passing of communications via satellite
between my PC and the remote web server - but I don't know how many times
messages are passed back and forth, and it would be informative if there was
a perl module that would dispay this info (if only in terms of a report
similar to that offered by Net::FTP).

It's not uncommon for my browser to take 15-20 seconds to download a web
page that doesn't contain a lot of data (a few small pictures, an
advertisement or 2, maybe more than one frame, but not much in terms of
overall data to be transferred) - which implies that if this is explained in
terms of the aforementioned latency, then there's quite a few messages going
back and forth. (A ping of a remote site takes about 1.3 seconds.)

Cheers,
Rob
 
S

Simon Taylor

Hello Rob,
One of the nice things about Net::FTP is that if you run it with debugging
switched on, you get a report of the actual communication that's taking
place between the local box and the remote ftp server.

Yes, it's rather slick.
Is there a module that provides the same sort of report in relation to http
downloads ?

I've sometimes used the LWP GET command as follows:

GET -Sdxu http://www.yahoo.com.au

Regards,

Simon Taylor
 
S

Sisyphus

Simon Taylor said:
Hello Rob,


Yes, it's rather slick.


I've sometimes used the LWP GET command as follows:

GET -Sdxu http://www.yahoo.com.au

That's a nix command, right ? It's probably the sort of thing I'm looking
for ... but I'm on Win32 :)

I half expected that LWP::UserAgent or HTTP::Request/Response might
implement the verbosity I'm after since they obviously know all about the
http protocol, but I can't find anything in their docs that helps in that
regard.

I vaguely recall having used some sniffer type (non-perl) program a few
years back .... I might have to google that up again if there's no
ready-made perl solution.

Cheers,
Rob
 
K

ko

Sisyphus said:
Hi,

One of the nice things about Net::FTP is that if you run it with debugging
switched on, you get a report of the actual communication that's taking
place between the local box and the remote ftp server.

Is there a module that provides the same sort of report in relation to http
downloads ?

I have a satellite broadband connection, and when surfing the web I find
it's about as slow as my old 28kbps dial-up connection was. (There's no
problem when it comes to http or ftp downloads of large files - it's just
that in general surfing it's fairly slow.) I suspect this has something to
do with the latency involved in the passing of communications via satellite
between my PC and the remote web server - but I don't know how many times
messages are passed back and forth, and it would be informative if there was
a perl module that would dispay this info (if only in terms of a report
similar to that offered by Net::FTP).

It's not uncommon for my browser to take 15-20 seconds to download a web
page that doesn't contain a lot of data (a few small pictures, an
advertisement or 2, maybe more than one frame, but not much in terms of
overall data to be transferred) - which implies that if this is explained in
terms of the aforementioned latency, then there's quite a few messages going
back and forth. (A ping of a remote site takes about 1.3 seconds.)

Cheers,
Rob

For a start, how about something like this:

use strict;
use warnings;
use LWP;

my $ua = LWP::UserAgent->new(
requests_redirectable => [],
max_redirect => 100,
);

verbose_http('http://hotmail.com/');

sub verbose_http {
push my @urls, shift;
while (my $url = shift @urls) {
my $r = $ua->get($url);
if ($r->is_redirect) {
print $r->headers->as_string . "\n";
my $redirect = $r->header('Location');
push @urls, $redirect;
} elsif ($r->is_success) {
print $r->content . "\n";
}
}
}
__END__

HTH - keith
 
T

Tad McClellan

One of the nice things about Net::FTP is that if you run it with debugging
switched on, you get a report of the actual communication that's taking
place between the local box and the remote ftp server.

Is there a module that provides the same sort of report in relation to http
downloads ?


Web Scraping Proxy

http://www.research.att.com/~hpk/wsp/

I have a satellite broadband connection, and when surfing the web I find
it's about as slow as my old 28kbps dial-up connection was. (There's no
problem when it comes to http or ftp downloads of large files - it's just
that in general surfing it's fairly slow.)


Spyware would provide those symptoms too...
 
K

ko

Eric said:
What's the point of this? Why not just
sub verbose_http {
my $url = shift;

<snip rest of code>

-=Eric

Yes, could've use recursion:

sub verbose_http {
my $url = shift;
my $r = $ua->get($url);
if ($r->is_redirect) {
print $r->headers->as_string . "\n\n";
my $redirect = $r->header('Location');
verbose_http($redirect);
} elsif ($r->is_success) {
print $r->content . "\n";
}
}

Just a personal preference not to :)

<offtopic>
If the OP uses Firefox, have a look here:
http://livehttpheaders.mozdev.org/

If I remember correctly, it was a ~60KB or so download. Very nice.
</offtopic>

keith
 
A

Anno Siegel

ko said:
Yes, could've use recursion:

sub verbose_http {
my $url = shift;
my $r = $ua->get($url);
if ($r->is_redirect) {
print $r->headers->as_string . "\n\n";
my $redirect = $r->header('Location');
verbose_http($redirect);
} elsif ($r->is_success) {
print $r->content . "\n";
}
}

Just a personal preference not to :)

Huh? I don't get it. How does one force you to use recursion and
the other doesn't?

sub foo {
my @urls = shift;
while ( my $url = shift @urls ) {
# do something
}
}

sub bar {
my $url = shift;
# do something
}


These do the same thing, except that foo() is unnecessarily roundabout.
Minor differences may result from "# do something" being enclosed in a
block in foo(), but not in bar(), but how foo() would avoid recursion
needed in bar() is beyond me.

Anno
 
K

ko

Anno said:
ko said:
Eric Schwartz wrote:
[snip]

Huh? I don't get it. How does one force you to use recursion and
the other doesn't?

sub foo {
my @urls = shift;
while ( my $url = shift @urls ) {
# do something
}
}

sub bar {
my $url = shift;
# do something
}


These do the same thing, except that foo() is unnecessarily roundabout.

I understood that they accomplish the same thing. The roundabout part, I
obviously did not :)
Minor differences may result from "# do something" being enclosed in a
block in foo(), but not in bar(), but how foo() would avoid recursion
needed in bar() is beyond me.

Sorry, mistaken/sloppy terminology. I've always associated the word
recursion with a subroutine *explicitly* calling itself:

sub bar {
# do stuff and sometime later..
bar()
}

Rather than the sub's *depth*. For whatever reason when I started
programming it just didn't look right to have a sub call itself, and the
habit has stuck with me...

Thanks - keith
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,769
Messages
2,569,580
Members
45,055
Latest member
SlimSparkKetoACVReview

Latest Threads

Top