Read limited size of HTTP content

G

Great Deals

I am trying to stream a real-time stock quote from a none stream http
html page. The real-time stock quote info is at the some part of the
page. I only want to download this part not other content. I am very
sure that http protocal can do this, because such programs as
net-transport, net-ant, flash-get.. all can resume download broken
files. I checked their http header from those download manager.
Besically they just telling the server it is a partial download and
tells the server the starting point of the download, and it can ends
at certain point.

I can not find any PERL module doing this. I think the most low level
http is Net::Http, but I still could not find anything there that
meets my need. Here is the code copied from Net::HTTP

use Net::HTTP;
my $s = Net::HTTP->new(Host => "www.perl.com) || die $@;
$s->write_request(GET => "/", 'User-Agent' => "Mozilla/5.0");
my($code, $mess, %h) = $s->read_response_headers;
while (1) {
my $buf;
my $n = $s->read_entity_body($buf, 1024);
die "read failed: $!" unless defined $n;
last unless $n;
print $buf;
}

It looks like $s->read_entity_body($buf, 1024); is close, but it does
not specify the starting point and ending point. Another question how
do I exit from the while loop if the maxsize is read. They use die,
will die quit the whole perl program? Should I use exit instead here?
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,744
Messages
2,569,484
Members
44,903
Latest member
orderPeak8CBDGummies

Latest Threads

Top