T
Ted Byers
Running on Windows XP and on Windows Server 2003, Activestate Perl
5.8.8 and 5.10.0.
In the script that is giving me troubles, I am using LWP::RobotUA,
LWP::UserAgent, HTTP::Request, HTTP::Request::Common, and
HTTP::Response. However, I have seen similar issues with
Finance::QuoteHist::Yahoo.
The problem manifests itself as the download terminating and the
script appearing to freeze. This only happens when the download is
large (many megabytes in size). i have a data feed that I can only
access using POST to a given URI (with query parameters specifyin what
data I am trying to retrieve. Never does this result in an error.
The script just freezes and the download ends. So far, I have been
able to work around this by modifying my scripts to break the download
into smaller peices to be handled in a child process. With the one
script, each download is only a few kbytes in size, but there are over
9000 downloads of about the same size, and the total amount of data
across all downloads appeas to be the key, Doing each of those 9000
downloads in its own child process results in a happy, successfully
completed job. Doing them in a loop in a single process results in
unhappy failure.
Does anyone know what I can look at to make these download scripts
either finish successfully or give me a message saying there's too
much data for the script to handle? More importantly, what is the
most likely cause of this misbehaviour and how can it be fixed?
NB: The scripts I'm using work flawlessly when I use parameters that
are guaranteed to restrict the total amount of data to be handled by
the script to a few dozen kbytes, and this is diligently checking for
problems i know about.
Thanks
Ted
5.8.8 and 5.10.0.
In the script that is giving me troubles, I am using LWP::RobotUA,
LWP::UserAgent, HTTP::Request, HTTP::Request::Common, and
HTTP::Response. However, I have seen similar issues with
Finance::QuoteHist::Yahoo.
The problem manifests itself as the download terminating and the
script appearing to freeze. This only happens when the download is
large (many megabytes in size). i have a data feed that I can only
access using POST to a given URI (with query parameters specifyin what
data I am trying to retrieve. Never does this result in an error.
The script just freezes and the download ends. So far, I have been
able to work around this by modifying my scripts to break the download
into smaller peices to be handled in a child process. With the one
script, each download is only a few kbytes in size, but there are over
9000 downloads of about the same size, and the total amount of data
across all downloads appeas to be the key, Doing each of those 9000
downloads in its own child process results in a happy, successfully
completed job. Doing them in a loop in a single process results in
unhappy failure.
Does anyone know what I can look at to make these download scripts
either finish successfully or give me a message saying there's too
much data for the script to handle? More importantly, what is the
most likely cause of this misbehaviour and how can it be fixed?
NB: The scripts I'm using work flawlessly when I use parameters that
are guaranteed to restrict the total amount of data to be handled by
the script to a few dozen kbytes, and this is diligently checking for
problems i know about.
Thanks
Ted