A
Adlene
Hi, there:
I wrote a program to download 500,000 HTML files from a website, I
have compiled all the links in a file. my grabber.pl will download all of
them...
I have a fast internet connection. I think it is better to run multiple
downloads at
same time, but $INET = new Win32::Internet() only allows one at a
time...what
may I do?
I also found, occassionally the grabber just hang somewhere...In such
situation I
need to bypass $INET->FetchURL($url), write the offending URL in an error
file
and continue on to next iteration...How may I do that?
Best Regards,
Adlene
I wrote a program to download 500,000 HTML files from a website, I
have compiled all the links in a file. my grabber.pl will download all of
them...
I have a fast internet connection. I think it is better to run multiple
downloads at
same time, but $INET = new Win32::Internet() only allows one at a
time...what
may I do?
I also found, occassionally the grabber just hang somewhere...In such
situation I
need to bypass $INET->FetchURL($url), write the offending URL in an error
file
and continue on to next iteration...How may I do that?
Best Regards,
Adlene