scripting web 2.0 sites to download howto

S

shul

It is more and more difficult to script download stuff from websites
using tools like wget
or LWP because the sites dont have static contents.

Many web sites, even after you fill in a field and the site 'returns'
the data (such as links to files or even worse links to pages with
links to files), the data is not 'visible' in the sense that you
cannot save the content of the screen and be able to access the links
on the page to write a script. you must highlight each link and copy
its location by hand and then write a script to do your downloading.

i see that there are javascipt tools to work with this that can be
integrated into the browser, such as greasemonkey and Chickenfoot.

Are there any similar perl tools?

thanks
 
C

C.DeRykus

It is more and more difficult to script download stuff from websites
using tools like wget
or LWP because the sites dont have static contents.

Many web sites, even after you fill in a field and the site 'returns'
the data (such as links to files or even worse links to pages with
links to files), the data is not 'visible' in the sense that you
cannot save the content of the screen and be able to access the links
on the page to write a script. you must highlight each link and copy
its location by hand and then write a script to do your downloading.

i see that there are javascipt tools to work with this that can be
integrated into the browser, such as greasemonkey and Chickenfoot.

Are there any similar perl tools?

Perl supports a Selenium client interface:

http://search.cpan.org/~lukec/Test-WWW-Selenium-1.23/lib/WWW/Selenium.pm
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,769
Messages
2,569,582
Members
45,061
Latest member
KetonaraKeto

Latest Threads

Top