S
shul
It is more and more difficult to script download stuff from websites
using tools like wget
or LWP because the sites dont have static contents.
Many web sites, even after you fill in a field and the site 'returns'
the data (such as links to files or even worse links to pages with
links to files), the data is not 'visible' in the sense that you
cannot save the content of the screen and be able to access the links
on the page to write a script. you must highlight each link and copy
its location by hand and then write a script to do your downloading.
i see that there are javascipt tools to work with this that can be
integrated into the browser, such as greasemonkey and Chickenfoot.
Are there any similar perl tools?
thanks
using tools like wget
or LWP because the sites dont have static contents.
Many web sites, even after you fill in a field and the site 'returns'
the data (such as links to files or even worse links to pages with
links to files), the data is not 'visible' in the sense that you
cannot save the content of the screen and be able to access the links
on the page to write a script. you must highlight each link and copy
its location by hand and then write a script to do your downloading.
i see that there are javascipt tools to work with this that can be
integrated into the browser, such as greasemonkey and Chickenfoot.
Are there any similar perl tools?
thanks