Remote uploading with hops?

C

Calvine Chew

I've been trying to write a script that will allow me to simultaneously
update several websites in a company LAN environment, across multiple
divisions, just by updating an initial one.

Basically:

1) I upload an update (say a new dataset in a zip file) to website A.
2) Script in website A saves it to disk, unpacks it nicely, then sends the
zip off to website B before deleting it from the disk,
3) Website B does the same and sends it off to website C.
4) This process is repeated until it reaches an end node.

Obviously the scripts on each site have been pre-programmed to hop to the
next server, so the hopping logic is done. I basically use LWP to post from
HTML form to HTML form until the last form which just saves the file and
unpacks it.

This works beautifully but because I use LWP, it only works for perl
installations where the LWP/HTML/HTTP modules are make'd and install'd
properly (pls correct me if I'm wrong on that). I can't seem to get it to
work on servers where I did not make and install the required modules. Carp
shows that it can't find loadable module for HTML::parser in the available
library paths even tho I've already used "use lib". In fact, it seems the
offenders are HTML::parser and HTML::Entities.

I'm suspecting this is because LWP/Parser/etc (I think Entities.pm too) uses
XS and other files and binaries besides the .pm files. So does anyone know
how I should go about doing a proper local non-make/install installation of
LWP/HTML/HTTP on the problematic server(s) or is there another way to look
at this problem/issue (perhaps by directly interfacing with each perl script
without using LWP?? But not sure how to do that... socket/port connections
are not available to me)?

Any comments or suggestions are appreciated!
 
J

J. Gleixner

Calvine said:
I've been trying to write a script that will allow me to simultaneously
update several websites in a company LAN environment, across multiple
divisions, just by updating an initial one.

Basically:

1) I upload an update (say a new dataset in a zip file) to website A.
2) Script in website A saves it to disk, unpacks it nicely, then sends the
zip off to website B before deleting it from the disk,
3) Website B does the same and sends it off to website C.
4) This process is repeated until it reaches an end node.

Ummm.. why not update all servers from one main server? Or use
something like rsync, which is made for this kind of thing?
 
C

Calvine Chew

Mmm, I can't employ a primary node to secondaries model coz my main server
can't reach all my other servers... not directly at least... hence the need
to hop...

Also, at each hop I pick up logs to attach to the file transfers so by the
end of the long chain of hops my final node picked up all log files.

I tried searching cpan but I couldn't find a pureperl ua, which is probably
what I need (to perform the auto-post).
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,767
Messages
2,569,572
Members
45,045
Latest member
DRCM

Latest Threads

Top