A
arek
New to perl, but not new to C++. I wanted a simple and efficeint app
to update some small pages to a remote website. I had used a old perl
app in Linux before called AutoFtp.pl, so I looked for it and did some
modifications. It was working fine on my Test workstations, (XP Home),
so I figured I had it licked and moved to the Main Server, (XP Pro).
Now I am getting nowhere, Literally. The application runs, but does
absolutely nothing....
heres the script: (A modified AutoFtp.pl)
<------------------------------------------------------->
# Automatically FTP files to the web site.
#
# The movfiles.txt file lists which files should be transfered.
# It compares the last modification time of each file to see if it has
been updated.
# Only updated files are transfered.
#
# Changes to the primary site only require their information files to
be updated.
# After successfully updating it sleeps until next transfer time.
#
# A logfile is kept of updates done.
# Only Extreme Failures will cause it to DIE.
#-----------------------------------------------------------------------------
use English;
use strict;
use Time::Local;
use Net::FTP;
my $mov_file; # List of Files to transfer
my $log_file; # Log file to store information
my $primary_info; # WebSite connection information
my $now_string; # Current Time for Log File
my $time_period; # Time Period to transfer files
my @ftp_commands; # List of commands to send to FTP.
my @raw_files; # contents of newfiles.txt
my @files; # Final list after removing Dupes to transfer
my $primary_web; # Web Site url
my $primary_directory; # Web Site directory to store in
my $primary_username; # Username to login
my $primary_password; # Password 'Duh'
my $file_time; # File's Last Modified time
my $system; # System time to check last Modification time
$log_file = "logFtp.txt";
$mov_file = "movfiles.txt";
$primary_info = "primary.inf";
$time_period = 1800; # Every 1/2 hr
#-------------------------------------------------------------
# Main body that calls the Transfer routine
# Sleeps for specified period
# Checks for Updated files to transfer
# ONLY a few DIE statements used for Extreme failures will kill it
# I have it SLEEP first as I don't need it to UPDATE on initial
startup.
#-------------------------------------------------------------
do {
sleep $time_period; # Sleep until next transfer time
main_transfer(); # Transfer Status files
} while (1); # Forever loop as XP Scheduler SUCKS!!
# ----------------------------------------------------------------
# Primary Transfer routine here
# Called by Do Loop once per PreSet Time to transfer updated files
#
# Paramters:
# None
# Return Value:
# None
#-----------------------------------------------------------------
sub main_transfer {
# Only have something to do if there is a list of new files.
if (-f "$mov_file")
{
my $result;
($primary_web, $primary_directory, $primary_username,
$primary_password) = parse_information_file $primary_info);
open FILES, "$mov_file" || die write_log("Unable to open file
$mov_file");
@raw_files = <FILES>;
@files = remove_duplicates(@raw_files);
close(FILES);
$result = put_files($primary_web, $primary_directory,
$primary_username, $primary_password, @files);
if( $result == 0 )
{
write_log("Primary Transfer Failed");
}
elsif ( $result == 1 )
{
write_log("File transfer to primary completed");
}
}
else
{
die write_log("Unable to find $mov_file");
}
}
#----------------------------------------------------------------------------
# Write a file via FTP using the specified user information.
# Parameters:
# hostname - name of the host that contains the file.
# directory - the directory that contains the file.
# username - log in name
# password - duh
# files - the name of the files to get.
#
# Return value:
# boolean - true if the FTP was successful, false if not.
#----------------------------------------------------------------------------
sub put_files {
my $hostname = shift @_;
my $directory = shift @_;
my $username = shift @_;
my $password = shift @_;
my @files = @_;
my $n_files;
my $file;
my $ret;
my $ftp;
my @transfers;
my $dotransfers;
$n_files = @files;
$dotransfers = 0;
clear_ftp();
if ($n_files > 0)
{
my $count;
$count = 0;
foreach $file (@files)
{
# The Job runs at a PreSet time period
$file_time = (stat($file))[9];
$system = time;
$system -= $file_time;
# Has the file been changed within the last time Period?
if ( $system < $time_period )
{
$transfers[$count] = $file;
$count++;
$dotransfers = 1;
}
}
if( !($dotransfers) )
{
# write_log("No updated files to transfer, exiting Ftp.");
$ret = 2;
return $ret;
}
# print "FTP to $hostname - ";
if( !($ftp = Net::FTP->new($hostname, Timeout => 30)) )
{
write_log("Can't connect to $hostname: $ERRNO");
return $ret;
}
if( !($ftp->login($username, $password)) )
{
write_log("Can't login with <$username> <$password>: $ERRNO");
return $ret;
}
if ($directory ne "")
{
if( !($ftp->cwd($directory)) )
{
write_log("Can't cwd to <$directory>: $ERRNO");
return $ret;
}
}
$ftp->type("I"); # binary mode
foreach $file (@transfers)
{
# file updated --> transfer
if( !($ftp->put($file)) )
{
write_log("Can't put $file: $ERRNO");
return $ret;
}
}
if( !($ftp->quit()) )
{
write_log("Couldn't quit FTP: $ERRNO");
}
$ret = 1;
}
return $ret;
} #put_files
#------------------------------------------------------------------------------
# Collect commands to send to FTP.
# Parameters:
# line - a new line to send
# Return value:
# none
#------------------------------------------------------------------------------
sub collect_ftp {
my $line = @_[0];
push @ftp_commands, $line;
} # collect_ftp
#------------------------------------------------------------------------------
# Clear out list of commands to send to FTP.
# Parameters:
# none
# Return value:
# none
#------------------------------------------------------------------------------
sub clear_ftp {
@ftp_commands = ();
} # clear_ftp
#------------------------------------------------------------------------------
# Send commands to FTP.
# Parameters:
# args - list of FTP commands
# Return value:
# none
#------------------------------------------------------------------------------
sub send_ftp {
my $line;
my $command_line;
$command_line = shift(@_);
if ( open(FTP, "$command_line") )
{
for $line (@_)
{
print FTP "$line\n";
}
print FTP "disconnect\n";
print FTP "bye\n";
close(FTP);
}
else
{
write_log("FTP Connection failed");
write_log($command_line);
return;
}
} # send_ftp
#----------------------------------------------------------------------------
# Scan a site information file and return the site, directory, username
and
# password entries.
#
# Parameters:
# file - name of the information file.
# Return value:
# list - site, directory, username, password.
#------------------------------------------------------------------------------
sub parse_information_file
{
my $file = $_[0];
my $site = "";
my $directory = "";
my $username = "";
my $password = "";
my $keyword;
my $value;
open INFO, "$file" || die write_log("Unable to open FTP site
information file $file\n");
while (<INFO>)
{
($keyword, $value) = split;
if ($keyword eq "site")
{
$site = $value;
}
elsif ($keyword eq "directory")
{
$directory = $value;
}
elsif ($keyword eq "username")
{
$username = $value;
}
elsif ($keyword eq "password")
{
$password = $value;
}
else
{
write_log("Unknown keyword in FTP site information file $file: ");
write_log($keyword);
die;
}
}
return ($site, $directory, $username, $password);
} # parse_information_file
#------------------------------------------------------------------------
# Remove duplicates from a list. A side-effect is that the return
values are
# sorted.
#
# Parameters:
# in_list - list which may have duplicate entries.
# Return value:
# out_list - in_list, sorted with duplicates removed.
#------------------------------------------------------------------------------
sub remove_duplicates {
my @unsorted_in_list = @_;
my @in_list;
my @out_list;
my $element;
my $last_element;
@in_list = sort @unsorted_in_list;
# Prime the pump.
$element = shift(@in_list);
chop $element;
@out_list = ($element);
$last_element = $element;
foreach $element (@in_list)
{
chop $element;
if ($element eq $last_element)
{
next;
}
$last_element = $element;
push(@out_list, $element);
}
return @out_list;
} #remove_duplicates
#------------------------------------------------------------------------------
# Write Information to LogFile
# Parameters:
# Info String
# Return value:
# none
#------------------------------------------------------------------------------
sub write_log {
open LOGFILE, ">>", "$log_file" || die "Unable to open file
$log_file";
my $log_data = $_[0];
$now_string = localtime;
print LOGFILE "$now_string : $log_data\n";
close LOGFILE;
} # write_log
<----------------------------------------------------->
It's very simple... It shouldn't be doing what it is doing, which is
absolutely nothing at all...
No errors on compile, no errors during running, NO Log writes either.
The machines I first tested it on were XP Home with the latest 5.++
The XP Pro box has the Same version installed from the same download.
No major applications except the Primary Server app running on the XP
Pro box.
The XP Pro box is whittled down to as few services as neccessary as the
primary application uses well over 300MB Ram and runs 24/7.
I've tried running the Perl App from cmd.exe by hand, by Shortcut with
the appropriate command line settings to start it.. all start up, but
nothing.
The Perl App is started in the SAME directory as the Files it accesses
so I don't need to
change directories, (It's a specialised application, so no sense having
do extra work).
I set the cmd.exe to the correct Dir as the Perl App when I have a XP
shortcut to start it.
I also tried manually starting it from cmd.exe after changing to the
correct Dir.
I suspect several possible causes:
File permissions? All files are created by the Same User running the
App.
File Times: This one I am not sure about, but have manually changed
the file so it's time is within update time.
Some days I really HATE MS. As far as I can figure this has to be an
issue with XP Pro. I am aware that it has some differences from XP
Home...
I've looked through the lists, but this issue hasn't shown up...
Any help would be appreciated..
Thx
to update some small pages to a remote website. I had used a old perl
app in Linux before called AutoFtp.pl, so I looked for it and did some
modifications. It was working fine on my Test workstations, (XP Home),
so I figured I had it licked and moved to the Main Server, (XP Pro).
Now I am getting nowhere, Literally. The application runs, but does
absolutely nothing....
heres the script: (A modified AutoFtp.pl)
<------------------------------------------------------->
# Automatically FTP files to the web site.
#
# The movfiles.txt file lists which files should be transfered.
# It compares the last modification time of each file to see if it has
been updated.
# Only updated files are transfered.
#
# Changes to the primary site only require their information files to
be updated.
# After successfully updating it sleeps until next transfer time.
#
# A logfile is kept of updates done.
# Only Extreme Failures will cause it to DIE.
#-----------------------------------------------------------------------------
use English;
use strict;
use Time::Local;
use Net::FTP;
my $mov_file; # List of Files to transfer
my $log_file; # Log file to store information
my $primary_info; # WebSite connection information
my $now_string; # Current Time for Log File
my $time_period; # Time Period to transfer files
my @ftp_commands; # List of commands to send to FTP.
my @raw_files; # contents of newfiles.txt
my @files; # Final list after removing Dupes to transfer
my $primary_web; # Web Site url
my $primary_directory; # Web Site directory to store in
my $primary_username; # Username to login
my $primary_password; # Password 'Duh'
my $file_time; # File's Last Modified time
my $system; # System time to check last Modification time
$log_file = "logFtp.txt";
$mov_file = "movfiles.txt";
$primary_info = "primary.inf";
$time_period = 1800; # Every 1/2 hr
#-------------------------------------------------------------
# Main body that calls the Transfer routine
# Sleeps for specified period
# Checks for Updated files to transfer
# ONLY a few DIE statements used for Extreme failures will kill it
# I have it SLEEP first as I don't need it to UPDATE on initial
startup.
#-------------------------------------------------------------
do {
sleep $time_period; # Sleep until next transfer time
main_transfer(); # Transfer Status files
} while (1); # Forever loop as XP Scheduler SUCKS!!
# ----------------------------------------------------------------
# Primary Transfer routine here
# Called by Do Loop once per PreSet Time to transfer updated files
#
# Paramters:
# None
# Return Value:
# None
#-----------------------------------------------------------------
sub main_transfer {
# Only have something to do if there is a list of new files.
if (-f "$mov_file")
{
my $result;
($primary_web, $primary_directory, $primary_username,
$primary_password) = parse_information_file $primary_info);
open FILES, "$mov_file" || die write_log("Unable to open file
$mov_file");
@raw_files = <FILES>;
@files = remove_duplicates(@raw_files);
close(FILES);
$result = put_files($primary_web, $primary_directory,
$primary_username, $primary_password, @files);
if( $result == 0 )
{
write_log("Primary Transfer Failed");
}
elsif ( $result == 1 )
{
write_log("File transfer to primary completed");
}
}
else
{
die write_log("Unable to find $mov_file");
}
}
#----------------------------------------------------------------------------
# Write a file via FTP using the specified user information.
# Parameters:
# hostname - name of the host that contains the file.
# directory - the directory that contains the file.
# username - log in name
# password - duh
# files - the name of the files to get.
#
# Return value:
# boolean - true if the FTP was successful, false if not.
#----------------------------------------------------------------------------
sub put_files {
my $hostname = shift @_;
my $directory = shift @_;
my $username = shift @_;
my $password = shift @_;
my @files = @_;
my $n_files;
my $file;
my $ret;
my $ftp;
my @transfers;
my $dotransfers;
$n_files = @files;
$dotransfers = 0;
clear_ftp();
if ($n_files > 0)
{
my $count;
$count = 0;
foreach $file (@files)
{
# The Job runs at a PreSet time period
$file_time = (stat($file))[9];
$system = time;
$system -= $file_time;
# Has the file been changed within the last time Period?
if ( $system < $time_period )
{
$transfers[$count] = $file;
$count++;
$dotransfers = 1;
}
}
if( !($dotransfers) )
{
# write_log("No updated files to transfer, exiting Ftp.");
$ret = 2;
return $ret;
}
# print "FTP to $hostname - ";
if( !($ftp = Net::FTP->new($hostname, Timeout => 30)) )
{
write_log("Can't connect to $hostname: $ERRNO");
return $ret;
}
if( !($ftp->login($username, $password)) )
{
write_log("Can't login with <$username> <$password>: $ERRNO");
return $ret;
}
if ($directory ne "")
{
if( !($ftp->cwd($directory)) )
{
write_log("Can't cwd to <$directory>: $ERRNO");
return $ret;
}
}
$ftp->type("I"); # binary mode
foreach $file (@transfers)
{
# file updated --> transfer
if( !($ftp->put($file)) )
{
write_log("Can't put $file: $ERRNO");
return $ret;
}
}
if( !($ftp->quit()) )
{
write_log("Couldn't quit FTP: $ERRNO");
}
$ret = 1;
}
return $ret;
} #put_files
#------------------------------------------------------------------------------
# Collect commands to send to FTP.
# Parameters:
# line - a new line to send
# Return value:
# none
#------------------------------------------------------------------------------
sub collect_ftp {
my $line = @_[0];
push @ftp_commands, $line;
} # collect_ftp
#------------------------------------------------------------------------------
# Clear out list of commands to send to FTP.
# Parameters:
# none
# Return value:
# none
#------------------------------------------------------------------------------
sub clear_ftp {
@ftp_commands = ();
} # clear_ftp
#------------------------------------------------------------------------------
# Send commands to FTP.
# Parameters:
# args - list of FTP commands
# Return value:
# none
#------------------------------------------------------------------------------
sub send_ftp {
my $line;
my $command_line;
$command_line = shift(@_);
if ( open(FTP, "$command_line") )
{
for $line (@_)
{
print FTP "$line\n";
}
print FTP "disconnect\n";
print FTP "bye\n";
close(FTP);
}
else
{
write_log("FTP Connection failed");
write_log($command_line);
return;
}
} # send_ftp
#----------------------------------------------------------------------------
# Scan a site information file and return the site, directory, username
and
# password entries.
#
# Parameters:
# file - name of the information file.
# Return value:
# list - site, directory, username, password.
#------------------------------------------------------------------------------
sub parse_information_file
{
my $file = $_[0];
my $site = "";
my $directory = "";
my $username = "";
my $password = "";
my $keyword;
my $value;
open INFO, "$file" || die write_log("Unable to open FTP site
information file $file\n");
while (<INFO>)
{
($keyword, $value) = split;
if ($keyword eq "site")
{
$site = $value;
}
elsif ($keyword eq "directory")
{
$directory = $value;
}
elsif ($keyword eq "username")
{
$username = $value;
}
elsif ($keyword eq "password")
{
$password = $value;
}
else
{
write_log("Unknown keyword in FTP site information file $file: ");
write_log($keyword);
die;
}
}
return ($site, $directory, $username, $password);
} # parse_information_file
#------------------------------------------------------------------------
# Remove duplicates from a list. A side-effect is that the return
values are
# sorted.
#
# Parameters:
# in_list - list which may have duplicate entries.
# Return value:
# out_list - in_list, sorted with duplicates removed.
#------------------------------------------------------------------------------
sub remove_duplicates {
my @unsorted_in_list = @_;
my @in_list;
my @out_list;
my $element;
my $last_element;
@in_list = sort @unsorted_in_list;
# Prime the pump.
$element = shift(@in_list);
chop $element;
@out_list = ($element);
$last_element = $element;
foreach $element (@in_list)
{
chop $element;
if ($element eq $last_element)
{
next;
}
$last_element = $element;
push(@out_list, $element);
}
return @out_list;
} #remove_duplicates
#------------------------------------------------------------------------------
# Write Information to LogFile
# Parameters:
# Info String
# Return value:
# none
#------------------------------------------------------------------------------
sub write_log {
open LOGFILE, ">>", "$log_file" || die "Unable to open file
$log_file";
my $log_data = $_[0];
$now_string = localtime;
print LOGFILE "$now_string : $log_data\n";
close LOGFILE;
} # write_log
<----------------------------------------------------->
It's very simple... It shouldn't be doing what it is doing, which is
absolutely nothing at all...
No errors on compile, no errors during running, NO Log writes either.
The machines I first tested it on were XP Home with the latest 5.++
The XP Pro box has the Same version installed from the same download.
No major applications except the Primary Server app running on the XP
Pro box.
The XP Pro box is whittled down to as few services as neccessary as the
primary application uses well over 300MB Ram and runs 24/7.
I've tried running the Perl App from cmd.exe by hand, by Shortcut with
the appropriate command line settings to start it.. all start up, but
nothing.
The Perl App is started in the SAME directory as the Files it accesses
so I don't need to
change directories, (It's a specialised application, so no sense having
do extra work).
I set the cmd.exe to the correct Dir as the Perl App when I have a XP
shortcut to start it.
I also tried manually starting it from cmd.exe after changing to the
correct Dir.
I suspect several possible causes:
File permissions? All files are created by the Same User running the
App.
File Times: This one I am not sure about, but have manually changed
the file so it's time is within update time.
Some days I really HATE MS. As far as I can figure this has to be an
issue with XP Pro. I am aware that it has some differences from XP
Home...
I've looked through the lists, but this issue hasn't shown up...
Any help would be appreciated..
Thx