Annoying Problem with a Basic Perl app and XP Pro

A

arek

New to perl, but not new to C++. I wanted a simple and efficeint app
to update some small pages to a remote website. I had used a old perl
app in Linux before called AutoFtp.pl, so I looked for it and did some
modifications. It was working fine on my Test workstations, (XP Home),
so I figured I had it licked and moved to the Main Server, (XP Pro).

Now I am getting nowhere, Literally. The application runs, but does
absolutely nothing....

heres the script: (A modified AutoFtp.pl)

<------------------------------------------------------->

# Automatically FTP files to the web site.
#
# The movfiles.txt file lists which files should be transfered.
# It compares the last modification time of each file to see if it has
been updated.
# Only updated files are transfered.
#
# Changes to the primary site only require their information files to
be updated.
# After successfully updating it sleeps until next transfer time.
#
# A logfile is kept of updates done.
# Only Extreme Failures will cause it to DIE.
#-----------------------------------------------------------------------------

use English;
use strict;
use Time::Local;
use Net::FTP;

my $mov_file; # List of Files to transfer
my $log_file; # Log file to store information
my $primary_info; # WebSite connection information

my $now_string; # Current Time for Log File
my $time_period; # Time Period to transfer files

my @ftp_commands; # List of commands to send to FTP.
my @raw_files; # contents of newfiles.txt
my @files; # Final list after removing Dupes to transfer

my $primary_web; # Web Site url
my $primary_directory; # Web Site directory to store in
my $primary_username; # Username to login
my $primary_password; # Password 'Duh'

my $file_time; # File's Last Modified time
my $system; # System time to check last Modification time

$log_file = "logFtp.txt";
$mov_file = "movfiles.txt";
$primary_info = "primary.inf";
$time_period = 1800; # Every 1/2 hr

#-------------------------------------------------------------
# Main body that calls the Transfer routine
# Sleeps for specified period
# Checks for Updated files to transfer
# ONLY a few DIE statements used for Extreme failures will kill it
# I have it SLEEP first as I don't need it to UPDATE on initial
startup.
#-------------------------------------------------------------
do {

sleep $time_period; # Sleep until next transfer time
main_transfer(); # Transfer Status files

} while (1); # Forever loop as XP Scheduler SUCKS!!

# ----------------------------------------------------------------
# Primary Transfer routine here
# Called by Do Loop once per PreSet Time to transfer updated files
#
# Paramters:
# None
# Return Value:
# None
#-----------------------------------------------------------------
sub main_transfer {
# Only have something to do if there is a list of new files.
if (-f "$mov_file")
{
my $result;

($primary_web, $primary_directory, $primary_username,
$primary_password) = parse_information_file $primary_info);

open FILES, "$mov_file" || die write_log("Unable to open file
$mov_file");
@raw_files = <FILES>;
@files = remove_duplicates(@raw_files);
close(FILES);

$result = put_files($primary_web, $primary_directory,
$primary_username, $primary_password, @files);
if( $result == 0 )
{
write_log("Primary Transfer Failed");
}
elsif ( $result == 1 )
{
write_log("File transfer to primary completed");
}
}
else
{
die write_log("Unable to find $mov_file");
}
}

#----------------------------------------------------------------------------
# Write a file via FTP using the specified user information.
# Parameters:
# hostname - name of the host that contains the file.
# directory - the directory that contains the file.
# username - log in name
# password - duh
# files - the name of the files to get.
#
# Return value:
# boolean - true if the FTP was successful, false if not.
#----------------------------------------------------------------------------
sub put_files {
my $hostname = shift @_;
my $directory = shift @_;
my $username = shift @_;
my $password = shift @_;
my @files = @_;
my $n_files;
my $file;
my $ret;
my $ftp;
my @transfers;
my $dotransfers;

$n_files = @files;
$dotransfers = 0;

clear_ftp();
if ($n_files > 0)
{
my $count;
$count = 0;
foreach $file (@files)
{
# The Job runs at a PreSet time period
$file_time = (stat($file))[9];
$system = time;
$system -= $file_time;

# Has the file been changed within the last time Period?
if ( $system < $time_period )
{
$transfers[$count] = $file;
$count++;
$dotransfers = 1;
}
}

if( !($dotransfers) )
{
# write_log("No updated files to transfer, exiting Ftp.");
$ret = 2;
return $ret;
}

# print "FTP to $hostname - ";
if( !($ftp = Net::FTP->new($hostname, Timeout => 30)) )
{
write_log("Can't connect to $hostname: $ERRNO");
return $ret;
}

if( !($ftp->login($username, $password)) )
{
write_log("Can't login with <$username> <$password>: $ERRNO");
return $ret;
}

if ($directory ne "")
{
if( !($ftp->cwd($directory)) )
{
write_log("Can't cwd to <$directory>: $ERRNO");
return $ret;
}
}

$ftp->type("I"); # binary mode

foreach $file (@transfers)
{
# file updated --> transfer
if( !($ftp->put($file)) )
{
write_log("Can't put $file: $ERRNO");
return $ret;
}
}

if( !($ftp->quit()) )
{
write_log("Couldn't quit FTP: $ERRNO");
}

$ret = 1;
}
return $ret;
} #put_files

#------------------------------------------------------------------------------
# Collect commands to send to FTP.
# Parameters:
# line - a new line to send
# Return value:
# none
#------------------------------------------------------------------------------
sub collect_ftp {
my $line = @_[0];
push @ftp_commands, $line;
} # collect_ftp


#------------------------------------------------------------------------------
# Clear out list of commands to send to FTP.
# Parameters:
# none
# Return value:
# none
#------------------------------------------------------------------------------
sub clear_ftp {
@ftp_commands = ();
} # clear_ftp


#------------------------------------------------------------------------------
# Send commands to FTP.
# Parameters:
# args - list of FTP commands
# Return value:
# none
#------------------------------------------------------------------------------
sub send_ftp {
my $line;
my $command_line;

$command_line = shift(@_);
if ( open(FTP, "$command_line") )
{
for $line (@_)
{
print FTP "$line\n";
}
print FTP "disconnect\n";
print FTP "bye\n";
close(FTP);
}
else
{
write_log("FTP Connection failed");
write_log($command_line);
return;
}
} # send_ftp


#----------------------------------------------------------------------------
# Scan a site information file and return the site, directory, username
and
# password entries.
#
# Parameters:
# file - name of the information file.
# Return value:
# list - site, directory, username, password.
#------------------------------------------------------------------------------
sub parse_information_file
{
my $file = $_[0];
my $site = "";
my $directory = "";
my $username = "";
my $password = "";
my $keyword;
my $value;

open INFO, "$file" || die write_log("Unable to open FTP site
information file $file\n");

while (<INFO>)
{
($keyword, $value) = split;
if ($keyword eq "site")
{
$site = $value;
}
elsif ($keyword eq "directory")
{
$directory = $value;
}
elsif ($keyword eq "username")
{
$username = $value;
}
elsif ($keyword eq "password")
{
$password = $value;
}
else
{
write_log("Unknown keyword in FTP site information file $file: ");
write_log($keyword);
die;
}
}

return ($site, $directory, $username, $password);
} # parse_information_file


#------------------------------------------------------------------------
# Remove duplicates from a list. A side-effect is that the return
values are
# sorted.
#
# Parameters:
# in_list - list which may have duplicate entries.
# Return value:
# out_list - in_list, sorted with duplicates removed.
#------------------------------------------------------------------------------
sub remove_duplicates {
my @unsorted_in_list = @_;
my @in_list;
my @out_list;
my $element;
my $last_element;

@in_list = sort @unsorted_in_list;

# Prime the pump.
$element = shift(@in_list);
chop $element;
@out_list = ($element);
$last_element = $element;
foreach $element (@in_list)
{
chop $element;
if ($element eq $last_element)
{
next;
}
$last_element = $element;
push(@out_list, $element);
}
return @out_list;
} #remove_duplicates

#------------------------------------------------------------------------------
# Write Information to LogFile
# Parameters:
# Info String
# Return value:
# none
#------------------------------------------------------------------------------
sub write_log {
open LOGFILE, ">>", "$log_file" || die "Unable to open file
$log_file";
my $log_data = $_[0];
$now_string = localtime;
print LOGFILE "$now_string : $log_data\n";
close LOGFILE;
} # write_log

<----------------------------------------------------->

It's very simple... It shouldn't be doing what it is doing, which is
absolutely nothing at all...

No errors on compile, no errors during running, NO Log writes either.

The machines I first tested it on were XP Home with the latest 5.++
The XP Pro box has the Same version installed from the same download.

No major applications except the Primary Server app running on the XP
Pro box.
The XP Pro box is whittled down to as few services as neccessary as the
primary application uses well over 300MB Ram and runs 24/7.

I've tried running the Perl App from cmd.exe by hand, by Shortcut with
the appropriate command line settings to start it.. all start up, but
nothing.

The Perl App is started in the SAME directory as the Files it accesses
so I don't need to
change directories, (It's a specialised application, so no sense having
do extra work).
I set the cmd.exe to the correct Dir as the Perl App when I have a XP
shortcut to start it.
I also tried manually starting it from cmd.exe after changing to the
correct Dir.

I suspect several possible causes:
File permissions? All files are created by the Same User running the
App.
File Times: This one I am not sure about, but have manually changed
the file so it's time is within update time.

Some days I really HATE MS. As far as I can figure this has to be an
issue with XP Pro. I am aware that it has some differences from XP
Home...

I've looked through the lists, but this issue hasn't shown up...

Any help would be appreciated..

Thx
 
A

A. Sinan Unur

[ lotsa lines of code snipped ]
No errors on compile, no errors during running, NO Log writes either.

D:\Home\asu1\UseNet\clpmisc> perl -c v.pl
Scalar value @_[0] better written as $_[0] at v.pl line 195.
syntax error at v.pl line 62, near "$primary_info)"
syntax error at v.pl line 245, near "and"
v.pl had compilation errors.
Some days I really HATE MS.

Whatever ...

Sinan
 
A

arek

Actually those were caused by Wrapping during post...
It DOES compile without errors if you correct the Wrapping in the Post.

As I said, "It Compiles without Errors AND works on the XP Home boxes"

So a more usefull reply would be appreciated...

as for the @_[0] better as $_[0], I was not sure and had NOT changed
that from
the Original Authors AutoFtp.pl code as it worked fine.

Not trying to be nasty, but those Compile Errors are not Code issues,
just failure to
look at the Code before just cutting and pasting it and running it.. I
am quite used to
watching out for Wrapping problems from Web Based copys...
 
A

arek

Also, it WILL fail if the following Files do NOT exist in the same Dir:

movfiles.txt
primary.inf

movfiles:
a simple txt list of files to find / check modification times /
transfer
Example: status.html (a one line list, file--> status.html
MUST exist)

primary.inf:
A list of Data for AutoFtpCode.pl to parse as Commands to FTP

Example: (Seperate each command/data by a space)
site www.somewhere.com
directory www
username someone
password something

without these it will not run correctly
just create them for an appropriate site, I used Cerebrus FTP and ran
it on the Same Machine
to do the basic testing before having it connect to the Actually Remote
site.
 
A

arek

Yes XP Pro CAN BE more strict with file access..

I setup the XP Pro box with NTFS and used Fat32 on the Home box.
Several reason for that, I'll leave to the "Pros" to figure out.
That in itself will say ALOT about file access...

And since I got so little help from the "Pros", I'll leave it to the
"Pros"
to figure out where the BUG was in how it ran on XP Pro and XP Home...

So to state it simply, I fixed it with the help from ONE person out of
ALL
the other Garbage posts in this thread, being the reminder to use
[ use warnings ] which got me on the right track.

The Troll
 
T

Tad McClellan

arek said:
Can't edit the post to correct it...


So long then!

As far as that goes, did ANYBODy even bother to Read what the Code was
doing?


No, because we could see that it was not the actual code.

Any body halfway decent at coding would have easily seen the need to
change that to a
shorter period to do testing...


Anybody halfway decent at coding...

If people spent more time helping rather then groaning about simple
typos and a


.... would know that a syntax error is not a "simple typo".

SMALL error in posted code...


How are we to know that the small error is not the cause of your problem?

If you were Really good at Coding, that SMALL error shouldn't have even
made you blink!
ONE missing [ ( ] and everyone's bitching...


You expect hundreds of people to fix the mangled code, AND then
debug it for you?

We are volunteers here, if you make it hard to help you, then we're
likely to just move on to helping someone who makes it easy to
be helped.
 
T

Tad McClellan

arek said:
If your [Exhausted] from reading less then 2 pages of code,
How the heck can you debug 10K lines worth of Code?


What you are willing to do is much different when volunteering
in comparison to working for pay.

I work with programs that contain more then 500 seperate documents


We are all impressed, I'm sure.

and your complaing about less then TWO pages Heavily Commented??


You are _working_.

We are _volunteering_.
 
T

Tad McClellan

arek said:
SOOOooo with all your gripes over simple errors in the post Nobody
bothered to look at where it actually
had a problem:


That's because the code was not Perl.

Review the angst and unhelpfulness evident in this thread.

Notice that none of it would have happened if you had posted
Real Perl code.

Maybe you should consider posting Real Code in the future to
avoid such silliness?
 
T

Tad McClellan

arek said:
Never used XP Home eh? then how can you effectivley Answer my
problem/question?


_Nobody_ can help with your problem, because nobody can
see the code that you are running.

So I am a troll eh?


Yes, that is patently obvious.
 
T

Tad McClellan

arek said:
being the reminder to use
[ use warnings ] which got me on the right track.


Have you seen the Posting Guidelines that are posted here frequently?

You could have been on the right track in seconds if you had...
 
J

Jonathan Stowe

arek said:
If your [Exhausted] from reading less then 2 pages of code,
How the heck can you debug 10K lines worth of Code?
I work with programs that contain more then 500 seperate documents
(each a C++ .h or .cc file)

My heart bleeds for you.
and your complaing about less then TWO pages Heavily Commented??

Yes but there is a difference between doing this for a job and doing it
as a favour to someone on a newsgroup.

/J\
 
A

arek

Umm... in what way did I NOT post what I was using?
I did my BEST to make sure I provided sufficient information to ask for
help
with what appeared to be an issue between to versions of Windows with
the same code.

What I got was a bunch of arguments about the Code posted being full of
errors.
What the posting contained was the Exact same code with 3 glitches that
occured
during the initial Posting.

With further browsing of the Thread, you will find several had actually
gotten it to compile,
which means the Code was functional. SO where did I NOT post the code?

Your response falls among the many others that would rather Argue and
NitPick then Help!

Any decent programmer would merely have chuckled at the MINOR glitches
in the Posted
code, fixed them and looked at where I might be having problems. Few
did. No problems
with pointing out Style concerns, but as that was ALL some posted,
those in my Opinion
were wasted bandwidth.

Further posts as yours above merely turn away possible usefull
participants in what should
be a good natured and civil attempt to help each other.
 
A

arek

A favor is exactly all I asked for, not groaning that the Code is
Crap.
If your not interested in helping someone with a problem, you shouldn't
even post.
I help loads of people in different forums. I also refrain from
posting Garbage.

I don't refrain from defending myself tho.

The above posts were directed at someone groaning about something they
didn't really seem to care to Help with, just point out that my Coding
of it was crap.

IF it was so Crappy, it wouldn't have worked in the first place. As
with ANY application
you start with it HEAVILY commented to help you remember WHAT you were
doing
Documentation of Code is a must, and if my documenting is Garbage as
Some have
said, I really would DREAD having to work on any Code they wrote.

Most people that post here for help are NOT going to be extreme coding
Guru's in the
section they are asking for help. Expecting 100% clean and perfect
code is Ridiculous!

Thus if you do nothing but point out those errors and do no actual help
with the code,
your actually defeating the purpose of such groups.
 
A

arek

Hmm... So the Perl Code I posted,
Which I had modified from another Authors Code that
has been heavily involved in Perl for years WAS NOT PERL CODE?

YOU make me Laugh!
 
P

Peter Wyzl

arek said:
One Note to other users of Perl:

XP Pro is far more strict in access to files and other things then XP
Home...
Be careful when writing apps and testing them on one then trying to run
them on another

What may run without problems on XP Home, may not necessarily run on XP
Pro.

The actual problem was in Accessing File Attributes in XP Pro...
XP Pro requires Opening the File Before attempting to read Attributes..

That is simply not true. If you believe it to be so, post a short (10 lines
or so) working program that demonstrates a requirement to open a file before
stat will work.

I use XP Pro with both fat32 and NTFS. I have never needed to open a file
to stat it.

Maybe you have some other problem?

P
 
T

Tad McClellan

arek said:
Hmm... So the Perl Code I posted,
Which I had modified from another Authors Code that
has been heavily involved in Perl for years WAS NOT PERL CODE?


If it has syntax errors then it is not Perl.

YOU make me Laugh!


I'm crushed, since I value your opinion so highly.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,734
Messages
2,569,441
Members
44,832
Latest member
GlennSmall

Latest Threads

Top