Why would it appear to my scripts that a server they're connecting to

T

Ted Byers

Here is the interesting part of the log for my script that uses
NET::FTP:

Copying 44 files
Net::FTP=GLOB(0x3e18a04)>>> TYPE I
Net::FTP=GLOB(0x3e18a04)<<< 200 Type set to I
Status: 2

C_MerchantData_ALT_Work.2008-11-12.zip
Net::FTP=GLOB(0x3e18a04)>>> ALLO 5062889
Net::FTP=GLOB(0x3e18a04)<<< 202 No storage allocation neccessary.
Net::FTP=GLOB(0x3e18a04)>>> PORT 10,1,10,124,7,5
Net::FTP=GLOB(0x3e18a04)<<< 200 Port command successful
Net::FTP=GLOB(0x3e18a04)>>> STOR C_MerchantData_ALT_Work.
2008-11-12.zip
Net::FTP=GLOB(0x3e18a04)<<< 150 Opening data channel for file
transfer.
Net::FTP=GLOB(0x3e18a04): Timeout at C:/Perl/site/lib/Net/FTP/
dataconn.pm line 7
4
Status: 1

C_MerchantData_ALT_Work.2008-11-13.zip
Net::FTP=GLOB(0x3e18a04)>>> ALLO 5299854
Net::FTP: Unexpected EOF on command channel at C:\backup
\copy.backups.to.T.O.pl
line 48
Status: 5


Qualitatively, it looks like only one zip archive is tranfered (and my
colleagues in TO can open them and verify the contents), and then the
ftp session seems to go away and there is no more interaction with the
ftp server.

Here is the interesting part of a log created (LWP::DebugFile):

Here is the last bit pertaining to the download that occured just
before the failed dowwnload:
LWP::protocol::http::request: ()
# Time now: {1228452248} = Thu Dec 4 23:44:08 2008
LWP::UserAgent::request: Simple response: OK

# LWP::DebugFile logging to lwp.log
# Time now: {1228452249} = Thu Dec 4 23:44:09 2008
LWP::UserAgent::new: ()
LWP::UserAgent::request: ()

ignoring the URL requested (it doesn't matter because if I select a
temporal interval involving a few kbytes of data, all works fine),
here is what follows:

..LWP::protocol::http::request: ()
# Time now: {1228452869} = Thu Dec 4 23:54:29 2008
LWP::UserAgent::request: Simple response: Internal Server Error

# LWP::DebugFile logging to lwp.log
# Time now: {1228452870} = Thu Dec 4 23:54:30 2008
LWP::UserAgent::new: ()
LWP::UserAgent::request: ()

I find it interesting to observe that simply including use of
LWP::DebugFile is sufficient to prevent the script from freezing or
locking up. Instead, an error like this is logged and the script
proceeds to the next request. The error shown above happens in the
middle of the download, and in this run, this is the only download
that failed.

Questions:

1) How do I get more information from LWP::DebugFile? Or is there any
more information to be had?
2) I'd also included the following statements:

use Log::Trace;
use Log::Trace 'print';
use Log::Trace log => 'master.generic.download.trace.log';
use Log::Trace warn => { Deep => 1, Everywhere => 1, Verbose => 2 };

But I did not get any log from Log::Trace. i must have missed
something, but what?

3) The information I'm getting from Net::FTP seems fine, except I have
yet to find the documentation for what the status codes (returned by
the status function) mean. (I found curl, but haven't had time to
compile and run it on the same files). Where will I find a more
complete description of these status codes?

4) Most importantly of all, why do these transfers fail and what can
be done about them?

Thanks

Ted
 
T

Tim Greer

Ted said:
Net::FTP=GLOB(0x3e18a04): Timeout at C:/Perl/site/lib/Net/FTP/
dataconn.pm line 7

What does the error log say on the remote site? Did you run any tools
or commands to check the connectivity? How's the route from your ISP
connection to the remote host, especially when this happens? How about
the route back, if you can get that information? Did you test
FTP/uploading/downloading for the same site via a normal FTP client or
from the ftp command on the command line to verify it works without
issue every time, and only fails when you use the script, or are you
only testing this with the script?
 
T

Ted Byers

What does the error log say on the remote site?  Did you run any tools
or commands to check the connectivity?  How's the route from your ISP
connection to the remote host, especially when this happens?  How about
the route back, if you can get that information?  Did you test
FTP/uploading/downloading for the same site via a normal FTP client or
from the ftp command on the command line to verify it works without
issue every time, and only fails when you use the script, or are you
only testing this with the script?
--
Tim Greer, CEO/Founder/CTO, BurlyHost.com, Inc.
Shared Hosting, Reseller Hosting, Dedicated & Semi-Dedicated servers
and Custom Hosting.  24/7 support, 30 day guarantee, secure servers.
Industry's most experienced staff! -- Web Hosting With Muscle!

Thanks Tim,

So far I have tested only with my script. This is a new problem
domain for me, as I have not written code to automate ftp before a
week ago. I am therefore groping in the dark, as it were. Getting
the basic stuff to work isn't so much a problem as making it reliable.

I don't have remote access to the error log on the remote site (but I
have directed my colleague in the office to scrutinize that.

What tools would you recommend to check connectivity between sites
(both are actually remote for me, one 100 KM away and the other much
further), and both live behind a firewall so outside access is
severely restricted, so what I'll have to do is run whatever tool s
available from the remote desktop, if that matters.

Thanks

Ted
 
T

Tim Greer

Ted said:
Thanks Tim,

So far I have tested only with my script.  This is a new problem
domain for me, as I have not written code to automate ftp before a
week ago.  I am therefore groping in the dark, as it were.  Getting
the basic stuff to work isn't so much a problem as making it reliable.

I don't have remote access to the error log on the remote site (but I
have directed my colleague in the office to scrutinize that.

What tools would you recommend to check connectivity between sites
(both are actually remote for me, one 100 KM away and the other much
further), and both live behind a firewall so outside access is
severely restricted, so what I'll have to do is run whatever tool s
available from the remote desktop, if that matters.

I'd recommend running some ping/traceroutes to begin with, when the
issue transpires, as well as trying a normal FTP program first and see
if it fails at all, or how often. The FTP service could be having
issues, it could have too many connections from the source IP or any
number of things. Ask the host for support to check their logs to see
what issues are reported on their end. If you determine it's the
script somehow, post the relevant portions of the code, once you've
ruled out network/connectivity or service issues.
 
T

Ted Byers

I'd recommend running some ping/traceroutes to begin with, when the
issue transpires, as well as trying a normal FTP program first and see
if it fails at all, or how often.  The FTP service could be having
issues, it could have too many connections from the source IP or any
number of things.  Ask the host for support to check their logs to see
what issues are reported on their end.  If you determine it's the
script somehow, post the relevant portions of the code, once you've
ruled out network/connectivity or service issues.
--
Tim Greer, CEO/Founder/CTO, BurlyHost.com, Inc.
Shared Hosting, Reseller Hosting, Dedicated & Semi-Dedicated servers
and Custom Hosting.  24/7 support, 30 day guarantee, secure servers.
Industry's most experienced staff! -- Web Hosting With Muscle!- Hide quoted text -

- Show quoted text -


Thanks Tim. I appreciate this. As I am working on this with a
colleague who has physical access to the nearest machine, we'll be
able to check the logs Monday when he's back in the office. Like I
said, these machines are behind strict firewalls, so there is limited
connectivity to the outside world. For example, the ftp server is
configured to accept connections only from with the LAN behind the
firewall and from the machine running my script. Otherwise, this
machine has no contact to the outside world at all. I wonder if the
firewall or router may be an issue?

Thanks again

Ted
 
C

C.DeRykus

....
Net::FTP=GLOB(0x3e18a04): Timeout at C:/Perl/site/lib/Net/FTP/
dataconn.pm line 7
4
Status: 1

....
Questions:

1) How do I get more information from LWP::DebugFile? Or is there any
more information to be had?
...
Net::FTP=GLOB(0x3e18a04): Timeout at C:/Perl/site/lib/Net/FTP/
dataconn.pm line 7
....

In case you haven't tried, you might set the configurable timeout:

Timeout - Set a timeout value (defaults to 120)
 
P

Peter J. Holzer

Thanks Tim. I appreciate this. As I am working on this with a
colleague who has physical access to the nearest machine, we'll be
able to check the logs Monday when he's back in the office. Like I
said, these machines are behind strict firewalls, so there is limited
connectivity to the outside world. For example, the ftp server is
configured to accept connections only from with the LAN behind the
firewall and from the machine running my script. Otherwise, this
machine has no contact to the outside world at all. I wonder if the
firewall or router may be an issue?

FTP is nasty for firewalls. It opens a seperate connection for each file
transfer, from and to essentially random ports, and in different
directions (in "active mode" the server connects back to the client, in
"passive mode" the server expects a connection from the client). Your
log file snippet suggests that you are using active mode and that the
server isn't able to connect back to your client. Try passive mode
instead. Are these transfers already done manually? If so, check the
settings and behaviour of the ftp client. And ask the firewall admin
about the details for ftp access.

hp
 
B

Bill H

Here is the interesting part of the log for my script that uses
NET::FTP:

Copying 44 files
Net::FTP=GLOB(0x3e18a04)>>> TYPE I
Net::FTP=GLOB(0x3e18a04)<<< 200 Type set to I
Status: 2

C_MerchantData_ALT_Work.2008-11-12.zip
Net::FTP=GLOB(0x3e18a04)>>> ALLO 5062889
Net::FTP=GLOB(0x3e18a04)<<< 202 No storage allocation neccessary.
Net::FTP=GLOB(0x3e18a04)>>> PORT 10,1,10,124,7,5
Net::FTP=GLOB(0x3e18a04)<<< 200 Port command successful
Net::FTP=GLOB(0x3e18a04)>>> STOR C_MerchantData_ALT_Work.
2008-11-12.zip
Net::FTP=GLOB(0x3e18a04)<<< 150 Opening data channel for file
transfer.
Net::FTP=GLOB(0x3e18a04): Timeout at C:/Perl/site/lib/Net/FTP/
dataconn.pm line 7
4
Status: 1

C_MerchantData_ALT_Work.2008-11-13.zip
Net::FTP=GLOB(0x3e18a04)>>> ALLO 5299854
Net::FTP: Unexpected EOF on command channel at C:\backup
\copy.backups.to.T.O.pl
line 48
Status: 5

Qualitatively, it looks like only one zip archive is tranfered (and my
colleagues in TO can open them and verify the contents), and then the
ftp session seems to go away and there is no more interaction with the
ftp server.

Here is the interesting part of a log created (LWP::DebugFile):

Here is the last bit pertaining to the download that occured just
before the failed dowwnload:
LWP::protocol::http::request: ()
# Time now: {1228452248} = Thu Dec  4 23:44:08 2008
LWP::UserAgent::request: Simple response: OK

# LWP::DebugFile logging to lwp.log
# Time now: {1228452249} = Thu Dec  4 23:44:09 2008
LWP::UserAgent::new: ()
LWP::UserAgent::request: ()

ignoring the URL requested (it doesn't matter because if I select a
temporal interval involving a few kbytes of data, all works fine),
here is what follows:

.LWP::protocol::http::request: ()
# Time now: {1228452869} = Thu Dec  4 23:54:29 2008
LWP::UserAgent::request: Simple response: Internal Server Error

# LWP::DebugFile logging to lwp.log
# Time now: {1228452870} = Thu Dec  4 23:54:30 2008
LWP::UserAgent::new: ()
LWP::UserAgent::request: ()

I find it interesting to observe that simply including use of
LWP::DebugFile is sufficient to prevent the script from freezing or
locking up.  Instead, an error like this is logged and the script
proceeds to the next request.  The error shown above happens in the
middle of the download, and in this run, this is the only download
that failed.

Questions:

1) How do I get more information from LWP::DebugFile?  Or is there any
more information to be had?
2) I'd also included the following statements:

use Log::Trace;
use Log::Trace 'print';
use Log::Trace log => 'master.generic.download.trace.log';
use Log::Trace warn => { Deep => 1, Everywhere => 1, Verbose => 2};

But I did not get any log from Log::Trace.  i must have missed
something, but what?

3) The information I'm getting from Net::FTP seems fine, except I have
yet to find the documentation for what the status codes (returned by
the status function) mean.  (I found curl, but haven't had time to
compile and run it on the same files).  Where will I find a more
complete description of these status codes?

4) Most importantly of all, why do these transfers fail and what can
be done about them?

Thanks

Ted

Ted, not sure if this may be your problem but I had an issue with
NET::Ftp earlier this year. I have a script that runs as a cron job
and uses ftp to transfer files from a clients web server to their
interanl server. It had always worked great, then we upgraded the
server and it would never connect and just get timeout warnings, even
though I could connect from my own machine perfectly fine. It took me
a day or so to determine that the upgraded server was not allowing the
outgoing ftp connection because the destination server was an IP
address. After I setup a DNS for it and used a domain name in the
connection, it started working just fine again.

Bill H
 
T

Ted Byers

Ted, not sure if this may be your problem but I had an issue with
NET::Ftp earlier this year. I have a script that runs as a cron job
and uses ftp to transfer files from a clients web server to their
interanl server. It had always worked great, then we upgraded the
server and it would never connect and just get timeout warnings, even
though I could connect from my own machine perfectly fine. It took me
a day or so to determine that the upgraded server was not allowing the
outgoing ftp connection because the destination server was an IP
address. After I setup a DNS for it and used a domain name in the
connection, it started working just fine again.

Bill H- Hide quoted text -

- Show quoted text -

Thanks all.

We're on our way to a solution, but not quite there, yet. My
colleague (actually the ftp/router/network administrator) is convinced
it is largely a timeout error. He retried my script with a pile of
really small files (a few k each) and my script worked flawlessly.
The files that we need to copy to our head office, though, are
typically a few megabytes in size, with one or two that are larger
(e.g. a dump of the current state of a DB managed using MySQL, which
is 200 MB before being compressed and about 15 MB after being
compressed).

I guess, then, the new infomrmation here is that this problem exists
only for transfers of large files, and that all works correctly if we
process only tiny files.

But then, this is a similar problem to our script that uses LWP to
retrieve data from our data feed, in that if the amount of data is
small, all is fine, but there is a server error in the client log when
LWP is asked to retrieve a few MB from the same server (in the LWP
case, we don't have direct access to the server, so we're not sure if
the problem is our script, LWP or the other guy's server).

Did I mention I am paranoid about backing up our data and code? ;-)

Does anyone else use Net::FTP or LWP's user agents to move around
megabytes of data, in a single file? If so, are there any tricks of
the trade you'd care to share?

Thanks

Ted
 
T

Tim Greer

Ted said:
Does anyone else use Net::FTP or LWP's user agents to move around
megabytes of data, in a single file?  If so, are there any tricks of
the trade you'd care to share?

Even if the file is large, as long as the transfer is active, you
shouldn't be having problems with a timeout from inactivity. How is
your connection? It's been a while since I've used Net::FTP for
anything, and for backups I'd just suggest you create an account with a
jailed (chrooted) shell and use sftp, rsync or similar for secure
transmission.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,769
Messages
2,569,578
Members
45,052
Latest member
LucyCarper

Latest Threads

Top