Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

FTP Timeout option

Status
Not open for further replies.

howdthattaste

Programmer
Apr 12, 2007
17
US
Hello,

I have a script that will run through a list of filenames to get() from the ftp site. The ftp site is known for being unreliable at times.

I need a way to skip over a file that is taking to long (or there was an error) and retry it later. I thought that if the timeout limit expired, it would move onto the next command, but it seems the script dies.

I tried searching for more detailed information about the Timeout option, but could only find the basics:
"in seconds" and
"The Timeout option gives the number of seconds all operations wait before giving up."

what is giving up? die?

here is the part of code i use:

Code:
#---loop through files and 'get' each one
while(<@ftp_filenames>)
{
	chomp($_);
	my$file = $_;
	unless( $ftp->get($file) )
	{
		print "Error getting $file\n";
		push(@errors,$file);
	}
}

$ftp->quit();

I'm pushing the errant files to an array to retry later. (i plan to only retry once) UNLESS... you guys think theres a better way to handle timeouts/errors.

thanks


"Not New York, Kansas
 
1: Where does your script die ?
2: AFAIK, a timeout is treated like an error, and you can interrogate $! to determine if the error is a timeout or not.
 
unfortunately, its hard to recreate the timeout. like i said, it "seems" it died. im not really sure if it actually dies or it a timeout issues a disconnect or something.

so i got smart and set my timeout to 1 second and setup my filelist to only try to get the largest files (which are max 1.9mb). So it timesout on the first file, sometimes. here is the output, post login: (i've changed the dir/file names)
Code:
...
Net::FTP=GLOB(0x1a56fb0)>>> PASS ....
Net::FTP=GLOB(0x1a56fb0)<<< 230 User anonymous logged in.
Net::FTP=GLOB(0x1a56fb0)>>> CWD some_dir
Net::FTP=GLOB(0x1a56fb0)<<< 250 CWD command successful.
Net::FTP=GLOB(0x1a56fb0)>>> CWD some_sub_dir
Net::FTP=GLOB(0x1a56fb0)<<< 250 CWD command successful.
Net::FTP=GLOB(0x1a56fb0)>>> PASV
Net::FTP=GLOB(0x1a56fb0)<<< 227 Entering Passive Mode (12,24,113,42,143,8)
Net::FTP=GLOB(0x1a56fb0)>>> RETR file1.csv
Net::FTP=GLOB(0x1a56fb0): Timeout at ftp.pl line 42
Error getting file1.csv
Net::FTP=GLOB(0x1a56fb0)>>> PASV
Net::FTP=GLOB(0x1a56fb0)<<< 150 Data transfer starting.
Error getting file2.csv
Net::FTP=GLOB(0x1a56fb0)>>> PASV
Net::FTP=GLOB(0x1a56fb0)<<< 150 Data transfer starting.
Error getting file3.csv
Net::FTP=GLOB(0x1a56fb0)>>> PASV
Net::FTP: Unexpected EOF on command channel at ftp.pl line 42
Error getting file4.csv
Error getting file5.csv

So it actually does try to get the other files, but was unsuccessful. Not sure why, i didn't include $! in my print statement. Anyway, how do i handle $! for a timeout so it moves onto the next file?

thanks again!


"Not New York, Kansas
 
Try something like....

Replace:
Code:
    unless( $ftp->get($file) )
    {
        print "Error getting $file\n";
        push(@errors,$file);
    }
With:
Code:
    my $error=0;
    $ftp->get($file) or $error=1;
    if($error) {
        my $errmsg = $!;
        if(substr($errmsg,0,7) == "Timeout") {
          print "Error getting $file\n";
          push(@errors,$file);
          $error = 0;
        } else {
          die $errmsg;
        }
    }

..which interrogates a copy of $! for "Timeout" if an error occurs. If it is a timeout, it should do your processing and clear the error flag. If not, then it should die with your saved error message.
 
thanks for the suggestion, but i don't yet see how this is much different than using unless(). also a few things:

Code:
if(substr($errmsg,0,7) == "Timeout")
"Timeout" isn't numeric, should be 'eq' instead, but i get the idea. (its probably better to do 'if($errmsg =~ /Timeout/)' if i wanted to check through the error message for the string.) Also, i dont see the need to reset $error=0, when its already being reset each prior to issuing the get($file). And i dont see what reseting it to zero does.

So, what happens differently in your code from my code when an error occurs with "timeout" in it? in my code, i log the filename in question in an array. in yours, you additionally reset a seemingly arbitrary variable, $error.

I ran it anyway, and again halfway through the filelist, a timeout occurs and the script does not try to get() any more files.

(again, its hard to replicate a timeout, i will post my code and log as soon as i can capture a good one)


----------------------------------
"Not New York..., Kansas
 
If you can manually block port 21 with a firewall or router, you should be able to simulate a command channel timeout. Blocking port 20 would cause a data channel timeout.

Also, try checking $ftp->message instead of $!

I know I've written something in the past to handle problems like this, but I can't find it at the moment.
 
Had a very similar problem in a .ksh script and used the following trick (taught to me by one of our Unix gurus):

Code:
 TDUR=900 ftp -n ...

As he explained it, the TDUR sets the ftp timeout on our AIX 5.2 box to 900 seconds (15 minutes) for this job only, then for any other ftp connection it goes back to the default.

Might be worth a shot to see if something similar can be used in Perl ...

BTW, this works like a charm - my daily failed ftp job due to file size (18Mb and 16MB) is no longer an issue.

Good luck!

Tom

"My mind is like a steel whatchamacallit ...
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top