howdthattaste
Programmer
Hello,
I have a script that will run through a list of filenames to get() from the ftp site. The ftp site is known for being unreliable at times.
I need a way to skip over a file that is taking to long (or there was an error) and retry it later. I thought that if the timeout limit expired, it would move onto the next command, but it seems the script dies.
I tried searching for more detailed information about the Timeout option, but could only find the basics:
"in seconds" and
"The Timeout option gives the number of seconds all operations wait before giving up."
what is giving up? die?
here is the part of code i use:
I'm pushing the errant files to an array to retry later. (i plan to only retry once) UNLESS... you guys think theres a better way to handle timeouts/errors.
thanks
"Not New York, Kansas
I have a script that will run through a list of filenames to get() from the ftp site. The ftp site is known for being unreliable at times.
I need a way to skip over a file that is taking to long (or there was an error) and retry it later. I thought that if the timeout limit expired, it would move onto the next command, but it seems the script dies.
I tried searching for more detailed information about the Timeout option, but could only find the basics:
"in seconds" and
"The Timeout option gives the number of seconds all operations wait before giving up."
what is giving up? die?
here is the part of code i use:
Code:
#---loop through files and 'get' each one
while(<@ftp_filenames>)
{
chomp($_);
my$file = $_;
unless( $ftp->get($file) )
{
print "Error getting $file\n";
push(@errors,$file);
}
}
$ftp->quit();
I'm pushing the errant files to an array to retry later. (i plan to only retry once) UNLESS... you guys think theres a better way to handle timeouts/errors.
thanks
"Not New York, Kansas