chitownclone
Programmer
My program was just transferred from a shared server to a dedicated. The default timeout on the decided is 3 minutes, so when my spider comes across a URL that is not active or doesn't return code....my spider tries for exactly 3 minutes, when I need it to be 30 seconds.
Here's the code:
---------------------------------------------
#!/usr/bin/perl -Tw
require "/home/&ReadParse(*in);
print &PrintHeader;
use LWP::Simple;
require LWP::RobotUA;
require LWP::UserAgent;
$ua = new LWP::UserAgent;
$ua->timeout([1]);
open (READ, "/home/@url=<READ>;
close(READ);
foreach $url (@url){
# Process Code
}
---------------------------------------------
Anyone see what I am doing wrong or have any suggestions to reset my timeout from 3 minutes to 30 seconds.
Thanks
Here's the code:
---------------------------------------------
#!/usr/bin/perl -Tw
require "/home/&ReadParse(*in);
print &PrintHeader;
use LWP::Simple;
require LWP::RobotUA;
require LWP::UserAgent;
$ua = new LWP::UserAgent;
$ua->timeout([1]);
open (READ, "/home/@url=<READ>;
close(READ);
foreach $url (@url){
# Process Code
}
---------------------------------------------
Anyone see what I am doing wrong or have any suggestions to reset my timeout from 3 minutes to 30 seconds.
Thanks