Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Chris Miller on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

PHP Timeout

Status
Not open for further replies.

tweenerz

Programmer
Mar 25, 2002
202
US
I am having trouble with a script timing out at about 60 seconds every time.

The script does a few curl operations that access various API's and screen-grabs and it is in the middle of those curl operations at the moment it times out.

I have a progress checking mechanism that allows me check the progress of the script and I know that it is definitely not done doing what it needs to do; it is simply a time intensive operation. I am certain that it is not "hanging" on any particular operation since it tells me what page it is on (page 1 of x, etc etc). 60 seconds doesn't seem like a long time to execute, but rather a default setting somewhere that I cannot seem to override.

Here is what I have tried and currently have in the script:

Code:
set_time_limit(0);
ini_set('display_errors',1);
ini_set('max_execution_time', 300);
ini_set('default_socket_timeout', 300);
error_reporting(E_ALL);

There are no warnings or errors, just one basic notice that get spit out once it times out. There is nothing else in the output.

It simply times out at 60 seconds every time.

Any help is greatly appreciated.
 
You'll most likely have to change a line or two in your php.ini file. Here is what I found in mine
Code:
max_execution_time = 30     ; Maximum execution time of each script, in seconds
max_input_time = 60     ; Maximum amount of time each script may spend parsing request data

-----------------------------------------
I cannot be bought. Find leasing information at
 
max_execution_time in php.ini can be controlled at runtime through set_time_limit(). it can also be controlled by ini_set but they do the same thing.

i suspect something is either happening at the webserver end or something is going wrong in your curl script. can you share that with us?
 
set_time_limit() has no effect if running in safe mode. Can you verify if you are running in safe mode?

-----------------------------------------
I cannot be bought. Find leasing information at
 
jaxtell,

I set max_execution_time to 300 seconds already and I included max_input_time at 300 too and it didn't help. safe_mode is set to OFF in the phpinfo() screen.

jpadie,

The curl operations do a series of API requests and screen scrapes. As I alluded to before, I am recording the progress of the script in the db as it runs and manually checking it while it is processing. Everything is working as it should and it never gets hung up on any particular spot. It just simply needs more time to execute and ceases at 60 seconds every time and gives me a blank screen (even with errors turned on and enabled so its not a fatal error issue - in fact I did get one NOTICE).

It is also worth noting that this is an internal script, not a public one. So I am not worried about losing visitors or anything. This script actually saves about an hour or more of manual work, so relatively speaking it's pretty fast. I just need to find a way to keep it from halting after 60 seconds.
 
that's why i wanted to take a look at it. there are some curl options that you might want to set - take a look at CURLOPT_TIMEOUT.

are you getting any CURL errors on timeout? and is your script handling misconnection attempts gracefully?
 
Ok. Here is some of the code for the curl. There is a lot of information I dont want to display for proprietary reasons, so I'll do my best.

Code:
	$url = "[URL unfurl="true"]http://blahblahblah.com";[/URL]
	$ch = curl_init();
	curl_setopt($ch, CURLOPT_HEADER, 0);
	curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
	curl_setopt($ch, CURLOPT_TIMEOUT, 300);
	curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 300);
	$onPage = 1;
	$myArr = array();
	while($onPage <= $totalPages || empty($totalPages)){
          curl_setopt($ch, CURLOPT_URL, $url);
          $result = curl_exec($ch);
          $mainArr = explode("\n",$result);
          foreach ($mainArr as $i => $line){
            // do some regex and string manipulation here
            // also log progress in the db here as it runs
            // no problems with db functionality.
            // $totalPages is set here as well and verified in the progress
            $onPage++;
          }
	}
	curl_close($ch);
	ksort($myArr);

Question: if it is something wrong with the code, why would it timeout seemingly exactly at 60 seconds each time and at different points in the process?

And to answer your question about errors, I set error_reporting to E_ALL and all I get is one NOTICE for an undeclared variable. No other errors. Just a blank page.
 
i wasn't inferring that there was something syntactically wrong with your code. i assumed you were calling multiple pages and the one you hit after sixty seconds was being mishandled by cURL.

and i'm not referring to php errors but cURL errors. but if you are getting a php timeout then i'm not entirely sure how you would capture the curl error anyway.

i can not see anything obviously incorrect in your code, other than the if statement has the conditions the wrong way around (hence the notice you are receiving). i assume that $url changes each iteration? so the next step of debugging is to analyse what is going on inside the foreach loop.

one other thought. instead of setting the timelimit to 0 why not set it to something meaningful in the loop itself. say 10 secs for each iteration.

i would also suggest testing the value of $result for FALSE to capture any errors in curl_exec. perhaps that would be a good time to examine CURL_ERROR too.
 
oh. a thought. noone has asked you what OS, webserver and php version (number and flavour :CGI/CLI/SAPI) you are running. this could be critical! IIS for example, has a bunch of time out options that override php (for CGI/FastCGI etc). but then you would be getting IIS errors and not PHP time out errors. or is that what you are getting?
 
Thanks jpadie,

I am using a lamp system of some sort. Not sure what exact flavor of unix it is (linux I think).

I am actually not getting any errors at all, just a blank page .

When you said wrong order for the if statement, were you referring to the while statement? Because there is no if statement. And thats not the notice I am getting; its a simple non-declaration of a variable notice.

The $url variable does change with each iteration since I construct it dynamically for the next page in the web site.

I put in a piece of code that exits out of the loop completely after 55 seconds to avoid the timeout. It works every time now, but I am obviously missing out on some pages. With that new piece of code in place, I sometimes get as many as 20 pages back, sometimes I get as little as 7. It all depends on how responsive the website($url) is at the time I run the script.

regarding debugging what is going on in the foreach loop ... I am storing the current step in the database and reviewing in manually while the script is running and nothing seems to be wrong - no infinite looping, no hanging, etc.

I tried testing the $result variable for false, and it was never false and there was never any CURL_ERRORs.

this one really has me stumped, I can usually figure this stuff out after a while.
 
i meant the while. yes. and you would get a notice because the conditions are in the wrong order as totalPages may not be instantiated at the first test.

can you let us know what type of LAMP set up you are using. which version of apache, which version of php and what type of php connector you are using.

i would also try placing all the cURL code into the loop and expressly closing it at the end of each cycle.
 
oh. and can you cut and paste the full error messge you receive. or take a screenshot or something?
 
jpadie,

OK. I decided to simplify things quite a bit in an attempt to isolate the problem. I created a page with the following, including all the configurations I used.

I am not getting any error messages - just a blank screen. Nothing in the error logs either. By getting the NOTICE, that tells me I would be getting errors if there were any.

I would give you a screenshot, but I assume you don't really want a screenshot of a blank page.

Code:
<?php
set_time_limit(0);
error_reporting(~E_ALL);
ini_set('display_errors',1);
ini_set('max_execution_time', 300);
ini_set('max_input_time', 300);
ini_set('default_socket_timeout', 300);
sleep(65);
print 'made it here';
exit;
?>

When I run the script with the code above (nothing else in the script), I get a blank screen after 60 seconds. I dont see 'made it here'. So there is definitely a setting I am missing somewhere. The code cant get much simpler than that. When I change it to sleep(59);, I get the message.

Environment
PHP Version 5.2.5
Apache/1.3.
Linux (not sure what version)
 
in apache 1 there is a directive in the conf file for timeouts. you might want to check it.

could you try this code as it should avoid the above issue too:
Code:
<?php
ini_set('default_socket_timeout', 69);
ini_set('display_errors', true);

$url = "[URL unfurl="true"]http://blahblahblah.com";[/URL]
$onPage = 1;
$myArr = array();
error_reporting(E_ALL);
echo "<ul>";
while($onPage <= $totalPages || empty($totalPages)){
	set_time_limit(70);
	
	//start cURL
	$ch = curl_init();
	
	//set cURL options
	curl_setopt($ch, CURLOPT_HEADER, 0);
	curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
	curl_setopt($ch, CURLOPT_TIMEOUT, 300);
	curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 300);
	curl_setopt($ch, CURLOPT_URL, $url);
	
	//get the page
	$result = curl_exec($ch);
	echo "<li>Processing page $onPage.&nbsp;";
	
	if(!$result){
		echo "Error returned: " . curl_error($ch);
	} else {
		echo "No Errors returned";
	}
	
	//attribute to $mainArr
	$mainArr[] = explode("\n",$result);
	$onPage++;
	curl_close($ch);
	
	//change the $url as needed
	
}
//close the output list
echo "</ul>"; 

//check the output
set_time_limit(0);
foreach ($mainArr as $page){
	echo "<pre>";
	echo htmlspecialchars($page);
	echo "</pre>";
	echo "<hr/>";
}
?>
 
OK. I think I have found out the problem. I don't exactly know what part of the process is actually timing out, but I think it has to do with the fact that there is no output (even though I ran the flush() command)

So if there hasn't been any output in 60 seconds, something in the process is figuring that there will never be any output I guess so it simply stops.

On that note, one thing that I have never been able to figure out is how and when the flush command actually works. Sometime when I use it it works, other times it seems to be ignored.

In any case, I am gonna do some more testing and will report back.
 
I'm not. I just tried to call the flush() command (not ob_flush()) to manully/explicity flush the content. But it doesnt work.

I dont know why it is not printing out the out as it goes.

Buffering can happen automatically at every conceivable level: php, apache, server, and browser.

So I just need to figure out how to do this so that it spits the content out to the browser periodically instead of waiting until the end.

In fact when I changed the snippet of code above to have several sleep(20) statements and printed something out inbetween:
Code:
sleep(25);
print 'step 1';
sleep(25);
print 'step 2';
sleep(25);
print 'step 3';
sleep(25);
print 'step 4';
sleep(25);
print 'step 5';
sleep(25);
print 'step 6';
sleep(25);
print 'made it here';
exit;

it printed out a statement every 25 seconds and finished even though it took longer than 60 seconds. So someting doeesnt like to see no output for 60 seconds. It just shuts off.
 
Assuming the issue isn't with php, can you ssh into your server and run php directly? That should avoid the problem.

-----------------------------------------
I cannot be bought. Find leasing information at
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top