Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Write to Debug file globally 2

Status
Not open for further replies.

sen5241b

IS-IT--Management
Sep 27, 2007
199
US
I want to write info to my debug or 'dump' file from anywhere in the scope of my app. I have a settings.php, a main.php and a LibOfFuncs.php file. Is there a way to create a 'global' file handle I can write to?
 
sure.
Code:
$GLOBALS['fh'] = fopen('myfile.txt', 'w');

but why not just do this from anywhere, then you don't have to hold open a file handle.

Code:
file_put_contents('myfile.txt', $myData, FILE_APPEND);
 
If one of those files (settings.php or LibOfFuncs.php) is included by all the other files then just create a function in there. Then any time you want to write to the log just call the function.

Code:
function Log($msg){
   $fp = fopen("log.txt","a+");
   fwrite($fp, $msg."\r\n");
   fclose($fp);
}
Code:
Log("Page Loaded");
Code:
include "LibOfFuncs.php";

Log("someotherpage loaded");
Code:
Page Loaded
someotherpage loaded
Or you can make a class, I personally like classes for this type of thing because they are easy to move from one project to a new project.
 
Is one way faster than the other or are the differences negligible?
 
file_put_contents is faster than fopen + fwrite + fclose mainly because there is only one line to interpret for the parser. In other respects the manual reports that calling file_put_contents is the equivalent of calling fopen + fwrite + fclose

a global debugging/logging script is often useful and its usefulness will override any negligible parsing impact by having the extra class/function in scope.
 
This is what I do.

I use a class to handle all debugging.
When I'm developing, changing something, or hunting for a problem, I turn debugging on. But normally it is turned off.

$debug = true; || $debug = false;
in a header file.

This allows me to save processing time by turning it off.

Second thing I do is I don't write to a debug file until the very last line of the script.
I use header and footer files so the last line in my footer.php is $debugger->Output();

Which again, if debugging is turned off, does nothing.
This way, even if debugging is turned on, the log file only gets opened once.

A simple example: (I use classes so I can use it in all my programs, but this will work as well)
Code:
<?
$_custom_log = "";
function AddToLog($msg){
  global $_custom_log;
  $_custom_log .= $msg."\r\n";
}
function SaveLog(){
  global $_custom_log;
  
  $contents  = "==============================";
  $contents .= date("Y-m-d h:i:s a")."\r\n";
  $contents .= $_custom_log;
  // update file
  for ($r=0;$r<=60;$w++){ // Try 60 times to get write access to the file
    if ($r > 0) sleep(1); // if NOT the first attempt, sleep 1 second
    $fp = @fopen("log.php","a+");
    if ($fp) break; // Got access, break loop
  }
  if ($fp){ // If access to the file was gained
    fputs($fp,$content); // Inject contents
    fclose($fp); // Close file
  }
}
?>
Then later you in your script you do this
Code:
AddToLog("whatever you want");
AddToLog("however many times you want");
Code:
saveLog();

Just a warning, this will make some large log files quickly, but if you make it right, it can be extreamly helpful when debugging a website without the current users knowing anything different.

Also something to note, I haven't use file_put_contents before, but if you use the fopen they way I did, it will silently try 60 times to gain access to the log file. You need to do this because only 1 script/program can write to a file at once, if you try to open the logfile while another script is using it, it will fail. (unless you code it like shown above)

Also if you have a bunch of different files, you might want to consider logging into different log files. main.php.log.txt, index.php.log.txt or something like that. Your logs can get hard to read if your not careful.

If you want to see an example using classes then let me know.
 
Very good info ruggie, but I'm concerned the $contents var could get huge. How big can a var get?
 
there is no intrinsic limit on variable size in php. the constraints are imposed (artificially) by php.ini and actually by the amount of memory available on the machine.

as for ruggle's code, if your variables are very large their size requirements will double as the code takes copies of the variables. if this is genuinely a concern then you can work around it with references. However, if your debug output is genuinely that large so as to be a concern I find it difficult to see how the information would be helpful.

Ruggle's point on getting a write lock is important. however I don't think that his code will give you a write lock, depends on the OS. Conversely, you can force a write lock using file_put_contents
Code:
file_put_contents($filename, $data, FILE_APPEND|LOCK_EX)
i'm not sure (not checked the source) whether php will wait for a write lock or fail if it cannot obtain one. just in case, borrowing from Ruggle, you might try this:
Code:
while (!file_put_contents($filename, $data, FILE_APPEND|LOCK_EX)){
 continue;
}
 
I'm used to working on Windows servers in which the lock happens automatically. (might have been a setting I changed, can't remember)(One of my major scripts writes to a lot of database 'files', not my first choice but it was the only way to solve a few cross-network issues when our data-center blocks a ton of common ports for security)

I wouldn't want to use a while loop however because if the lock takes 5 minutes, because something extensive or unexpected is happening, your user will not get the entire page or any part of the page until that code completes. (depends on how or if your using buffering)

This is why I use a for loop with a limited count, also if the file is in use, there is no need to slam the processing time trying to get access to it several times a second, this will only slow down things on the server! This is why I use the sleep command.
If you want to force a lock you can use the flock command with the code I provided above. The most it will ever slow down a page is 60 seconds, and you can adjust that just by lowering the number inside the for loop.

And don't worry about the size of $contents, or the copy time, if your transfering half a meg or something that would be bad, but the most you would be transfering is going to be a few lines of text.

Another way to save memory if your really picky is:
Code:
function SaveLog(){
  global $_custom_log;
  
 // update file
  for ($r=0;$r<=60;$w++){ // Try 60 times to get write access to the file
    if ($r > 0) sleep(1); // if NOT the first attempt, sleep 1 second
    $fp = @fopen("log.php","a+");
    if ($fp) break; // Got access, break loop
  }
  if ($fp){ // If access to the file was gained
    fputs($fp,"==============================\r\n".date("Y-m-d h:i:s a")."\r\n".$_custom_log); // Inject contents
    fclose($fp); // Close file
  }
}

Check out a file generator I made. The variable $output expands to megs and fwrite outputs the file very quickly. I ran this script on my local machine and it pumps out megs of data very fast. If your using an older version of PHP you may need to increase you memory_limit inside php.ini, but 5.2.1 and up its 128Megs
 
the while loop would timeout in accordance with your timeout settings in php.ini (which one assumes is what the site admin wants!)
 
Yes it would timeout, but if that happens, your page and any following code times out as well!
I don't know about others, but I would rather have the file write be dropped and serve the user the rest of the page before the time_limit kicks in.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top