Related
I have a particularly memory intensive function I'd like to (just while that function is running) up the allowed memory for it to complete.
Is it poor practice to use ini_set('memory_limit' , '1024M') within a php function and once the function is completed will it return to default value?
I know it's a high value to use. It's a dedicated server and has plenty of juice.
Example would be:
function run_Cron_Processes() {
ini_set('memory_limit', '1024M');
memoryIntensive1();
memoryIntensive2();
memoryIntensive3();
//return to default m limit
}
$oldLimit = ini_get( 'memory_limit' );
ini_set( 'memory_limit', '1024M' );
(...)
ini_set( 'memory_limit', $oldLimit );
But I think it is unnecessary: at the end of the script execution, memory limit is reset to default value.
The value will return to the default value after your script execution.
You can also set your limit to -1 in order to tell PHP that your script can use all the necessary memory.
I wouldn't call it is a poor practice, you got to do what you have to do. If you need more memory, that's the way to do it. You might also need to increase max_execution_time.
That being said, you might want to check your 'memoryIntensive' functions, and optimize things there.
ini_set('memory_limit','-1');
It will use as per require memory and will not return any error or notice related to memory side.
I've encountered the dreaded error-message, possibly through-painstaking effort, PHP has run out of memory:
Allowed memory size of #### bytes exhausted (tried to allocate #### bytes) in file.php on line 123
Increasing the limit
If you know what you're doing and want to increase the limit see memory_limit:
ini_set('memory_limit', '16M');
ini_set('memory_limit', -1); // no limit
Beware! You may only be solving the symptom and not the problem!
Diagnosing the leak:
The error message points to a line withing a loop that I believe to be leaking, or needlessly-accumulating, memory. I've printed memory_get_usage() statements at the end of each iteration and can see the number slowly grow until it reaches the limit:
foreach ($users as $user) {
$task = new Task;
$task->run($user);
unset($task); // Free the variable in an attempt to recover memory
print memory_get_usage(true); // increases over time
}
For the purposes of this question let's assume the worst spaghetti code imaginable is hiding in global-scope somewhere in $user or Task.
What tools, PHP tricks, or debugging voodoo can help me find and fix the problem?
PHP doesn't have a garbage collector. It uses reference counting to manage memory. Thus, the most common source of memory leaks are cyclic references and global variables. If you use a framework, you'll have a lot of code to trawl through to find it, I'm afraid. The simplest instrument is to selectively place calls to memory_get_usage and narrow it down to where the code leaks. You can also use xdebug to create a trace of the code. Run the code with execution traces and show_mem_delta.
Here's a trick we've used to identify which scripts are using the most memory on our server.
Save the following snippet in a file at, e.g., /usr/local/lib/php/strangecode_log_memory_usage.inc.php:
<?php
function strangecode_log_memory_usage()
{
$site = '' == getenv('SERVER_NAME') ? getenv('SCRIPT_FILENAME') : getenv('SERVER_NAME');
$url = $_SERVER['PHP_SELF'];
$current = memory_get_usage();
$peak = memory_get_peak_usage();
error_log("$site current: $current peak: $peak $url\n", 3, '/var/log/httpd/php_memory_log');
}
register_shutdown_function('strangecode_log_memory_usage');
Employ it by adding the following to httpd.conf:
php_admin_value auto_prepend_file /usr/local/lib/php/strangecode_log_memory_usage.inc.php
Then analyze the log file at /var/log/httpd/php_memory_log
You might need to touch /var/log/httpd/php_memory_log && chmod 666 /var/log/httpd/php_memory_log before your web user can write to the log file.
I noticed one time in an old script that PHP would maintain the "as" variable as in scope even after my foreach loop. For example,
foreach($users as $user){
$user->doSomething();
}
var_dump($user); // would output the data from the last $user
I'm not sure if future PHP versions fixed this or not since I've seen it. If this is the case, you could unset($user) after the doSomething() line to clear it from memory. YMMV.
There are several possible points of memory leaking in php:
php itself
php extension
php library you use
your php code
It is quite hard to find and fix the first 3 without deep reverse engineering or php source code knowledge. For the last one you can use binary search for memory leaking code with memory_get_usage
I recently ran into this problem on an application, under what I gather to be similar circumstances. A script that runs in PHP's cli that loops over many iterations. My script depends on several underlying libraries. I suspect a particular library is the cause and I spent several hours in vain trying to add appropriate destruct methods to it's classes to no avail. Faced with a lengthy conversion process to a different library (which could turn out to have the same problems) I came up with a crude work around for the problem in my case.
In my situation, on a linux cli, I was looping over a bunch of user records and for each one of them creating a new instance of several classes I created. I decided to try creating the new instances of the classes using PHP's exec method so that those process would run in a "new thread". Here is a really basic sample of what I am referring to:
foreach ($ids as $id) {
$lines=array();
exec("php ./path/to/my/classes.php $id", $lines);
foreach ($lines as $line) { echo $line."\n"; } //display some output
}
Obviously this approach has limitations, and one needs to be aware of the dangers of this, as it would be easy to create a rabbit job, however in some rare cases it might help get over a tough spot, until a better fix could be found, as in my case.
I came across the same problem, and my solution was to replace foreach with a regular for. I'm not sure about the specifics, but it seems like foreach creates a copy (or somehow a new reference) to the object. Using a regular for loop, you access the item directly.
I would suggest you check the php manual or add the gc_enable() function to collect the garbage... That is the memory leaks dont affect how your code runs.
PS: php has a garbage collector gc_enable() that takes no arguments.
I recently noticed that PHP 5.3 lambda functions leave extra memory used when they are removed.
for ($i = 0; $i < 1000; $i++)
{
//$log = new Log;
$log = function() { return new Log; };
//unset($log);
}
I'm not sure why, but it seems to take an extra 250 bytes each lambda even after the function is removed.
I didn't see it explicitly mentioned, but xdebug does a great job profiling time and memory (as of 2.6). You can take the information it generates and pass it off to a gui front end of your choice: webgrind (time only), kcachegrind, qcachegrind or others and it generates very useful call trees and graphs to let you find the sources of your various woes.
Example (of qcachegrind):
If what you say about PHP only doing GC after a function is true, you could wrap the loop's contents inside a function as a workaround/experiment.
One huge problem I had was by using create_function. Like in lambda functions, it leaves the generated temporary name in memory.
Another cause of memory leaks (in case of Zend Framework) is the Zend_Db_Profiler.
Make sure that is disabled if you run scripts under Zend Framework.
For example I had in my application.ini the folowing:
resources.db.profiler.enabled = true
resources.db.profiler.class = Zend_Db_Profiler_Firebug
Running approximately 25.000 queries + loads of processing before that, brought the memory to a nice 128Mb (My max memory limit).
By just setting:
resources.db.profiler.enabled = false
it was enough to keep it under 20 Mb
And this script was running in CLI, but it was instantiating the Zend_Application and running the Bootstrap, so it used the "development" config.
It really helped running the script with xDebug profiling
I'm a little late to this conversation but I'll share something pertinent to Zend Framework.
I had a memory leak problem after installing php 5.3.8 (using phpfarm) to work with a ZF app that was developed with php 5.2.9. I discovered that the memory leak was being triggered in Apache's httpd.conf file, in my virtual host definition, where it says SetEnv APPLICATION_ENV "development". After commenting this line out, the memory leaks stopped. I'm trying to come up with an inline workaround in my php script (mainly by defining it manually in the main index.php file).
I didn't see it mentioned here but one thing that might be helpful is using xdebug and xdebug_debug_zval('variableName') to see the refcount.
I can also provide an example of a php extension getting in the way: Zend Server's Z-Ray. If data collection is enabled it memory use will balloon on each iteration just as if garbage collection was off.
Is there any way to make PHP wait until a function returns before continuing?
This is my code:
<?php
set_time_limit(0);
function waitforchange($nof) {
$lfilemod=filemtime($nof);
while(filemtime($nof) == $lfilemod) {
clearstatcache();
usleep(10000);
}
}
waitforchange('./blahblah.txt')
sleep(5);
echo 'done';
?>
It is supposed to wait until blahblah.txt changes, wait another five seconds after that, then print out "done", however, it prints out "done" after five seconds, regardless whether the file actually changed.
PHP is single-threaded, which means that it executes one instruction at a time before moving on. In other words, PHP naturally waits for a function to finish executing before it continues with the next statement.
I tried executing your code on my machine, and it properly waited and waited until I modified the file before completing the function and displaying the message.
I can't even see anything in your code that might be failing. Since the creation of $lfilemod is performed in the same way as the check in the while loop, the condition of that loop would return TRUE and execute even if there was a problem with the file (filemtime would return FALSE on both errors, so the condition would read (FALSE == FALSE, which is obviously TRUE).
Does your PHP script modify the file at all before running that loop? If it does, then the initial value returned by filemtime might be the modification time from when the script originally started. When you run clearstatcache in the loop, you'll pick up the new modification time caused by your changes earlier in the script.
My advice:
Try running clearstatcache before setting the value of $lfilemod so that you know the value is clean, and that you're comparing apples-to-apples with what is being checked in the loop.
Make sure the file really isn't being modified. Try placing a couple debugging lines at the start and end of your code which prints out the last modification time for the file, then comparing the two yourself to see if PHP is reporting seeing a change in modification time.
This should go without saying, but you should make sure that PHP is configured to display all errors during development, so you are immediately shown when and how things go wrong. Make sure that the display_errors setting is turned On in your php.ini file (or use ini_set() if you can't modify the file itself), and that your error_reporting() is set to E_ALL | E_STRICT for PHP <= 5.3, or E_ALL for PHP 5.4 (E_STRICT is part of E_ALL as of that version). A better way is to just set your error reporting to -1 which effectively turns on all error reporting regardless of PHP version.
Try running your code again with these modifications. If you see that the file really is being modified, then you know that your code works. If the file isn't being modified, you should at least have an error that you can look up, or ask us here.
Try assigning the function call to a variable. I think it doesn't wait for the return otherwise.
The code looks good; save for a missing semicolon (;) after the waitforchange line. I tested putting the (;) in there and the script behaved as intended. Perhaps that is the culprit? I am at loss though to explain how you got your code to execute at all with that error in there.
I've encountered the dreaded error-message, possibly through-painstaking effort, PHP has run out of memory:
Allowed memory size of #### bytes exhausted (tried to allocate #### bytes) in file.php on line 123
Increasing the limit
If you know what you're doing and want to increase the limit see memory_limit:
ini_set('memory_limit', '16M');
ini_set('memory_limit', -1); // no limit
Beware! You may only be solving the symptom and not the problem!
Diagnosing the leak:
The error message points to a line withing a loop that I believe to be leaking, or needlessly-accumulating, memory. I've printed memory_get_usage() statements at the end of each iteration and can see the number slowly grow until it reaches the limit:
foreach ($users as $user) {
$task = new Task;
$task->run($user);
unset($task); // Free the variable in an attempt to recover memory
print memory_get_usage(true); // increases over time
}
For the purposes of this question let's assume the worst spaghetti code imaginable is hiding in global-scope somewhere in $user or Task.
What tools, PHP tricks, or debugging voodoo can help me find and fix the problem?
PHP doesn't have a garbage collector. It uses reference counting to manage memory. Thus, the most common source of memory leaks are cyclic references and global variables. If you use a framework, you'll have a lot of code to trawl through to find it, I'm afraid. The simplest instrument is to selectively place calls to memory_get_usage and narrow it down to where the code leaks. You can also use xdebug to create a trace of the code. Run the code with execution traces and show_mem_delta.
Here's a trick we've used to identify which scripts are using the most memory on our server.
Save the following snippet in a file at, e.g., /usr/local/lib/php/strangecode_log_memory_usage.inc.php:
<?php
function strangecode_log_memory_usage()
{
$site = '' == getenv('SERVER_NAME') ? getenv('SCRIPT_FILENAME') : getenv('SERVER_NAME');
$url = $_SERVER['PHP_SELF'];
$current = memory_get_usage();
$peak = memory_get_peak_usage();
error_log("$site current: $current peak: $peak $url\n", 3, '/var/log/httpd/php_memory_log');
}
register_shutdown_function('strangecode_log_memory_usage');
Employ it by adding the following to httpd.conf:
php_admin_value auto_prepend_file /usr/local/lib/php/strangecode_log_memory_usage.inc.php
Then analyze the log file at /var/log/httpd/php_memory_log
You might need to touch /var/log/httpd/php_memory_log && chmod 666 /var/log/httpd/php_memory_log before your web user can write to the log file.
I noticed one time in an old script that PHP would maintain the "as" variable as in scope even after my foreach loop. For example,
foreach($users as $user){
$user->doSomething();
}
var_dump($user); // would output the data from the last $user
I'm not sure if future PHP versions fixed this or not since I've seen it. If this is the case, you could unset($user) after the doSomething() line to clear it from memory. YMMV.
There are several possible points of memory leaking in php:
php itself
php extension
php library you use
your php code
It is quite hard to find and fix the first 3 without deep reverse engineering or php source code knowledge. For the last one you can use binary search for memory leaking code with memory_get_usage
I recently ran into this problem on an application, under what I gather to be similar circumstances. A script that runs in PHP's cli that loops over many iterations. My script depends on several underlying libraries. I suspect a particular library is the cause and I spent several hours in vain trying to add appropriate destruct methods to it's classes to no avail. Faced with a lengthy conversion process to a different library (which could turn out to have the same problems) I came up with a crude work around for the problem in my case.
In my situation, on a linux cli, I was looping over a bunch of user records and for each one of them creating a new instance of several classes I created. I decided to try creating the new instances of the classes using PHP's exec method so that those process would run in a "new thread". Here is a really basic sample of what I am referring to:
foreach ($ids as $id) {
$lines=array();
exec("php ./path/to/my/classes.php $id", $lines);
foreach ($lines as $line) { echo $line."\n"; } //display some output
}
Obviously this approach has limitations, and one needs to be aware of the dangers of this, as it would be easy to create a rabbit job, however in some rare cases it might help get over a tough spot, until a better fix could be found, as in my case.
I came across the same problem, and my solution was to replace foreach with a regular for. I'm not sure about the specifics, but it seems like foreach creates a copy (or somehow a new reference) to the object. Using a regular for loop, you access the item directly.
I would suggest you check the php manual or add the gc_enable() function to collect the garbage... That is the memory leaks dont affect how your code runs.
PS: php has a garbage collector gc_enable() that takes no arguments.
I recently noticed that PHP 5.3 lambda functions leave extra memory used when they are removed.
for ($i = 0; $i < 1000; $i++)
{
//$log = new Log;
$log = function() { return new Log; };
//unset($log);
}
I'm not sure why, but it seems to take an extra 250 bytes each lambda even after the function is removed.
I didn't see it explicitly mentioned, but xdebug does a great job profiling time and memory (as of 2.6). You can take the information it generates and pass it off to a gui front end of your choice: webgrind (time only), kcachegrind, qcachegrind or others and it generates very useful call trees and graphs to let you find the sources of your various woes.
Example (of qcachegrind):
If what you say about PHP only doing GC after a function is true, you could wrap the loop's contents inside a function as a workaround/experiment.
One huge problem I had was by using create_function. Like in lambda functions, it leaves the generated temporary name in memory.
Another cause of memory leaks (in case of Zend Framework) is the Zend_Db_Profiler.
Make sure that is disabled if you run scripts under Zend Framework.
For example I had in my application.ini the folowing:
resources.db.profiler.enabled = true
resources.db.profiler.class = Zend_Db_Profiler_Firebug
Running approximately 25.000 queries + loads of processing before that, brought the memory to a nice 128Mb (My max memory limit).
By just setting:
resources.db.profiler.enabled = false
it was enough to keep it under 20 Mb
And this script was running in CLI, but it was instantiating the Zend_Application and running the Bootstrap, so it used the "development" config.
It really helped running the script with xDebug profiling
I'm a little late to this conversation but I'll share something pertinent to Zend Framework.
I had a memory leak problem after installing php 5.3.8 (using phpfarm) to work with a ZF app that was developed with php 5.2.9. I discovered that the memory leak was being triggered in Apache's httpd.conf file, in my virtual host definition, where it says SetEnv APPLICATION_ENV "development". After commenting this line out, the memory leaks stopped. I'm trying to come up with an inline workaround in my php script (mainly by defining it manually in the main index.php file).
I didn't see it mentioned here but one thing that might be helpful is using xdebug and xdebug_debug_zval('variableName') to see the refcount.
I can also provide an example of a php extension getting in the way: Zend Server's Z-Ray. If data collection is enabled it memory use will balloon on each iteration just as if garbage collection was off.
Say you have a large PHP project and suddenly, when attempting to run it, you just end up with a blank page. The script terminates and you want to find exactly where that is with as little effort as possible.
Is there a tool/program/command/IDE that can, on PHP script termination, tell you the location of a script exit?
Note: I can't mark my own post as "accepted answer" so look at the bottom to see my solution. If you come up with a better solution I will mark your post as the answer.
I use the following code and need no special debugging environment. Note that this might take really long; you can set the ticks count higher - that makes it faster, but blurry.
function shutdown_find_exit()
{
var_dump($GLOBALS['dbg_stack']);
}
register_shutdown_function('shutdown_find_exit');
function write_dbg_stack()
{
$GLOBALS['dbg_stack'] = debug_backtrace();
}
register_tick_function('write_dbg_stack');
declare(ticks=1);
With some inspiration from the nonworking but still right-direction answer from RoBorg, I used the following code in the beginning:
function shutdown() {
global $dbg_stack_a;
print_r($dbg_stack_a);
}
register_shutdown_function('shutdown');
And then I made a global conditional breakpoint (global = breakpoint is evaluated on each row), exploiting the fact that it can run code trough eval(), with the following "condition":
eval('
global $dbg_stack_a, $dbg_stack_b, $dbg_stack_c;
$dbg_stack_a = $dbg_stack_b;
$dbg_stack_b = $dbg_stack_c;
$dbg_stack_c = debug_backtrace();
return false;
')
Probably not fast but does the trick! Using this I was able to determine the exact file and line location that raised die(). (This example works in NuSphere.)
grep -n die filename
Don't forget to grep for "exit" too.
Add this to the top of the file:
function shutdown()
{
print_r(debug_backtrace());
}
register_shutdown_function('shutdown');
See register_ shutdown_function()
You can use an interactive debugger to step through the code until you reach the exit point. Other than that, I think you're down to grep'ing the code for exit|die.
xdebug has a nice trace feature that'll allow you to see all the entire trace of your php app execution and it should give you give clue as to where your exit is.
but for the quick and dirty solution a grep/find as mentioned above will do rightly.
Also check the error___logs for "memory_limit" errors in the Apache error_log.
Memory Limit >= 10M Warning: (Set this to 10M or larger in your php.ini file)
In my experience, scripts suddenly end without warning or notice when this happens.
Make sure that errors are displayed in your development environment (not production).