Symfony Task Unexpected Memory - php

I am using Symfony task php5.2. Here is a portion of my code:
array of images
foreach($array as $k=>$v)
{
abc(); // function call which will cope images from one server to another
// by using file_get_content in a php variable and using api(wso).
echo memory_get_usage();
}
problem is memory_get_usage(); always returning same value but when i am using top command memory is increasing nonstop.
Is there any bug in symfony task or php5.2 or wso.

Have you tried memory_get_usage(true)?
Also, have you considered that it might not be PHP that is using the memory? But some other library you are using?

Related

Why using `eval` will not release memory later when using `unset` on anonymous function?

The very simple PHP code below works perfectly, I mean, it does nothing and at the same time it does not use almost any system memory.
<?php
set_time_limit(0);
ini_set('memory_limit','-1');
for ($i=0;$i<100000000;$i++) {
$bbb = function() {
return true;
};
unset($bbb);
}
?>
While the code below is very similar BUT if you run it, after a few minutes you will crash your system because it will consume all your system RAM.
<?php
set_time_limit(0);
ini_set('memory_limit','-1');
for ($i=0;$i<100000000;$i++) {
eval('
$bbb = function() {
return true;
};
');
unset($bbb);
}
?>
My actual $bbb function is much more complex and it is a result of my AI output, so I really need to use eval in this case. I am totally aware of security implications.
I just wanna know how to solve this problem. How can I make PHP/APACHE release memory (the GC probably is the curlprit) in the second case?
NOTE: I am using Windows 10 with PHP 7.4
EDIT
As per suggestion of #Nigel Ren (which looks very good) I tried this:
<?php
set_time_limit(0);
ini_set('memory_limit','-1');
for ($i=0;$i<100000000;$i++) {
require("test.php");
unset($bbb);
}
?>
And the "test.php" file has this:
<?php
$bbb = function() {
return true;
};
?>
But this suggestion still consumes lots of memory, it's not getting cleared! Even if I use gc_mem_caches(); GC is still not clearing memory. Maybe I understood something wrong from your suggestion ##Nigel Ren?
See the last comment by Chris on this bug report.
For posterity's sake I'll copy it here:
I became interested by the eval() behavior recently, and I can confirm that the memory usage gets higher by the time of the execution.
However it is not really a bug, but the way eval() is fundamentally written:
it creates a temp file with the code inside (sometimes it is stored in memory)
it then includes that temp file using a normal include() function
As you can see, the more eval() you do, the more include() it will trigger, resulting in a memory usage that cannot be freed...
So the reason your memory usages increases when using eval is that it's including temporary files. There's nu way to un-include files, so the more you use eval, the more memory you're going to end up using.
Disclaimer: the following is pure speculation, I have not verified if this will work
As for a potential workaround to get around this issue, since you say you need to use eval, you could look into using the pcntl extension to fork a process.
It's probably going to be slower, but if you can fork the eval-ing piece of your code off to a separate process which can then terminate gracefully once finished, the "include" performed in that child process should be cleared up. That might be one way to limit the memory usage in your application.
After lots of reseraching I discovered runkit7 is the way to go! It allows clearing RAM even from functions included/required/evaluated! You can use include/require/eval and it will wisely clean the memory footprint. It's a pretty awesome extension and I was amazed as soon as I tested it, my ram dropped 90% from the previous run I did without runkit7.
As a test, this code starts with a template for the source code in your first listing minus the definition but with a place holder to insert the code. The script just reads this file, replaces the marker with the AI script and writes a new file (abc1.php), then uses require to run the script. This removes any need for eval and the overheads associated with it...
So abc.php is
set_time_limit(0);
ini_set('memory_limit','-1');
for ($i=0;$i<10;$i++) {
//#
unset($bbb);
}
The bootstrap script is, where $aiSrc is the code you generate (note this also runs the function)...
$src = file_get_contents("abc.php");
$aiSrc = '$bbb = function() {
echo "Hello";
};
$bbb();';
$newSrc = str_replace("//#", $aiSrc, $src);
file_put_contents("abc1.php", $newSrc);
require"abc1.php";
After this the script abc1.php contains...
set_time_limit(0);
ini_set('memory_limit','-1');
for ($i=0;$i<10;$i++) {
$bbb = function() {
echo "Hello";
};
$bbb();
unset($bbb);
}

Emulating PHP Include Statement Using Code From HTTP Requests

I have a primary "driver" script written in PHP, and based on certain criteria, I'd like this script to pull code from one or more supporting servers via HTTP in order to load functions into memory as though I had done so using the PHP include statement.
Is it possible to use HTTP requests in PHP to pull data that is actually interpreted as PHP code when it is returned?
For example, suppose I used cURL or a web service to return the following text that was stored in a variable, say $URLResponse:
function userKeyGet() {
return (isset($_SESSION['user_key']) ? $_SESSION['user_key'] : '-1');
}
If I then used something such as eval($URLResponse), would that create the function for use during the current execution of the calling PHP script? I've used cURL and Buzz to return JSON or similarly structured data that I've converted into an array, but not a function or class. Is this possible? Thanks.
You can load remote PHP codes with include(),include_once(),require() and require_once() functions, It requires allow_url_include enabled in php.ini.
require_once("http://www.yourserver.com/function.php");
Included file should contain codes and not interpreted by server as executable, so if you are using php supported web server maybe you can give another extension to remote file.
eval() will work too. If you use eval/include and declare same function two times it will raise fatal error since it is already declared. You can use object or anonymous functions to override the function.
$code="function userKeyGet() {
return (isset(\$_SESSION['user_key']) ? \$_SESSION['user_key'] : '-1');
}";
eval($code);
eval($code); # Second time using eval on $code with "Fatal error: Cannot redeclare"
# This will work
echo userKeyGet();
# An example for anonymous function way
$code="\$userKeyGet = function() {
return (isset(\$_SESSION['user_key']) ? \$_SESSION['user_key'] : '-1');
};";
eval($code); # this wont raise redeclare error
echo $userKeyGet();

Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 7680 bytes) in wordpress [duplicate]

I've encountered the dreaded error-message, possibly through-painstaking effort, PHP has run out of memory:
Allowed memory size of #### bytes exhausted (tried to allocate #### bytes) in file.php on line 123
Increasing the limit
If you know what you're doing and want to increase the limit see memory_limit:
ini_set('memory_limit', '16M');
ini_set('memory_limit', -1); // no limit
Beware! You may only be solving the symptom and not the problem!
Diagnosing the leak:
The error message points to a line withing a loop that I believe to be leaking, or needlessly-accumulating, memory. I've printed memory_get_usage() statements at the end of each iteration and can see the number slowly grow until it reaches the limit:
foreach ($users as $user) {
$task = new Task;
$task->run($user);
unset($task); // Free the variable in an attempt to recover memory
print memory_get_usage(true); // increases over time
}
For the purposes of this question let's assume the worst spaghetti code imaginable is hiding in global-scope somewhere in $user or Task.
What tools, PHP tricks, or debugging voodoo can help me find and fix the problem?
PHP doesn't have a garbage collector. It uses reference counting to manage memory. Thus, the most common source of memory leaks are cyclic references and global variables. If you use a framework, you'll have a lot of code to trawl through to find it, I'm afraid. The simplest instrument is to selectively place calls to memory_get_usage and narrow it down to where the code leaks. You can also use xdebug to create a trace of the code. Run the code with execution traces and show_mem_delta.
Here's a trick we've used to identify which scripts are using the most memory on our server.
Save the following snippet in a file at, e.g., /usr/local/lib/php/strangecode_log_memory_usage.inc.php:
<?php
function strangecode_log_memory_usage()
{
$site = '' == getenv('SERVER_NAME') ? getenv('SCRIPT_FILENAME') : getenv('SERVER_NAME');
$url = $_SERVER['PHP_SELF'];
$current = memory_get_usage();
$peak = memory_get_peak_usage();
error_log("$site current: $current peak: $peak $url\n", 3, '/var/log/httpd/php_memory_log');
}
register_shutdown_function('strangecode_log_memory_usage');
Employ it by adding the following to httpd.conf:
php_admin_value auto_prepend_file /usr/local/lib/php/strangecode_log_memory_usage.inc.php
Then analyze the log file at /var/log/httpd/php_memory_log
You might need to touch /var/log/httpd/php_memory_log && chmod 666 /var/log/httpd/php_memory_log before your web user can write to the log file.
I noticed one time in an old script that PHP would maintain the "as" variable as in scope even after my foreach loop. For example,
foreach($users as $user){
$user->doSomething();
}
var_dump($user); // would output the data from the last $user
I'm not sure if future PHP versions fixed this or not since I've seen it. If this is the case, you could unset($user) after the doSomething() line to clear it from memory. YMMV.
There are several possible points of memory leaking in php:
php itself
php extension
php library you use
your php code
It is quite hard to find and fix the first 3 without deep reverse engineering or php source code knowledge. For the last one you can use binary search for memory leaking code with memory_get_usage
I recently ran into this problem on an application, under what I gather to be similar circumstances. A script that runs in PHP's cli that loops over many iterations. My script depends on several underlying libraries. I suspect a particular library is the cause and I spent several hours in vain trying to add appropriate destruct methods to it's classes to no avail. Faced with a lengthy conversion process to a different library (which could turn out to have the same problems) I came up with a crude work around for the problem in my case.
In my situation, on a linux cli, I was looping over a bunch of user records and for each one of them creating a new instance of several classes I created. I decided to try creating the new instances of the classes using PHP's exec method so that those process would run in a "new thread". Here is a really basic sample of what I am referring to:
foreach ($ids as $id) {
$lines=array();
exec("php ./path/to/my/classes.php $id", $lines);
foreach ($lines as $line) { echo $line."\n"; } //display some output
}
Obviously this approach has limitations, and one needs to be aware of the dangers of this, as it would be easy to create a rabbit job, however in some rare cases it might help get over a tough spot, until a better fix could be found, as in my case.
I came across the same problem, and my solution was to replace foreach with a regular for. I'm not sure about the specifics, but it seems like foreach creates a copy (or somehow a new reference) to the object. Using a regular for loop, you access the item directly.
I would suggest you check the php manual or add the gc_enable() function to collect the garbage... That is the memory leaks dont affect how your code runs.
PS: php has a garbage collector gc_enable() that takes no arguments.
I recently noticed that PHP 5.3 lambda functions leave extra memory used when they are removed.
for ($i = 0; $i < 1000; $i++)
{
//$log = new Log;
$log = function() { return new Log; };
//unset($log);
}
I'm not sure why, but it seems to take an extra 250 bytes each lambda even after the function is removed.
I didn't see it explicitly mentioned, but xdebug does a great job profiling time and memory (as of 2.6). You can take the information it generates and pass it off to a gui front end of your choice: webgrind (time only), kcachegrind, qcachegrind or others and it generates very useful call trees and graphs to let you find the sources of your various woes.
Example (of qcachegrind):
If what you say about PHP only doing GC after a function is true, you could wrap the loop's contents inside a function as a workaround/experiment.
One huge problem I had was by using create_function. Like in lambda functions, it leaves the generated temporary name in memory.
Another cause of memory leaks (in case of Zend Framework) is the Zend_Db_Profiler.
Make sure that is disabled if you run scripts under Zend Framework.
For example I had in my application.ini the folowing:
resources.db.profiler.enabled = true
resources.db.profiler.class = Zend_Db_Profiler_Firebug
Running approximately 25.000 queries + loads of processing before that, brought the memory to a nice 128Mb (My max memory limit).
By just setting:
resources.db.profiler.enabled = false
it was enough to keep it under 20 Mb
And this script was running in CLI, but it was instantiating the Zend_Application and running the Bootstrap, so it used the "development" config.
It really helped running the script with xDebug profiling
I'm a little late to this conversation but I'll share something pertinent to Zend Framework.
I had a memory leak problem after installing php 5.3.8 (using phpfarm) to work with a ZF app that was developed with php 5.2.9. I discovered that the memory leak was being triggered in Apache's httpd.conf file, in my virtual host definition, where it says SetEnv APPLICATION_ENV "development". After commenting this line out, the memory leaks stopped. I'm trying to come up with an inline workaround in my php script (mainly by defining it manually in the main index.php file).
I didn't see it mentioned here but one thing that might be helpful is using xdebug and xdebug_debug_zval('variableName') to see the refcount.
I can also provide an example of a php extension getting in the way: Zend Server's Z-Ray. If data collection is enabled it memory use will balloon on each iteration just as if garbage collection was off.

PHP foreach stack - is it possible that functions called in a for each loop are still running when the next iteration is called

I am having problems with cURL not being able to connect to a server that returns an xml feed and am not sure if my code is stacking up and causing the problem. Is it possible the final function called in this foreach loop is still running when the next loop iteration comes round.
Is it possible to make sure all functions in the loop complete before the next iteration begins, or does foreach do this by default anyway? I tried setting a return true on process_xml() and running a test in the loop: if($this->process_xml($xml_array))continue;
but it didn't seem to have an effect and seems like a bad idea anyway.
foreach($arrayOfUrls as $url){
//retrieve xml from url as string.
if($url_xml_string = $this->getFeedStringUsing_cURL($url)){
$xml_object = simplexml_load_string($url_xml_string);
$xml_array = $this->feedStringToArray($xml_object);
//process the xml.
$this->process_xml($xml_array);
}
}
No, this is not possible. Each statement is executed and finished before the next statement is run.
and am not sure if my code is stacking up
Not sure? If it's important to you why don't you find out? Without knowing what OS you are running on its rather hard to advise how you'd go about that - but netstat might be a good starting point.
Is it possible the final function called in this foreach loop is still running
It's highly improbable - PHP scripts run in a single thread of execution unless you tell them not to - but the curl extension allows you to define callbacks into your php code which run before the operation completes, and the curl_multi_ family of functions also allow you to run php code while requests are in progress.

Diagnosing Memory Leaks - Allowed memory size of # bytes exhausted

I've encountered the dreaded error-message, possibly through-painstaking effort, PHP has run out of memory:
Allowed memory size of #### bytes exhausted (tried to allocate #### bytes) in file.php on line 123
Increasing the limit
If you know what you're doing and want to increase the limit see memory_limit:
ini_set('memory_limit', '16M');
ini_set('memory_limit', -1); // no limit
Beware! You may only be solving the symptom and not the problem!
Diagnosing the leak:
The error message points to a line withing a loop that I believe to be leaking, or needlessly-accumulating, memory. I've printed memory_get_usage() statements at the end of each iteration and can see the number slowly grow until it reaches the limit:
foreach ($users as $user) {
$task = new Task;
$task->run($user);
unset($task); // Free the variable in an attempt to recover memory
print memory_get_usage(true); // increases over time
}
For the purposes of this question let's assume the worst spaghetti code imaginable is hiding in global-scope somewhere in $user or Task.
What tools, PHP tricks, or debugging voodoo can help me find and fix the problem?
PHP doesn't have a garbage collector. It uses reference counting to manage memory. Thus, the most common source of memory leaks are cyclic references and global variables. If you use a framework, you'll have a lot of code to trawl through to find it, I'm afraid. The simplest instrument is to selectively place calls to memory_get_usage and narrow it down to where the code leaks. You can also use xdebug to create a trace of the code. Run the code with execution traces and show_mem_delta.
Here's a trick we've used to identify which scripts are using the most memory on our server.
Save the following snippet in a file at, e.g., /usr/local/lib/php/strangecode_log_memory_usage.inc.php:
<?php
function strangecode_log_memory_usage()
{
$site = '' == getenv('SERVER_NAME') ? getenv('SCRIPT_FILENAME') : getenv('SERVER_NAME');
$url = $_SERVER['PHP_SELF'];
$current = memory_get_usage();
$peak = memory_get_peak_usage();
error_log("$site current: $current peak: $peak $url\n", 3, '/var/log/httpd/php_memory_log');
}
register_shutdown_function('strangecode_log_memory_usage');
Employ it by adding the following to httpd.conf:
php_admin_value auto_prepend_file /usr/local/lib/php/strangecode_log_memory_usage.inc.php
Then analyze the log file at /var/log/httpd/php_memory_log
You might need to touch /var/log/httpd/php_memory_log && chmod 666 /var/log/httpd/php_memory_log before your web user can write to the log file.
I noticed one time in an old script that PHP would maintain the "as" variable as in scope even after my foreach loop. For example,
foreach($users as $user){
$user->doSomething();
}
var_dump($user); // would output the data from the last $user
I'm not sure if future PHP versions fixed this or not since I've seen it. If this is the case, you could unset($user) after the doSomething() line to clear it from memory. YMMV.
There are several possible points of memory leaking in php:
php itself
php extension
php library you use
your php code
It is quite hard to find and fix the first 3 without deep reverse engineering or php source code knowledge. For the last one you can use binary search for memory leaking code with memory_get_usage
I recently ran into this problem on an application, under what I gather to be similar circumstances. A script that runs in PHP's cli that loops over many iterations. My script depends on several underlying libraries. I suspect a particular library is the cause and I spent several hours in vain trying to add appropriate destruct methods to it's classes to no avail. Faced with a lengthy conversion process to a different library (which could turn out to have the same problems) I came up with a crude work around for the problem in my case.
In my situation, on a linux cli, I was looping over a bunch of user records and for each one of them creating a new instance of several classes I created. I decided to try creating the new instances of the classes using PHP's exec method so that those process would run in a "new thread". Here is a really basic sample of what I am referring to:
foreach ($ids as $id) {
$lines=array();
exec("php ./path/to/my/classes.php $id", $lines);
foreach ($lines as $line) { echo $line."\n"; } //display some output
}
Obviously this approach has limitations, and one needs to be aware of the dangers of this, as it would be easy to create a rabbit job, however in some rare cases it might help get over a tough spot, until a better fix could be found, as in my case.
I came across the same problem, and my solution was to replace foreach with a regular for. I'm not sure about the specifics, but it seems like foreach creates a copy (or somehow a new reference) to the object. Using a regular for loop, you access the item directly.
I would suggest you check the php manual or add the gc_enable() function to collect the garbage... That is the memory leaks dont affect how your code runs.
PS: php has a garbage collector gc_enable() that takes no arguments.
I recently noticed that PHP 5.3 lambda functions leave extra memory used when they are removed.
for ($i = 0; $i < 1000; $i++)
{
//$log = new Log;
$log = function() { return new Log; };
//unset($log);
}
I'm not sure why, but it seems to take an extra 250 bytes each lambda even after the function is removed.
I didn't see it explicitly mentioned, but xdebug does a great job profiling time and memory (as of 2.6). You can take the information it generates and pass it off to a gui front end of your choice: webgrind (time only), kcachegrind, qcachegrind or others and it generates very useful call trees and graphs to let you find the sources of your various woes.
Example (of qcachegrind):
If what you say about PHP only doing GC after a function is true, you could wrap the loop's contents inside a function as a workaround/experiment.
One huge problem I had was by using create_function. Like in lambda functions, it leaves the generated temporary name in memory.
Another cause of memory leaks (in case of Zend Framework) is the Zend_Db_Profiler.
Make sure that is disabled if you run scripts under Zend Framework.
For example I had in my application.ini the folowing:
resources.db.profiler.enabled = true
resources.db.profiler.class = Zend_Db_Profiler_Firebug
Running approximately 25.000 queries + loads of processing before that, brought the memory to a nice 128Mb (My max memory limit).
By just setting:
resources.db.profiler.enabled = false
it was enough to keep it under 20 Mb
And this script was running in CLI, but it was instantiating the Zend_Application and running the Bootstrap, so it used the "development" config.
It really helped running the script with xDebug profiling
I'm a little late to this conversation but I'll share something pertinent to Zend Framework.
I had a memory leak problem after installing php 5.3.8 (using phpfarm) to work with a ZF app that was developed with php 5.2.9. I discovered that the memory leak was being triggered in Apache's httpd.conf file, in my virtual host definition, where it says SetEnv APPLICATION_ENV "development". After commenting this line out, the memory leaks stopped. I'm trying to come up with an inline workaround in my php script (mainly by defining it manually in the main index.php file).
I didn't see it mentioned here but one thing that might be helpful is using xdebug and xdebug_debug_zval('variableName') to see the refcount.
I can also provide an example of a php extension getting in the way: Zend Server's Z-Ray. If data collection is enabled it memory use will balloon on each iteration just as if garbage collection was off.

Categories