My site's admin section has a bunch of very slow report-generating scripts that echo output line by line as it is generated. To have this output flushed immediately to the browser, instead of the user having to wait for minutes before they see any response, we have output_buffering disabled and we call ob_implicit_flush at the beginning of such scripts.
For convenience, I was considering just turning on the implicit_flush setting in php.ini instead of adding ob_implicit_flush() calls to every script that would benefit from it.
However, the documentation contains the following scary but unexplained remark:
implicit_flush
...
When using PHP within an web environment, turning this option on has serious performance implications and is generally recommended for debugging purposes only.
What are these "serious performance implications", and do they justify the manual's recommendation?
It may or may not be what the manual is hinting at, but one context in which either turning on implicit_flush or calling ob_implicit_flush() has serious performance implications is when using PHP with Apache through mod_php with mod_deflate enabled.
In this context, flush() calls are able to push output all the way through mod_deflate to the browser. If you have any scripts that echo large amounts of data in small chunks, flushing every chunk will cripple mod_deflate's ability to compress your output, quite possibly resulting in a 'compressed' form that is larger than the original content.
As an extreme example, consider this simple script which echoes out a million random numbers:
<?php
header('Content-Type: text/plain');
for ($i=0; $i < 1000000; $i++) {
echo rand();
echo "\n";
}
?>
With output_buffering off and implicit_flush also off (for now), let's hit this in Chrome with the dev tools open:
Note the Size/Content column; the decompressed output is 10.0MB in size, but thanks to mod_deflate's gzip compression, the entire response was compressed down to 4.8MB, roughly halving it in size.
Now hitting exactly the same script with implicit_flush set to On:
Once again, the 'decompressed' output is 10.0MB in size. This time, though, the size of the HTTP response was 28.6MB - mod_deflate's 'compression' has actually trebled the size of the response.
This, for me, is more than enough reason to heed the PHP manual's advice of leaving the implicit_flush config option off, and only using ob_implicit_flush() (or manual flush() calls) in contexts where doing so actually serves a purpose.
I have a weird memory problem in PHP. I think something is only allowing an array to be a maximum of 0.25M. It appears the script is only using up to around 6M before it crashes.
Here's the output from xdebug:
Here's the function it is calling. The result of the sql query is about 800 rows of text.
public function getOptions(){
$sql = "select Opt,
Code,
Description
from PCAOptions";
$result = sqlsrv_query($this->conn,$sql);
$arrayResult = array();
echo ini_get('memory_limit'); //this confirms that my memory limit is high enough
while($orderObject = sqlsrv_fetch_object($result,'PCA_Option')){
array_push($arrayResult, $orderObject);
}
return $arrayResult;
}
So, I don't know how or why this worked, but I fixed the problem by commenting out these two lines from my .htaccess file:
# php_value memory_limit 1024M
# php_value max_execution_time 18000
I did this because I noticed phpinfo() was returning different values for these settings in the two columns "master" and "local".
My php.ini had memory_limit=512M and max_execution_time=3000, whereas my .htacces file had the above values. I thought .htaccess would just override whatever was in php.ini but I guess it caused a conflict. Could this be a possible bug in php?
There's a number of steps some PHP distributions, security packages, and web hosts take to prevent users from raising the PHP actual memory limit at runtime via ini_set. The most likely candidate is the suhosin security package.
This older Stack Overflow question has a number of other suggestions as well.
When working with 'big' scripts which i know will bypass my server PHP memory limits,
i always set this variable at the beginning of the script. Works all the time.
ini_set("memory_limit","32M"); //Script should use up to 32MB of memory
I have a PHP Script, it needs execution time of at least 1000 seconds to complete.
It terminates after around 265 seconds each time with no errors. Since I am using loops I tested number of iterations and it is independent of that, further ruling out a possibility of occurrence of an error in the loop.
I have set max_execution_time to 10800 in php.ini, and also changing memory_limit doesn't affect the results.
Please help! I have scratched my head thoroughly!
Did you check your log file? If you get error 6 or segmentation fault. Then your script is is actually crashing php without showing any errors on the browser (if it is browser and not cli).
If you are using apache on unix, then you should find this log in /var/log/apache2/error.log.
Otherwise you can define the path of the log file in .htaccess by add this line:
php_value error_log "/path/to/somewhere/convenient/php_errors.log"
Change the path to somewhere where your httpd has write permission and where you have read permission.
give:
set_time_limit(0);
in the beginning of script. so that the code execution will not time out unless its completed.
This same error happened to me. One of my PHP functions died without any errors sent to stderr, stdout, nor any other log file.
What happened was I was using a helper PHP script written by some other developer that was setting the memory limit to 512MB half way through the operation of my program. The sub module poisoned the well by also setting the error log settings to silent at some point mid-way through the processing of my script.
You can prove if this is happening to you by printing out the php system settings available to your PHP script on EVERY iteration of the loop. When the subscript did the dirty deed, the PHP engine throws a fit after the garbage collector runs at a random point in the future, then dies immediately without error. It's a bug in the PHP garbage collector when sub modules mess with system settings as the garbage collector is doing its work.
Solution: Edit the php helper sub-modules, and make sure they don't tweak the system settings, as the garbage collector is doing its work. The PHP interpreter will freak out and die without any errors or output at a random interval after the poisoning of the PHP system variables.
On PHP, the only way to hide errors from logs even when the display_errors and error_reporting are properly indicated is to use the Error Control Operators,
for example: the following code will raise a Fatal error but a "hidden" one, it will not appear neither on the logs or on the browser.
#echo "forgetting the last quote;
So set the display_errors value to On and the error_reporting depending on the PHP version you are using (see differences here), and check if you are using a "#" symbol on your code.
I have a bunch of client point of sale (POS) systems that periodically send new sales data to one centralized database, which stores the data into one big database for report generation.
The client POS is based on PHPPOS, and I have implemented a module that uses the standard XML-RPC library to send sales data to the service. The server system is built on CodeIgniter, and uses the XML-RPC and XML-RPCS libraries for the webservice component. Whenever I send a lot of sales data (as little as 50 rows from the sales table, and individual rows from sales_items pertaining to each item within the sale) I get the following error:
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 54 bytes)
128M is the default value in php.ini, but I assume that is a huge number to break. In fact, I have even tried setting this value to 1024M, and all it does is take a longer time to error out.
As for steps I've taken, I've tried disabling all processing on the server-side, and have rigged it to return a canned response regardless of the input. However, I believe the problem lies in the actual sending of the data. I've even tried disabling the maximum script execution time for PHP, and it still errors out.
Changing the memory_limit by ini_set('memory_limit', '-1'); is not a proper solution. Please don't do that.
Your PHP code may have a memory leak somewhere and you are telling the server to just use all the memory that it wants. You wouldn't have fixed the problem at all. If you monitor your server, you will see that it is now probably using up most of the RAM and even swapping to disk.
You should probably try to track down the offending code in your code and fix it.
ini_set('memory_limit', '-1'); overrides the default PHP memory limit.
The correct way is to edit your php.ini file.
Edit memory_limit to your desire value.
As from your question, 128M (which is the default limit) has been exceeded, so there is something seriously wrong with your code as it should not take that much.
If you know why it takes that much and you want to allow it set memory_limit = 512M or higher and you should be good.
The memory allocation for PHP can be adjusted permanently, or temporarily.
Permanently
You can permanently change the PHP memory allocation two ways.
If you have access to your php.ini file, you can edit the value for memory_limit to your desire value.
If you do not have access to your php.ini file (and your webhost allows it), you can override the memory allocation through your .htaccess file. Add php_value memory_limit 128M (or whatever your desired allocation is).
Temporary
You can adjust the memory allocation on the fly from within a PHP file. You simply have the code ini_set('memory_limit', '128M'); (or whatever your desired allocation is). You can remove the memory limit (although machine or instance limits may still apply) by setting the value to "-1".
It's very easy to get memory leaks in a PHP script - especially if you use abstraction, such as an ORM. Try using Xdebug to profile your script and find out where all that memory went.
When adding 22.5 million records into an array with array_push I kept getting "memory exhausted" fatal errors at around 20M records using 4G as the memory limit in file php.ini. To fix this, I added the statement
$old = ini_set('memory_limit', '8192M');
at the top of the file. Now everything is working fine. I do not know if PHP has a memory leak. That is not my job, nor do I care. I just have to get my job done, and this worked.
The program is very simple:
$fh = fopen($myfile);
while (!feof($fh)) {
array_push($file, stripslashes(fgets($fh)));
}
fclose($fh);
The fatal error points to line 3 until I boosted the memory limit, which
eliminated the error.
I kept getting this error, even with memory_limit set in php.ini, and the value reading out correctly with phpinfo().
By changing it from this:
memory_limit=4G
To this:
memory_limit=4096M
This rectified the problem in PHP 7.
You can properly fix this by changing memory_limit on fastcgi/fpm:
$vim /etc/php5/fpm/php.ini
Change memory, like from 128 to 512, see below
; Maximum amount of memory a script may consume (128 MB)
; http://php.net/memory-limit
memory_limit = 128M
to
; Maximum amount of memory a script may consume (128 MB)
; http://php.net/memory-limit
memory_limit = 512M
When you see the above error - especially if the (tried to allocate __ bytes) is a low value, that could be an indicator of an infinite loop, like a function that calls itself with no way out:
function exhaustYourBytes()
{
return exhaustYourBytes();
}
Your site's root directory:
ini_set('memory_limit', '1024M');
After enabling these two lines, it started working:
; Determines the size of the realpath cache to be used by PHP. This value should
; be increased on systems where PHP opens many files to reflect the quantity of
; the file operations performed.
; http://php.net/realpath-cache-size
realpath_cache_size = 16k
; Duration of time, in seconds for which to cache realpath information for a given
; file or directory. For systems with rarely changing files, consider increasing this
; value.
; http://php.net/realpath-cache-ttl
realpath_cache_ttl = 120
Rather than changing the memory_limit value in your php.ini file, if there's a part of your code that could use a lot of memory, you could remove the memory_limit before that section runs, and then replace it after.
$limit = ini_get('memory_limit');
ini_set('memory_limit', -1);
// ... do heavy stuff
ini_set('memory_limit', $limit);
In Drupal 7, you can modify the memory limit in the settings.php file located in your sites/default folder. Around line 260, you'll see this:
ini_set('memory_limit', '128M');
Even if your php.ini settings are high enough, you won't be able to consume more than 128 MB if this isn't set in your Drupal settings.php file.
Change the memory limit in the php.ini file and restart Apache. After the restart, run the phpinfo(); function from any PHP file for a memory_limit change confirmation.
memory_limit = -1
Memory limit -1 means there is no memory limit set. It's now at the maximum.
Just add a ini_set('memory_limit', '-1'); line at the top of your web page.
And you can set your memory as per your need in the place of -1, to 16M, etc..
For Drupal users, this Chris Lane's answer of:
ini_set('memory_limit', '-1');
works but we need to put it just after the opening
<?php
tag in the index.php file in your site's root directory.
PHP 5.3+ allows you to change the memory limit by placing a .user.ini file in the public_html folder.
Simply create the above file and type the following line in it:
memory_limit = 64M
Some cPanel hosts only accept this method.
Crash page?
(It happens when MySQL has to query large rows. By default, memory_limit is set to small, which was safer for the hardware.)
You can check your system existing memory status, before increasing php.ini:
# free -m
total used free shared buffers cached
Mem: 64457 63791 666 0 1118 18273
-/+ buffers/cache: 44398 20058
Swap: 1021 0 1021
Here I have increased it as in the following and then do service httpd restart to fix the crash page issue.
# grep memory_limit /etc/php.ini
memory_limit = 512M
For those who are scratching their heads to find out why on earth this little function should cause a memory leak, sometimes by a little mistake, a function starts recursively call itself for ever.
For example, a proxy class that has the same name for a function of the object that is going to proxy it.
class Proxy {
private $actualObject;
public function doSomething() {
return $this->actualObjec->doSomething();
}
}
Sometimes you may forget to bring that little actualObjec member and because the proxy actually has that doSomething method, PHP wouldn't give you any error and for a large class, it could be hidden from the eyes for a couple of minutes to find out why it is leaking the memory.
I had the error below while running on a dataset smaller than had worked previously.
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 4096 bytes) in C:\workspace\image_management.php on line 173
As the search for the fault brought me here, I thought I'd mention that it's not always the technical solutions in previous answers, but something more simple. In my case it was Firefox. Before I ran the program it was already using 1,157 MB.
It turns out that I'd been watching a 50 minute video a bit at a time over a period of days and that messed things up. It's the sort of fix that experts correct without even thinking about it, but for the likes of me it's worth bearing in mind.
In my case on mac (Catalina - Xampp) there was no loaded file so I had to do this first.
sudo cp /etc/php.ini.default /etc/php.ini
sudo nano /etc/php.ini
Then change memory_limit = 512M
Then Restart Apache and check if file loaded
php -i | grep php.ini
Result was
Configuration File (php.ini) Path => /etc
Loaded Configuration File => /etc/php.ini
Finally Check
php -r "echo ini_get('memory_limit').PHP_EOL;"
Using yield might be a solution as well. See Generator syntax.
Instead of changing the PHP.ini file for a bigger memory storage, sometimes implementing a yield inside a loop might fix the issue. What yield does is instead of dumping all the data at once, it reads it one by one, saving a lot of memory usage.
The reason for this error is that your server configuration has a very low memory limit. Try adding this to wp-config.php (put it after <?php in this file):
define('WP_MEMORY_LIMIT', '96M');
Please note that this limit is OK for the theme and the plugins that come with the theme. If you want to enable other plugins you may need to increase the limit further.
define('WP_MEMORY_LIMIT', '256M');
Running the script like this (cron case for example): php5 /pathToScript/info.php produces the same error.
The correct way: php5 -cli /pathToScript/info.php
If you're running a WHM-powered VPS (virtual private server) you may find that you do not have permissions to edit PHP.INI directly; the system must do it. In the WHM host control panel, go to Service Configuration → PHP Configuration Editor and modify memory_limit:
I find it useful when including or requiring _dbconnection.php_ and _functions.php in files that are actually processed, rather than including in the header. Which is included in itself.
So if your header and footer is included, simply include all your functional files before the header is included.
Greetings is a very common problem because if you have very little memory allocated to php and your website is growing will require more resources.
I found myself in a site that had problems that gave error 500 to modify only some products, the problem was that they had used very heavy images in those specific products, solution:
1.- Increase "memory_limit" in php.ini
2.- Lower the weight of the images.
3.- Adapt again "memory_limit" to an acceptable value "512M" at least for me more than enough.
now it is important that you verify that the changes are being made because php apart from having several versions and several types of installations on the server, maybe you modify one and it does not work and this is because you are not modifying the correct php.ini file.
How do you verify that you are modifying the correct file?
In the prestashop dashboard go to advanced settings/information there you can see "Memory limit".
always remember that after making a change in the php.ini file it is advisable to restart apache or Nginx.
Ubuntu: sudo services apache2 restart
IMPORTANT NOTE: Never set the "memory_limit = -1" as many people mention here. The problem is that if you have a problem with a file or module you could be in a continuous loop consuming all the server's memory and processor. Let's take a simple example: a module has an error and makes a call to a function and until it is not positive it keeps calling, this will create an infinite loop and it will never stop doing it because php has no limit.
I hope it helps colleagues who have this problem.
The most common cause of this error message for me is omitting the "++" operator from a PHP "for" statement. This causes the loop to continue forever, no matter how much memory you allow to be used. It is a simple syntax error, yet is difficult for the compiler or runtime system to detect. It is easy for us to correct if we think to look for it!
But suppose you want a general procedure for stopping such a loop early and reporting the error? You can simply instrument each of your loops (or at least the innermost loops) as discussed below.
In some cases such as recursion inside exceptions, set_time_limit fails, and the browser keeps trying to load the PHP output, either with an infinite loop or with the fatal error message which is the topic of this question.
By reducing the allowed allocation size near the beginning of your code you might be able to prevent the fatal error, as discussed in the other answers.
Then you may be left with a program that terminates, but is still difficult to debug.
Whether or not your program terminates, instrument your code by inserting BreakLoop() calls inside your program to gain control and find out what loop or recursion in your program is causing the problem.
The definition of BreakLoop is as follows:
function BreakLoop($MaxRepetitions=500,$LoopSite="unspecified")
{
static $Sites=[];
if (!#$Sites[$LoopSite] || !$MaxRepetitions)
$Sites[$LoopSite]=['n'=>0, 'if'=>0];
if (!$MaxRepetitions)
return;
if (++$Sites[$LoopSite]['n'] >= $MaxRepetitions)
{
$S=debug_backtrace(); // array_reverse
$info=$S[0];
$File=$info['file'];
$Line=$info['line'];
exit("*** Loop for site $LoopSite was interrupted after $MaxRepetitions repetitions. In file $File at line $Line.");
}
} // BreakLoop
The $LoopSite argument can be the name of a function in your code. It isn't really necessary, since the error message you will get will point you to the line containing the BreakLoop() call.
In my case it was a brief issue with the way a function was written. A memory leak can be caused by assigning a new value to a function's input variable, e.g.:
/**
* Memory leak function that illustrates unintentional bad code
* #param $variable - input function that will be assigned a new value
* #return null
**/
function doSomehting($variable){
$variable = 'set value';
// Or
$variable .= 'set value';
}
Increasing the memory_limit fixed the problem. However, I had problems finding the memory limit. I am working on my project directly from live server, so if you're doing the same, on cPanel you can find the memory_limit if you go to Software - MultiPHP INI Editor and select the location. I increased mine from 256M to 512M. You can also find instructions here.