In a PHP script intended to work on generic shared LAMP hosting, I am using require() as a function to read data from a file. The data file looks like this:
<?php
# storage.php
return [
// Rotating secrets
'last_rand' => '532e89355b78aafdb85f5f01f0eed20440d6bd9e0a2d6ae1bd17be4e1d7d21c7bb7a822a2077e3f4',
];
And I read it like this:
$data = require($root . '/storage.php');
$lastRand = $data['last_rand'];
In my script, I will read information from a configuration file using this approach. In some cases, the operation will re-write a new config file (using file_put_contents()) and then re-read it, again using require().
This has worked solidly so far, on a variety of LAMP hosts, but today I found a web host where the require() does not seem to notice changes in the file. This host is using PHP 5.6, whereas all the others I have tested are using 7+.
What's extremely odd is that require() gets the old contents of the file even though file_get_contents() can see the new version, as if require() is doing its own internal caching.
Failed fix attempts
I have even tried waiting for require() to catch up, in case some underlying cache needed time to expire:
$ok = true;
$t = microtime(true);
while ($oldRandom != $this->getFileService()->requirePhp($this->getStoragePath())['last_rand'])
{
$elapsedTime = microtime(true) - $t;
if ($elapsedTime > 4) // Wait 4 seconds
{
$ok = false;
break;
}
usleep(1000);
}
That did not work either (the timeout just expires with no change in the results brought back from require()).
However, upon the next run of the PHP script, require() suddenly sees the new file contents.
I've also tried to delete the storage.php file before re-reading it, and that has no effect either! I was really expecting that to help.
Future actions
So, I don't understand this behaviour at all. To work around it I could:
rather than modifying a new config, saving it to file and then reloading it from file, I could refactor my code so that I save the new config to memory, thus avoiding having to reload it
refactor so that I use file_get_contents() rather than require() (for rather dull reasons it's not as convenient, but I'll prefer whatever works reliably).
However, these both feel like I am ignoring a bug, and that I should investigate the cause. I happen to know also that this host (a free one) is nearly always under heavy CPU load. It's a 64 bit server running Linux on the rather old 2.6 kernel, and phpinfo() indicates the CGI/FastCGI server API is in force.
Can anything be done to mitigate this behaviour?
More attempts
Some helpful commenters suggest this problem could be related to opcaching, which seems to fit the general purpose of require(). I've added this code after the file_put_contents() that re-writes the config:
$reset = opcache_reset(); // Returns true
$invalid = opcache_invalidate($this->getStoragePath()); // Returns false
However, it makes no difference - require() stubbornly reads the same value. I have confirmed this by doing a require() before and after the file write, and also a file_get_contents() to get the true contents.
When I originally encountered this error, I was hesitant to work around it, since it felt like a critical bug that ought not be ignored. However, now that the culprit is likely to be opcaching (and thus related to require() specifically), I think working around it by keeping values in memory is not a bad solution. I shall have to remember that require() is only good for one call per HTTP request, even if that does not hold true for all PHP installations.
I am conscious also that if I did try to work around this, I would have to contend with a number of opcache invalidation mechanisms, which is more complexity than I am comfortable with.
I found the same problem with a config file, the only valid solution for now is to change the name of the config file with each modification:
1 - load config file
// get the config file name
$config_file=glob('config/config*.php'))[0] ?? null;
// load the settings
if($config_file) $settings=require_once($config_file);
2 - save the config file when needed
...
$data=['var'=>'value','other-var'=>'other value'];
$vars=var_export($data, TRUE);
// new file name
$file_name='config/config-'.time().'.php';
file_put_contents($file_name,"<?php\n return $vars;");
// delete old config file
if($config_file) #unlink($config_file);
I have an old code base that makes extensive use of GD (no, switching to imagemagick is not an option). It's been running for several years, through several versions. However, when I run it in my current development environment, I'm running into a mysterious gd-png error: cannot allocate image data errors when calling imagecreatefrompng(). The PNG file I'm using is the same as I've been using, so I know it works.
Current setup:
Ansible-provisioned vagrant box
Ubuntu 14.04
PHP 5.5.9
GD 2.1.1
libPNG 1.2.50
PHP's memory limit at the time the script is run is 650M, though it's ultimately the kernel itself that ends up killing the script, and changing PHP's memory limit doesn't seem to have an effect.
The image dimensions are 7200x6600 and are about 500KiB on disk. This hasn't been a problem in my other environments and is only newly-occurring in my development environment. Unfortunately, I don't have access to the other environments anymore to do a comparison, though the setup was similar in the last working one -- Ubuntu 14.04, PHP 5.5, sufficient memory allocations.
What could be happening in this setup that wasn't happening in my previous setups? How can I fix this?
I was browsing a bit through the PHP 5.5.9 source to try and find your specific error string "cannot allocate image data", but the closest I could find is "gd-png error: cannot allocate gdImage struct". The bundled version of GD for PHP 5.5.9 (all the way up to 7.0.0) is version 2.0.35, so it seems you're looking at a custom build(?). Bugging the PHP guys might not help here.
Looking at the GD source (2.1.2, can't seem to find 2.1.1), the only place this error occurs is:
https://github.com/libgd/libgd/blob/master/src/gd_png.c#L435
image_data = (png_bytep) gdMalloc (rowbytes * height);
if (!image_data) {
gd_error("gd-png error: cannot allocate image data\n");
png_destroy_read_struct (&png_ptr, &info_ptr, NULL);
if (im) {
gdImageDestroy(im);
}
if (palette_allocated) {
gdFree (palette);
}
return NULL;
}
Where gdMalloc is:
https://github.com/libgd/libgd/blob/GD-2.1/src/gdhelpers.c#L73
void *
gdMalloc (size_t size)
{
return malloc (size);
}
I'm afraid this is as far as my detective work goes. As far as I can tell the bundled 2.0.35 GD versions in PHP also just use malloc, so at first glance there's no real difference there. I also tried to find some equivalent piece of code in the bundled versions, but so far I haven't found it and it seems to be here:
https://github.com/php/php-src/blob/PHP-5.5.9/ext/gd/libgd/gd_png.c#L323
image_data = (png_bytep) safe_emalloc(rowbytes, height, 0);
I can't seem to find safe_emalloc, but it seems the old PHP bundled versions did use a different memory allocation here than the version your environment uses. Perhaps your best bet is to check with the GD devs?. To avoid a merry goose chase, I think trying another environment is your best bet -after confirming they are indeed using a non-strandard GD version.
After some more PHP source safari, it seems safe_emalloc is used throughout all extensions, so I guess (guess mind you) that this is the preferred way to allocate memory for PHP extensions (looks like it). If your environment is in fact using an unaltered GD 2.1.1, it's likely it is ignoring any PHP memory settings/limits. You might be able to find some other way of specifying the ('stand-alone') GD memory limit?
Edit: Looking at the official libgd faq, near the bottom it states that the PHP extension should indeed respect the memory limit and it specifies that an 8000x8000 pixel image would take about 256MB, so your 650MB limit should really be enough (unless you're working with multiple copies in the same run?) and the fact that it's not working corroborates that something fishy is going on with the memory allocation.
If I was you I would do the following:
Send messages to the php-devel list and see if it is a known issue. You'll also want to search their bug tracker. http://php.net/mailing-lists.php and https://bugs.php.net/
Use GD command line utilities to see if you can process the same image with the same version with cmd line stuff. This is trying to determine if it's a problem with GD and the image or if it is an issue with the PHP-gd lib or some combination.
1.a Create a PHP CLI program to resize the image and see if it works
Figure out exactly what happens when you call the php function in the underlying (probably c) code. This will involve downloading the source to the php-gd module and looking through it.
Code a minimal c application that does the same thing as the underlying PHP-gd library and see if you can reproduce the error
Something in there should tell you exactly what is going on, though it will take a bit of work. You should be able to get all the tools/sources for your environment. The beauty of open source, right?
The other option would be to try different versions of these applications in your environment and see if you find one that works. This is the "shotgun" approach and not the best, IMO.
From my experience:
Try a file with a different row size. I had issues with another image library that did not read images longer than 3000px/row. It was otherwise not restricted in absolute size.
You have quite an image there, if this is in RGBA in memory, you end up with 180M uncompressed image data, 140M on RGB, still 45M as 8Bit. You are sure, your process limits will not be exceeded when this is loaded?
You used the same image with the same libraries, it means libgd and libpng have no restrictions about the memory size (or at least not one for the size you're using), the problem may be php but you are sure the memory limit is 650M (if your php installation is using suhosin, you may want to check suhosin.memory_limit)
You tell the kernel is killing the script (but I can't find why you are telling so, is there a dmesg log about?) but the kernel kill a process only when is out of memory.
One problem could be memory fragmentation / contiguous alloc, nowadays it should be more a kernel space problem than user space, but the size your script is allocating is not an every day size.
Fragmentation could be a problem if your script is running on a server with a long uptime with a lot of process allocating and freeing memory, if is a virtual machine that you just start up and the script fail at the first launch, then the problem isn't fragmentation.
You can do a simple test, command and code below, it use the gcc compiler to create (in the directory where you launch the command) a a.out executable which allocate and set to 0 a memory chunk with size 191 x 1024 x 1024 (about ~190M)
echo '#include <stdlib.h>\n#include <string.h>\n#define SIZ 191*1024*1024\nint main() {void *p=malloc(SIZ);memset(p,0,SIZ);return (p==NULL?1:0);}' \
| gcc -O2 -xc -
run the executable with:
./a.out
If the problem is the system memory the kernel should kill the executable (I'm not sure 100%, the kernel have some policy about which process to kill when there's no memory, to be checked)
You should see the kill message though, with dmesg perhaps.
(I think this shouldn't happen, instead the malloc should just fail)
If the malloc succeed, the executable can allocate enough memory, it should return 0, if it fail should return 1
to verify the return code just use
./a.out
echo $?
(don't execute any other command in the middle)
If the malloc fail you probably just need to add physical or virtual memory to the system.
Keep in mind that ~190M is just the memory for one uncompressed image, depending on how php, libgd and libpng are working the memory for the whole process may be even twice (or more).
I did a simple test and profiled the memory usage, the peak with an image 7600x2200 (1.5M on disk) on my system is about ~342M, here's the test:
<?php
$im = imagecreatefrompng("input.png");
$string = "Sinker sucker socks pants";
$orange = imagecolorallocate($im, 220, 210, 60);
$px = (imagesx($im) - 7.5 * strlen($string)) / 2;
imagestring($im, 3, $px, 9, $string, $orange);
imagepng($im);
imagedestroy($im);
I tried xhprof at first but it returned values too low.
So I tried a simple script memusg
memusg /usr/bin/php -f test.php >output.png
(the test work btw)
I met this problem yesterday and fixed it today.
Yesterday's env:
php-7.0.12
libpng-1.6.26
libgd-2.1.1
This suite will crash when I resize a png image.
After memory check, I thought it might be a bug in the latest php or libpng.
So I changed the env to this:
php-5.6.27
libpng-1.2.56
libgd-2.1.1
I changed php and libpng to mature versions which are used longtime.
Recompile and re-install these - it works well for png and jpeg.
Is there a way to view the PHP error logs or Apache error logs in a web browser?
I find it inconvenient to ssh into multiple servers and run a "tail" command to follow the error logs. Is there some tool (preferably open source) that shows me the error logs online (streaming or non-streaming?
Thanks
A simple php code to read log and print:
<?php
exec('tail /var/log/apache2/error.log', $error_logs);
foreach($error_logs as $error_log) {
echo "<br />".$error_log;
}
?>
You can embed error_log php variable in html as per your requirement. The best part is tail command will load the latest errors which wont make too load on your server.
You can change tail to give output as you want
Ex. tail myfile.txt -n 100 // it will give last 100 lines
See What commercial and open source competitors are there to Splunk? and I would recommend https://github.com/tobi/clarity
Simple and easy tool.
Since everyone is suggesting clarity, I would also like to mention tailon. I wrote tailon as a more modern and secure alternative to clarity. It's still in its early stages of development, but the functionality you need is there. You may also use wtee, if you're only interested in following a single log file.
You good make a script that reads the error logs from apache2..
$apache_errorlog = file_get_contents('/var/log/apache2/error.log');
if its not working.. trying to get it with the php functions exec or shell_exec and the command 'cat /var/log/apache2/error.log'
EDIT: If you have multi servers(i quess with webservers on it) you can create a file on the machine, when you make a request to that script(hashed connection) you get the logs from that server
I recommend LogHappens: https://loghappens.com, it allows you to view the error log in web, and this is what it looks like:
LogHappens supports kinds of web server log format, it comes with parses for Apache and CakePHP, and you can write your own.
You can find it here: https://github.com/qijianjun/logHappens
It's open source and free, I forked it and do some work to make it work better in dev env or in public env. That is:
Support token for security, one can't access the site without the token in config.php
Support IP whitelists for security and privacy
Sopport config the interval between ajax requests
Support load static files from local (for local dev env)
I've found this solution https://code.google.com/p/php-tail/
It's working perfectly. I only needed to change the filesize, because I was getting an error first.
56 if($maxLength > $this->maxSizeToLoad) {
57 $maxLength = $this->maxSizeToLoad;
58 // return json_encode(array("size" => $fsize, "data" => array("ERROR: PHPTail attempted to load more (".round(($maxLength / 1048576), 2)."MB) then the maximum size (".round(($this->maxSizeToLoad / 1048576), 2) ."MB) of bytes into memory. You should lower the defaultUpdateTime to prevent this from happening. ")));
59 }
And I've added default size, but it's not needed
125 lastSize = <?php echo filesize($this->log) || 1000; ?>;
I know this question is a bit old, but (along with the lack of good choices) it gave me the idea to create this tiny (open source) web app. https://github.com/ToX82/logHappens. It can be used online, but I'd use an .htpasswd as a basic login system. I hope it helps.
I have a bunch of client point of sale (POS) systems that periodically send new sales data to one centralized database, which stores the data into one big database for report generation.
The client POS is based on PHPPOS, and I have implemented a module that uses the standard XML-RPC library to send sales data to the service. The server system is built on CodeIgniter, and uses the XML-RPC and XML-RPCS libraries for the webservice component. Whenever I send a lot of sales data (as little as 50 rows from the sales table, and individual rows from sales_items pertaining to each item within the sale) I get the following error:
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 54 bytes)
128M is the default value in php.ini, but I assume that is a huge number to break. In fact, I have even tried setting this value to 1024M, and all it does is take a longer time to error out.
As for steps I've taken, I've tried disabling all processing on the server-side, and have rigged it to return a canned response regardless of the input. However, I believe the problem lies in the actual sending of the data. I've even tried disabling the maximum script execution time for PHP, and it still errors out.
Changing the memory_limit by ini_set('memory_limit', '-1'); is not a proper solution. Please don't do that.
Your PHP code may have a memory leak somewhere and you are telling the server to just use all the memory that it wants. You wouldn't have fixed the problem at all. If you monitor your server, you will see that it is now probably using up most of the RAM and even swapping to disk.
You should probably try to track down the offending code in your code and fix it.
ini_set('memory_limit', '-1'); overrides the default PHP memory limit.
The correct way is to edit your php.ini file.
Edit memory_limit to your desire value.
As from your question, 128M (which is the default limit) has been exceeded, so there is something seriously wrong with your code as it should not take that much.
If you know why it takes that much and you want to allow it set memory_limit = 512M or higher and you should be good.
The memory allocation for PHP can be adjusted permanently, or temporarily.
Permanently
You can permanently change the PHP memory allocation two ways.
If you have access to your php.ini file, you can edit the value for memory_limit to your desire value.
If you do not have access to your php.ini file (and your webhost allows it), you can override the memory allocation through your .htaccess file. Add php_value memory_limit 128M (or whatever your desired allocation is).
Temporary
You can adjust the memory allocation on the fly from within a PHP file. You simply have the code ini_set('memory_limit', '128M'); (or whatever your desired allocation is). You can remove the memory limit (although machine or instance limits may still apply) by setting the value to "-1".
It's very easy to get memory leaks in a PHP script - especially if you use abstraction, such as an ORM. Try using Xdebug to profile your script and find out where all that memory went.
When adding 22.5 million records into an array with array_push I kept getting "memory exhausted" fatal errors at around 20M records using 4G as the memory limit in file php.ini. To fix this, I added the statement
$old = ini_set('memory_limit', '8192M');
at the top of the file. Now everything is working fine. I do not know if PHP has a memory leak. That is not my job, nor do I care. I just have to get my job done, and this worked.
The program is very simple:
$fh = fopen($myfile);
while (!feof($fh)) {
array_push($file, stripslashes(fgets($fh)));
}
fclose($fh);
The fatal error points to line 3 until I boosted the memory limit, which
eliminated the error.
I kept getting this error, even with memory_limit set in php.ini, and the value reading out correctly with phpinfo().
By changing it from this:
memory_limit=4G
To this:
memory_limit=4096M
This rectified the problem in PHP 7.
You can properly fix this by changing memory_limit on fastcgi/fpm:
$vim /etc/php5/fpm/php.ini
Change memory, like from 128 to 512, see below
; Maximum amount of memory a script may consume (128 MB)
; http://php.net/memory-limit
memory_limit = 128M
to
; Maximum amount of memory a script may consume (128 MB)
; http://php.net/memory-limit
memory_limit = 512M
When you see the above error - especially if the (tried to allocate __ bytes) is a low value, that could be an indicator of an infinite loop, like a function that calls itself with no way out:
function exhaustYourBytes()
{
return exhaustYourBytes();
}
Your site's root directory:
ini_set('memory_limit', '1024M');
After enabling these two lines, it started working:
; Determines the size of the realpath cache to be used by PHP. This value should
; be increased on systems where PHP opens many files to reflect the quantity of
; the file operations performed.
; http://php.net/realpath-cache-size
realpath_cache_size = 16k
; Duration of time, in seconds for which to cache realpath information for a given
; file or directory. For systems with rarely changing files, consider increasing this
; value.
; http://php.net/realpath-cache-ttl
realpath_cache_ttl = 120
Rather than changing the memory_limit value in your php.ini file, if there's a part of your code that could use a lot of memory, you could remove the memory_limit before that section runs, and then replace it after.
$limit = ini_get('memory_limit');
ini_set('memory_limit', -1);
// ... do heavy stuff
ini_set('memory_limit', $limit);
In Drupal 7, you can modify the memory limit in the settings.php file located in your sites/default folder. Around line 260, you'll see this:
ini_set('memory_limit', '128M');
Even if your php.ini settings are high enough, you won't be able to consume more than 128 MB if this isn't set in your Drupal settings.php file.
Change the memory limit in the php.ini file and restart Apache. After the restart, run the phpinfo(); function from any PHP file for a memory_limit change confirmation.
memory_limit = -1
Memory limit -1 means there is no memory limit set. It's now at the maximum.
Just add a ini_set('memory_limit', '-1'); line at the top of your web page.
And you can set your memory as per your need in the place of -1, to 16M, etc..
For Drupal users, this Chris Lane's answer of:
ini_set('memory_limit', '-1');
works but we need to put it just after the opening
<?php
tag in the index.php file in your site's root directory.
PHP 5.3+ allows you to change the memory limit by placing a .user.ini file in the public_html folder.
Simply create the above file and type the following line in it:
memory_limit = 64M
Some cPanel hosts only accept this method.
Crash page?
(It happens when MySQL has to query large rows. By default, memory_limit is set to small, which was safer for the hardware.)
You can check your system existing memory status, before increasing php.ini:
# free -m
total used free shared buffers cached
Mem: 64457 63791 666 0 1118 18273
-/+ buffers/cache: 44398 20058
Swap: 1021 0 1021
Here I have increased it as in the following and then do service httpd restart to fix the crash page issue.
# grep memory_limit /etc/php.ini
memory_limit = 512M
For those who are scratching their heads to find out why on earth this little function should cause a memory leak, sometimes by a little mistake, a function starts recursively call itself for ever.
For example, a proxy class that has the same name for a function of the object that is going to proxy it.
class Proxy {
private $actualObject;
public function doSomething() {
return $this->actualObjec->doSomething();
}
}
Sometimes you may forget to bring that little actualObjec member and because the proxy actually has that doSomething method, PHP wouldn't give you any error and for a large class, it could be hidden from the eyes for a couple of minutes to find out why it is leaking the memory.
I had the error below while running on a dataset smaller than had worked previously.
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 4096 bytes) in C:\workspace\image_management.php on line 173
As the search for the fault brought me here, I thought I'd mention that it's not always the technical solutions in previous answers, but something more simple. In my case it was Firefox. Before I ran the program it was already using 1,157 MB.
It turns out that I'd been watching a 50 minute video a bit at a time over a period of days and that messed things up. It's the sort of fix that experts correct without even thinking about it, but for the likes of me it's worth bearing in mind.
In my case on mac (Catalina - Xampp) there was no loaded file so I had to do this first.
sudo cp /etc/php.ini.default /etc/php.ini
sudo nano /etc/php.ini
Then change memory_limit = 512M
Then Restart Apache and check if file loaded
php -i | grep php.ini
Result was
Configuration File (php.ini) Path => /etc
Loaded Configuration File => /etc/php.ini
Finally Check
php -r "echo ini_get('memory_limit').PHP_EOL;"
Using yield might be a solution as well. See Generator syntax.
Instead of changing the PHP.ini file for a bigger memory storage, sometimes implementing a yield inside a loop might fix the issue. What yield does is instead of dumping all the data at once, it reads it one by one, saving a lot of memory usage.
The reason for this error is that your server configuration has a very low memory limit. Try adding this to wp-config.php (put it after <?php in this file):
define('WP_MEMORY_LIMIT', '96M');
Please note that this limit is OK for the theme and the plugins that come with the theme. If you want to enable other plugins you may need to increase the limit further.
define('WP_MEMORY_LIMIT', '256M');
Running the script like this (cron case for example): php5 /pathToScript/info.php produces the same error.
The correct way: php5 -cli /pathToScript/info.php
If you're running a WHM-powered VPS (virtual private server) you may find that you do not have permissions to edit PHP.INI directly; the system must do it. In the WHM host control panel, go to Service Configuration → PHP Configuration Editor and modify memory_limit:
I find it useful when including or requiring _dbconnection.php_ and _functions.php in files that are actually processed, rather than including in the header. Which is included in itself.
So if your header and footer is included, simply include all your functional files before the header is included.
Greetings is a very common problem because if you have very little memory allocated to php and your website is growing will require more resources.
I found myself in a site that had problems that gave error 500 to modify only some products, the problem was that they had used very heavy images in those specific products, solution:
1.- Increase "memory_limit" in php.ini
2.- Lower the weight of the images.
3.- Adapt again "memory_limit" to an acceptable value "512M" at least for me more than enough.
now it is important that you verify that the changes are being made because php apart from having several versions and several types of installations on the server, maybe you modify one and it does not work and this is because you are not modifying the correct php.ini file.
How do you verify that you are modifying the correct file?
In the prestashop dashboard go to advanced settings/information there you can see "Memory limit".
always remember that after making a change in the php.ini file it is advisable to restart apache or Nginx.
Ubuntu: sudo services apache2 restart
IMPORTANT NOTE: Never set the "memory_limit = -1" as many people mention here. The problem is that if you have a problem with a file or module you could be in a continuous loop consuming all the server's memory and processor. Let's take a simple example: a module has an error and makes a call to a function and until it is not positive it keeps calling, this will create an infinite loop and it will never stop doing it because php has no limit.
I hope it helps colleagues who have this problem.
The most common cause of this error message for me is omitting the "++" operator from a PHP "for" statement. This causes the loop to continue forever, no matter how much memory you allow to be used. It is a simple syntax error, yet is difficult for the compiler or runtime system to detect. It is easy for us to correct if we think to look for it!
But suppose you want a general procedure for stopping such a loop early and reporting the error? You can simply instrument each of your loops (or at least the innermost loops) as discussed below.
In some cases such as recursion inside exceptions, set_time_limit fails, and the browser keeps trying to load the PHP output, either with an infinite loop or with the fatal error message which is the topic of this question.
By reducing the allowed allocation size near the beginning of your code you might be able to prevent the fatal error, as discussed in the other answers.
Then you may be left with a program that terminates, but is still difficult to debug.
Whether or not your program terminates, instrument your code by inserting BreakLoop() calls inside your program to gain control and find out what loop or recursion in your program is causing the problem.
The definition of BreakLoop is as follows:
function BreakLoop($MaxRepetitions=500,$LoopSite="unspecified")
{
static $Sites=[];
if (!#$Sites[$LoopSite] || !$MaxRepetitions)
$Sites[$LoopSite]=['n'=>0, 'if'=>0];
if (!$MaxRepetitions)
return;
if (++$Sites[$LoopSite]['n'] >= $MaxRepetitions)
{
$S=debug_backtrace(); // array_reverse
$info=$S[0];
$File=$info['file'];
$Line=$info['line'];
exit("*** Loop for site $LoopSite was interrupted after $MaxRepetitions repetitions. In file $File at line $Line.");
}
} // BreakLoop
The $LoopSite argument can be the name of a function in your code. It isn't really necessary, since the error message you will get will point you to the line containing the BreakLoop() call.
In my case it was a brief issue with the way a function was written. A memory leak can be caused by assigning a new value to a function's input variable, e.g.:
/**
* Memory leak function that illustrates unintentional bad code
* #param $variable - input function that will be assigned a new value
* #return null
**/
function doSomehting($variable){
$variable = 'set value';
// Or
$variable .= 'set value';
}
Increasing the memory_limit fixed the problem. However, I had problems finding the memory limit. I am working on my project directly from live server, so if you're doing the same, on cPanel you can find the memory_limit if you go to Software - MultiPHP INI Editor and select the location. I increased mine from 256M to 512M. You can also find instructions here.