Related
My website was constantly running out of memory in random spots of code when the memory limit was set to 256M, so I changed it to 1024M ti see if it was an issue of space or some bad loop in the code... The website still ran out of memory after a while. What are some things I could do in order to not let the memory overflow?
I saw things about limiting requests but I think this does not solve the root of the problem. I will do that if it's my last option but I want to know what the best ways of troubleshooting this are.
PHP Version: 7.2.30
Apache Version: 2.4.41
Wordpress Version: 5.4.1
This is an image of the error shown on the website when the memory overflows:
This is an example of the error (Keep in mind there are about 100 of these in the log file in one day and the location of the error varies (sometimes it's in a php file in the plugins folder, sometimes it's in the themes folder)):
[16-May-2020 19:16:22 UTC] PHP Fatal error: Out of memory (allocated 21233664) (tried to allocate 4718592 bytes) in /var/www/html/wp-content/plugins/woocommerce/includes/log-handlers/class-wc-log-handler-file.php on line 21
EDIT: The logs also said that I did not have the XML service installed. I installed it but am not sure if that is the root of the problem.
Wordpress should never consume that much memory on a single load. You're saying it's a fairly standard setup. Do you get the same levels of memory usage using a vanilla installation without themes and plugins? And then, do things spike after a given plugin? Build it up from a scratch and see if you can find the culprit (e.g. buggy plugin, plugin conflict, or faulty configuration).
If you want to dig in a bit deeper under the hood, short of using more involved PHP debugging tools like Xdebug, the built-in memory_get_usage() function is your friend. You can for example use a logging function like this (off the top of my head, briefly tested):
function log_mem($file, $line, $save = true) {
static $iter = 0;
$iter++;
$usage = round(memory_get_usage() / 1048576, 2) . ' MB';
$log = "[" . time() . "] {$iter}, {$file}#{$line}, {$usage}\n";
$save && file_put_contents('tmp/php_memory.log', $log, FILE_APPEND);
return $log;
}
log_mem(__FILE__, __LINE__); // to save into log file
echo log_mem(__FILE__, __LINE__, false); // to output only
Pop the log_mem() command into suspect locations. It will log the timestamp, the iteration number (ie. processing order), file name, line number and the current memory usage into a file. Like so:
[1589662734] 1, C:\server\home\more\dev\mem_log.php#14, 0.78 MB
[1589662734] 2, C:\server\home\more\dev\mem_log.php#18, 34.78 MB
[1589662734] 3, C:\server\home\more\dev\mem_log.php#22, 68.78 MB
You can then see where the spikes are triggered, and begin to fix your code.
If you don't care to add and remove the log commands (this obviously causes some processing overhead with repeated filesys access) over and over, you can make it conditionally run with a constant boolean switch, placed into any site-wide included file:
const MEM_LOG = true;
...
MEM_LOG && log_mem(__FILE__, __LINE__);
Remember to follow up and let us know what caused the memory leak. Good luck! ^_^
People think Wordpress is easy. Even if you never touch the code, it is a very difficult system to manage and keep secure. Its complexity makes code customization very, very hard.
Your logs already shows you where the system is running out of memory.
This is an image of the error shown on the website when the memory overflows
What you said here illustrates that you are very inexperienced in operating PHP websites. Your installation should not be writing errors to the browser. This makes me think that you are way out of your depth in trying to resolve this.
You have posted this on a programming forum - implying you may be writing custom code. In which case the correct approach to resolving the error is to use a profiler to track where the memory is getting used up.
However if, in fact, there is no custom code on the site then you need to start by tracking memory usage (register a shutdown function for this) and start disabling themes and plugins until you find the problem.
I have a relatively small store, about 20k skus all simple products. I'm using magento 1.7.2 but had the same problem with all the older versions. I simply cannot export my products to a CSV. Out of memory when running directly from the dataflow profiles in the magento backend and same error when running it from shell.
Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 71 bytes) in /home/public_html/app/code/core/Mage/Eav/Model/Entity/Attribute/Source/Table.php on line 62
I've increased the memory limits and execution times in magentos htaccess to 512m, magentos php.ini to 512m and my VPS php configuration ini to 512mb. it still burns through it in about 4 minutes and runs out of memory.
I'm so confused, my entire database (zipped) is only 28mb! what am I missing to make the magento export all products function work?
Magento dataflow does have a tendency to use massive amounts of memory making exports on large stores difficult. For stores with large product catalogues it is often a lot quicker and easier to write a script to export directly from the database rather than through dataflow.
This might be problem with your .htaccess file(s) aren’t overriding your memory_limit settings globally set in php.ini.
Other option is set memory_limit to unlimited in you index.php for testing. Then you will come to know whether changes in .htaccess is not getting effected or not.
I solved this problem by exporting 500, 1000 or as many as I wanted at a time (with a custom export script).
I made a file that received as parameters $start and $productsToExport. The file took the collection of products, and then used
LIMIT ($start-1)*$productsToExport, $productsToExport
This script only returned the number of products exported.
I made a second, master script that did a recursive AJAX to the first file, with the parameters $start = 0, $productsToExport = 500. When the AJAX was finished, it did the same with $start = 1, and so on, until no products are left.
The advantage of this is that it doesn't overload the server (one ajax is run only after the previous is finished) - and if an error occurs, the script continues. Also thememory_limit and max_execution_time are sa
If by 20k skus you mean 20.000 then this is totally possible. The export is very hungry for memory, unfortunately. I always increase the memory_limit to 2000M in this case and then it takes a while to create the file, but succeeds in the end.
I need to read a large file to find some labels and create a dynamic form. I can not use file() or file_get_contents() because the file size.
If I read the file line by line with the following code
set_time_limit(0);
$handle = fopen($file, 'r');
set_time_limit(0);
if ($handle) {
while (!feof($handle)) {
$line = fgets($handle);
if ($line) {
//do something.
}
}
}
echo 'Read complete';
I get the following error in Chrome:
Error 101 (net::ERR_CONNECTION_RESET)
This error occurs after several minutes so that the constant max_input_time, I think not is the problem.(is set to 60).
What browser software do you use? Apache, nginx? You should set the max accepted file upload at somewhere higher than 500MB. Furthermore, the max upload size in the php.ini should be bigger than 500MB, too, and I think that PHP must be allowed to spawn processes larger than 500MB. (check this in your php config).
Set the memory limit ini_set("memory_limit","600M");also you need to set the time out limit
set_time_limit(0);
Generally long running processes should not be done while the users waits for them to complete.
I'd recommend using a background job oriented tool that can handle this type of work and can be queried about the status of the job (running/finished/error).
My first guess is that something in the middle breaks the connection because of a timeout. Whether it's a timeout in the web server (which PHP cannot know about) or some firewall, it doesn't really matter, PHP gets a signal to close the connection and the script stops running. You could circumvent this behaviour by using ignore-user-abort(true), this along with set_time_limit(0) should do the trick.
The caveat is that whatever caused the connection abort will still do it, though the script would still finish it's job. One very annoying side effect is that this script could possibly be executed multiple times in parallel without neither of them ever completing.
Again, I recommend using some background task to do it and an interface for the end-user (browser) to verify the status of that task. You could also implement a basic one yourself via cron jobs and database/text files that hold the status.
I have a bunch of client point of sale (POS) systems that periodically send new sales data to one centralized database, which stores the data into one big database for report generation.
The client POS is based on PHPPOS, and I have implemented a module that uses the standard XML-RPC library to send sales data to the service. The server system is built on CodeIgniter, and uses the XML-RPC and XML-RPCS libraries for the webservice component. Whenever I send a lot of sales data (as little as 50 rows from the sales table, and individual rows from sales_items pertaining to each item within the sale) I get the following error:
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 54 bytes)
128M is the default value in php.ini, but I assume that is a huge number to break. In fact, I have even tried setting this value to 1024M, and all it does is take a longer time to error out.
As for steps I've taken, I've tried disabling all processing on the server-side, and have rigged it to return a canned response regardless of the input. However, I believe the problem lies in the actual sending of the data. I've even tried disabling the maximum script execution time for PHP, and it still errors out.
Changing the memory_limit by ini_set('memory_limit', '-1'); is not a proper solution. Please don't do that.
Your PHP code may have a memory leak somewhere and you are telling the server to just use all the memory that it wants. You wouldn't have fixed the problem at all. If you monitor your server, you will see that it is now probably using up most of the RAM and even swapping to disk.
You should probably try to track down the offending code in your code and fix it.
ini_set('memory_limit', '-1'); overrides the default PHP memory limit.
The correct way is to edit your php.ini file.
Edit memory_limit to your desire value.
As from your question, 128M (which is the default limit) has been exceeded, so there is something seriously wrong with your code as it should not take that much.
If you know why it takes that much and you want to allow it set memory_limit = 512M or higher and you should be good.
The memory allocation for PHP can be adjusted permanently, or temporarily.
Permanently
You can permanently change the PHP memory allocation two ways.
If you have access to your php.ini file, you can edit the value for memory_limit to your desire value.
If you do not have access to your php.ini file (and your webhost allows it), you can override the memory allocation through your .htaccess file. Add php_value memory_limit 128M (or whatever your desired allocation is).
Temporary
You can adjust the memory allocation on the fly from within a PHP file. You simply have the code ini_set('memory_limit', '128M'); (or whatever your desired allocation is). You can remove the memory limit (although machine or instance limits may still apply) by setting the value to "-1".
It's very easy to get memory leaks in a PHP script - especially if you use abstraction, such as an ORM. Try using Xdebug to profile your script and find out where all that memory went.
When adding 22.5 million records into an array with array_push I kept getting "memory exhausted" fatal errors at around 20M records using 4G as the memory limit in file php.ini. To fix this, I added the statement
$old = ini_set('memory_limit', '8192M');
at the top of the file. Now everything is working fine. I do not know if PHP has a memory leak. That is not my job, nor do I care. I just have to get my job done, and this worked.
The program is very simple:
$fh = fopen($myfile);
while (!feof($fh)) {
array_push($file, stripslashes(fgets($fh)));
}
fclose($fh);
The fatal error points to line 3 until I boosted the memory limit, which
eliminated the error.
I kept getting this error, even with memory_limit set in php.ini, and the value reading out correctly with phpinfo().
By changing it from this:
memory_limit=4G
To this:
memory_limit=4096M
This rectified the problem in PHP 7.
You can properly fix this by changing memory_limit on fastcgi/fpm:
$vim /etc/php5/fpm/php.ini
Change memory, like from 128 to 512, see below
; Maximum amount of memory a script may consume (128 MB)
; http://php.net/memory-limit
memory_limit = 128M
to
; Maximum amount of memory a script may consume (128 MB)
; http://php.net/memory-limit
memory_limit = 512M
When you see the above error - especially if the (tried to allocate __ bytes) is a low value, that could be an indicator of an infinite loop, like a function that calls itself with no way out:
function exhaustYourBytes()
{
return exhaustYourBytes();
}
Your site's root directory:
ini_set('memory_limit', '1024M');
After enabling these two lines, it started working:
; Determines the size of the realpath cache to be used by PHP. This value should
; be increased on systems where PHP opens many files to reflect the quantity of
; the file operations performed.
; http://php.net/realpath-cache-size
realpath_cache_size = 16k
; Duration of time, in seconds for which to cache realpath information for a given
; file or directory. For systems with rarely changing files, consider increasing this
; value.
; http://php.net/realpath-cache-ttl
realpath_cache_ttl = 120
Rather than changing the memory_limit value in your php.ini file, if there's a part of your code that could use a lot of memory, you could remove the memory_limit before that section runs, and then replace it after.
$limit = ini_get('memory_limit');
ini_set('memory_limit', -1);
// ... do heavy stuff
ini_set('memory_limit', $limit);
In Drupal 7, you can modify the memory limit in the settings.php file located in your sites/default folder. Around line 260, you'll see this:
ini_set('memory_limit', '128M');
Even if your php.ini settings are high enough, you won't be able to consume more than 128 MB if this isn't set in your Drupal settings.php file.
Change the memory limit in the php.ini file and restart Apache. After the restart, run the phpinfo(); function from any PHP file for a memory_limit change confirmation.
memory_limit = -1
Memory limit -1 means there is no memory limit set. It's now at the maximum.
Just add a ini_set('memory_limit', '-1'); line at the top of your web page.
And you can set your memory as per your need in the place of -1, to 16M, etc..
For Drupal users, this Chris Lane's answer of:
ini_set('memory_limit', '-1');
works but we need to put it just after the opening
<?php
tag in the index.php file in your site's root directory.
PHP 5.3+ allows you to change the memory limit by placing a .user.ini file in the public_html folder.
Simply create the above file and type the following line in it:
memory_limit = 64M
Some cPanel hosts only accept this method.
Crash page?
(It happens when MySQL has to query large rows. By default, memory_limit is set to small, which was safer for the hardware.)
You can check your system existing memory status, before increasing php.ini:
# free -m
total used free shared buffers cached
Mem: 64457 63791 666 0 1118 18273
-/+ buffers/cache: 44398 20058
Swap: 1021 0 1021
Here I have increased it as in the following and then do service httpd restart to fix the crash page issue.
# grep memory_limit /etc/php.ini
memory_limit = 512M
For those who are scratching their heads to find out why on earth this little function should cause a memory leak, sometimes by a little mistake, a function starts recursively call itself for ever.
For example, a proxy class that has the same name for a function of the object that is going to proxy it.
class Proxy {
private $actualObject;
public function doSomething() {
return $this->actualObjec->doSomething();
}
}
Sometimes you may forget to bring that little actualObjec member and because the proxy actually has that doSomething method, PHP wouldn't give you any error and for a large class, it could be hidden from the eyes for a couple of minutes to find out why it is leaking the memory.
I had the error below while running on a dataset smaller than had worked previously.
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 4096 bytes) in C:\workspace\image_management.php on line 173
As the search for the fault brought me here, I thought I'd mention that it's not always the technical solutions in previous answers, but something more simple. In my case it was Firefox. Before I ran the program it was already using 1,157 MB.
It turns out that I'd been watching a 50 minute video a bit at a time over a period of days and that messed things up. It's the sort of fix that experts correct without even thinking about it, but for the likes of me it's worth bearing in mind.
In my case on mac (Catalina - Xampp) there was no loaded file so I had to do this first.
sudo cp /etc/php.ini.default /etc/php.ini
sudo nano /etc/php.ini
Then change memory_limit = 512M
Then Restart Apache and check if file loaded
php -i | grep php.ini
Result was
Configuration File (php.ini) Path => /etc
Loaded Configuration File => /etc/php.ini
Finally Check
php -r "echo ini_get('memory_limit').PHP_EOL;"
Using yield might be a solution as well. See Generator syntax.
Instead of changing the PHP.ini file for a bigger memory storage, sometimes implementing a yield inside a loop might fix the issue. What yield does is instead of dumping all the data at once, it reads it one by one, saving a lot of memory usage.
The reason for this error is that your server configuration has a very low memory limit. Try adding this to wp-config.php (put it after <?php in this file):
define('WP_MEMORY_LIMIT', '96M');
Please note that this limit is OK for the theme and the plugins that come with the theme. If you want to enable other plugins you may need to increase the limit further.
define('WP_MEMORY_LIMIT', '256M');
Running the script like this (cron case for example): php5 /pathToScript/info.php produces the same error.
The correct way: php5 -cli /pathToScript/info.php
If you're running a WHM-powered VPS (virtual private server) you may find that you do not have permissions to edit PHP.INI directly; the system must do it. In the WHM host control panel, go to Service Configuration → PHP Configuration Editor and modify memory_limit:
I find it useful when including or requiring _dbconnection.php_ and _functions.php in files that are actually processed, rather than including in the header. Which is included in itself.
So if your header and footer is included, simply include all your functional files before the header is included.
Greetings is a very common problem because if you have very little memory allocated to php and your website is growing will require more resources.
I found myself in a site that had problems that gave error 500 to modify only some products, the problem was that they had used very heavy images in those specific products, solution:
1.- Increase "memory_limit" in php.ini
2.- Lower the weight of the images.
3.- Adapt again "memory_limit" to an acceptable value "512M" at least for me more than enough.
now it is important that you verify that the changes are being made because php apart from having several versions and several types of installations on the server, maybe you modify one and it does not work and this is because you are not modifying the correct php.ini file.
How do you verify that you are modifying the correct file?
In the prestashop dashboard go to advanced settings/information there you can see "Memory limit".
always remember that after making a change in the php.ini file it is advisable to restart apache or Nginx.
Ubuntu: sudo services apache2 restart
IMPORTANT NOTE: Never set the "memory_limit = -1" as many people mention here. The problem is that if you have a problem with a file or module you could be in a continuous loop consuming all the server's memory and processor. Let's take a simple example: a module has an error and makes a call to a function and until it is not positive it keeps calling, this will create an infinite loop and it will never stop doing it because php has no limit.
I hope it helps colleagues who have this problem.
The most common cause of this error message for me is omitting the "++" operator from a PHP "for" statement. This causes the loop to continue forever, no matter how much memory you allow to be used. It is a simple syntax error, yet is difficult for the compiler or runtime system to detect. It is easy for us to correct if we think to look for it!
But suppose you want a general procedure for stopping such a loop early and reporting the error? You can simply instrument each of your loops (or at least the innermost loops) as discussed below.
In some cases such as recursion inside exceptions, set_time_limit fails, and the browser keeps trying to load the PHP output, either with an infinite loop or with the fatal error message which is the topic of this question.
By reducing the allowed allocation size near the beginning of your code you might be able to prevent the fatal error, as discussed in the other answers.
Then you may be left with a program that terminates, but is still difficult to debug.
Whether or not your program terminates, instrument your code by inserting BreakLoop() calls inside your program to gain control and find out what loop or recursion in your program is causing the problem.
The definition of BreakLoop is as follows:
function BreakLoop($MaxRepetitions=500,$LoopSite="unspecified")
{
static $Sites=[];
if (!#$Sites[$LoopSite] || !$MaxRepetitions)
$Sites[$LoopSite]=['n'=>0, 'if'=>0];
if (!$MaxRepetitions)
return;
if (++$Sites[$LoopSite]['n'] >= $MaxRepetitions)
{
$S=debug_backtrace(); // array_reverse
$info=$S[0];
$File=$info['file'];
$Line=$info['line'];
exit("*** Loop for site $LoopSite was interrupted after $MaxRepetitions repetitions. In file $File at line $Line.");
}
} // BreakLoop
The $LoopSite argument can be the name of a function in your code. It isn't really necessary, since the error message you will get will point you to the line containing the BreakLoop() call.
In my case it was a brief issue with the way a function was written. A memory leak can be caused by assigning a new value to a function's input variable, e.g.:
/**
* Memory leak function that illustrates unintentional bad code
* #param $variable - input function that will be assigned a new value
* #return null
**/
function doSomehting($variable){
$variable = 'set value';
// Or
$variable .= 'set value';
}
Increasing the memory_limit fixed the problem. However, I had problems finding the memory limit. I am working on my project directly from live server, so if you're doing the same, on cPanel you can find the memory_limit if you go to Software - MultiPHP INI Editor and select the location. I increased mine from 256M to 512M. You can also find instructions here.
This question already has answers here:
Fatal Error: Allowed Memory Size of 134217728 Bytes Exhausted (CodeIgniter + XML-RPC)
(36 answers)
Closed 12 months ago.
This error message is being presented, any suggestions?
Allowed memory size of 33554432 bytes exhausted (tried to allocate
43148176 bytes) in php
If your script is expected to allocate that big amount of memory, then you can increase the memory limit by adding this line to your php file
ini_set('memory_limit', '44M');
where 44M is the amount you expect to be consumed.
However, most of time this error message means that the script is doing something wrong and increasing the memory limit will just result in the same error message with different numbers.
Therefore, instead of increasing the memory limit you must rewrite the code so it won't allocate that much memory. For example, processing large amounts of data in smaller chunks, unsetting variables that hold large values but not needed anymore, etc.
Here are two simple methods to increase the limit on shared hosting:
If you have access to your PHP.ini file, change the line in PHP.ini
If your line shows 32M try 64M:
memory_limit = 64M ; Maximum amount of memory a script may consume (64MB)
If you don't have access to PHP.ini try adding this to an .htaccess file:
php_value memory_limit 64M
Your script is using too much memory. This can often happen in PHP if you have a loop that has run out of control and you are creating objects or adding to arrays on each pass of the loop.
Check for infinite loops.
If that isn't the problem, try and help out PHP by destroying objects that you are finished with by setting them to null. eg. $OldVar = null;
Check the code where the error actually happens as well. Would you expect that line to be allocating a massive amount of memory? If not, try and figure out what has gone wrong...
Doing :
ini_set('memory_limit', '-1');
is never good. If you want to read a very large file, it is a best practise to copy it bit by bit. Try the following code for best practise.
$path = 'path_to_file_.txt';
$file = fopen($path, 'r');
$len = 1024; // 1MB is reasonable for me. You can choose anything though, but do not make it too big
$output = fread( $file, $len );
while (!feof($file)) {
$output .= fread( $file, $len );
}
fclose($file);
echo 'Output is: ' . $output;
I have faced same problem in php7.2 with laravel 5.6. I just increase the amount of variable memory_limit = 128M in php.ini as my applications demand. It might be 256M/512M/1048M.....Now it works fine.
It is unfortunately easy to program in PHP in a way that consumes memory faster than you realise. Copying strings, arrays and objects instead of using references will do it, though PHP 5 is supposed to do this more automatically than in PHP 4. But dealing with your data set in entirety over several steps is also wasteful compared to processing the smallest logical unit at a time. The classic example is working with large resultsets from a database: most programmers fetch the entire resultset into an array and then loop over it one or more times with foreach(). It is much more memory efficient to use a while() loop to fetch and process one row at a time. The same thing applies to processing a file.
If you want to read large files, you should read them bit by bit instead of reading them at once.
It’s simple math: If you read a 1 MB large file at once, than at least 1 MB of memory is needed at the same time to hold the data.
So you should read them bit by bit using fopen & fread.
I was also having the same problem, looked for phpinfo.ini, php.ini or .htaccess files to no avail. Finally I have looked at some php files, opened them and checked the codes inside for memory. Finally this solution was what I come out with and it worked for me. I was using wordpress, so this solution might only work for wordpress memory size limit problem.
My solution, open default-constants.php file in /public_html/wp-includes folder. Open that file with code editor, and find memory settings under wp_initial_constants scope, or just Ctrl+F it to find the word "memory". There you will come over WP_MEMORY_LIMIT and WP_MAX_MEMORY_LIMIT. Just increase it, it was 64 MB in my case, I increased it to 128 MB and then to 200 MB.
// Define memory limits.
if ( ! defined( 'WP_MEMORY_LIMIT' ) ) {
if ( false === wp_is_ini_value_changeable( 'memory_limit' ) ) {
define( 'WP_MEMORY_LIMIT', $current_limit );
} elseif ( is_multisite() ) {
define( 'WP_MEMORY_LIMIT', '200M' );
} else {
define( 'WP_MEMORY_LIMIT', '128M' );
}
}
if ( ! defined( 'WP_MAX_MEMORY_LIMIT' ) ) {
if ( false === wp_is_ini_value_changeable( 'memory_limit' ) ) {
define( 'WP_MAX_MEMORY_LIMIT', $current_limit );
} elseif ( -1 === $current_limit_int || $current_limit_int > 268435456 /* = 256M */ ) {
define( 'WP_MAX_MEMORY_LIMIT', $current_limit );
} else {
define( 'WP_MAX_MEMORY_LIMIT', '256M' );
}
}
Btw, please don't do the following code, because that's bad practice:
ini_set('memory_limit', '-1');
I notice many answers just try to increase the amount of memory given to a script which has its place but more often than not it means that something is being too liberal with memory due to an unforseen amount of volume or size. Obviously if your not the author of a script your at the mercy of the author unless your feeling ambitious :) The PHP docs even say memory issues are due to "poorly written scripts"
It should be mentioned that ini_set('memory_limit', '-1'); (no limit) can cause server instability as 0 bytes free = bad things. Instead, find a reasonable balance by what your script is trying to do and the amount of available memory on a machine.
A better approach: If you are the author of the script (or ambitious) you can debug such memory issues with xdebug. The latest version (2.6.0 - released 2018-01-29) brought back memory profiling that shows you what function calls are consuming large amounts of memory. It exposes issues in the script that are otherwise hard to find. Usually, the inefficiencies are in a loop that isn't expecting the volume it's receiving, but each case will be left as an exercise to the reader :)
The xdebug documentation is helpful, but it boils down to 3 steps:
Install It - Available through apt-get and yum etc
Configure it - xdebug.ini: xdebug.profiler_enable = 1, xdebug.profiler_output_dir = /where/ever/
View the profiles in a tool like QCacheGrind, KCacheGrind
You can increase the memory allowed to php script by executing the following line above all the codes in the script:
ini_set('memory_limit','-1'); // enabled the full memory available.
And also de allocate the unwanted variables in the script.
ini_set('memory_limit', '-1');
If you are trying to read a file, that will take up memory in PHP. For instance, if you are trying to open up and read an MP3 file ( like, say, $data = file("http://mydomain.com/path/sample.mp3" ) it is going to pull it all into memory.
As Nelson suggests, you can work to increase your maximum memory limit if you actually need to be using this much memory.
We had a similar situation and we tried out given at the top of the answers
ini_set('memory_limit', '-1');
and everything worked fine, compressed images files greater than 1MB to KBs.
Write
ini_set('memory_limit', '-1');
in your index.php at the top after opening of php tag
I was receiving the same error message after switching to a new theme in Wordpress. PHP was running version 5.3 so I switched to 7.2. That fixed the issue.
If you are using a shared hosting, you will not be able to enforce the increment in the php size limit.
Just go to your cpanel and upgrade your php version to 7.1 and above then you are good to go.
I want to share my experience on this issue!
Suppose you have a class A and class B.
class A {
protected $userB;
public function __construct() {
$this->userB = new B();
}
}
class B {
protected $userA;
public function __construct() {
$this->userA = new A();
}
}
this will initiate a chain formation of objects which may be create this kind of issue!
I ran the following command composer update, and now are working. If its not solve your problem increase your memory limit in memory_limit in the php.ini file.
wordpress users add line:
#ini_set('memory_limit', '-1');
in wp-settings.php which you can find in the wordpress installed root folder
I hadn't renewed my hosting and the database was read-only. Joomla needed to write the session and couldn't do it.
I had the same issue which running php in command line. Recently, I had changes the php.ini file and did a mistake while changing the php.ini
This is for php7.0
path to php.ini where I made mistake: /etc/php/7.0/cli/php.ini
I had set memory_limit = 256 (which means 256 bytes) instead of memory_limit = 256M (which means 256 Mega bytes).
; Maximum amount of memory a script may consume (128MB)
; http://php.net/memory-limit memory_limit = 128M
Once I corrected it, my process started running fine.
COMPOSER_MEMORY_LIMIT=-1 composer require mageplaza/module-core
wordpress users add line:
#ini_set('memory_limit', '-1');
or
set memory_limit to -1 in cpanel
If you are using Laravel, then use these ways:
public function getClientsListApi(Request $request){
print_r($request->all()); //for all request
print_r($request->name); //for all name
}
instead of
public function getClientsListApi(Request $request){
print_r($request); // it show error as above mention
}