I am using MysqliDb Class from there.
https://github.com/ajillion/PHP-MySQLi-Database-Class/blob/master/MysqliDb.php
When i used on local pc, i don't have any problem. But I bought host yesterday. And I uploaded my files about 5 min ago and doesn't work. I checked my host and created error_log file and this..
PHP Fatal error: Allowed memory size of 75497472 bytes exhausted (tried to allocate 4294967296 bytes) in /home/(..)/MysqliDb.php on line 417
What is this problem?
I used this code on my config file. But same didn't work.
ini_set('memory_limit', '192M');
I think that is that you are using LongText in your database.
Please read this message: https://bugs.php.net/bug.php?id=51386
So try to mysqli::store_result before bind_result.
ini_set commands that affect the limits of the server tend to be blocked (so, you may only be able to use the ones related to showing errors or so).
Bear in mind that you are trying to allocate 4.2 Gb, which is a fairly big amount of information for a website.
Recommendations:
Check if you are creating an infinite loop which tries to load that big bunch of information (not only in that class, but prior to that call in your own code).
Use a lighter mysqli class (did you try the one that comes by default with PHP 5)?
Check if your problem can be solved in another (maybe asynchronous? that amount of memory is insane for a website), lighter way, probably with another language like C or C++ to release the load out of Apache.
Talk to your hosting provider and try to convince them to let you load 4.2 Gb of data.
Related
I've got to get some (potentially) very large files uploaded to my S3 bucket on a Laravel Job I am building out. I am getting the dreaded "Allowed memory size of ### bytes exhausted" error, and I have no interest in increasing the memory limit in php.ini (simply because I don't know how large some of these files will go, and at some point I need to quit running away from these large files by increasing memory_limit to ridiculous levels).
The question is: Does Laravel make chunking this thing easy? Is there a function I am not seeing that I can use?
I know the answer is probably no, but Laravel makes SO many things easy for me, I figured I might ask to see if I was missing something in my Google's.
If this does not exist in Laravel, what should I do? I know that I need to take the file into memory a chunk at a time, but I have no idea where to start on that.
Thanks!
I am using MysqliDb Class from there.
https://github.com/ajillion/PHP-MySQLi-Database-Class/blob/master/MysqliDb.php
When i used on local pc, i don't have any problem. But I bought host yesterday. And I uploaded my files about 5 min ago and doesn't work. I checked my host and created error_log file and this..
PHP Fatal error: Allowed memory size of 75497472 bytes exhausted (tried to allocate 4294967296 bytes) in /home/(..)/MysqliDb.php on line 417
What is this problem?
I used this code on my config file. But same didn't work.
ini_set('memory_limit', '192M');
I think that is that you are using LongText in your database.
Please read this message: https://bugs.php.net/bug.php?id=51386
So try to mysqli::store_result before bind_result.
ini_set commands that affect the limits of the server tend to be blocked (so, you may only be able to use the ones related to showing errors or so).
Bear in mind that you are trying to allocate 4.2 Gb, which is a fairly big amount of information for a website.
Recommendations:
Check if you are creating an infinite loop which tries to load that big bunch of information (not only in that class, but prior to that call in your own code).
Use a lighter mysqli class (did you try the one that comes by default with PHP 5)?
Check if your problem can be solved in another (maybe asynchronous? that amount of memory is insane for a website), lighter way, probably with another language like C or C++ to release the load out of Apache.
Talk to your hosting provider and try to convince them to let you load 4.2 Gb of data.
I have a CakePHP 2.2.3 applicaiton that's running perfectly fine on our Dev server, a Debian Squeeze LAMP box from Turnkey Linux. We're using InMotion hosting for our production server, and moving our code over to this server has been DISASTEROUS.
While testing out AJAX functionality on one page, we were getting the terribly unhelpful:
Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 389245600 bytes) in Unknown on line 0
tl;dr: I am looking for suggestions on how we can debug this issue
My first course of action was to strip down all the code within the controller functions to the bare minimum. The index() action of one of my controllers contains ONE line of code, and still somehow manages to exceed 256mb of memory per execution:
$this->autoRender = false;
To take the above point to the extreme, I commented out EVERY line of the Model & Controller that is generating this error. Still running out of memory. Several other pages that make MySQL database requests also display this "memory exhausted" error despite the fact that they load completely. Other pages, the memory error is more of a show-stopper and completely prevents execution.
I have tried raising the memory limit from 256 to 512 or even 1024mb, all this does is suppress the error message itself. The page does not route/render or do anything, it just silently fails.
At the suggestion of another SO post, I tried turning Debug from 2 down to 0, which does not help the issue at all either.
We do not have XDebug installed on our production server, so I am at a loss as to how I'm supposed to track down the issue for our web host to fix the problem.
The VPS we are using is a CentOS 5.8 server running Apache 2.2.23, MySQL 5.3.18, and CakePHP 2.2.3
Our webhost can't or won't provide any further information on the subject. They suggested we "ask the Cake devs if they've seen anything like this before", which I feel is a very cowardly way to kick the can down the road. I'm hoping that someone here on SO has seen something like this issue before and might be able to help.
I've seen this problem before, and it may be because you're not using Containable behavior.
It's happened to me many times before I learn to set $recursive = -1 on AppModel (or any model you're using).
Unless you're managing tons of info per page knowingly, you should restrict the data retrived. It's important to maintain the retrieval of models to the minimum, using a combination of the Containable Behaviour and $recursive
Just a tip: it can be a session problem. If you store too much in $_SESSION, session_start() can do such a thing for it has to read all the shit you've stored. Just try this:
$_SESSION = array();
If this helps, you'll find out the rest.
At my workplace, we are running a Magento 1.3 storefront, and we are having a problem with a job that is being run by Magento's internal cron service. Troubleshooting is nearly at a halt because we can't identify which job is causing the problem. The only feedback that we're getting is that every night at 00:05, cron coughs up the following, notoriously unhelpful PHP error while executing /usr/bin/php-cgi -f /path/to/app/magento/html/cron.php.
PHP Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 50 bytes) in /chroot/magento/html/lib/Zend/Db/Statement/Pdo.php on line 294
Increasing PHP's memory limit is clearly not the answer - at 512mb, the problem is almost certainly that an algorithm is doing something grievously wrong, not that we've underestimated the requirements of the problem. Our database is fairly modest in size - a plaintext dump of the entire thing is less than 512mb, so a query would have to be pretty pathological to eat more than that. Our best guess is that something is probably using Zend's fetchAll() incorrectly, but that method isn't being called directly in anything we can find.
How can we get Magento to give us a stack trace or some other indication of its internal state at the time of the problem? Is there a way to get more transparency into exactly what PHP is trying to execute when it hits the memory wall?
Ideally, we would like to do this without modifying third-party code - sometimes plugin developers use bullfeathers measures like Zend Guard, or have a license that does not permit us to modify their broken code, or other measures that basically make me want to go find Mr. Stallman and give him a warm, grateful hug.
Please note that the problem is not "how do we solve the out-of-memory error?" That's been asked many times before, and answered with varying excellence. The problem is "how can we tell which PHP file is causing the out-of-memory error?" It's a question about Magento internals, not about PHP qua PHP.
I would suggest using Mage::log() to log the beginning and end of your multiple jobs so you can narrow it down to the task. After that just create a controller that will execute it manually so you can start debugging it to nail down the problem.
If possible I would run the cronjob with the Xdebug module, with Xdebug function trace or with Xdebug stack trace (which if Xdebug is enabled should display/log the stack trace automatically).
For the first, you should probably configure (either in php.ini or by using
php -d xdebug.auto_trace=1 ... cron.php
) the following:
xdebug.auto_trace=1
xdebug.trace_output_dir=/some/temp/path/
xdebug.
Also check other interesting settings like xdebug.collect_params
Good luck!
NB: be careful where you output the traces since they probably contain sensitive data.
I've got this weird error when i try to generate either the filters or the form on my production server.
Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to
allocate 20 bytes) in /var/www/project/lib/vendor/symfony/
lib/plugins/sfDoctrinePlugin/lib/vendor/doctrine/Doctrine/Core.php on
line 669
I don't know how to get rid of this error,
i tryed :
Increasing the memory of PHP to 512Mo
Downloading the entire /lib/ folder and to build forms and filters on local : it went right, i got no error.
So which files, the generation of filters or forms are dependent ( apart the /lib/ otherwise i would have got this error on my local computer too but it's not the case.)
Thanks
You shouldn't be generating your forms and filters, or fiddling with much else, on your production server. Build the site locally, and then upload it to the production server. You should only really be clearing the cache and fixing permissions on the production server, depending on your sfPlugin choices.
The generators are quite a large part of symfony given the complexity of the form modelling it does, so it's quite a large group to identify. You really shouldn't need to worry about it unless you have some heavily locked-down production hosting restrictions.
I increased the memory of the CLI and it fixed the problem.