What can I do to prevent running out of memory when working with large (around 11000 records) datasets ?
The problem
When using mPDF with PHP, I am trying to create PDF files from large datasets (around 11000 records) which is resulting in errors like (memory values fluctuate):
Fatal error: Out of memory (allocated 1197211648) (tried to allocate 44 bytes) in projectfolder\mpdf\mpdf.php on line 24132
What I tried
It works fine on smaller datasets, and I have tried searcing Stackoverflow articles and other google articles resulting in me making the following changes to my php.ini file:
memory_limit=-1
max_execution_time=0
post_max_size=0
My laptop specifications (where the script is being run) :
8GB RAM
i7 processor
64bit OS
XAMPP
mPDF is, sadly, not optimized to work with large datasets resulting in large HTML.
A thing I would recommend is to create multiple smaller PDF documents if you can and later concatenate them with an external tool such as ghostscript.
Use this after calling the autoload:
require_once APPPATH.'libraries/Mpdf/autoload.php';
ini_set("memory_limit","-1");
Related
I have a huge file that I would like to read so I can fill my mysql database. I tried to use the PHPExcel library but I get an error when I want to load my file :
Fatal error: Allowed memory size of 1610612736 bytes exhausted (tried
to allocate 22 bytes) in C:\Wamp\www\Classes\PHPExcel\Worksheet.php on
line 964
I have already increased the value of the memory_limit in the php.ini file but it's still not enough. My Excel file is more than 60 MB (5 109 719 cells).
Anybody have an idea how to solve this problem ?
Reading 5 million cells using PHPExcel is a tricky problem. You can try using the cell caching or chunked loading techniques that Mark mentioned. Take a look at the documentation here: https://github.com/PHPOffice/PHPExcel/blob/develop/Documentation/markdown/Overview/04-Configuration-Settings.md
I also tried reading millions of cells with PHPExcel, using these techniques but it then becomes a performance problem (instead of a memory problem). Reading so many cells take forever. So alternatively, you can try using Spout: https://github.com/box/spout. Reading 5 millions cells should only require a few MB of memory and take 20 to 40 minutes.
If your don't have some special functions its much better to export your file as CSV and import it directly to MySQL. 5 Million cells are a lot and that would take too long time with PHP.
Importing a csv into mysql via command line
I'm not a big Java fan but for that you get a good lib.
https://poi.apache.org/
POI perform good on big files and is working better than PHP. If you can't convert your file to CSV and read it. You should take a look at POI.
I'm trying to read a larger than 100MB Excel file using PHPExcel but it crashes while loading the file. I don't need any styling. I tried using:
$objReader->setReadDataOnly(true);
but it still crashes.
Is there any efficient way to read this size of Excel file in PHP?
Try Spout: https://github.com/box/spout.
This is a PHP library that was created to solve your problem (reading/writing large files). Here is why it works:
Other libraries keep a representation of the spreadsheet in memory which make them subject to out of memory errors. Using some caching strategies will help with these kind of errors but will affect performance pretty badly.
On the other hand, Spout uses streams to read or write data. This means that there is only one row kept in memory at all times, all read/written rows being freed from memory. This allows fast read/write of dataset of any size! Give it a try :)
Spout just saved my time! I couldn't read a large file with PhpOffice/PhPSpreedSheet with many Fatal Error Memory size, and with Spout it works like a charm.
When I use this function:
$objPHPExcel = PHPExcel_IOFactory::load($fileName);
on smaller excel files, it takes a while but finally gets an extensive array out into the $objPHPExcel... Unfortuantely when I try it on a slightly larger more complex Ecel file I get:
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 32 bytes)
The file is an xlsm file and is 1.7MB... Does this sound right or is something fishy going on?
I think this is all the code you need. I'm running off of default WAMP set up at the moment locally.
I had the same issue. At our company we need to import huge xls(x) files to our database. We have been using PEAR Spreadsheet Excel Reader, but it is no longer supported and we encountered many bugs with newer files, so we have tried to switch to PHPExcel.
Unfortunatelly we did not manage to overcome the memory limit issue. And we spent a lof of time trying to.
There is no way you can load 100K rows with columns up to 'BB' with PHPExcel.
It is just not right tool for the job. The PHP Excel builds the whole spreadsheet in the memory as objects. There is no way around this.
What you need is a tool that can read the file row by row, cell by cell.
Our solution was to use Java and Apache POI classes with event model - which does read only operations but is very memory and cpu efficient.
If you only need to support the xml based "Office Open XML" formats (xlsx) and not the old OLE based, then you can process it as XML for your own. The format is not so much complicated if you get into it. Just unzip a file and look at the xmls. You have one file with is string table, and one file with rows and cells per each sheet.
You should parse the string table first, and then the sheet data with xml reader (not the DOM).
As far as I know, there is no PHP library that can import large excel files out of the box as of October 2013.
Good luck.
In my experience, PHPExcel runs out of memory during a lot of operations.
You can try upping the memory limit for the script.
ini_set('memory_limit','256M');
I am using MysqliDb Class from there.
https://github.com/ajillion/PHP-MySQLi-Database-Class/blob/master/MysqliDb.php
When i used on local pc, i don't have any problem. But I bought host yesterday. And I uploaded my files about 5 min ago and doesn't work. I checked my host and created error_log file and this..
PHP Fatal error: Allowed memory size of 75497472 bytes exhausted (tried to allocate 4294967296 bytes) in /home/(..)/MysqliDb.php on line 417
What is this problem?
I used this code on my config file. But same didn't work.
ini_set('memory_limit', '192M');
I think that is that you are using LongText in your database.
Please read this message: https://bugs.php.net/bug.php?id=51386
So try to mysqli::store_result before bind_result.
ini_set commands that affect the limits of the server tend to be blocked (so, you may only be able to use the ones related to showing errors or so).
Bear in mind that you are trying to allocate 4.2 Gb, which is a fairly big amount of information for a website.
Recommendations:
Check if you are creating an infinite loop which tries to load that big bunch of information (not only in that class, but prior to that call in your own code).
Use a lighter mysqli class (did you try the one that comes by default with PHP 5)?
Check if your problem can be solved in another (maybe asynchronous? that amount of memory is insane for a website), lighter way, probably with another language like C or C++ to release the load out of Apache.
Talk to your hosting provider and try to convince them to let you load 4.2 Gb of data.
I've got this weird error when i try to generate either the filters or the form on my production server.
Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to
allocate 20 bytes) in /var/www/project/lib/vendor/symfony/
lib/plugins/sfDoctrinePlugin/lib/vendor/doctrine/Doctrine/Core.php on
line 669
I don't know how to get rid of this error,
i tryed :
Increasing the memory of PHP to 512Mo
Downloading the entire /lib/ folder and to build forms and filters on local : it went right, i got no error.
So which files, the generation of filters or forms are dependent ( apart the /lib/ otherwise i would have got this error on my local computer too but it's not the case.)
Thanks
You shouldn't be generating your forms and filters, or fiddling with much else, on your production server. Build the site locally, and then upload it to the production server. You should only really be clearing the cache and fixing permissions on the production server, depending on your sfPlugin choices.
The generators are quite a large part of symfony given the complexity of the form modelling it does, so it's quite a large group to identify. You really shouldn't need to worry about it unless you have some heavily locked-down production hosting restrictions.
I increased the memory of the CLI and it fixed the problem.