I have a huge file that I would like to read so I can fill my mysql database. I tried to use the PHPExcel library but I get an error when I want to load my file :
Fatal error: Allowed memory size of 1610612736 bytes exhausted (tried
to allocate 22 bytes) in C:\Wamp\www\Classes\PHPExcel\Worksheet.php on
line 964
I have already increased the value of the memory_limit in the php.ini file but it's still not enough. My Excel file is more than 60 MB (5 109 719 cells).
Anybody have an idea how to solve this problem ?
Reading 5 million cells using PHPExcel is a tricky problem. You can try using the cell caching or chunked loading techniques that Mark mentioned. Take a look at the documentation here: https://github.com/PHPOffice/PHPExcel/blob/develop/Documentation/markdown/Overview/04-Configuration-Settings.md
I also tried reading millions of cells with PHPExcel, using these techniques but it then becomes a performance problem (instead of a memory problem). Reading so many cells take forever. So alternatively, you can try using Spout: https://github.com/box/spout. Reading 5 millions cells should only require a few MB of memory and take 20 to 40 minutes.
If your don't have some special functions its much better to export your file as CSV and import it directly to MySQL. 5 Million cells are a lot and that would take too long time with PHP.
Importing a csv into mysql via command line
I'm not a big Java fan but for that you get a good lib.
https://poi.apache.org/
POI perform good on big files and is working better than PHP. If you can't convert your file to CSV and read it. You should take a look at POI.
Related
While uploading Excel File in Laravel, throws error Allowed memory size of 536870912 bytes exhausted (tried to allocate 33554432 bytes) in folderpath/vendor/phpoffice/phpexcel/Classes/PHPExcel/Worksheet.php:1213
I'm using PHP 7.1.
My php.ini file has set memory_limit to 2048M
Even if I upload the small excel file same error
That specific row of PHPExcel relates to the instantiation of a cell instance, so I guess your excel file might have a lot cell's or due to a bug it end's in a loop, but thats hard to say without seeing it.
You may consider changing to the successor of the PHPExcel package called PhpSpreadsheet and see if it helps. It contains a lot of optimizations, although the structure is different. It also supports php7.1, so it could be an option for you, so why don't you give it a try?
PhpSpreadsheet is the next version of PHPExcel. It breaks compatibility to dramatically improve the code base quality (namespaces, PSR compliance, use of latest PHP language features, etc.). Because all efforts have shifted to PhpSpreadsheet, PHPExcel will no longer be maintained. All contributions for PHPExcel, patches and new features, should target PhpSpreadsheet master branch.
https://phpspreadsheet.readthedocs.io/en/latest/
What can I do to prevent running out of memory when working with large (around 11000 records) datasets ?
The problem
When using mPDF with PHP, I am trying to create PDF files from large datasets (around 11000 records) which is resulting in errors like (memory values fluctuate):
Fatal error: Out of memory (allocated 1197211648) (tried to allocate 44 bytes) in projectfolder\mpdf\mpdf.php on line 24132
What I tried
It works fine on smaller datasets, and I have tried searcing Stackoverflow articles and other google articles resulting in me making the following changes to my php.ini file:
memory_limit=-1
max_execution_time=0
post_max_size=0
My laptop specifications (where the script is being run) :
8GB RAM
i7 processor
64bit OS
XAMPP
mPDF is, sadly, not optimized to work with large datasets resulting in large HTML.
A thing I would recommend is to create multiple smaller PDF documents if you can and later concatenate them with an external tool such as ghostscript.
Use this after calling the autoload:
require_once APPPATH.'libraries/Mpdf/autoload.php';
ini_set("memory_limit","-1");
I'm trying to read a larger than 100MB Excel file using PHPExcel but it crashes while loading the file. I don't need any styling. I tried using:
$objReader->setReadDataOnly(true);
but it still crashes.
Is there any efficient way to read this size of Excel file in PHP?
Try Spout: https://github.com/box/spout.
This is a PHP library that was created to solve your problem (reading/writing large files). Here is why it works:
Other libraries keep a representation of the spreadsheet in memory which make them subject to out of memory errors. Using some caching strategies will help with these kind of errors but will affect performance pretty badly.
On the other hand, Spout uses streams to read or write data. This means that there is only one row kept in memory at all times, all read/written rows being freed from memory. This allows fast read/write of dataset of any size! Give it a try :)
Spout just saved my time! I couldn't read a large file with PhpOffice/PhPSpreedSheet with many Fatal Error Memory size, and with Spout it works like a charm.
When I use this function:
$objPHPExcel = PHPExcel_IOFactory::load($fileName);
on smaller excel files, it takes a while but finally gets an extensive array out into the $objPHPExcel... Unfortuantely when I try it on a slightly larger more complex Ecel file I get:
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 32 bytes)
The file is an xlsm file and is 1.7MB... Does this sound right or is something fishy going on?
I think this is all the code you need. I'm running off of default WAMP set up at the moment locally.
I had the same issue. At our company we need to import huge xls(x) files to our database. We have been using PEAR Spreadsheet Excel Reader, but it is no longer supported and we encountered many bugs with newer files, so we have tried to switch to PHPExcel.
Unfortunatelly we did not manage to overcome the memory limit issue. And we spent a lof of time trying to.
There is no way you can load 100K rows with columns up to 'BB' with PHPExcel.
It is just not right tool for the job. The PHP Excel builds the whole spreadsheet in the memory as objects. There is no way around this.
What you need is a tool that can read the file row by row, cell by cell.
Our solution was to use Java and Apache POI classes with event model - which does read only operations but is very memory and cpu efficient.
If you only need to support the xml based "Office Open XML" formats (xlsx) and not the old OLE based, then you can process it as XML for your own. The format is not so much complicated if you get into it. Just unzip a file and look at the xmls. You have one file with is string table, and one file with rows and cells per each sheet.
You should parse the string table first, and then the sheet data with xml reader (not the DOM).
As far as I know, there is no PHP library that can import large excel files out of the box as of October 2013.
Good luck.
In my experience, PHPExcel runs out of memory during a lot of operations.
You can try upping the memory limit for the script.
ini_set('memory_limit','256M');
I have a trouble with converting a .xls file (Excel) to CSV in PHPExcel.
All works fine until comes some Big file. My php script just exceeds the memory limit and blows up. I cannot use more than 64MB because of the specifics of the computer. I'm running Apache on it.
We need to find a solution.
I think I have to tell PHPExcel to load just a few lines of Excel than convert it to small CSV, save it, free the used memory and so on with the rest of the file until it's done...
What you think about? Can we find the more accurate way of doing it.
You have a few options for saving memory with PHPExcel. The main two are:
cell caching, described in section 4.2.1 of the developer
documentation,
This allows you to reduce the memory overhead for each cell that is read from the file
Chunking, described in section 4.3 of the User Documentation for
Readers
This allows you to read small ranges of rows and
columns from a file, rather than the whole worksheet