I am using the PHPExcel framework to try to write out a very large excel document from a mysql query.
Everything works fine until I hit the 5000 row mark (or there about's) where the page gets flagged with the error:
Fatal error: Allowed memory size of xxx bytes exhausted (tried to allocate yyy bytes) in zzz on line aaa
I know this is documented, and I have adjusted the memory allocation on the server but even so I am still hitting the ceiling. I have also tried to turn off formatting, but in reality I will need it on.
So is there a way to write in small chunks, or append to the excel document so I don't exhaust the memory allocation? I am thinking along the lines of the page writing say 1000 lines, then redirect to itself and process the next 1000 using a GET to keep track. For example:
index.php?p=0
then redirect to
index.php?p=1000,
But I can't find a way to append to an existing document without opening up the whole thing.
There is no way of writing in chunks, although a common mistake is for people to load their mysql data to an array, then loop through the array setting the Excel cell data. It's more memory efficient to set the cell data as you loop through the mySQL query resultset.
If you need to keep memory usage to a minimum, what cell caching method are you using? Cell caching is slower, but can save significant amounts of memory.
Related
I am working with tcpdf to generate report in pdf. when i generate report for one month it gives me the error and it only genereates 15 pages in pdf not more than that.
Fatal error: Out of memory (allocated 85721088) (tried to allocate 262142 bytes) in C:\xampp\htdocs\Apps\tcpdf\tcpdf.php on line 20732
How to resolve this issues tcpdf not creating more than 15 pages. Please Help me to resolve this issue I am stuck on it. I want More pages to genrate to show all data.
Thanks In Advance
There are a number of issues that you need to address when dealing with TCPDF,
The first is setting the max amount of memory you feel appropriate and require using something like this.
ini_set('memory_limit','VALUE HERE');
Then you need to ensure that you have sufficent processing time;
ini_set('max_execution_time', VALUE HERE);
It is then a very good idea to check all your code, looking for ways to limit the amount of resources required to create the pdf. In my case, I had lots of DB results, and multiple arrays storing huge volumes of data which remained in memory after they had been used. Once I unset these after their usage, my memory usage reduced considerably.
I have a PHP script that exports a MySQL database to CSV format. The strange CSV format is a requirement for a 3rd party software application. It worked initially, but I am now running out of memory at line 743. My hosting service has a limit of 66MB.
My complete code is listed here. http://pastebin.com/fi049z4n
Are there alternatives to using array_splice? Can anyone please assist me in reducing memory usage. What functions can be changed? How do I free up memory?
You have to change your CSV creation strategy. Instead if reading the whole result from the database into an array in memory you must work sequencially:
Read a row from the database
Write that row into the CSV file (or maybe buffer)
Free the row from memory
Do this in a loop...
This way the memory consumption of your script does not rise proportionally with the number of rows in the result, but stays (more or less) constant.
This might serve as an untested example to sketch the idea:
while (FALSE!==($row=mysql_fetch_result($res))) {
fwrite ($csv_file."\n", implode(',', $row));
}
Increase the memory limit and maximum execution time
For example :
// Change the values to suit you
ini_set("memory_limit","15000M");
ini_set("max-execution_time", "5000");
I’ve tried to get over a millions of rows from a MySQL table to PHP my application.
Steps to get data are like bellow.
Send query to mySQL with mysql_query() or mysqli_query()..
Set result to array with looping each rows by using mysql_fetch_array().or add array by using mysqli_fetch_all().
I took messages ‘out of memory’ or ‘allowed memory size of * bytes exhausted’.
I might be able to solve them if I change ‘memory_limit’ .
But I want know why those message was shown.
I’ve changed ’memory_limit’ like 128M-> 512M ->1024M.
When I set 128M,sending query was failed with allowed memory size of *bytes exhausted and I can’t add result to array area.
Then setting 512M, query was finished successfully but I can’t add result to array with allowed memory size of * bytes exhausted.
Finally setting 1024M,both sending query and set result to add result to array was finished .but sometime mysqli_fetch_all() was field with Out of memory.
One thing I can’t understand is why Out of memory isn’t happen constantly?
One more thing I want know is How to get whole rows over a million from a table into PHP less memory turning. After getting whole rows I want let my user to access them by browser operation including download some types of file like csv.
You will get "allowed memory size of * bytes exhausted’" when the amount of memory needed exceeds the memory limit specified in php.ini file.
You will get "Out of memory" when the server itself (or Apache or MySQL) runs out of memory.
The best option to retrieve huge number of rows is to use limit and offset in your query and loop through them.
This will prevent the memory issues that you are facing.
Much has been documented regarding PHPEXCEL memory issues so my question presumes that reading in "chunks" - as described by Mark Baker has been applied. My specific situation is this:
- read a "raw" spreadsheet uploaded to a database record (working fine)
- retrieve said spreadsheet, open with PHPExcel, and audit the cells
- Read rows and columns and look for conditions.
- If a cell has a [informational|warning|error] (a) apply conditional formatting and insert a cell note (working fine)
- Do this for all rows and columns. Runs out of memory before completes.
- Save the formatted spreadsheet into a server directory so user can download and look at all the embedded errors warnings, or informational items - also works just fine if I limit the number of rows read.
The spreadsheet is not especially big 130 or rows and 60 columns.
This is what I have tried:
- Reading in "chunks" ala Mark Baker. Problem is the filter only returns the subset of rows. Those associated with the chunk rule. So when I go to do a save at the end of a chunk read and processing, only those rows are saved, not the whole spreadsheet.
- Using cell caching (
$cacheMethod = PHPExcel_CachedObjectStorageFactory::cache_to_discISAM;
PHPExcel_Settings::setCacheStorageMethod($cacheMethod);
) allows me to read and update every row and the associated cells, but when I go to do the $objWriter->save(str_replace('.php', '.xlsx', $theDIRandFileName) I get a memory problem in XMLWriter.php:
Fatal error: Allowed memory size of 18874368 bytes exhausted (tried to allocate 238592 bytes) in ... /PHPExcel/Shared/XMLWriter.php on line 100
Argh!
Any assistance would be greatly appreciated. Right now it appears the only thing left for me to try is open two copies, use the chunk approach, but figure out how to read the unedited version but update a chunk version. There has to be a better way.
Can any one please tell me Why PHPExcel does not allow more than 5000 rows.
I am using an open-source PHPExcel for report generation on my projects and i could not write more than 5000 rows of data from Mysql-DB. My result set fetch 7230 records when the query is executed. How do i fix it..
Almost certainly this is a timeout or a memory issue. The only PHPExcel limit for worksheets size is 65,536 rows and 256 (IV) columns (when using the Excel5 Writer); or 1,048,576 rows and 16,384 (XFD) columns (when using the Excel2007 Writer).
Ensure that your error logging is always enabled... use try/catch blocks to trap for any PHPExcel Exceptions. And read the PHPExcel site discussion threads on memory and performance.
I had the same problem. You will need to allocate enough time and memory limit.
I have tested my solution on 3 different server here is the result:
About 5000 records (12 columns)
Reading file:
09:48:22 Peak memory usage: 1.25 MB
Reading data:
09:48:31 Peak memory usage: 54.5 MB
After indexing data into an array:
09:48:35 Peak memory usage: 68.5 MB
Records: 4504
I increased the memory and time to read 22.000 records after indexing it went up to 370.00MB
Here is the solution (being given that everything else is correct in the code sequence)
where you call PHPExcel in your program/function:
ini_set("max_execution_time", 'time_limit'); //see manual
Do all initialization here so that all objects are ready then allocate memory for reading the file and indexing data into program internal structure:
ini_set('memory_limit', '???'); //your memory limit as string
$excel = $excelReader->load($filePath);
"Memory usage: " . (memory_get_peak_usage(true) / 1024 / 1024) . " MB"
//do the rest of the structure!
A good idea is to have managed all this by some categories of data so you don't run into 400 MB - PRONE TO ERRORS!
Almost certainly this is a timeout or a memory issue. The only PHPExcel limit for worksheets size is 65,536 rows and 256 (IV) columns (when using the Excel5 Writer); or 1,048,576 rows and 16,384 (XFD) columns (when using the Excel2007 Writer).
You can change this line
$objWriter = PHPExcel_IOFactory::createWriter($objPHPExcel, 'Excel5');
as
$objWriter = PHPExcel_IOFactory::createWriter($objPHPExcel, 'Excel2007');
Then it allows to write records more than 65536 rows.
Without having your code or the class's code is quite difficult I believe...Do you mean you can't write more than 5k rows in a XLS file, or inside a worksheet? otherwise, an ugly workaround could be writing 5K rows in first sheet and the rest in the second (so 5K rows each sheet, if DB gets bigger).
I don't think XLS has a 5k rows limitations, so there should be something wrong or misconfigured in your script..Have you tried several times? Does it always print 5k rows? or could it be due to timeouts? (of your script or connection)
This type of issue is more than likely a server memory issue. What type of server are you on and are you sure it has enough memory and resources available to process large data files? If you cant tell either way, the best work around is to read in a few thousand records at a time, process them and then move to the next chunk. I myself would prefer to break the large data file into manageable pieces (files), upon which I could subsequently process each of those pieces to get my desired outcome. Once all pieces are processed, they can then be merged together to make a new large data file.