Can any one please tell me Why PHPExcel does not allow more than 5000 rows.
I am using an open-source PHPExcel for report generation on my projects and i could not write more than 5000 rows of data from Mysql-DB. My result set fetch 7230 records when the query is executed. How do i fix it..
Almost certainly this is a timeout or a memory issue. The only PHPExcel limit for worksheets size is 65,536 rows and 256 (IV) columns (when using the Excel5 Writer); or 1,048,576 rows and 16,384 (XFD) columns (when using the Excel2007 Writer).
Ensure that your error logging is always enabled... use try/catch blocks to trap for any PHPExcel Exceptions. And read the PHPExcel site discussion threads on memory and performance.
I had the same problem. You will need to allocate enough time and memory limit.
I have tested my solution on 3 different server here is the result:
About 5000 records (12 columns)
Reading file:
09:48:22 Peak memory usage: 1.25 MB
Reading data:
09:48:31 Peak memory usage: 54.5 MB
After indexing data into an array:
09:48:35 Peak memory usage: 68.5 MB
Records: 4504
I increased the memory and time to read 22.000 records after indexing it went up to 370.00MB
Here is the solution (being given that everything else is correct in the code sequence)
where you call PHPExcel in your program/function:
ini_set("max_execution_time", 'time_limit'); //see manual
Do all initialization here so that all objects are ready then allocate memory for reading the file and indexing data into program internal structure:
ini_set('memory_limit', '???'); //your memory limit as string
$excel = $excelReader->load($filePath);
"Memory usage: " . (memory_get_peak_usage(true) / 1024 / 1024) . " MB"
//do the rest of the structure!
A good idea is to have managed all this by some categories of data so you don't run into 400 MB - PRONE TO ERRORS!
Almost certainly this is a timeout or a memory issue. The only PHPExcel limit for worksheets size is 65,536 rows and 256 (IV) columns (when using the Excel5 Writer); or 1,048,576 rows and 16,384 (XFD) columns (when using the Excel2007 Writer).
You can change this line
$objWriter = PHPExcel_IOFactory::createWriter($objPHPExcel, 'Excel5');
as
$objWriter = PHPExcel_IOFactory::createWriter($objPHPExcel, 'Excel2007');
Then it allows to write records more than 65536 rows.
Without having your code or the class's code is quite difficult I believe...Do you mean you can't write more than 5k rows in a XLS file, or inside a worksheet? otherwise, an ugly workaround could be writing 5K rows in first sheet and the rest in the second (so 5K rows each sheet, if DB gets bigger).
I don't think XLS has a 5k rows limitations, so there should be something wrong or misconfigured in your script..Have you tried several times? Does it always print 5k rows? or could it be due to timeouts? (of your script or connection)
This type of issue is more than likely a server memory issue. What type of server are you on and are you sure it has enough memory and resources available to process large data files? If you cant tell either way, the best work around is to read in a few thousand records at a time, process them and then move to the next chunk. I myself would prefer to break the large data file into manageable pieces (files), upon which I could subsequently process each of those pieces to get my desired outcome. Once all pieces are processed, they can then be merged together to make a new large data file.
Related
I am trying to export more than 11000 rows in PHP Excel, its working great till 11000 after it, when I am trying to export it mix all rows.
Almost certainly this is a timeout or a memory issue. The only PHPExcel limit for worksheets size is 65,536 rows and 256 (IV) columns (when using the Excel5 Writer); or 1,048,576 rows and 16,384 (XFD) columns (when using the Excel2007 Writer).
Ensure that your error logging is always enabled... use try/catch blocks to trap for any PHPExcel Exceptions. And read the PHPExcel site discussion threads on memory and performance.
I am using php library from
http://phpexcel.codeplex.com/
for reading .xls, .xlsx and .csv .
And i have to also work with large size files. And it gives out of memory issues. i have already increased the memory limit in php.ini.
I am not sure but as i have learnt the memory consumed size is depends on the number of cells in the excel.
And in my excel files the no. of rows are very large but i have to read only 5-6 columns. So i just want to read only those specified columns so that the blank columns do not consumes the memory for the script.
So my question is that how can i read only specified column by using this library.
e.g :- If i want to read only first five columns.
Thanks
I use phpexcel with 10MB file, it works fine. but however if you want to specify range you can go to http://www.contao-docs.org/docs/PHPExcel/html/index.html. I think you need
PHPExcel_Worksheet::rangeToArray
I have a PHP script that exports a MySQL database to CSV format. The strange CSV format is a requirement for a 3rd party software application. It worked initially, but I am now running out of memory at line 743. My hosting service has a limit of 66MB.
My complete code is listed here. http://pastebin.com/fi049z4n
Are there alternatives to using array_splice? Can anyone please assist me in reducing memory usage. What functions can be changed? How do I free up memory?
You have to change your CSV creation strategy. Instead if reading the whole result from the database into an array in memory you must work sequencially:
Read a row from the database
Write that row into the CSV file (or maybe buffer)
Free the row from memory
Do this in a loop...
This way the memory consumption of your script does not rise proportionally with the number of rows in the result, but stays (more or less) constant.
This might serve as an untested example to sketch the idea:
while (FALSE!==($row=mysql_fetch_result($res))) {
fwrite ($csv_file."\n", implode(',', $row));
}
Increase the memory limit and maximum execution time
For example :
// Change the values to suit you
ini_set("memory_limit","15000M");
ini_set("max-execution_time", "5000");
Much has been documented regarding PHPEXCEL memory issues so my question presumes that reading in "chunks" - as described by Mark Baker has been applied. My specific situation is this:
- read a "raw" spreadsheet uploaded to a database record (working fine)
- retrieve said spreadsheet, open with PHPExcel, and audit the cells
- Read rows and columns and look for conditions.
- If a cell has a [informational|warning|error] (a) apply conditional formatting and insert a cell note (working fine)
- Do this for all rows and columns. Runs out of memory before completes.
- Save the formatted spreadsheet into a server directory so user can download and look at all the embedded errors warnings, or informational items - also works just fine if I limit the number of rows read.
The spreadsheet is not especially big 130 or rows and 60 columns.
This is what I have tried:
- Reading in "chunks" ala Mark Baker. Problem is the filter only returns the subset of rows. Those associated with the chunk rule. So when I go to do a save at the end of a chunk read and processing, only those rows are saved, not the whole spreadsheet.
- Using cell caching (
$cacheMethod = PHPExcel_CachedObjectStorageFactory::cache_to_discISAM;
PHPExcel_Settings::setCacheStorageMethod($cacheMethod);
) allows me to read and update every row and the associated cells, but when I go to do the $objWriter->save(str_replace('.php', '.xlsx', $theDIRandFileName) I get a memory problem in XMLWriter.php:
Fatal error: Allowed memory size of 18874368 bytes exhausted (tried to allocate 238592 bytes) in ... /PHPExcel/Shared/XMLWriter.php on line 100
Argh!
Any assistance would be greatly appreciated. Right now it appears the only thing left for me to try is open two copies, use the chunk approach, but figure out how to read the unedited version but update a chunk version. There has to be a better way.
I am using the PHPExcel framework to try to write out a very large excel document from a mysql query.
Everything works fine until I hit the 5000 row mark (or there about's) where the page gets flagged with the error:
Fatal error: Allowed memory size of xxx bytes exhausted (tried to allocate yyy bytes) in zzz on line aaa
I know this is documented, and I have adjusted the memory allocation on the server but even so I am still hitting the ceiling. I have also tried to turn off formatting, but in reality I will need it on.
So is there a way to write in small chunks, or append to the excel document so I don't exhaust the memory allocation? I am thinking along the lines of the page writing say 1000 lines, then redirect to itself and process the next 1000 using a GET to keep track. For example:
index.php?p=0
then redirect to
index.php?p=1000,
But I can't find a way to append to an existing document without opening up the whole thing.
There is no way of writing in chunks, although a common mistake is for people to load their mysql data to an array, then loop through the array setting the Excel cell data. It's more memory efficient to set the cell data as you loop through the mySQL query resultset.
If you need to keep memory usage to a minimum, what cell caching method are you using? Cell caching is slower, but can save significant amounts of memory.