PHP Excel does not export more than 11000 rows - php

I am trying to export more than 11000 rows in PHP Excel, its working great till 11000 after it, when I am trying to export it mix all rows.

Almost certainly this is a timeout or a memory issue. The only PHPExcel limit for worksheets size is 65,536 rows and 256 (IV) columns (when using the Excel5 Writer); or 1,048,576 rows and 16,384 (XFD) columns (when using the Excel2007 Writer).
Ensure that your error logging is always enabled... use try/catch blocks to trap for any PHPExcel Exceptions. And read the PHPExcel site discussion threads on memory and performance.

Related

How to read only specified columns only with PhpExcel Codeplex

I am using php library from
http://phpexcel.codeplex.com/
for reading .xls, .xlsx and .csv .
And i have to also work with large size files. And it gives out of memory issues. i have already increased the memory limit in php.ini.
I am not sure but as i have learnt the memory consumed size is depends on the number of cells in the excel.
And in my excel files the no. of rows are very large but i have to read only 5-6 columns. So i just want to read only those specified columns so that the blank columns do not consumes the memory for the script.
So my question is that how can i read only specified column by using this library.
e.g :- If i want to read only first five columns.
Thanks
I use phpexcel with 10MB file, it works fine. but however if you want to specify range you can go to http://www.contao-docs.org/docs/PHPExcel/html/index.html. I think you need
PHPExcel_Worksheet::rangeToArray

PHPEXCEL: Read a Spreadsheet->Audit and apply formats->Save audited Spreadsheet. MEMORY Problems

Much has been documented regarding PHPEXCEL memory issues so my question presumes that reading in "chunks" - as described by Mark Baker has been applied. My specific situation is this:
- read a "raw" spreadsheet uploaded to a database record (working fine)
- retrieve said spreadsheet, open with PHPExcel, and audit the cells
- Read rows and columns and look for conditions.
- If a cell has a [informational|warning|error] (a) apply conditional formatting and insert a cell note (working fine)
- Do this for all rows and columns. Runs out of memory before completes.
- Save the formatted spreadsheet into a server directory so user can download and look at all the embedded errors warnings, or informational items - also works just fine if I limit the number of rows read.
The spreadsheet is not especially big 130 or rows and 60 columns.
This is what I have tried:
- Reading in "chunks" ala Mark Baker. Problem is the filter only returns the subset of rows. Those associated with the chunk rule. So when I go to do a save at the end of a chunk read and processing, only those rows are saved, not the whole spreadsheet.
- Using cell caching (
$cacheMethod = PHPExcel_CachedObjectStorageFactory::cache_to_discISAM;
PHPExcel_Settings::setCacheStorageMethod($cacheMethod);
) allows me to read and update every row and the associated cells, but when I go to do the $objWriter->save(str_replace('.php', '.xlsx', $theDIRandFileName) I get a memory problem in XMLWriter.php:
Fatal error: Allowed memory size of 18874368 bytes exhausted (tried to allocate 238592 bytes) in ... /PHPExcel/Shared/XMLWriter.php on line 100
Argh!
Any assistance would be greatly appreciated. Right now it appears the only thing left for me to try is open two copies, use the chunk approach, but figure out how to read the unedited version but update a chunk version. There has to be a better way.

PHPExcel large data sets with multiple tabs - memory exhausted

Using PHPExcel I can run each tab separately and get the results I want but if I add them all into one excel it just stops, no error or any thing.
Each tab consists of about 60 to 80 thousand records and I have about 15 to 20 tabs. So about 1600000 records split into multiple tabs (This number will probably grow as well).
Also I have tested the 65000 row limitation with .xls by using the .xlsx extension with no problems if I run each tab it it's own excel file.
Pseudo code:
read data from db
start the PHPExcel process
parse out data for each page (some styling/formatting but not much)
(each numeric field value does get summed up in a totals column at the bottom of the excel using the formula SUM)
save excel (xlsx format)
I have 3GB of RAM so this is not an issue and the script is set to execute with no timeout.
I have used PHPExcel in a number of projects and have had great results but having such a large data set seems to be an issue.
Anyone every have this problem? work around? tips? etc...
UPDATE:
on error log --- memory exhausted
Besides adding more RAM to the box is there any other tips I could do?
Anyone every save current state and edit excel with new data?
I had the exact same problem and googling around did not find a valuable solution.
As PHPExcel generates Objects and stores all data in memory, before finally generating the document file which itself is also stored in memory, setting higher memory limits in PHP will never entirely solve this problem - that solution does not scale very well.
To really solve the problem, you need to generate the XLS file "on the fly". Thats what i did and now i can be sure that the "download SQL resultset as XLS" works no matter how many (million) row are returned by the database.
Pity is, i could not find any library which features "drive-by" XLS(X) generation.
I found this article on IBM Developer Works which gives an example on how to generate the XLS XML "on-the-fly":
http://www.ibm.com/developerworks/opensource/library/os-phpexcel/#N101FC
Works pretty well for me - i have multiple sheets with LOTS of data and did not even touch the PHP memory limit. Scales very well.
Note that this example uses the Excel plain XML format (file extension "xml") so you can send your uncompressed data directly to the browser.
http://en.wikipedia.org/wiki/Microsoft_Office_XML_formats#Excel_XML_Spreadsheet_example
If you really need to generate an XLSX, things get even more complicated. XLSX is a compressed archive containing multiple XML files. For that, you must write all your data on disk (or memory - same problem as with PHPExcel) and then create the archive with that data.
http://en.wikipedia.org/wiki/Office_Open_XML
Possibly its also possible to generate compressed archives "on the fly", but this approach seems really complicated.

Why PHPExcel does not allow to write more than 5000 rows

Can any one please tell me Why PHPExcel does not allow more than 5000 rows.
I am using an open-source PHPExcel for report generation on my projects and i could not write more than 5000 rows of data from Mysql-DB. My result set fetch 7230 records when the query is executed. How do i fix it..
Almost certainly this is a timeout or a memory issue. The only PHPExcel limit for worksheets size is 65,536 rows and 256 (IV) columns (when using the Excel5 Writer); or 1,048,576 rows and 16,384 (XFD) columns (when using the Excel2007 Writer).
Ensure that your error logging is always enabled... use try/catch blocks to trap for any PHPExcel Exceptions. And read the PHPExcel site discussion threads on memory and performance.
I had the same problem. You will need to allocate enough time and memory limit.
I have tested my solution on 3 different server here is the result:
About 5000 records (12 columns)
Reading file:
09:48:22 Peak memory usage: 1.25 MB
Reading data:
09:48:31 Peak memory usage: 54.5 MB
After indexing data into an array:
09:48:35 Peak memory usage: 68.5 MB
Records: 4504
I increased the memory and time to read 22.000 records after indexing it went up to 370.00MB
Here is the solution (being given that everything else is correct in the code sequence)
where you call PHPExcel in your program/function:
ini_set("max_execution_time", 'time_limit'); //see manual
Do all initialization here so that all objects are ready then allocate memory for reading the file and indexing data into program internal structure:
ini_set('memory_limit', '???'); //your memory limit as string
$excel = $excelReader->load($filePath);
"Memory usage: " . (memory_get_peak_usage(true) / 1024 / 1024) . " MB"
//do the rest of the structure!
A good idea is to have managed all this by some categories of data so you don't run into 400 MB - PRONE TO ERRORS!
Almost certainly this is a timeout or a memory issue. The only PHPExcel limit for worksheets size is 65,536 rows and 256 (IV) columns (when using the Excel5 Writer); or 1,048,576 rows and 16,384 (XFD) columns (when using the Excel2007 Writer).
You can change this line
$objWriter = PHPExcel_IOFactory::createWriter($objPHPExcel, 'Excel5');
as
$objWriter = PHPExcel_IOFactory::createWriter($objPHPExcel, 'Excel2007');
Then it allows to write records more than 65536 rows.
Without having your code or the class's code is quite difficult I believe...Do you mean you can't write more than 5k rows in a XLS file, or inside a worksheet? otherwise, an ugly workaround could be writing 5K rows in first sheet and the rest in the second (so 5K rows each sheet, if DB gets bigger).
I don't think XLS has a 5k rows limitations, so there should be something wrong or misconfigured in your script..Have you tried several times? Does it always print 5k rows? or could it be due to timeouts? (of your script or connection)
This type of issue is more than likely a server memory issue. What type of server are you on and are you sure it has enough memory and resources available to process large data files? If you cant tell either way, the best work around is to read in a few thousand records at a time, process them and then move to the next chunk. I myself would prefer to break the large data file into manageable pieces (files), upon which I could subsequently process each of those pieces to get my desired outcome. Once all pieces are processed, they can then be merged together to make a new large data file.

Is there any PHP performance issue in export to CSV or excel

am exporting huge data to excel using php.
i change that to csv and text.
i see no difference in file size.
So PHP performance have anything to do with file format.
even if file format is different , rows and columns is same.
consider 60 column and 100000 rows.
is there any optimizing technique,other than ini memory limit and execution time.
we have to taken care
As far as I know, the various Excel libraries for PHP will build the spreadsheet in-memory, which could cause problems for very large data sets. CSV/txt, on the other hand, can be written out to disk or the client for each row, so memory usage is minimal.
Performance-wise, the Excel libraries will always have larger overhead. There's all kinds of extra Excel-specific binary bits in the file which need special handling in PHP, whereas CSV is just plain text. PHP's core purpose is to be able to spit out large amounts of text very quickly, so generating csv/txt is going to be faster, always. And of course, there's function call over head. In pseudo code, consider the difference between:
CSV:
echo "$column1, $column2, $column3, $column4";
versus Excel:
$workbook->write('A1', $column1);
$workbook->write('B1', $column2);
$workbook->write('C1', $column3);
$workbook->write('D1', $column3);
etc...
On the plus side for Excel, particularly with XLSX, there is some compression so the same amount of data will take up less space. This can be mitigated somewhat by using compression in the webserver, or feeding the CSV/txt output into a Zip library server-side.

Categories