I am using PHPexcel for excel generation.
for()
{
$objPHPExcel->getActiveSheet()->setCellValueByColumnAndRow($col, $ex_row, $value);
}
$objWriter = PHPExcel_IOFactory::createWriter($objPHPExcel, 'Excel2007');
I having huge data result of more than 60000 records with 60 columns.
I think PHPExcel is setting values and everything saved in object array and at last its writing to the file. As PHP is not good with arrays and data is huge am getting request time out error.
To avoid that am planning to write row by row. Is it possible that I can write row by row to the excel file and save it at the end?
If the speed of the creation isn't the real problem and the fact that PHP is giving you a timeout error, you could always place this at the top of your script:
set_time_limit(0);
The 0 will allow the script to run and run and run...
Related
I am trying to read excel file in my CodeIgniter application. The function getActiveSheet()->toArray(null,true,true,true); is working fine for an excel file with 14442 x 17 cells, however this function does not works for an excel file with 17590 x 17 cells. In this second case, browser ends-up with a blank page and I am not getting any error. So please tell what can be the issue?
Code:
$objPHPExcel = PHPExcel_IOFactory::load($file_path);
$allDataInSheet = $objPHPExcel->getActiveSheet()->toArray(null,true,true,true);
Probably out of memory. It is a common issue with large excel files.
If you only need to read the data you can use something like
$objReader = PHPExcel_IOFactory::createReaderForFile($file);
$objReader->setReadDataOnly(true);
I am trying to open an existing excel file, modify some cells and save it. I am using Excel2007 for reader and writer.
The input file is about 1 MB large and it has few formulas, protected data and hidden rows and columns and worksheets which I do not modify.
I am able to load the data and read and write some values into it, which I check with various var_dumps in the code.
The problem is while saving it. It throws some fatal errors on timing outs and also if it writes the file the file size is bloated to 9.2 MB, which is okay if I can open it.
code snippet - nothing fancy.
$objReader = PHPExcel_IOFactory::createReader('Excel2007');
$objPHPExcel = $objReader->load($inputFile);
$objPHPExcel->setActiveSheetIndex(2);
$activeSheet = $objPHPExcel->getActiveSheet();
$currCell = $activeSheet->getCell("O3");
$cellValidation = $currCell->getDataValidation("O3");
$values = array();
if ($cellValidation->getShowDropDown() == true)
{
$values = $cellValidation->getFormula1();
$valArray = explode(",", $values);
$currCell->setValue($valArray[0]);
}
$objWriter = PHPExcel_IOFactory::createWriter($objPHPExcel, 'Excel2007');
$objWriter -> setPreCalculateFormulas(false);
$objWriter->save($outputFile);
I use MS Excel 2010 to open the resultant file but it just takes forever and has not opened it even once.
Please help me to troubleshoot this by giving me pointers as to where I should be looking.
Any help is greatly appreciated.
Instead of saving it to a file, save it to php://outputĀDocs:
$objWriter->save('php://output');
This will send it AS-IS to the browser.
You want to add some headersĀDocs first, like it's common with file downloads, so the browser knows which type that file is and how it should be named (the filename):
// We'll be outputting an excel file
header('Content-type: application/vnd.ms-excel');
// It will be called file.xls
header('Content-Disposition: attachment; filename="file.xls"');
// Write file to the browser
$objWriter->save('php://output');
First do the headers, then the save. For the excel headers see as well the following question: Setting mime type for excel document.
So the final code would have below lines -
// Save Excel 2007 file
#echo date('H:i:s') . " Write to Excel2007 format\n";
$objWriter = PHPExcel_IOFactory::createWriter($objPHPExcel, 'Excel2007');
ob_end_clean();
// We'll be outputting an excel file
header('Content-type: application/vnd.ms-excel');
// It will be called file.xls
header('Content-Disposition: attachment; filename="file.xlsx"');
$objWriter->save('php://output');
I think this line:
ob_end_clean();
Should solve your problem.
Thanks!
There's a whole lot of reasons for that "bloat" and it very much depends on the actual data in the worksheet, but MS Excel itself uses a lot of different techniques to keep the filesize small, whereas PHPExcel writes a simple version of the OfficeOpenXML format.
For example, MS Excel looks at the string content of all cells, and stores the individual strings in a string table. If a string is used by two or more cells, there will only be a single entry in the string table. However, there's a performance overhead in checking if a string already exists in the string table, so PHPExcel doesn't perform that check but will duplicate entries in the string table. This means that it will create a large file because of the duplication, but keeps the save speed as fast as possible.
Similarly, MS Excel looks at all formulae, and if the formula is similar to an existing formula (with only a row/column offset difference) it will store it as a shared formula rather than a cell formula, so the actual formula data is only stored once. Again, PHPExcel won't perform this check, because it is a big performance overhead in the save, so it stores every formula as a cell formula rather than a shared formula.
And no, I can't explain why the file doesn't load in MS Excel 2010, nor will I be able to explain it without being able to run the whole thing through debug
actualy this for saving huge data to excel file, but excel have limitation for storing data for each row, i wanna make a few sheet acording to data row, here is the problem, say that i have 1000 data in database, and i wanna make it splited by 500 for each and put in to first sheet, how do we make that happen ?, i have tried with following code
$this->load->library('excel');
$this->excel->setActiveSheetIndex(0);
$this->excel->getActiveSheet()->setCellValue('A1','NAME');
for($i=1; $i<=1000; $i++){
if($i>=500){
#save first 500 data to first sheet and then continue for the rest to 2nd sheet
}
$this->excel->getActiveSheet()
->setCellValue("A$i",$i);
}
i wanna make for each sheet saved 500 data, is it possible to make it happen?
Basically something like this:
$inserted = 0;
for(...) {
$inserted++;
if ($inserted % 500 == 0) {
start new sheet
}
}
keep a counter. whenever the counter reaches a multiple of 500, start a new sheet, and start inserting into that.
Although my application is built using the Yii framework this is more of a general PHP issue (I think).
I have some code which takes a Yii CActiveDataProvider, loops over it and builds a CSV export. This works great and correctly builds and sends the CSV.
I encounter a problem when trying to export a larger dataset. I have successfully output ~2500 records without any problem but when I run the exact same code for a larger set of data (~5000 records) the script appears to run ok but sends a zero length/blank CSV. I can't figure out why... it seems to run for a while and then sends the CSV, no errors or warnings in the logs. Could it be that the output is being flushed or similar before it's ready?
Code is as follows (added a couple of inline comments here for clarity):
<?php
header('Content-type: text/csv');
header('Content-Disposition: attachment; filename="vacancies.csv"');
set_time_limit(240); // I know this is arbitrarily long, it's just to avoid any timeout
$outstream = fopen("php://output", 'w');
$headings = array(
$vacancy->getAttributeLabel('vacancy_id'), // this is a Yii method that returns the active record attribute as a string
...
);
fputcsv($outstream, $headings, ',', '"');
foreach($dp->getData() as $vacancy){ // the getData() method pulls the next active record model out of the Yii dataprovider and the values for various attributes are set below
$row = array(
$vacancy->vacancy_id,
...
);
fputcsv($outstream, $row, ',', '"');
}
fclose($outstream);
?>
Any thoughts on why this is working ok up to a certain number of records?
Update
After re-checking the logs as suggested below I've found I am in fact running out of memory, doh!
I can write out to the filesystem and that gets me up to about 3000 records but then runs out of memory. Any idea of the best way to alter my code to avoid running out of memory?
Thanks very much for the suggestions to check the error logs, I had somehow missed an out of memory error I was getting.
The problem was in fact caused by the way I was using the CActiveDataProvider from the Yii Framework. Reading straight from the DataProvider as I was doing in my question was reading each row into memory, as the script ran on this meant I eventually ran out of memory available to PHP.
There are a couple of ways to fix this, one is to set pagination on the dataprovider to a smaller number of records and to manually iterate over the data, loading only the pagesize into memory each iteration.
The option I went for is to use a CDataProviderIterator to handle this for me $iterator = new CDataProviderIterator($dp); this prevents memory filling up with the records I'm retrieving.
Note that I also had to add an ob_flush(); call to prevent the output buffer from filling up with the CSV contents iteslf.
For reference I ended up with the following:
<?php
header('Content-type: text/csv');
header('Content-Disposition: attachment; filename="vacancies.csv"');
set_time_limit(240);
$outstream = fopen("php://output", 'w');
$headings = array(
$vacancy->getAttributeLabel('vacancy_id'),
...
);
fputcsv($outstream, $headings, ',', '"');
$iterator = new CDataProviderIterator($dp); // create an iterator instead of just using the dataprovider
foreach($iterator as $vacancy){ // use the new iterator here
$row = array(
$vacancy->vacancy_id,
...
);
fputcsv($outstream, $row, ',', '"');
ob_flush(); // explicitly call a flush to avoid filling the buffer
}
fclose($outstream);
?>
Would not have thought to go back and look at the logs again without the suggestion so many thanks :)
Please help me for the the problem exporting large data to excel format xlsx.
i m exporting almost 45000 records at a time to single file and it timeout without saving file.
the mysql select query is taking 21 seconds for executing that large data. Below is my code to export data in to Excel file using PHP Excel library.
$sql2 = "SELECT * FROM Surveys";
$result2 = mysql_query($sql2);
while($row2 = mysql_fetch_assoc($result2))
{
$j=$rowscounter+2;
$sheet->setCellValue("A$j",$row2['brandname']);
$sheet->setCellValue("B$j",$row2['productname']);
$sheet->setCellValue("C$j",$row2['sname']);
$sheet->setCellValue("D$j",$row2['smobile']);
$sheet->setCellValue("E$j",$row2['semail']);
$sheet->setCellValue("F$j",$row2['country']);
$sheet->setCellValue("G$j",$row2['city']);
$sheet->setCellValue("H$j",$row2['sdob']);
$sheet->setCellValue("I$j",$row2['age']);
$sheet->setCellValue("J$j",$row2['comment']);
$sheet->setCellValue("K$j",$row2['outletcode']);
$sheet->setCellValue("L$j",$row2['username']);
$sheet->setCellValue("M$j",$row2['datetime']);
$sheet->setCellValue("N$j",$row2['duration']);
$rowscounter++;
}
// Rename worksheet
$sheet->setTitle('Survey-Report');
$objPHPExcel->setActiveSheetIndex(0);
$objWriter = PHPExcel_IOFactory::createWriter($objPHPExcel, 'Excel2007');
$objWriter->setPreCalculateFormulas(false);
unlink("Survey-Report.xlsx");
$objWriter->save('Survey-Report.xlsx');
echo "ok";
UPDATE:
I forgot to mention that I already tried set_timout etc and wrote below code in my php file.
set_time_limit(0);
ini_set('memory_limit','2500M');
You could add this at the top of your script:
set_time_limit (0);
That will disable the default 30 seconds php timeout.
Or you could add a custom number of seconds, see set_time_limit()
I had the same issue where I keep on getting a 504 Gateway Timeout when exporting records using PHPExcel. I also tried set_time_limit(0) with no success. What I ended up doing is changing the apache timeout.
You can find this timout in your httpd.conf file.
Hope it helps!