PHPExcel Timeout occur - php

Please help me for the the problem exporting large data to excel format xlsx.
i m exporting almost 45000 records at a time to single file and it timeout without saving file.
the mysql select query is taking 21 seconds for executing that large data. Below is my code to export data in to Excel file using PHP Excel library.
$sql2 = "SELECT * FROM Surveys";
$result2 = mysql_query($sql2);
while($row2 = mysql_fetch_assoc($result2))
{
$j=$rowscounter+2;
$sheet->setCellValue("A$j",$row2['brandname']);
$sheet->setCellValue("B$j",$row2['productname']);
$sheet->setCellValue("C$j",$row2['sname']);
$sheet->setCellValue("D$j",$row2['smobile']);
$sheet->setCellValue("E$j",$row2['semail']);
$sheet->setCellValue("F$j",$row2['country']);
$sheet->setCellValue("G$j",$row2['city']);
$sheet->setCellValue("H$j",$row2['sdob']);
$sheet->setCellValue("I$j",$row2['age']);
$sheet->setCellValue("J$j",$row2['comment']);
$sheet->setCellValue("K$j",$row2['outletcode']);
$sheet->setCellValue("L$j",$row2['username']);
$sheet->setCellValue("M$j",$row2['datetime']);
$sheet->setCellValue("N$j",$row2['duration']);
$rowscounter++;
}
// Rename worksheet
$sheet->setTitle('Survey-Report');
$objPHPExcel->setActiveSheetIndex(0);
$objWriter = PHPExcel_IOFactory::createWriter($objPHPExcel, 'Excel2007');
$objWriter->setPreCalculateFormulas(false);
unlink("Survey-Report.xlsx");
$objWriter->save('Survey-Report.xlsx');
echo "ok";
UPDATE:
I forgot to mention that I already tried set_timout etc and wrote below code in my php file.
set_time_limit(0);
ini_set('memory_limit','2500M');

You could add this at the top of your script:
set_time_limit (0);
That will disable the default 30 seconds php timeout.
Or you could add a custom number of seconds, see set_time_limit()

I had the same issue where I keep on getting a 504 Gateway Timeout when exporting records using PHPExcel. I also tried set_time_limit(0) with no success. What I ended up doing is changing the apache timeout.
You can find this timout in your httpd.conf file.
Hope it helps!

Related

Update table values by CSV files which is huge PHP

I need to update the table which contains more than 10k records, through CSV file in server. Problem Is It shows "Server Timeout" or "Error after a few minutes".
I Have added this lines
ini_set('memory_limit', '512M');
ini_set('max_execution_time', '180');
before
$this->db->where('part_no',$insert_csv['part_no']);
$this->db->update('mst_parts', $data4);
I am Getting This Error
"This page isn’t working
'xxxxxxxx.com' took too long to respond.
HTTP ERROR 504"
You can read the CSV source by chunks, for example reading only 1000 rows at a time like:
$startLine = 1;
$file = new SplFileObject('source.csv');
$file->seek($startLine-1);
echo $file->current();
And the next run you should increment $startLine.
Documentation: SplFileObject::seek()

how to increase phpExcel row limit to above 24000?

I have used the code below to generate a report in excel:
require_once "phpexcel/class.writeexcel_workbook.inc.php";
require_once "phpexcel/class.writeexcel_worksheet.inc.php";
$fname = tempnam("/tmp", "simple.xls");
$workbook = &new writeexcel_workbook($fname);
$gen =& $workbook->addformat();
$gen->set_align('left');
$gen->set_num_format('General');
$worksheet = &$workbook->addworksheet("records");
$worksheet->write_string(0,0,'Customer');
$worksheet->write_string(0,1,'ID');
$worksheet->set_column(0,0,30,$gen);
$worksheet->set_column(0,1,10,$gen);
$j=1;
while($res = mysql_fetch_array($qry))
{
$worksheet->write($j,0,$res['cust']);
$worksheet->write($j,1,$res['id']);
$j++;
}
$workbook->close();
header("Content-Type: application/x-msexcel; name=records.xls");
header("Content-Disposition: inline; filename=records.xls");
When I run this file its generating output correctly but only the problem is if my records comes above 24000 then it shows an error message.
if I set 1 to 24000 records its working
if I set above 24000 records its working
if I set all then its not working (at the time 26000 records will come)
You may try this official example:
https://github.com/PHPOffice/PHPExcel/blob/7d1c140974a4988f8a9739d167335d24fb59955e/Examples/06largescale-xls.php
Also, consider to increase PHP memory limit
PHPExcel holds an "in memory" representation of a spreadsheet, so it is susceptible to PHP's memory limitations. The memory made available to PHP can be increased by editing the value of the memory_limit directive in your php.ini file, or by using ini_set('memory_limit', '128M') within your code (ISP permitting).
Here is a similar question

Issue with max_execution_time during file import in php 5.4.22

PHP Version: PHP Version 5.4.22
JS :ExtJs
I'm facing to the problem below:
I've an CSV import in PHP and after about 30 to 30.5 seconds the XHR process canceled, then the browser (Chrome, Firefox) hang on and finished the process.
All the data are imported into the database. Ok, that's good but:
At the beginning and at the end of the import function i've a log-function ( Start ) and at the end ( End: XX imported files. )
The problem is, that in the database are two start and two end log entries.
I looks, that during the second start of the function these entries will be generated.
I've tried:
ini_set('max_execution_time', 0);
ini_set('memory_limit', '128M');
ini_set('upload_max_filesize', '12M');
But without an result.
I've change following setttings
remoteApi.maxRetries = 0;
remoteApi.timeout = 300000;
now it work perfekt!!

PHPExcel unable to open the saved file

I am trying to open an existing excel file, modify some cells and save it. I am using Excel2007 for reader and writer.
The input file is about 1 MB large and it has few formulas, protected data and hidden rows and columns and worksheets which I do not modify.
I am able to load the data and read and write some values into it, which I check with various var_dumps in the code.
The problem is while saving it. It throws some fatal errors on timing outs and also if it writes the file the file size is bloated to 9.2 MB, which is okay if I can open it.
code snippet - nothing fancy.
$objReader = PHPExcel_IOFactory::createReader('Excel2007');
$objPHPExcel = $objReader->load($inputFile);
$objPHPExcel->setActiveSheetIndex(2);
$activeSheet = $objPHPExcel->getActiveSheet();
$currCell = $activeSheet->getCell("O3");
$cellValidation = $currCell->getDataValidation("O3");
$values = array();
if ($cellValidation->getShowDropDown() == true)
{
$values = $cellValidation->getFormula1();
$valArray = explode(",", $values);
$currCell->setValue($valArray[0]);
}
$objWriter = PHPExcel_IOFactory::createWriter($objPHPExcel, 'Excel2007');
$objWriter -> setPreCalculateFormulas(false);
$objWriter->save($outputFile);
I use MS Excel 2010 to open the resultant file but it just takes forever and has not opened it even once.
Please help me to troubleshoot this by giving me pointers as to where I should be looking.
Any help is greatly appreciated.
Instead of saving it to a file, save it to php://output­Docs:
$objWriter->save('php://output');
This will send it AS-IS to the browser.
You want to add some headers­Docs first, like it's common with file downloads, so the browser knows which type that file is and how it should be named (the filename):
// We'll be outputting an excel file
header('Content-type: application/vnd.ms-excel');
// It will be called file.xls
header('Content-Disposition: attachment; filename="file.xls"');
// Write file to the browser
$objWriter->save('php://output');
First do the headers, then the save. For the excel headers see as well the following question: Setting mime type for excel document.
So the final code would have below lines -
// Save Excel 2007 file
#echo date('H:i:s') . " Write to Excel2007 format\n";
$objWriter = PHPExcel_IOFactory::createWriter($objPHPExcel, 'Excel2007');
ob_end_clean();
// We'll be outputting an excel file
header('Content-type: application/vnd.ms-excel');
// It will be called file.xls
header('Content-Disposition: attachment; filename="file.xlsx"');
$objWriter->save('php://output');
I think this line:
ob_end_clean();
Should solve your problem.
Thanks!
There's a whole lot of reasons for that "bloat" and it very much depends on the actual data in the worksheet, but MS Excel itself uses a lot of different techniques to keep the filesize small, whereas PHPExcel writes a simple version of the OfficeOpenXML format.
For example, MS Excel looks at the string content of all cells, and stores the individual strings in a string table. If a string is used by two or more cells, there will only be a single entry in the string table. However, there's a performance overhead in checking if a string already exists in the string table, so PHPExcel doesn't perform that check but will duplicate entries in the string table. This means that it will create a large file because of the duplication, but keeps the save speed as fast as possible.
Similarly, MS Excel looks at all formulae, and if the formula is similar to an existing formula (with only a row/column offset difference) it will store it as a shared formula rather than a cell formula, so the actual formula data is only stored once. Again, PHPExcel won't perform this check, because it is a big performance overhead in the save, so it stores every formula as a cell formula rather than a shared formula.
And no, I can't explain why the file doesn't load in MS Excel 2010, nor will I be able to explain it without being able to run the whole thing through debug

PHPExcel - Can it write row by row as fetching from DB

I am using PHPexcel for excel generation.
for()
{
$objPHPExcel->getActiveSheet()->setCellValueByColumnAndRow($col, $ex_row, $value);
}
$objWriter = PHPExcel_IOFactory::createWriter($objPHPExcel, 'Excel2007');
I having huge data result of more than 60000 records with 60 columns.
I think PHPExcel is setting values and everything saved in object array and at last its writing to the file. As PHP is not good with arrays and data is huge am getting request time out error.
To avoid that am planning to write row by row. Is it possible that I can write row by row to the excel file and save it at the end?
If the speed of the creation isn't the real problem and the fact that PHP is giving you a timeout error, you could always place this at the top of your script:
set_time_limit(0);
The 0 will allow the script to run and run and run...

Categories