I have used the code below to generate a report in excel:
require_once "phpexcel/class.writeexcel_workbook.inc.php";
require_once "phpexcel/class.writeexcel_worksheet.inc.php";
$fname = tempnam("/tmp", "simple.xls");
$workbook = &new writeexcel_workbook($fname);
$gen =& $workbook->addformat();
$gen->set_align('left');
$gen->set_num_format('General');
$worksheet = &$workbook->addworksheet("records");
$worksheet->write_string(0,0,'Customer');
$worksheet->write_string(0,1,'ID');
$worksheet->set_column(0,0,30,$gen);
$worksheet->set_column(0,1,10,$gen);
$j=1;
while($res = mysql_fetch_array($qry))
{
$worksheet->write($j,0,$res['cust']);
$worksheet->write($j,1,$res['id']);
$j++;
}
$workbook->close();
header("Content-Type: application/x-msexcel; name=records.xls");
header("Content-Disposition: inline; filename=records.xls");
When I run this file its generating output correctly but only the problem is if my records comes above 24000 then it shows an error message.
if I set 1 to 24000 records its working
if I set above 24000 records its working
if I set all then its not working (at the time 26000 records will come)
You may try this official example:
https://github.com/PHPOffice/PHPExcel/blob/7d1c140974a4988f8a9739d167335d24fb59955e/Examples/06largescale-xls.php
Also, consider to increase PHP memory limit
PHPExcel holds an "in memory" representation of a spreadsheet, so it is susceptible to PHP's memory limitations. The memory made available to PHP can be increased by editing the value of the memory_limit directive in your php.ini file, or by using ini_set('memory_limit', '128M') within your code (ISP permitting).
Here is a similar question
Related
So I have this CI Project that converts from database into CSV.
Deployed in a SSH server.
I try to load all data(It's over 2,000,000+) then convert it to CSV.
My first try I filter it with rows having only emails(so it gives me 66,000+ data.)
It successfully exported the data into csv(took a bit of time).
But when I finally try to export all data, after I click the "Convert to CSV", It will take so much time loading and the browser give error:
This page isn’t working
<server-ip-address> didn’t send any data.
ERR_EMPTY_RESPONSE
Does this have something to matter with the server?
I tried changing settings in the /etc/php.ini with these settings:
max_execution_time = 259200
max_input_time = 259200
memory_limit = 300M
session.gc_maxlifetime = 1440
But it still gives me same error.
How can I resolve this? Please help.|
UPDATE: I include my code for the csv download, here it is:
public function convcsv(){
ini_set('memory_limit', '-1');
set_time_limit(0);
$prefKey = $this->session->flashdata('prefKey');
$searchKey = $this->session->flashdata('searchKey');
$withEmail = $this->session->flashdata('withEmail');
log_message('debug', 'CONVCSV prefKey = ' . $prefKey);
log_message('debug', 'CONVCSV searchKey = ' . $searchKey);
$list = $this->user_model->get_users($prefKey, $searchKey, $withEmail, "", "");
log_message('debug', 'Fetched data');
$headerArray = array("id", "prefecture_id", "industry_id", "offset", "name", "email");
// Header
$header = str_replace(",", "", $headerArray);
$datas = implode(',', $header) . "\r\n";
// Body
foreach($list as $body)
{
// 配列の内容を「,」区切りで連結する
$orig_email = $body['email'];
$mstring = preg_replace("/^([^a-zA-Z0-9])*/",',',$orig_email);
preg_match_all("/[\._a-zA-Z0-9-]+#[\._a-zA-Z0-9-]+/i", $mstring, $matches);
$email = implode($matches[0]);
//$email = $matches[0];
$datas .= $body["id"].",".$body["prefecture_id"].",".$body["industry_id"].",".$body["offset"].",".preg_replace('/[,]/',' ',$body["name"]).",".$email."\r\n";
}
// 文字コード返還
$datas = mb_convert_encoding($datas, "SJIS-win", "UTF-8");
// ダウンロード開始
$csvFileName = "phpList_" . date('Ymd_His') . ".csv";
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename=' . $csvFileName);
header('Content-Transfer-Encoding: binary');
while (ob_get_level() > 0)
{
ob_end_clean();
}
ob_start();
print trim($datas);
ob_flush();
ob_end_clean();
exit;
}
Ok, I will try to explain this as best i can with what little data you gave. I will assume you can pull the data from the database. If not you can use unbuffered queries in PDO ( I only use PDO for last 4-5 years )
http://php.net/manual/en/mysqlinfo.concepts.buffering.php
As a side note, I've pulled 110 million rows from MySql using unbuffered queries, this is on a server with 56GB of ram though ( Azure standard_A8, it's pretty l33t ).
For outputting the data, typically when the browser loads a page it "builds" it all server side and then dumps it in one go on the browser (generally speaking). In your case this is too much data. So,
(Psudo ish Code )
set_time_limit(-1); //set time limit.
header('Content-Type: text/csv; charset=utf-8');
header('Content-Disposition: attachment; filename=data.csv');
$f = fopen('php:\\output', 'w');
while( false !== ($row = $stmt->fetch(PDO:FETCH_ASSOC) ) ){
fputcsv($f, $row);
flush();
}
The disadvantage is there is no real way to tell the download file size before hand. We are basically sending the download headers, then dumping each line into the output stream and flushing it to the browser one line at a time.
Overall this is slower then doing it in one push, because there is more network traffic but it manages memory better, sometimes it is what it is.
You can see some example on the page ( for streaming output )
http://php.net/manual/en/function.flush.php
And you might have to use some stuff like this (first) if it doesn't work,
#apache_setenv('no-gzip', 1);
ini_set('zlib.output_compression', 0);
ini_set('implicit_flush', 1);
The download should start pretty much instantly though, but there is no guarantee that the complete file will be output as an error part way through could prematurely end the script.
You may have issues with this too memory_limit = 300M, I'm spoiled and have 2GB as the default and up to 6GB at run time ini_set('memory_limit', '300M'); //set at runtime
Lastly, I feel compelled to say not to set the time limit globally but instead do it this way set_time_limit(-1); at run time. That way it only affects this script and not the server as a whole. However you may run into issues with timeouts in apache itself. It's a very tricky thing because there are a lot of moving parts between the server and the browser. Many of which can depend on the server, the servers OS, the browser etc. ( environment )
Ideally you would do this though FTP download, but this is probably the best solution ( at least in concept ), it's just a matter of sending easily digestible chunks.
I'm trying to create an excel file, about 10 worksheets of 3 columns and roughly 30 rows each. I'm trying to highlight some groups of cells on each worksheet by setting some style properties (in addition to a couple cell mergers and column resizing). I'm finding that the document drops the styles after about the 4th worksheet.
My question is: can I do something to increase the number of styles I can apply to my document? Could it be that I am neglecting to do some cleanup? Some setting I'm missing?
I noticed some memory issue questions on SO that seemed related, so I checked the memory limits and tried cacheing. As far as I can tell, that doesn't seem to be the issue (please refute me if I'm wrong though!).
I created a toy example to demonstrate the problem. On my test server, the styles give out on the 3rd worksheet (about 50 applications).
Toy example (EDIT: Due to this answer I changed the example a little so that the styles are clearly in disjoint regions).
EDIT: I tried the same thing on a different server (perhaps a slightly newer version of PHPExcel as well) and all the styles appear to be preserved in Excel5 formatted output, even after increasing the complexity and size.
<?php
require_once 'classes/PHPExcel.php';
ini_set('memory_limit','64M'); // The default memory_limit in php.ini is at least this as well
$cacheMethod = \PHPExcel_CachedObjectStorageFactory::cache_to_phpTemp;
$cacheSettings=array( 'memoryCacheSize'=>'32MB');
\PHPExcel_Settings::setCacheStorageMethod($cacheMethod, $cacheSettings);
$o = new \PHPExcel();
$style1 = array(
'fill'=>array(
'type'=> \PHPExcel_Style_Fill::FILL_SOLID,
'color'=>array('rgb'=>'CCFFCC'),
),
'font'=>array(
'size'=>17,
'name'=>'Calibri Light',
'bold'=>false
),
);
$style2 = array(
'fill'=>array(
'type'=> \PHPExcel_Style_Fill::FILL_SOLID,
'color'=>array('rgb'=>'FFCCCC'),
),
'font'=>array(
'size'=>17,
'name'=>'Calibri Light',
'bold'=>false
),
);
$maxws = 10;
$maxrow=40;
for ($ws=0;$ws<$maxws;$ws++){
$o->setActiveSheetIndex($ws);
$o->getActiveSheet()->setTitle("TEST $ws");
for ($row=1;$row<$maxrow; $row++){
if ($row % 2){
$o->getActiveSheet()
->getCell("A$row")
->setValue("Styled!");
$o->getActiveSheet() ->getStyle("A$row") ->applyFromArray($style1);
}else{
$o->getActiveSheet()
->getCell("A$row")
->setValue("Default");
}
$o->getActiveSheet()
->getCell("B$row")
->setValue("Default");
if ( ! ($row % 2)){
$o->getActiveSheet()
->getCell("C$row")
->setValue("Other style!");
$o->getActiveSheet() ->getStyle("C$row") ->applyFromArray($style2);
}else{
$o->getActiveSheet()
->getCell("C$row")
->setValue("Default");
}
}
if ($ws+1<$maxws) $o->createSheet($ws+1);
}
//echo "Peak memory usage ". (memory_get_peak_usage(true)/1024/1024) . " MB\r\n"; die();
$filename = 'export_test.xls';
header("Content-Type: application/vnd.ms-excel");
header("Content-Disposition: attachment; filename=\"$filename.xls\"");
header("Cache-Control: max-age=0");
$objWriter = \PHPExcel_IOFactory::createWriter($o, "Excel5");
$objWriter->save("php://output");
exit;
Style limitations for OfficeOpenXML .xlsx files (Excel2007 Writer):
Unique cell formats/cell styles: 64,000
Fill styles: 256
Line weight and styles: 256
Unique font types: 1,024 global fonts available for use; 512 per workbook
Number formats in a workbook: Between 200 and 250, depending on the language version of Excel that you have installed
Style limitations for BIFF .xls files (Excel5 Writer):
Colours in a workbook: 56
Cell styles in a workbook: 4,000
Custom number formats: Between 200 and 250, depending on the language version of Excel you have installed.
Where possible, try to set styles in PHPExcel for a range of cells, rather than for individual cells; so rather than do
$o->getActiveSheet() ->getStyle("A$row") ->applyFromArray($style);
in your for loop, do
$o->getActiveSheet() ->getStyle("A1:A$maxrow") ->applyFromArray($style);
after the for loop has finished
This is not a complete answer but solves the issue for me.
The problem was the export format "Excel5" in the createWriter factory method. Switching to "Excel2007" fixes the issue (the styles I expect to see appear correctly). This works both for my real application and the toy example.
In code:
header("Content-Disposition: attachment; filename=\"$filename.xlsx\"");
header("Cache-Control: max-age=0");
$objWriter = \PHPExcel_IOFactory::createWriter($o, "Excel2007");
Please help me for the the problem exporting large data to excel format xlsx.
i m exporting almost 45000 records at a time to single file and it timeout without saving file.
the mysql select query is taking 21 seconds for executing that large data. Below is my code to export data in to Excel file using PHP Excel library.
$sql2 = "SELECT * FROM Surveys";
$result2 = mysql_query($sql2);
while($row2 = mysql_fetch_assoc($result2))
{
$j=$rowscounter+2;
$sheet->setCellValue("A$j",$row2['brandname']);
$sheet->setCellValue("B$j",$row2['productname']);
$sheet->setCellValue("C$j",$row2['sname']);
$sheet->setCellValue("D$j",$row2['smobile']);
$sheet->setCellValue("E$j",$row2['semail']);
$sheet->setCellValue("F$j",$row2['country']);
$sheet->setCellValue("G$j",$row2['city']);
$sheet->setCellValue("H$j",$row2['sdob']);
$sheet->setCellValue("I$j",$row2['age']);
$sheet->setCellValue("J$j",$row2['comment']);
$sheet->setCellValue("K$j",$row2['outletcode']);
$sheet->setCellValue("L$j",$row2['username']);
$sheet->setCellValue("M$j",$row2['datetime']);
$sheet->setCellValue("N$j",$row2['duration']);
$rowscounter++;
}
// Rename worksheet
$sheet->setTitle('Survey-Report');
$objPHPExcel->setActiveSheetIndex(0);
$objWriter = PHPExcel_IOFactory::createWriter($objPHPExcel, 'Excel2007');
$objWriter->setPreCalculateFormulas(false);
unlink("Survey-Report.xlsx");
$objWriter->save('Survey-Report.xlsx');
echo "ok";
UPDATE:
I forgot to mention that I already tried set_timout etc and wrote below code in my php file.
set_time_limit(0);
ini_set('memory_limit','2500M');
You could add this at the top of your script:
set_time_limit (0);
That will disable the default 30 seconds php timeout.
Or you could add a custom number of seconds, see set_time_limit()
I had the same issue where I keep on getting a 504 Gateway Timeout when exporting records using PHPExcel. I also tried set_time_limit(0) with no success. What I ended up doing is changing the apache timeout.
You can find this timout in your httpd.conf file.
Hope it helps!
i am facing a problem.
I need to read a .xls file of about 10MB. i write a php code that works fine when i read small .xls file. but when i try to read large file then the browser shows "Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 1032 bytes) in C:\wamp\www\student\ExcelRes\PHPExcel\Cell.php on line 1126"
Here is my code.
<?php
ini_set('memory_limit', '128M');
set_include_path(get_include_path() . PATH_SEPARATOR . 'ExcelRes/');
include 'PHPExcel/IOFactory.php';
$inputFileName = 'ru_unit_H_all.xls';
$objPHPExcel = PHPExcel_IOFactory::load($inputFileName);
$sheetData = $objPHPExcel->getActiveSheet()->toArray(null,true,true,true);
echo $sheetData['20007'][ 'K']; //row colomn
?>
Error message should be self explaining:
"Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 1032 bytes) in C:\wamp\www\student\ExcelRes\PHPExcel\Cell.php on line 1126"
You just simply ran out of memory reserved for execution of one script.
You may increase your memory_limit using ini_set() to solve this issue.
Note: using 128MB isn't enough, because 134217728B = ~128MB still causes that error. Try using 512MB.
There's no memory effective implementation of Excel reader/writer for PHP that I know of.
The memory available to the script has been exausted. By default I believe each script has a limit of 8MB of memory allocated to it. You are attempting to read 10MB file, and as such, there is not enough memory to process the requst and it fails.
You can try increasing the amount of memory available, by using memory_limit setting.
This can be done globally for all scripts in the php.ini settings file or on a per script basis using
ini_set('memory_limit','16M');
Where 16M is 16 Megabytes of memory.
In addition to possibly increasing memory, you should also be looking at the cell caching options provided by PHPExcel for precisely this purpose, as described in the section of the developer documentation entitled "cell caching" (section 4.2.1)
EDIT
And your use of toArray() is going to build the array you're requesting in PHP memory as well, adding extra overhead - consider iterating over the worksheet a row at a time rather than loading it twice into memory (once as a PHPExcel object and once as your array)
Finally i solve my problem by using example12 code.
<?php
set_include_path(get_include_path() . PATH_SEPARATOR . 'ExcelRes/');
include 'PHPExcel/IOFactory.php';
$inputFileType = 'Excel5';
$inputFileName = 'ru_unit_H_all.xls';
class chunkReadFilter implements PHPExcel_Reader_IReadFilter
{
private $_startRow = 0;
private $_endRow = 0;
public function setRows($startRow)
{
$this->_startRow = $startRow;
}
public function readCell($column, $row, $worksheetName = '')
{
if (($row == 1) || ($row >= $this->_startRow ))
{
return true;
}
return false;
}
}
$objReader = PHPExcel_IOFactory::createReader($inputFileType);
$chunkFilter = new chunkReadFilter();
$objReader->setReadFilter($chunkFilter);
$chunkFilter->setRows(22000);
$objPHPExcel = $objReader->load($inputFileName);
echo $objPHPExcel->getActiveSheet()->getCell('K22000')->getValue();
}
?>
I just recently asked and solved a question pertaining to uploading .PDF files that are greater than 2 MB into a MySQL database as BLOBS. I had to change some settings in my php.ini file and MySQLs maximum packet setting. However, fixing this issue has led me to discover a new issue with my script.
Now since I can upload files to my BLOB database I attempted to download the file for testing purposes. Much to my dismay when I went to open the .PDF file I received the following error: Failed to load document (error 3) 'file:///tmp/test-13.pdf'. Upon further investigation I found out that the file being downloaded, test.pdf, was only 1 MB, a little less than half of its supposed size in the database of a little more than 2 MB. This is obviously the reason for the error.
The following piece of code is the part of my script I am using for downloading files from the database. It is is at the very top of of script and works Flawlessly for files that are less than 1 MB.
foreach($_REQUEST as $key => $value)
{
if ($value == 'Open')
{
header();
session_start();
$dbh = new PDO('mysql:host='.$_SESSION['OpsDBServer'].'.ops.tns.its.psu.edu;
dbname='.$_SESSION['OpsDB'], $_SESSION['yoM'], $_SESSION['aMa']);
$id = $key;
$sqlDownload = "SELECT name, type, content, size FROM upload WHERE
id='".$id."'";
$result = $dbh->query($sqlDownload);
$download = $result->fetchAll();
$type = $download[0]['type'];
$size = $download[0]['size'];
$name = $download[0]['name'];
$content = $download[0]['content'];
header("Content-type: $type");
header("Content-Disposition: inline; filename=$name");
header("Content-length: $size");
header("Cache-Control: maxage=1");
header("Pragma: public");
echo $content;
exit;
}
}
I am thinking that maybe I have some header statements wrong? I am very confused about what to do. I have searched through php.ini and I have found no settings that I think need to changed and my maximum packet setting for MySQL is 4 MB so a 2 MB should download.
Thanks for any help.
According to (http://dev.mysql.com/doc/refman/5.0/en/blob.html):
The maximum size of
a BLOB or TEXT object is determined by
its type, but the largest value you
actually can transmit between the
client and server is determined by the
amount of available memory and the
size of the communications buffers.
You can change the message buffer size
by changing the value of the
max_allowed_packet variable, but you
must do so for both the server and
your client program.
According to (http://dev.mysql.com/doc/refman/5.0/en/server-parameters.html) the default value for max_allowed_packet is 1048576.
I actually fixed the issue. I changed all of the values that were recommended here in php.ini and my.cnf but I also needed to change a setting for PDO.
I changed:
PDO::MYSQL_ATTR_MAX_BUFFER_SIZE (integer)
Maximum buffer size. Defaults to 1 MiB.
This has to be set when the PDO object is created to work though. All is good now.