PHP: Reading huge amount of excel file causing fatal error - php

i am facing a problem.
I need to read a .xls file of about 10MB. i write a php code that works fine when i read small .xls file. but when i try to read large file then the browser shows "Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 1032 bytes) in C:\wamp\www\student\ExcelRes\PHPExcel\Cell.php on line 1126"
Here is my code.
<?php
ini_set('memory_limit', '128M');
set_include_path(get_include_path() . PATH_SEPARATOR . 'ExcelRes/');
include 'PHPExcel/IOFactory.php';
$inputFileName = 'ru_unit_H_all.xls';
$objPHPExcel = PHPExcel_IOFactory::load($inputFileName);
$sheetData = $objPHPExcel->getActiveSheet()->toArray(null,true,true,true);
echo $sheetData['20007'][ 'K']; //row colomn
?>

Error message should be self explaining:
"Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 1032 bytes) in C:\wamp\www\student\ExcelRes\PHPExcel\Cell.php on line 1126"
You just simply ran out of memory reserved for execution of one script.
You may increase your memory_limit using ini_set() to solve this issue.
Note: using 128MB isn't enough, because 134217728B = ~128MB still causes that error. Try using 512MB.
There's no memory effective implementation of Excel reader/writer for PHP that I know of.

The memory available to the script has been exausted. By default I believe each script has a limit of 8MB of memory allocated to it. You are attempting to read 10MB file, and as such, there is not enough memory to process the requst and it fails.
You can try increasing the amount of memory available, by using memory_limit setting.
This can be done globally for all scripts in the php.ini settings file or on a per script basis using
ini_set('memory_limit','16M');
Where 16M is 16 Megabytes of memory.

In addition to possibly increasing memory, you should also be looking at the cell caching options provided by PHPExcel for precisely this purpose, as described in the section of the developer documentation entitled "cell caching" (section 4.2.1)
EDIT
And your use of toArray() is going to build the array you're requesting in PHP memory as well, adding extra overhead - consider iterating over the worksheet a row at a time rather than loading it twice into memory (once as a PHPExcel object and once as your array)

Finally i solve my problem by using example12 code.
<?php
set_include_path(get_include_path() . PATH_SEPARATOR . 'ExcelRes/');
include 'PHPExcel/IOFactory.php';
$inputFileType = 'Excel5';
$inputFileName = 'ru_unit_H_all.xls';
class chunkReadFilter implements PHPExcel_Reader_IReadFilter
{
private $_startRow = 0;
private $_endRow = 0;
public function setRows($startRow)
{
$this->_startRow = $startRow;
}
public function readCell($column, $row, $worksheetName = '')
{
if (($row == 1) || ($row >= $this->_startRow ))
{
return true;
}
return false;
}
}
$objReader = PHPExcel_IOFactory::createReader($inputFileType);
$chunkFilter = new chunkReadFilter();
$objReader->setReadFilter($chunkFilter);
$chunkFilter->setRows(22000);
$objPHPExcel = $objReader->load($inputFileName);
echo $objPHPExcel->getActiveSheet()->getCell('K22000')->getValue();
}
?>

Related

PHPExcel Fatal error: Allowed memory size

I use PHPexcel to open a .xlsx file (on ovh mutualized server) and encountered problems that I solved.
I have a new problem when saving the the modified file :
"Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 49 bytes) in /home/observatvu/www/libraries/phpexcel/library/PHPExcel/Cell.php on line 870"
I read many questions and answers on internet and tried some solutions like :
memory_limit in .htaccess => problem on the server, it does'nt work
ini_set('memory_limit','512M') => I have the message above... with other ini_set values I have other sizes of memory error but no saving of the file.
I can't modify php.ini
I tried to write setPreCalculateFormulas(false) during saving the file but always the same problem.
Please someone could help me to find a working solution ?
Thank you
If you tried
$cacheMethod = PHPExcel_CachedObjectStorageFactory::cache_to_phpTemp;
$cacheSettings = array( 'memoryCacheSize' => '1024MB');
PHPExcel_Settings::setCacheStorageMethod($cacheMethod, $cacheSettings);
Then it wouldn't work.
You have a limit of 536,870,912 (512MB) for your PHP
The line
$cacheSettings = array( 'memoryCacheSize' => '1024MB');
is telling PHPExcel to use 1024MB of PHP memory before switching to using php://temp for caching.... that's what the memory element of the argument name memoryCacheSize means.
Use a lower value for the memoryCacheSize value than the amount of your PHP memory limit
$cacheMethod = PHPExcel_CachedObjectStorageFactory::cache_to_phpTemp;
$cacheSettings = array( 'memoryCacheSize' => '256MB');
PHPExcel_Settings::setCacheStorageMethod($cacheMethod, $cacheSettings);
If you can't make it work with PHPExcel's caching system, you can give Spout a try: https://github.com/box/spout.
It was designed to work with files of any size without causing memory or time limit issues.
All you need is 10MB of memory available and you can read all the XLSX files you want :)
This type of error can occur when passing an incorrect cell letter to a phpExcel function such as:
$objPHPExcel->getActiveSheet()->setCellValue($cell, $value);
Be sure not to increment column letters like this:
chr(ord($col) + 1);
Best to use a custom increment function like:
//$start = 'A'
private function _incrementCol($start, $offset)
{
$result = $start;
for($i = 1; $i <= $offset; $i++) {
$result++;
}
return $result;
}

PHPExcel_IOFactory::load Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 32 bytes) [duplicate]

I am using PHPExcel (found here: https://github.com/PHPOffice/PHPExcel). If i try to read more than approximately 2000 rows then it shows memory error as follows.
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried
to allocate 71 bytes) in
/home/sample/PHPExcelReader/Classes/PHPExcel/worksheet.php on line 89
My Excel data range is A1:X2000
Below is my code used to read the excel.
ini_set('memory_limit', '-1');
/** Include path **/
set_include_path(get_include_path() . PATH_SEPARATOR . 'Classes/');
/** PHPExcel_IOFactory */
include $unsecured_param['home_dir'].'APIs/PHPExcelReader/Classes/PHPExcel/IOFactory.php';
$inputFileName = $target; // File to read
//echo 'Loading file ',pathinfo($inputFileName,PATHINFO_BASENAME),' using IOFactory to identify the format<br />';
try {
$objPHPExcel = PHPExcel_IOFactory::load($inputFileName);
} catch(Exception $e) {
die('Error loading file "'.pathinfo($inputFileName,PATHINFO_BASENAME).'": '.$e->getMessage());
}
$sheetData = $objPHPExcel->getActiveSheet()->rangeToArray('A1:X2000', null, true, true, true)
//store data into array..
$i=0;$j=0;$max_rows=0;$max_columns=0;
foreach($sheetData as $rec)
{
foreach($rec as $part)
{//echo "items[$j][$i]=" ; echo $part;echo "<br>";
$items[$j][$i]=$part; $i=$i+1;
if($j==0) {$max_columns=$i;}
}
$j=$j+1;$i=0;
}
$max_rows=$j;
Could any one please let me know how to overcome this issue ?
Consider using cell caching to reduce the memory required to hold the workbook in memory, as described in section 4.2.1 of the developer documentation
And consider not using toArray() and then using that to build another array in memory.... doing this is really using a lot of memory to hold duplicated data, when you could simply loop through the rows and columns of the worksheet to do what you need
This error means that the PHP file that you are running has exceeded the allowed size in memory for PHP on your server. You can edit your PHP.ini file to allow your PHP files to allocate more space in memory when they are running, which may assist in this, but at the same time, if you are running a 32 bit Linux OS on your server for whatever reason, there is a hard cape of 3.5GB that the process can take up, so even allocating more than that, it will still fail and therefore cause a similar issue.
In cases such as this, it really comes down to the fact that the amount of data that you are trying to pull is too large and you need to scale it back somehow. It isn't necessarily an issue with the code, but rather how much data you are actually attempting to show/process.
Using google, I managed to find that the amount of memory that your noting (134217728 bytes), matches with the 128MB default that PHP.ini uses for memory_limit. Changing the value in the ini natively, will resolve this issue. If unable to do that, then you need to somehow limit the amount of data that you pull in one time.
Information:
http://ca1.php.net/manual/en/ini.core.php#ini.memory-limit

PHPExcel throws Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 71 bytes)

I am using PHPExcel (found here: https://github.com/PHPOffice/PHPExcel). If i try to read more than approximately 2000 rows then it shows memory error as follows.
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried
to allocate 71 bytes) in
/home/sample/PHPExcelReader/Classes/PHPExcel/worksheet.php on line 89
My Excel data range is A1:X2000
Below is my code used to read the excel.
ini_set('memory_limit', '-1');
/** Include path **/
set_include_path(get_include_path() . PATH_SEPARATOR . 'Classes/');
/** PHPExcel_IOFactory */
include $unsecured_param['home_dir'].'APIs/PHPExcelReader/Classes/PHPExcel/IOFactory.php';
$inputFileName = $target; // File to read
//echo 'Loading file ',pathinfo($inputFileName,PATHINFO_BASENAME),' using IOFactory to identify the format<br />';
try {
$objPHPExcel = PHPExcel_IOFactory::load($inputFileName);
} catch(Exception $e) {
die('Error loading file "'.pathinfo($inputFileName,PATHINFO_BASENAME).'": '.$e->getMessage());
}
$sheetData = $objPHPExcel->getActiveSheet()->rangeToArray('A1:X2000', null, true, true, true)
//store data into array..
$i=0;$j=0;$max_rows=0;$max_columns=0;
foreach($sheetData as $rec)
{
foreach($rec as $part)
{//echo "items[$j][$i]=" ; echo $part;echo "<br>";
$items[$j][$i]=$part; $i=$i+1;
if($j==0) {$max_columns=$i;}
}
$j=$j+1;$i=0;
}
$max_rows=$j;
Could any one please let me know how to overcome this issue ?
Consider using cell caching to reduce the memory required to hold the workbook in memory, as described in section 4.2.1 of the developer documentation
And consider not using toArray() and then using that to build another array in memory.... doing this is really using a lot of memory to hold duplicated data, when you could simply loop through the rows and columns of the worksheet to do what you need
This error means that the PHP file that you are running has exceeded the allowed size in memory for PHP on your server. You can edit your PHP.ini file to allow your PHP files to allocate more space in memory when they are running, which may assist in this, but at the same time, if you are running a 32 bit Linux OS on your server for whatever reason, there is a hard cape of 3.5GB that the process can take up, so even allocating more than that, it will still fail and therefore cause a similar issue.
In cases such as this, it really comes down to the fact that the amount of data that you are trying to pull is too large and you need to scale it back somehow. It isn't necessarily an issue with the code, but rather how much data you are actually attempting to show/process.
Using google, I managed to find that the amount of memory that your noting (134217728 bytes), matches with the 128MB default that PHP.ini uses for memory_limit. Changing the value in the ini natively, will resolve this issue. If unable to do that, then you need to somehow limit the amount of data that you pull in one time.
Information:
http://ca1.php.net/manual/en/ini.core.php#ini.memory-limit

Fatal error from phpmailer class

I have made a php page that sends an email message with multiple attachments.
The loop which i used to attach multiple attachments and to check the size of attachments is ,
foreach(array_keys($_FILES['attach']['name']) as $key)
{
$filesize = $_FILES['attach']['size'][$key];
$extention = pathinfo ($_FILES['attach']['name'][$key] ,PATHINFO_EXTENSION
);
$name=$_FILES['attach']['name'][$key];
$data=($_FILES['attach']['tmp_name']);
$totalsize = $totalsize + $filesize;
if($totalsize > 10000000) //10mb10000000
{$err="<font color=#990000 size=1>File exceeded maximum allowed limit of 10
Mb</font>";}
else{
$source = $_FILES['attach']['tmp_name'][$key];
$filename = $_FILES['attach']['name'][$key];
$mail->AddAttachment($source, $filename);
}
}//end Foreach loop
But when i try to attach a large file i get this error from the phpmailer class.
Fatal error: Allowed memory size of 16777216 bytes exhausted (tried to allocate
7355049 bytes) in /var/www/dev01/maiarn/Email/class.phpmailer.php on line 1677
Any body who can guide me Please.
You might want to increase the PHP memory limit. If you're working on your development machine, you could search for the php.ini file and modify the memory_limit (which often defaults to 16M). Change that to f.e. 128M and restart your webserver.
If you want to see it change, you can use the following line to show the configuration currently in use:
<?php phpinfo(); ?>
Let PHP use more memory on this script only by using the following PHP code:
ini_set(‘memory_limit’,’64M’);

Allowed memory size exhausted error exporting from mongodb

I try to export some documents from mongodb to .csv. For some large lists, the files would be something like 40M, I get errors about memory limit:
Fatal error: Allowed memory size of 134217728 bytes exhausted
(tried to allocate 44992513 bytes) in
/usr/share/php/Zend/Controller/Response/Abstract.php on line 586
I wonder why this error happens. What consumes such an amount of memory? How do I avoid such error without changing memory_limit which is set 128M now.
I use something like this:
public static function exportList($listId, $state = self::SUBSCRIBED)
{
$list = new Model_List();
$fieldsInfo = $list->getDescriptionsOfFields($listId);
$headers = array();
$params['list_id'] = $listId;
$mongodbCursor = self::getCursor($params, $fieldsInfo, $headers);
$mongodbCursor->timeout(0);
$fp = fopen('php://output', 'w');
foreach ($mongodbCursor as $subscriber) {
foreach ($fieldsInfo as $fieldInfo) {
$field = ($fieldInfo['constant']) ? $fieldInfo['field_tag'] : $fieldInfo['field_id'];
if (!isset($subscriber->$field)) {
$row[$field] = '';
} elseif (Model_CustomField::isMultivaluedType($fieldInfo['type'])) {
$row[$field] = array();
foreach ($subscriber->$field as $value) {
$row[$field][] = $value;
}
$row[$field] = implode(self::MULTIVALUED_DELEMITOR, $row[$field]);
} else {
$row[$field] = $subscriber->$field;
}
}
fputcsv($fp, $row);
}
}
Then in my controller I try to call it something like this:
public function exportAction()
{
set_time_limit(300);
$this->_helper->layout->disableLayout();
$this->_helper->viewRenderer->setNoRender();
$fileName = $list->list_name . '.csv';
$this->getResponse()->setHeader('Content-Type', 'text/csv; charset=utf-8')
->setHeader('Content-Disposition', 'attachment; filename="'. $fileName . '"');
Model_Subscriber1::exportList($listId);
echo 'Peak memory usage: ', memory_get_peak_usage()/1024, ' Memory usage: ', memory_get_usage()/1024;
}
So I'm at the end of the file where I export data. It's rather strange that for the list I export with something like 1M documents, it exports successfully and displays:
> Peak memory usage: 50034.921875 Kb Memory usage: 45902.546875 Kb
But when I try to export 1.3M documents, then after several minutes I only get in export file:
Fatal error: Allowed memory size of 134217728 bytes exhausted
(tried to allocate 44992513 bytes) in
/usr/share/php/Zend/Controller/Response/Abstract.php on line 586.
The size of documents I export are approximately the same.
I increased memory_limit to 256M and tried to export 1.3M list, this is what it showed:
Peak memory usage: 60330.4609375Kb Memory usage: 56894.421875 Kb.
It seems very confusing to me. Isn't this data so inaccurate? Otherwise, why it causes memory exhausted error with memory_limit set to 128M?
While the size of the documents may be about the same, the size allocated by PHP to process them isn't directly proportional to the document size or number of documents. This is because different types require different memory allocation in PHP. You may be able to free some memory as you go, but I don't see any place where you can in your code.
The best answer is to probably just increase the memory limit.
One thing you could do is offload the processing to an external script and call that from PHP. Many languages do this sort of processing in a more memory efficient way than PHP.
I've also noticed that the memory_get_peak_usage() isn't always accurate. I would try an experiment to increase the mem_limit to say 256 and run it on the larger data set (the 1.3 million). You are likely to find that it reports below the 128 limit as well.
I could reproduce this issue in a similar case of exporting a CSV file, where my system should have had enough memory, as shown by memory_get_usage(), but ended up with the same fatal error:
Fatal error: Allowed memory size.
I circumvented this issue by outputting the CSV contents into a physical temporary file, that I eventually zipped, before reading it out.
I wrote the file in a loop, so that each iteration wrote only a limited chunk of data, so that I never exceded the memory limit.
After zipping, the compression ratio was such, that I could handle raw files of over 10 times the size I initially hit the wall at. All up, it was a success.
Hint: when creating your archive, don't unlink the archive component(s) before invoking $zip->close(), as this call seems to be the one doing the business. Otherwise you'll end up with an empty archive!
Code sample:
<?php
$zip = new ZipArchive;
if ($zip->open($full_zip_path, ZipArchive::CREATE) === TRUE) {
$zip->addFile($full_csv_path, $csv_name);
$zip->close();
$Response->setHeader("Content-type", "application/zip; charset=utf-8");
$Response->setHeader("Content-disposition", "attachment; filename=" . $zip_name);
$Response->setBody(file_get_contents($full_zip_path));
}
else {
var_dump(error_get_last());
echo utf8_decode("Couldn't create zip archive '$full_zip_path'."), "\r\n";
}
unset($zip);
?>
Attention: when adding items to the zip archive, don't prepend a leading slash to the item's name if using Windows based OS.
Discussion over the original issue:
The Zend file at the line quoted is the
public function outputBody()
{
$body = implode('', $this->_body);
echo $body;
}
from the outputBody() method of the Zend_Controller_Response_Abstract class.
It looks like, however you do it, through echo, or print, or readfile, the output is always captured, and stuck into the response body, even if your turn the response return feature off before the dispatch.
I even tried to use the clearBody() class method, within the echo loop, with in mind that each $response->sendResponse() followed by $response->clearBody() would release memory, but it failed.
The way Zend handles the sending of the response is such that I always got the memory allocation of the full size of the raw CSV file.
Yet to be determined how it would be possible to tell Zend not to "capture" the output buffer.

Categories