Zend PDF Memory Management - php

I'm trying to create a large PDF file using Zend\PDF. But since Zend keeps the data in an object, at some point it shows "memory exhausted" error message.
Does anybody knows how to manage memory while creating large PDF files???

You can try to increase the size limit of the memory temporarily:
$memory_berfore = ini_get('memory_limit'); // in your php.ini
// New Limit
ini_set('memory_limit','256M'); //128M, 256M, 512M, 1024M,... (X*2M)
...
your code (creating large PDF)
...
// Older Limit
ini_set('memory_limit',$memory_berfore);
Edit:
As stated by pgampe, you can put -1 instead of '256M' in my example to have no memory limit.

Related

Error generating PDF with DomPDF with 5000 records

My controller, with DomPDF
public function exportPdf(){
$facturas = Vale:: all();
$pdf = PDF::loadView('pdf.facturas', compact('facturas'));
return $pdf->stream('facturas.pdf');
}
error:
Allowed memory size of 134217728 bytes exhausted (tried to allocate 2097160 bytes)
Any idea how to solve it ? In small files of 10 sheets it takes but generates the PDF. Thanks in advance for the help that you could give me.
Regards
This happens when your application requires more memory than PHP is configured to have.
You could change this globally in your php.ini file, but in the case of these generators where only one function needs more memory, you can include an ini_set(); in your code. That way, code that isn't expected to use lots of memory still gets cleaned up.
In this case, you would use ini_set('memory_limit','256M'); before your PDF code generation. You'll have to work out the right value to set based on available memory on your server, traffic you expect, etc.
Here are the relevant PHP docs:
https://www.php.net/manual/en/function.ini-set.php
https://www.php.net/manual/en/ini.core.php#ini.memory-limit

PHPExcel_IOFactory::load Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 32 bytes) [duplicate]

I am using PHPExcel (found here: https://github.com/PHPOffice/PHPExcel). If i try to read more than approximately 2000 rows then it shows memory error as follows.
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried
to allocate 71 bytes) in
/home/sample/PHPExcelReader/Classes/PHPExcel/worksheet.php on line 89
My Excel data range is A1:X2000
Below is my code used to read the excel.
ini_set('memory_limit', '-1');
/** Include path **/
set_include_path(get_include_path() . PATH_SEPARATOR . 'Classes/');
/** PHPExcel_IOFactory */
include $unsecured_param['home_dir'].'APIs/PHPExcelReader/Classes/PHPExcel/IOFactory.php';
$inputFileName = $target; // File to read
//echo 'Loading file ',pathinfo($inputFileName,PATHINFO_BASENAME),' using IOFactory to identify the format<br />';
try {
$objPHPExcel = PHPExcel_IOFactory::load($inputFileName);
} catch(Exception $e) {
die('Error loading file "'.pathinfo($inputFileName,PATHINFO_BASENAME).'": '.$e->getMessage());
}
$sheetData = $objPHPExcel->getActiveSheet()->rangeToArray('A1:X2000', null, true, true, true)
//store data into array..
$i=0;$j=0;$max_rows=0;$max_columns=0;
foreach($sheetData as $rec)
{
foreach($rec as $part)
{//echo "items[$j][$i]=" ; echo $part;echo "<br>";
$items[$j][$i]=$part; $i=$i+1;
if($j==0) {$max_columns=$i;}
}
$j=$j+1;$i=0;
}
$max_rows=$j;
Could any one please let me know how to overcome this issue ?
Consider using cell caching to reduce the memory required to hold the workbook in memory, as described in section 4.2.1 of the developer documentation
And consider not using toArray() and then using that to build another array in memory.... doing this is really using a lot of memory to hold duplicated data, when you could simply loop through the rows and columns of the worksheet to do what you need
This error means that the PHP file that you are running has exceeded the allowed size in memory for PHP on your server. You can edit your PHP.ini file to allow your PHP files to allocate more space in memory when they are running, which may assist in this, but at the same time, if you are running a 32 bit Linux OS on your server for whatever reason, there is a hard cape of 3.5GB that the process can take up, so even allocating more than that, it will still fail and therefore cause a similar issue.
In cases such as this, it really comes down to the fact that the amount of data that you are trying to pull is too large and you need to scale it back somehow. It isn't necessarily an issue with the code, but rather how much data you are actually attempting to show/process.
Using google, I managed to find that the amount of memory that your noting (134217728 bytes), matches with the 128MB default that PHP.ini uses for memory_limit. Changing the value in the ini natively, will resolve this issue. If unable to do that, then you need to somehow limit the amount of data that you pull in one time.
Information:
http://ca1.php.net/manual/en/ini.core.php#ini.memory-limit

PHPExcel throws Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 71 bytes)

I am using PHPExcel (found here: https://github.com/PHPOffice/PHPExcel). If i try to read more than approximately 2000 rows then it shows memory error as follows.
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried
to allocate 71 bytes) in
/home/sample/PHPExcelReader/Classes/PHPExcel/worksheet.php on line 89
My Excel data range is A1:X2000
Below is my code used to read the excel.
ini_set('memory_limit', '-1');
/** Include path **/
set_include_path(get_include_path() . PATH_SEPARATOR . 'Classes/');
/** PHPExcel_IOFactory */
include $unsecured_param['home_dir'].'APIs/PHPExcelReader/Classes/PHPExcel/IOFactory.php';
$inputFileName = $target; // File to read
//echo 'Loading file ',pathinfo($inputFileName,PATHINFO_BASENAME),' using IOFactory to identify the format<br />';
try {
$objPHPExcel = PHPExcel_IOFactory::load($inputFileName);
} catch(Exception $e) {
die('Error loading file "'.pathinfo($inputFileName,PATHINFO_BASENAME).'": '.$e->getMessage());
}
$sheetData = $objPHPExcel->getActiveSheet()->rangeToArray('A1:X2000', null, true, true, true)
//store data into array..
$i=0;$j=0;$max_rows=0;$max_columns=0;
foreach($sheetData as $rec)
{
foreach($rec as $part)
{//echo "items[$j][$i]=" ; echo $part;echo "<br>";
$items[$j][$i]=$part; $i=$i+1;
if($j==0) {$max_columns=$i;}
}
$j=$j+1;$i=0;
}
$max_rows=$j;
Could any one please let me know how to overcome this issue ?
Consider using cell caching to reduce the memory required to hold the workbook in memory, as described in section 4.2.1 of the developer documentation
And consider not using toArray() and then using that to build another array in memory.... doing this is really using a lot of memory to hold duplicated data, when you could simply loop through the rows and columns of the worksheet to do what you need
This error means that the PHP file that you are running has exceeded the allowed size in memory for PHP on your server. You can edit your PHP.ini file to allow your PHP files to allocate more space in memory when they are running, which may assist in this, but at the same time, if you are running a 32 bit Linux OS on your server for whatever reason, there is a hard cape of 3.5GB that the process can take up, so even allocating more than that, it will still fail and therefore cause a similar issue.
In cases such as this, it really comes down to the fact that the amount of data that you are trying to pull is too large and you need to scale it back somehow. It isn't necessarily an issue with the code, but rather how much data you are actually attempting to show/process.
Using google, I managed to find that the amount of memory that your noting (134217728 bytes), matches with the 128MB default that PHP.ini uses for memory_limit. Changing the value in the ini natively, will resolve this issue. If unable to do that, then you need to somehow limit the amount of data that you pull in one time.
Information:
http://ca1.php.net/manual/en/ini.core.php#ini.memory-limit

Scraping much data from different URLs with simple_html_dom.php

i basically want to do exactly something like this: Simple Html DOM Caching
i got everything to work so far, but now i'm getting the following error because i scrape many sites (6 at the moment, i want up to 25 sites):
Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 39 bytes)
i'm a php newbie =/...so, how can i "serialize" the scraping process step by step that my memory don't give up? :-)
code sample:
// Include the library
include('simple_html_dom.php');
// retrieve and find contents
$html0 = file_get_html('http://www.site.com/');
foreach($html0->find('#id') as $aktuelle_spiele);
file_put_contents("cache/cache0.html",$aktuelle_spiele);
thank you very much in advance for your help!
In your php.ini, change this line:
memory_limit = 32M
With this one:
memory_limit = 256M //or another greater value
Or add this piece of code at the start of every php script that use simple_html_dom:
ini_set('memory_limit', '128M'); //or a greater value
you can run a memory increase at the start of your script.
Like this:
ini_set('memory_limit', '128M');

Why is my image uploader crashing over a resolution threshold of 2900 x 2176?

Image size is not the problem here, because my image is 800 Kb.
My image upload works flawlessly at any resolution below 2900 x 2176. Over that threshold, it doesn't work. No image is uploaded. Why is that happening?
I'll put some code of my upload handler, just in case, but not sure if it's relevant.
The error is:
PHP Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 8884 bytes) in /path/imageResizer.php on line 34 –
which refers to...
if( $this->image_type == IMAGETYPE_JPEG ) { $this->image = imagecreatefromjpeg($filename);
What kind of break occurs? Any errors being thrown?
Possibilities:
You're running out of memory. My guess is the imageResizer object and its methods are to blame. What is your current memory limit for php?
EDIT: As #deceze said, you can do use this function to temporarily raise the allocated memory:
ini_set('memory_limit', '64MB');
imageResizer can't handle images of that resolution. Did you write that class yourself, or is it a library? I'm not familiar with it. Check specs.
Image size is the problem. The file may only be 800 KB, but the image needs to be expanded into memory if you want to work with it. So you need roughly
2900 × 2176 × color depth × no. of channels
bytes of memory to store each individual pixel in memory to do anything with the image. This may easily surpass the regular PHP memory limit. Set a higher limit using, for example:
ini_set('memory_limit', '500M');
You might have reached the memory_limit.
You should have a error message telling you that, I strongly recommand you to display those during development.
In the meantime, you can add more memory for your imageReziser this way (if you have the right to use ini_set function on you server)
ini_set('memory_limit', '256M');

Categories