Error generating PDF with DomPDF with 5000 records - php

My controller, with DomPDF
public function exportPdf(){
$facturas = Vale:: all();
$pdf = PDF::loadView('pdf.facturas', compact('facturas'));
return $pdf->stream('facturas.pdf');
}
error:
Allowed memory size of 134217728 bytes exhausted (tried to allocate 2097160 bytes)
Any idea how to solve it ? In small files of 10 sheets it takes but generates the PDF. Thanks in advance for the help that you could give me.
Regards

This happens when your application requires more memory than PHP is configured to have.
You could change this globally in your php.ini file, but in the case of these generators where only one function needs more memory, you can include an ini_set(); in your code. That way, code that isn't expected to use lots of memory still gets cleaned up.
In this case, you would use ini_set('memory_limit','256M'); before your PDF code generation. You'll have to work out the right value to set based on available memory on your server, traffic you expect, etc.
Here are the relevant PHP docs:
https://www.php.net/manual/en/function.ini-set.php
https://www.php.net/manual/en/ini.core.php#ini.memory-limit

Related

Laravel file_get_contents allowed memory exhausted

I am using a console command to download some data locally and than dispatch an update job from that data. The issue I'm having is that the data downloaded is around 65MB for now. The line Storage::disk('local')->put($name, $content); specifically throws a php fatal error: allowed memory size of 134217728 bytes exhausted since I assume the put method creates a copy of $content going beyond 128MB.
Is there a way around this other than setting the memory limit to say 256MB?
Can I store this data in chunks maybe? I am not interested in working on the chunks themselfs. Is there some Laravel method that takes the reference &$contents to store the data?
I would prefer a "Laravel" solution if possible.
$name = basename(config('helper.db_url'));
$content = file_get_contents(config('helper.db_url'));
Storage::disk('local')->put($name, $content);
UpdatePostsTable::dispatch();
Log::info("Downloaded $name");

PHPExcel_IOFactory::load Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 32 bytes) [duplicate]

I am using PHPExcel (found here: https://github.com/PHPOffice/PHPExcel). If i try to read more than approximately 2000 rows then it shows memory error as follows.
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried
to allocate 71 bytes) in
/home/sample/PHPExcelReader/Classes/PHPExcel/worksheet.php on line 89
My Excel data range is A1:X2000
Below is my code used to read the excel.
ini_set('memory_limit', '-1');
/** Include path **/
set_include_path(get_include_path() . PATH_SEPARATOR . 'Classes/');
/** PHPExcel_IOFactory */
include $unsecured_param['home_dir'].'APIs/PHPExcelReader/Classes/PHPExcel/IOFactory.php';
$inputFileName = $target; // File to read
//echo 'Loading file ',pathinfo($inputFileName,PATHINFO_BASENAME),' using IOFactory to identify the format<br />';
try {
$objPHPExcel = PHPExcel_IOFactory::load($inputFileName);
} catch(Exception $e) {
die('Error loading file "'.pathinfo($inputFileName,PATHINFO_BASENAME).'": '.$e->getMessage());
}
$sheetData = $objPHPExcel->getActiveSheet()->rangeToArray('A1:X2000', null, true, true, true)
//store data into array..
$i=0;$j=0;$max_rows=0;$max_columns=0;
foreach($sheetData as $rec)
{
foreach($rec as $part)
{//echo "items[$j][$i]=" ; echo $part;echo "<br>";
$items[$j][$i]=$part; $i=$i+1;
if($j==0) {$max_columns=$i;}
}
$j=$j+1;$i=0;
}
$max_rows=$j;
Could any one please let me know how to overcome this issue ?
Consider using cell caching to reduce the memory required to hold the workbook in memory, as described in section 4.2.1 of the developer documentation
And consider not using toArray() and then using that to build another array in memory.... doing this is really using a lot of memory to hold duplicated data, when you could simply loop through the rows and columns of the worksheet to do what you need
This error means that the PHP file that you are running has exceeded the allowed size in memory for PHP on your server. You can edit your PHP.ini file to allow your PHP files to allocate more space in memory when they are running, which may assist in this, but at the same time, if you are running a 32 bit Linux OS on your server for whatever reason, there is a hard cape of 3.5GB that the process can take up, so even allocating more than that, it will still fail and therefore cause a similar issue.
In cases such as this, it really comes down to the fact that the amount of data that you are trying to pull is too large and you need to scale it back somehow. It isn't necessarily an issue with the code, but rather how much data you are actually attempting to show/process.
Using google, I managed to find that the amount of memory that your noting (134217728 bytes), matches with the 128MB default that PHP.ini uses for memory_limit. Changing the value in the ini natively, will resolve this issue. If unable to do that, then you need to somehow limit the amount of data that you pull in one time.
Information:
http://ca1.php.net/manual/en/ini.core.php#ini.memory-limit

PHPExcel throws Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 71 bytes)

I am using PHPExcel (found here: https://github.com/PHPOffice/PHPExcel). If i try to read more than approximately 2000 rows then it shows memory error as follows.
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried
to allocate 71 bytes) in
/home/sample/PHPExcelReader/Classes/PHPExcel/worksheet.php on line 89
My Excel data range is A1:X2000
Below is my code used to read the excel.
ini_set('memory_limit', '-1');
/** Include path **/
set_include_path(get_include_path() . PATH_SEPARATOR . 'Classes/');
/** PHPExcel_IOFactory */
include $unsecured_param['home_dir'].'APIs/PHPExcelReader/Classes/PHPExcel/IOFactory.php';
$inputFileName = $target; // File to read
//echo 'Loading file ',pathinfo($inputFileName,PATHINFO_BASENAME),' using IOFactory to identify the format<br />';
try {
$objPHPExcel = PHPExcel_IOFactory::load($inputFileName);
} catch(Exception $e) {
die('Error loading file "'.pathinfo($inputFileName,PATHINFO_BASENAME).'": '.$e->getMessage());
}
$sheetData = $objPHPExcel->getActiveSheet()->rangeToArray('A1:X2000', null, true, true, true)
//store data into array..
$i=0;$j=0;$max_rows=0;$max_columns=0;
foreach($sheetData as $rec)
{
foreach($rec as $part)
{//echo "items[$j][$i]=" ; echo $part;echo "<br>";
$items[$j][$i]=$part; $i=$i+1;
if($j==0) {$max_columns=$i;}
}
$j=$j+1;$i=0;
}
$max_rows=$j;
Could any one please let me know how to overcome this issue ?
Consider using cell caching to reduce the memory required to hold the workbook in memory, as described in section 4.2.1 of the developer documentation
And consider not using toArray() and then using that to build another array in memory.... doing this is really using a lot of memory to hold duplicated data, when you could simply loop through the rows and columns of the worksheet to do what you need
This error means that the PHP file that you are running has exceeded the allowed size in memory for PHP on your server. You can edit your PHP.ini file to allow your PHP files to allocate more space in memory when they are running, which may assist in this, but at the same time, if you are running a 32 bit Linux OS on your server for whatever reason, there is a hard cape of 3.5GB that the process can take up, so even allocating more than that, it will still fail and therefore cause a similar issue.
In cases such as this, it really comes down to the fact that the amount of data that you are trying to pull is too large and you need to scale it back somehow. It isn't necessarily an issue with the code, but rather how much data you are actually attempting to show/process.
Using google, I managed to find that the amount of memory that your noting (134217728 bytes), matches with the 128MB default that PHP.ini uses for memory_limit. Changing the value in the ini natively, will resolve this issue. If unable to do that, then you need to somehow limit the amount of data that you pull in one time.
Information:
http://ca1.php.net/manual/en/ini.core.php#ini.memory-limit

Scraping much data from different URLs with simple_html_dom.php

i basically want to do exactly something like this: Simple Html DOM Caching
i got everything to work so far, but now i'm getting the following error because i scrape many sites (6 at the moment, i want up to 25 sites):
Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 39 bytes)
i'm a php newbie =/...so, how can i "serialize" the scraping process step by step that my memory don't give up? :-)
code sample:
// Include the library
include('simple_html_dom.php');
// retrieve and find contents
$html0 = file_get_html('http://www.site.com/');
foreach($html0->find('#id') as $aktuelle_spiele);
file_put_contents("cache/cache0.html",$aktuelle_spiele);
thank you very much in advance for your help!
In your php.ini, change this line:
memory_limit = 32M
With this one:
memory_limit = 256M //or another greater value
Or add this piece of code at the start of every php script that use simple_html_dom:
ini_set('memory_limit', '128M'); //or a greater value
you can run a memory increase at the start of your script.
Like this:
ini_set('memory_limit', '128M');

NuSOAP varDump PHP Fatal error: Allowed memory size of 134217728 bytes exhausted

Sorry for my english :)
I have NuSOAP version 0.9.5. And I have had an php error when tried to get a big data:
PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 27255652 bytes)
Stack trace shows that problem was in varDump method.
My solution is:
I have changed varDump method (in nusoap.php) to:
function varDump($data) {
$ret_val = "";
if ($this->debugLevel > 0) {
ob_start();
var_dump($data);
$ret_val = ob_get_contents();
ob_end_clean();
}
return $ret_val;
}
and then reset
$GLOBALS['_transient']['static']['nusoap_base']['globalDebugLevel']
to 0 (from 9). In class.nusoap_base.php and nusoap.php.
This helped me.
Does anyone have any comments on this? Or maybe better solution?
Many thanks and respect to Aaron Mingle for the real solution found for NuSOAP out of memory problem. The solution can be found here:
https://sourceforge.net/p/nusoap/discussion/193578/thread/12965595/
I already implemented and immediately tested and I am happy now because it works perfect. In my case I had approx 45 MB SOAP message size (including ~30 pdf files in base64 encoded) and even 2 GB memory for PHP did not helped before. So I have tried Aaron Mingle's solution and it was the good solution with only 384 MB memory granted to PHP.
+1 to Alexey Choporov as well because his suggestion is also required. So both modification is a must have patch in NuSOAP working preperly with larger messages.

Categories