i basically want to do exactly something like this: Simple Html DOM Caching
i got everything to work so far, but now i'm getting the following error because i scrape many sites (6 at the moment, i want up to 25 sites):
Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 39 bytes)
i'm a php newbie =/...so, how can i "serialize" the scraping process step by step that my memory don't give up? :-)
code sample:
// Include the library
include('simple_html_dom.php');
// retrieve and find contents
$html0 = file_get_html('http://www.site.com/');
foreach($html0->find('#id') as $aktuelle_spiele);
file_put_contents("cache/cache0.html",$aktuelle_spiele);
thank you very much in advance for your help!
In your php.ini, change this line:
memory_limit = 32M
With this one:
memory_limit = 256M //or another greater value
Or add this piece of code at the start of every php script that use simple_html_dom:
ini_set('memory_limit', '128M'); //or a greater value
you can run a memory increase at the start of your script.
Like this:
ini_set('memory_limit', '128M');
Related
I am using a console command to download some data locally and than dispatch an update job from that data. The issue I'm having is that the data downloaded is around 65MB for now. The line Storage::disk('local')->put($name, $content); specifically throws a php fatal error: allowed memory size of 134217728 bytes exhausted since I assume the put method creates a copy of $content going beyond 128MB.
Is there a way around this other than setting the memory limit to say 256MB?
Can I store this data in chunks maybe? I am not interested in working on the chunks themselfs. Is there some Laravel method that takes the reference &$contents to store the data?
I would prefer a "Laravel" solution if possible.
$name = basename(config('helper.db_url'));
$content = file_get_contents(config('helper.db_url'));
Storage::disk('local')->put($name, $content);
UpdatePostsTable::dispatch();
Log::info("Downloaded $name");
My controller, with DomPDF
public function exportPdf(){
$facturas = Vale:: all();
$pdf = PDF::loadView('pdf.facturas', compact('facturas'));
return $pdf->stream('facturas.pdf');
}
error:
Allowed memory size of 134217728 bytes exhausted (tried to allocate 2097160 bytes)
Any idea how to solve it ? In small files of 10 sheets it takes but generates the PDF. Thanks in advance for the help that you could give me.
Regards
This happens when your application requires more memory than PHP is configured to have.
You could change this globally in your php.ini file, but in the case of these generators where only one function needs more memory, you can include an ini_set(); in your code. That way, code that isn't expected to use lots of memory still gets cleaned up.
In this case, you would use ini_set('memory_limit','256M'); before your PDF code generation. You'll have to work out the right value to set based on available memory on your server, traffic you expect, etc.
Here are the relevant PHP docs:
https://www.php.net/manual/en/function.ini-set.php
https://www.php.net/manual/en/ini.core.php#ini.memory-limit
I'm trying to create a large PDF file using Zend\PDF. But since Zend keeps the data in an object, at some point it shows "memory exhausted" error message.
Does anybody knows how to manage memory while creating large PDF files???
You can try to increase the size limit of the memory temporarily:
$memory_berfore = ini_get('memory_limit'); // in your php.ini
// New Limit
ini_set('memory_limit','256M'); //128M, 256M, 512M, 1024M,... (X*2M)
...
your code (creating large PDF)
...
// Older Limit
ini_set('memory_limit',$memory_berfore);
Edit:
As stated by pgampe, you can put -1 instead of '256M' in my example to have no memory limit.
Sorry for my english :)
I have NuSOAP version 0.9.5. And I have had an php error when tried to get a big data:
PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 27255652 bytes)
Stack trace shows that problem was in varDump method.
My solution is:
I have changed varDump method (in nusoap.php) to:
function varDump($data) {
$ret_val = "";
if ($this->debugLevel > 0) {
ob_start();
var_dump($data);
$ret_val = ob_get_contents();
ob_end_clean();
}
return $ret_val;
}
and then reset
$GLOBALS['_transient']['static']['nusoap_base']['globalDebugLevel']
to 0 (from 9). In class.nusoap_base.php and nusoap.php.
This helped me.
Does anyone have any comments on this? Or maybe better solution?
Many thanks and respect to Aaron Mingle for the real solution found for NuSOAP out of memory problem. The solution can be found here:
https://sourceforge.net/p/nusoap/discussion/193578/thread/12965595/
I already implemented and immediately tested and I am happy now because it works perfect. In my case I had approx 45 MB SOAP message size (including ~30 pdf files in base64 encoded) and even 2 GB memory for PHP did not helped before. So I have tried Aaron Mingle's solution and it was the good solution with only 384 MB memory granted to PHP.
+1 to Alexey Choporov as well because his suggestion is also required. So both modification is a must have patch in NuSOAP working preperly with larger messages.
Image size is not the problem here, because my image is 800 Kb.
My image upload works flawlessly at any resolution below 2900 x 2176. Over that threshold, it doesn't work. No image is uploaded. Why is that happening?
I'll put some code of my upload handler, just in case, but not sure if it's relevant.
The error is:
PHP Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 8884 bytes) in /path/imageResizer.php on line 34 –
which refers to...
if( $this->image_type == IMAGETYPE_JPEG ) { $this->image = imagecreatefromjpeg($filename);
What kind of break occurs? Any errors being thrown?
Possibilities:
You're running out of memory. My guess is the imageResizer object and its methods are to blame. What is your current memory limit for php?
EDIT: As #deceze said, you can do use this function to temporarily raise the allocated memory:
ini_set('memory_limit', '64MB');
imageResizer can't handle images of that resolution. Did you write that class yourself, or is it a library? I'm not familiar with it. Check specs.
Image size is the problem. The file may only be 800 KB, but the image needs to be expanded into memory if you want to work with it. So you need roughly
2900 × 2176 × color depth × no. of channels
bytes of memory to store each individual pixel in memory to do anything with the image. This may easily surpass the regular PHP memory limit. Set a higher limit using, for example:
ini_set('memory_limit', '500M');
You might have reached the memory_limit.
You should have a error message telling you that, I strongly recommand you to display those during development.
In the meantime, you can add more memory for your imageReziser this way (if you have the right to use ini_set function on you server)
ini_set('memory_limit', '256M');