I've got to get some (potentially) very large files uploaded to my S3 bucket on a Laravel Job I am building out. I am getting the dreaded "Allowed memory size of ### bytes exhausted" error, and I have no interest in increasing the memory limit in php.ini (simply because I don't know how large some of these files will go, and at some point I need to quit running away from these large files by increasing memory_limit to ridiculous levels).
The question is: Does Laravel make chunking this thing easy? Is there a function I am not seeing that I can use?
I know the answer is probably no, but Laravel makes SO many things easy for me, I figured I might ask to see if I was missing something in my Google's.
If this does not exist in Laravel, what should I do? I know that I need to take the file into memory a chunk at a time, but I have no idea where to start on that.
Thanks!
Related
I've noticed that loading an image into imagick ($im = new Imagick($sFilename);) in php is taking 0.6 seconds for an 8MB image. This seems a bit slow to me, so I tried a test and read the file in using file_get_contents instead. About 0.005 seconds. Better. A bit too good tbh, I guess there's some caching going on there?
But I can load the same file a dozen times into imagick and it's always ~0.6 seconds.
Can I tell file_get_contents to bypass the system cache somehow, to give me a better idea of the raw speed with which an 8MB file can be retrieved from my hard drives?
Is there anything that can be done to speed up imagick? Or is 0.6 seconds for this operation completely normal?
The server has two 7200rpm HP sata drives in RAID 1.
Thanks.
Is there anything that can be done to speed up imagick?
Buy a faster CPU
Or is 0.6 seconds for this operation completely normal?
Yes.
This seems a bit slow to me
but it seems a long time for that.
I guess there's some caching going on there?
You're just guessing that something should be faster.....and you'r comparing it to a completely different operation. file_get_contents just reads the bytes in the file off the disk. Creating an image from a JPG means the computer has to read the bytes off the disk, and then decode them from the compressed data to be the actual image data.
If you want to see how much work has to be done during the compression, you can easily see this by writing the image out in an uncompressed format e.g.
$imagick = new Imagick("./testImage.jpg");
$imagick->setImageFormat('BMP');
$imagick->writeImage("./output.bmp");
And yes, this is longer than is reasonable for a HTTP request to take processing. Which is just another reason for why not running Imagick in a webserver is a good idea, but to instead run it as a background task.
I originally posted this over in the Drupal exchange but it was suggested I try here since it seems to be more related to my server, not Drupal.
When uploading files through php, the upload speed can see drastic slow downs. For example, I'm uploading a 400MB video and I'll be getting 10mbps then, suddenly, it drops to less than 100kbps. After some time, it speeds up again. Then it slows down again. And then repeats. There is no consistency as to when this happens. I can get this to replicate with large files and small files but since it is erratic, it's harder to observe on small files. I have not observed this uploading through SCP, so I assume it's not a network issue.
Here is what I know.
It's not my internet connection. I've tried multiple people from different places with the same results.
Tried on several browsers with the same result.
Eventually the upload will complete but in a lot of instances, an upload that should take 4 minutes ends up taking 30 or more.
PHP is set to allow 2GB file uploads. And PECL uploadprogress is installed.
Though I use Drupal, I've also tried uploading files through a straight-forward php upload form with the same results. So it's not a Drupal issue.
What I think is happening - at some point during the upload process, some sort of buffer is being hit. I really don't even know where to begin looking for that at the server or OS-level
We have plenty of disk space (more than 1TB) and plenty of RAM (24GB)
I'm hoping someone here has experienced similar or can suggest where to begin looking. Thanks for reading!
I am in process of rewriting some scripts to parse machine generated logs from perl to php
The files range from 20mb~400mb
I am running into this problem to decide if I should use file() or fopen()+fgets() combo to go through the file for some faster performance.
Here is the basic run through,
I check for file size before opening it, and if file is larger than 100mb(pretty rare case, but it does happen from time to time) I will go the fopen+fgets route since I only bumped the memory limit for the script to 384mb, any file larger than 100mb will have chance causing fatal error. Otherwise, I use file().
I am only going through the file once from beginning to the end in both method, line by line.
Here is the question, is it worth it to keep the file() part of the code to deal with the small files? I don't know how exactly file() (i use the SKIP_EMPTY_LINE option as well) works in php, does it map the file into the memory directly or does it shove line by line into the memory while going through it? I ran some benchmark on it, performance is pretty close, average difference is about 0.1s on 40mb file, and file() has advantage over fopen+fgets about 80% of the time(out of 200 test on the same fileset).
Dropping the file part could save me some memory from the system for sure, and considering I have 3 instance of the same script running at the same time, it could save me 1G worth of memory on a 12G system that's also hosting the database and other crap. But I don't want to let the performance of the script down also, since there is like 10k of these logs coming in per day, 0.1s difference actually adds up.
Any suggestion would help and TIA!
I would suggest sticking with one mechanism, like foreach(new \SplFileObject('file.log') as $line). Split your input files and process them in parallel, 2-3x per CPU core. Bonus: lower priority than database on same system. In PHP, this would mean spawning off N copies of the script at once, where each copy has its own file list or directory. Since you're talking about a rewrite and IO performance is an issue, consider other platforms with more capabilities here, eg Java 7 NIO, nodejs asynchronous IO, C# TPL.
I have an image GD script which is currently using around 9MB of memory.
I get a lot of traffic and hence sometimes it using up hell lot of RAM on my server.
Is there any way to reduce memory usage for image gd?
Or at-least make script process faster so that it de-allocates the memory which it is using, faster.
I have tried changing image quality, it had no effect.
I also tried changing image pixel size, it reduced the memory usage, but not much.
Thanks.
It's impossible to tell without seeing the code, but unless it contains any major mistakes, then the answer is probably no.
What you might be able to do is use the external imagemagick binary instead - it runs outside PHP's script memory limit - but that is an entirely different technology, and would require you to rewrite your code.
I assume you are already caching GD's results so not every request causes it to run?
Try avoiding using image GD on the fly if you are concerned about memory limits.
It's hard to solve the problem without seeing code, but i can make a suggestion.
Have a different process handle the images, for example, if you want to resize images, don't resize them everytime the user access a page, instead run a cron or a scheduler with window to resize all the images that needs to be resized periodically and save them. so there will be less overhead.
If you provide more code you will get better help
Welcome,
Does someone have any resources about creating large zip files (one file, not folder) in PHP ?
Without using shell access to "zip" application.
low memory usage (i can't use gzcompress) because file is to large to do this in RAM.
I have unzip function work perfect on low memory system
Here is unzip,.
http://paste-it.net/public/wdb61dc/
Regards
If the deflate algorithm is too memory intensive for you then there isn't really a good way to do what you want. You can try turning down the compression.
Can you give us an idea of how large the file is and how much memory you want to work with?