I have an upload form which allows multiple files to be selected. The original files are quite heavy - approx. 10MB each.
Each uploaded file goes through the same resizing and saving process:
<?php
$file = $request->file('qqfile'); // Huge JPG file in the request
$image = Image::make($file); // Building an Intervention Image object
$sizes = array(2560, 1980, 1366, 1024, 768, 480, 320);
foreach ($sizes as $size) {
$image = $image->widen($size->width);
Storage::put('public/image_'.$size.'.jpg', $image->encode('jpg', 90));
}
Now, my understanding is that in the foreach loop it's the 10MB, original image object that's being reused. So, during each iteration, the 10MB image object gets resized to a smaller size - e.g. from 4200x2460 to 320x187 in the last iteration, and then saved on the disk.
If I'm correct (am I?) then it's not very efficient, as the GD library operations are quite expensive and the server CPU can get hit stronger than it should.
Is there any way I can optimize the resizing process? E.g. for each iteration use already-resized image object, instead of the huge original one?
I would avoid this, though: resize - save - read from disk - resize again, as the disk I/O can get hit as well. What I have in mind is this: resize the original - resize to the next size - ... - resize the last size, then save each. This means I need a way to store a resized Intervention object in a variable, but not save it yet (as it will be saved in the last step).
Does this make sense? Any ideas?
Thank you.
Related
Is it possible to resize an image in PHP, to a specified size in Kilobytes, with large images?
Example: IMGonline.com.ua
You can absolutely resize to a certain size using resize so if you know a certain number of pixels will be within that size you could set a pixel size. I've used 100x100 for reference.
$resize = new ResizeImage('image.png');
$resize->resizeTo(100, 100, 'exact');
$resize->saveImage('/resized/image.png');
Using this theory you could take the same approach and if an image is already that size or below you could resize it.
if ($file_size < 5000) {
$resize = new ResizeImage('image.png');
$resize->resizeTo(100, 100, 'exact');
$resize->saveImage('/resized/image.png');
} else {
// Error
}
So you could run it like this
// Upload Image
// Resize image
$resize = new ResizeImage('image.png');
$resize->resizeTo(100, 100, 'exact');
$resize->saveImage('/resized/image.png');
// Save image
// Open saved image
// Check image size
// Confirm or resize progressively smaller.
Hopefully the resize property is of some use. You've provided no code in your question but I think the best approach would be to check the size and save if it's fine then if not you would resize again and check again. Obviously you'd have to do this until it was at the correct size so execution time could take longer. I can't think of a direct way to compress the image with a php function unless you compressed the images instead of resizing them.
I've been trying to optimize the UX for or admins when uploading images on our platform. The images are rather heavy (7-10MB) and uploaded in batches of 3 at a time, asynchronously (using FineUploader). Each image is being stored in multiple sizes, using Intervention Image.
Original logic
<?php
$file = $request->file('qqfile'); // A big JPG file in the request
$image = Image::make($file); // Building an Intervention Image object
$sizes = array(2560, 1980, 1366, 1024, 768, 480, 320);
foreach ($sizes as $size) {
$image = $image->widen($size->width);
Storage::put('public/image_'.$size.'.jpg', $image->encode('jpg', 90));
// image_2560.jpg, image_1980.jpg, etc.
}
Our current server is a small DigitalOcean droplet (512MB RAM, 1 vCPU). All things considered, the upload process was quite time-consuming (e.g. 3 images were being uploaded approx. 2 min).
Optimized version
To speed the process up for the admins, I decided to store only the biggest size during the upload request and then use Laravel queues (Redis) to delegate storing of the remaining sizes:
<?php
$file = $request->file('qqfile'); // A big JPG file in the request
$image = Image::make($file); // Building an Intervention Image object
$sizes = array(2560, 1980, 1366, 1024, 768, 480, 320);
foreach ($sizes as $size) {
if ($size == 2560) {
// Store the biggest size: image_2560.jpg
$image = $image->widen($size->width);
Storage::put('public/image_'.$size.'.jpg', $image->encode('jpg', 90));
} else {
// Delegate the job to store a resized image, 15 minutes from now
ProcessImageStoring::dispatch($size)->delay(Carbon::now()->addMinutes(15));
}
}
The ProcessImageStoring job works fine. It finds the already-stored 2560px image and uses it to store a resized version.
The problem is that - even with the job delayed by 15 minutes - the upload process did not get faster at all. And I don't get why. A single upload request needs to resize and store 1 image instead of 7, the other 6 should be processed in 15 min.
Any ideas? Could it just be the droplet power in general? Any PHP/Nginx settings that might be limiting the performance?
First of all make sure you are really using Redis as your queue driver. Make sure you have it set in .env, make sure it's not cached in your config file (run php artisan config:clear or php artisan config:cache). Also in case you have your queue worker running, make sure you have restarted it.
If it's not the case, you should make sure how much time gets uploading and how much time it takes image resizing. Are you sure uploading doesn't take 13 minutes and resizing only 2 minutes?
Im working on an application that allows users to upload images. We are setting a maximum file size of 1MB. I know there are a few options out there for compressing JPEGs (and we are dealing with jpegs here).
From what ive seen, all of these functions just allow you to define a compression ratio and scale it down. Im wondering if there is a function that allows you to define a maximum file size and have it calculate the compression ratio necessary to meet that size.
If not, I was thinking my best approach would be to use a while loop that looks at size and just keep hitting the image with imagejpeg() in 10% increments until the file is below the pre-defined max-size.
Am I on the right track here?
It depends on the data but with images you can take small small samples. Downsampling would change the result. Here is an example:PHP - Compress Image to Meet File Size Limit.
I completed this task using the following code:
$quality = 90;
while(filesize($full_path) > 1048576 && $quality > 20) {
$img = imagecreatefromjpeg($full_path);
imagejpeg($img,$full_path,$quality);
$quality = $quality - 10;
clearstatcache();
}
if(filesize($full_path) > 1048576) {echo "<p>File too large</p>"; unlink($full_path); exit;}
The $quality > 20 part of the statement is to keep the script from reducing the quality to a point I would consider unreasonable. Im sure there is more to be done here. I could add in a resolution re-size portion as well, but this works for my current needs. This is with a max file size of 1MB. If the file is still too large after maximum quality scale, it returns a file too large error and deletes the image from the server.
Take note that clearstatcache() is very important here. Without this, the server caches the image size and will not notice a change in file size.
This script only applies to JPEGs, but there are other php functions for gifs, pngs, etc.
I have a php script where the user can upload images.
I want to make the script lower the image quality (jpeg) if the file size is bigger than 'X' kbytes.
Something like this:
if( $_FILES['uploaded_img']['size'] > $file_size_limit ){
// code that lowers the quality of the uploaded image but keeps the image width and height
}
What is the best approach for this?
ps: I don't want to change image width and height.
Sure you can. Do something like this.
$upload = $_FILES['uploaded_img'];
$uploadPath = 'new/path/for/upload/';
$uploadName = pathinfo($upload['name'], PATHINFO_FILENAME);
$restrainedQuality = 75; //0 = lowest, 100 = highest. ~75 = default
$sizeLimit = 2000;
if($upload['size'] > $sizeLimit) {
//open a stream for the uploaded image
$streamHandle = #fopen($upload['tmp_name'], 'r');
//create a image resource from the contents of the uploaded image
$resource = imagecreatefromstring(stream_get_contents($streamHandle));
if(!$resource)
die('Something wrong with the upload!');
//close our file stream
#fclose($streamHandle);
//move the uploaded file with a lesser quality
imagejpeg($resource, $uploadPath . $uploadName . '.jpg', $restrainedQuality);
//delete the temporary upload
#unlink($upload['tmp_name']);
} else {
//the file size is less than the limit, just move the temp file into its appropriate directory
move_uploaded_file($upload['tmp_name'], $uploadPath . $upload['name']);
}
This will accept any image format supported by PHP GD (Assuming that it's installed on your server. Most likely is). If the image is less than the limit, it will just upload the original image to the path you specify.
Your basic approach (which is implemented in Austin's answer) will work some of the time, but it's important to keep in mind that quality != file size. While they are generally correlated, it is perfectly possible (even common) that reducing the quality of a jpeg file will actually result in a LARGER file. This is because any JPEG uploaded to your system has already been run through the JPEG compression formula (often with a quality of 79 or 80). Depending on the original image, this process will create artifacts/alter the resulting image. When you run this already optimized image through the jpeg compression algorithm a second time it doesn't "know" what the original image looked like... so it treats the incoming jpeg as if it's a brand new lossless file and tries to copy it as closely as possible... including any artifacts created in the original process. Couple this with the fact that the original jpeg compression already took advantage of most of the "easy" compression tricks, it ends up being quite likely that compressing a second time results in a crappier looking image (copy of a copy problem) but not smaller file.
I did a few tests to see where the cutoff was, and unsurprisingly if the original image had a low compression ratio (q=99) a lot of space was saved re-compressing to q=75. If the original was compressed at q=75 (pretty common for graphic program defaults) then the secondary q=75 compression looked worse but resulted in virtually the same file-size as the original. If the original had a lower compression level (q=50) then the secondary q=75 compression resulted in a significantly larger file (for these tests I used three complex photos... obviously images with specific palates/compositions will have different performances going through these compressions). Note: I'm using Fireworks cs4 for this test... I realize that these quality indicators have no standardization between platforms
As noted in the comments below, moving from file formats like PNG to JPEG will usually end up significantly smaller (though without any transparency), but from JPEG -> JPEG (or GIF->JPEG, especially for simple or small-palate images) will often not help.
Regardless, you can still try using the compression method described by Austin, but make sure you compare the file-sizes of the two images when you're done. If there is only a small incremental gain or the new file is larger, then default back to the original image.
I was using the firebug page speed utility and one of the suggestions given was to compress the images - So I wrote the following code to compress the image
$filename="http://localhost.com/snapshots/picture.png";
$img = imagecreatefrompng($filename);
$this->_response->setHeader('Content-Type', 'image/png');
imagepng($img,null,9);
imagedestroy($img);
Now the actual image size is 154K
So I experimented by giving different quality levels and here is what I found
imagepng($img,null,0); --> Size = 225K
imagepng($img,null,1); --> Size = 85.9K
imagepng($img,null,2); --> Size = 83.7K
imagepng($img,null,3); --> Size = 80.9K
imagepng($img,null,4); --> Size = 74.6K
imagepng($img,null,5); --> Size = 73.8K
imagepng($img,null,6); --> Size = 73K
imagepng($img,null,7); --> Size = 72.4K
imagepng($img,null,8); --> Size = 71K
imagepng($img,null,9); --> Size = 70.6K
Do these results look accurate - I'm not sure why with quality 0 - the image size is larger than the actual size.
Secondly is this the best way to go about in PHP to compress images before rendering them in the browser to improve performance.
Based on the suggestions, that its better to compress the image once at the time of saving - I digged up the code that is called by the flash program to generate the snap shot -
$video = $this->_getParam('video');
$imgContent = base64_decode($this->_getParam('snapshot'));
file_put_contents("snapshots/" . $video . ".png", $imgContent);
EDITED
Based on Alvaro's suggestion, I have made the following modification to the code which generates a much small jpg file
$video = $this->_getParam('video');
$imgContent = base64_decode($this->_getParam('snapshot'));
file_put_contents("snapshots/" . $video . ".png", $imgContent);
$filename="snapshots/".$video.".png";
$img = imagecreatefrompng($filename);
imagejpeg($img,'test.jpg',75);
So now this is a 3 step process
create the initial image using file_put_contents
Use imagecreatefrompng and imagejpeg to compress the file and generate a smaller image
Delete the orig image
Is this the best optimal way to go about it.
Since PNG uses lossless data compression, the only way to achieve a decent compression in a PNG image (edge cases apart) is to save it as palette (rather than true colour) and reduce the number of colours. You appear to be processing some sort of screenshots. You may obtain smaller file sizes is you use a lossy compression, i.e., save as JPEG. In either cases, you reduce both file size and picture quality. You could also try the GIF format, which tends to be smaller for small graphs.
Last but not least, you should compress images once (typically when they get uploaded), not every time they're served. I suppose your code is just a quick test but I mention just in case.
Answer to updated question:
I'm not familiar with PHP image functions but you should probably use a combination of imagecreatefrompng() and imagejpeg(). Also, consider whether you need to keep the original PNG for future reference or you can discard it.
You have not understood the last parameter, that's not quality that's the compression level so increasing it will decrease the image size. Anyway i've used that method before to compress png images and it works well so i think you should continue to use it.
1- The result seems accurate since 0 means no compression
quality
Compression level: from 0 (no
compression) to 9.
It's normal for the 0ed file to be larger than the original (that can be slightly compressed to begin with). You need to understand file compression and PHP GD image constructor.
2- IMHO, the wisest choice would be to compress your png file before uploading them on your server (of course, it states only if you have the choice : static, few files).
Help for that :
http://www.webreference.com/dev/graphics/compress.html
http://www.punypng.com/
http://omaralzabir.com/reduce_website_download_time_by_heavily_compressing_png_and_jpeg/
If it means to be dynamic, the php is the choice.