images optimization in laravel application - php

I have a laravel web application with more than thousands of uploaded images in it. I have not optimized my images before and now I have to do the optimization.
now I have two questions about image optimization;
I have problem with image optimization. I have tried any packages
that would do the 'image optimization' but after running my
optimization code, no changes would apply to the images in my source
control. why the package does not change the images?
I am using same images with different sizes in my application.
forexample an image named "image_1.png" is using with the size
120*60 pxs in home page and also it is using in another page with
size 300*150 pxs. what should I do for these images that are using
with different sizes for optimization?
packages that I have used are these:
https://github.com/spatie/laravel-image-optimizer.git
https://github.com/psliwa/image-optimizer.git
I am using the code like this for my images in database:
foreach ($files as $file) {
$filePath = //the image path
ImageOptimizer::optimize($filePath);
}
The above code successfully passes and no errors happen during the code running. but the images does not change.

Image optimization is not only about image resizing, there are a couple of elements that impact image file size:
Image quality - In most cases, you can significantly lower quality without a significant visual impact. Consider your image purpose and audience, and use the lowest possible quality that is acceptable for the image content, audience, and purpose.
Image format - Make sure each image is delivered in the right format for its image content. Take advantage of CDN services or other tools/logic to check which browser is making the request and deliver different formats for different browsers using the element and its srcset attribute.
Image metadata - By default, images contain a lot of metadata stored by cameras and graphics applications, but this data is completely unnecessary in delivered images.
Image sizing and resizing - Even when resizing on the server-side, keep in mind that you can crop to focus on important content, and not just scale down your images.
In my web design projects I use Cloudinary for image optimization.
Basically, it can find the best quality compression level and optimal encoding settings based on the image content and the browser and to automate the trade-off decision between the file and the quality.
Here is how I'm doing it:
optimize image quality with the "quality = auto" parameter:
cl_image_tag("sample.jpg", array("quality"=>"auto"))
This will reduce the file size without any noticeable degradation to the human eye and without the need to individually analyze every image.
optimize image format with the "fetch_format = auto parameter:
cl_image_tag("sample.jpg", array("width"=>500, "fetch_format"=>"auto", "crop"=>"scale"))
This will scale down to 500 pixels and deliver the image in JPEG format.
optimize image size by cropping:
cl_image_tag("sample.jpg", array("transformation"=>array(
array("aspect_ratio"=>"4:3", "crop"=>"fill"),
array("width"=>"auto", "dpr"=>"auto", "crop"=>"scale")
)))

from your question I understand that you don't need optimization you need "resize on the fly" service. I will show you how i am doing it.
Download and install Intervention Image to Laravel
Download and install Intervention Image CACHE to Laravel
After downloading the cache component don't read it's documention for now just install it with composer
Open routes/web.php and create new route like this you can replace the name and modify it to fit your needs
Route::get('resizer/{photo}/{ext}/{width}/{height}',function($photo,
$ext, $width,$height) {
$image = new \Intervention\Image\ImageManager();
$url = public_path('images/'.$photo.".".$ext); // If your photo is in public folder
enter code here
$res = $image->cache(function ($image) use ($url) {
return $image->make($url)->resize($width, $height);
}, 9999, true);
return $res->response();
})->name('resizer');
Then in your blade template you can use it like this
<img src="{{route(resizer,[$photo,$ext,$width,$height])}}"/>
The process is pretty simple you fetch the photo pass it to intervention image then cache it for faster response and return it as response with correct headers. If you have any question ask !. Wish you luck
P.S You can run optimization on the image before the response if you want.
P.S1 You have to make additional check if the photo exists before processing This is just Proof Of Concept how you can do it

Related

How to reduce the image size by reducing the quality when uploading images in Laravel?

I'm developing a Larvel website for ads, now users can upload very large images, I would like to resize, compress, reduce the quality of these images before storing them in the DB.
I have
$img = request->file("images");
Where do I go from here?
It seems strange to me how Php actually works. I was expecting some kind of byte[] array-like in Java that represents the image then do some calculation over it and we are done.
I checked a couple of posts like this and this one here
However, all code snippets deal with local disk images is there a way I can use:
$img = request->file("images");
Or I don't understand how files work in PHP?
You can use spatie package or any other package for image manipulation:
spatie image with this package you can simply make:
Image::load($request->file("images")) // or its path
->width(100)
->height(100)
->save($pathToNewImage);
It also support for reducing image sizes by decreasing quality.

Strategy for watermarking images

Currently when users are uploading images to my website, I only store watermarked images. Should I also save the plain images? Or should I possible save the plain images and then, on request, display the image with a dynamically generated (PHP) watermark?
What are your preferences?
And if I generate the images on the fly with PHP, should I be concerned about performance of those images?
many of current photography-cms software packages use GD and also ImageMagick.
You can upload a high or medium resolution image, and have a script like GD generate the low-res web version, including watermark.
unless it is a very large amount of very high res images, performance should not be a concern on current hardware.
(if you would like help with the php/gd part, you could ask another question, as this one is not related to that)

Optimize image on page load

Google Page Speed Insights suggests me to optimize webpage images on a webpage I'm currently working on. Images are uploaded from a server. I want to display optimized images on the page but don't want the original image on the server to change. Is there any way to do this in PHP?
You should reduce image size while uploading. Try this code.
reduce image size while uploading using the following PHP code used to upload image
If you don't want the original image to change then you'll find any optimisations you perform to bring the file size down / serve a move appropriately sized image will be redundant due to the overhead of "optimising" the image on the fly. (Unless your original image is just that big)
As I see it, you have 3 options:
Unless it has a massive impact (or you can envisage it will have a massive impact) just ignore Google Page Speeds for now.
You can use lossless compression which will reduce the file size without reducing the quality of the image. This is something you can do on your server with various different apps (just google what type of server you have followed by lossless image compression)
Just create a copy of the original image upon upload (or whenever) and serve that image to your users. The benefits to this are you can have different sized images which can be rendered for different devices, e.g. small image for mobile, and again you can use lossless compression on all of them. The downside to this is you'll obviously be using more server space.
Hope this helps!

Resizing an image on website for efficiency.

I currently have a website that aggregates images, the problem is those images are very large, and when I display them at smaller sizes, they still eat up a lot of computation.
I'm curious as to how to actually force reduced size/quality without just smooshing it into a <div> element.
Here is my site in question, you can see how 'laggy' it gets when you produce images:
http://newgameplus.nikuai.net/TEST/index.html
I was using timthumb.php to resize the images, but the host doesn't allow that script for some reason.
The best way to do this is to use some sort of image re-factoring service.
I have written my own one that uses ffmpeg and imagemagik to resize images on the fly and to generate arbitrarily sized thumbnails from videos. I memcache the results to make subsequent requests super snappy, and have some interesting additions such as automatic Point of Interest detection using Face Detection and Image Entropy, with the aim being "nice thumbnails, no matter the size"
An example of such a service is src.sencha.io - the documentation for this service is here but I have included the important bits below.
Specify Image Size
<img src='http://src.sencha.io/320/200/http://yourdomain.com/path/to/image.jpg'
alt='My constrained image'
width='320'
height='200' />
This will take your image (http://yourdomain.com/path/to/image.jpg) and run in through the resizing service, returning a 320x200 version of your image. You cannot set the gravity/point-of-interest using this service though (as far as I can tell).
You can also use this service to change formats, resize dataurls, do percentage resizes and use the resizing service to detect the width/height of the user agent requesting the image.
There are many such services available on the web.
I agree with slash: It depends on how the images are being resized. One thing I do for a site is use photoshop (or GIMP) to resize the image to the exact dimensions i need for the page i'm using the image for. Then i also include the same dimensions in the width-height attributes on the image itself.
Additionally, you can use your photo editing software to check the size of your image if you were to save it with a different file extension, and (specifically with jpeg and png files) photoshop will let you reduce the quality, which lowers file size and speeds up page loading.

caching and resizing images without shrinking or stretching?

How does Facebook and other image intensive sites maintain a thumbnail size of the full image without shrinking or distorting the thumbnail?
Are these thumbs cropped versions of the original and stored so when the thumb is clicked they reference the full size image.
My images are stretched or shrunk if I simply try to confine them to a preset size in my img tag.
Ideally I would like to crop each image to fit a preset size without distorting the aspect ratio. If this can be done on the fly is this an efficient way to handle images in high volumes?
It is considered bad practice to resize images with your HTML markup or CSS styles. Scaling them up means bad quality, scaling them down means that your users have to download a larger file than necessary, which hurts speed.
There are libraries built for image resizing and caching for almost any language or framework. They mostly feature cropping as well, in order to maintain a standard aspect ratio. For PHP, have a look at phpThumb, which is based on GD/ImageMagick.
The resulting resized versions of your images are saved in a cache folder, so they don't need to be regenerated every time the image is requested. This way, the performance is almost as good as serving static files. The only overhead is a small check if the resized version exists in the cache.
I can't speak directly for facebook, but most sites upload a large image, and then smaller, preset sizes are automatically recreated (usually by the scripting language and some kind of library, like PHP/GD) and saved with a similar file name pattern, so that you can use as little bandwidth as possible, improve loading times, and avoid manipulating images with css.

Categories