Strategy for watermarking images - php

Currently when users are uploading images to my website, I only store watermarked images. Should I also save the plain images? Or should I possible save the plain images and then, on request, display the image with a dynamically generated (PHP) watermark?
What are your preferences?
And if I generate the images on the fly with PHP, should I be concerned about performance of those images?

many of current photography-cms software packages use GD and also ImageMagick.
You can upload a high or medium resolution image, and have a script like GD generate the low-res web version, including watermark.
unless it is a very large amount of very high res images, performance should not be a concern on current hardware.
(if you would like help with the php/gd part, you could ask another question, as this one is not related to that)

Related

Optimize image on page load

Google Page Speed Insights suggests me to optimize webpage images on a webpage I'm currently working on. Images are uploaded from a server. I want to display optimized images on the page but don't want the original image on the server to change. Is there any way to do this in PHP?
You should reduce image size while uploading. Try this code.
reduce image size while uploading using the following PHP code used to upload image
If you don't want the original image to change then you'll find any optimisations you perform to bring the file size down / serve a move appropriately sized image will be redundant due to the overhead of "optimising" the image on the fly. (Unless your original image is just that big)
As I see it, you have 3 options:
Unless it has a massive impact (or you can envisage it will have a massive impact) just ignore Google Page Speeds for now.
You can use lossless compression which will reduce the file size without reducing the quality of the image. This is something you can do on your server with various different apps (just google what type of server you have followed by lossless image compression)
Just create a copy of the original image upon upload (or whenever) and serve that image to your users. The benefits to this are you can have different sized images which can be rendered for different devices, e.g. small image for mobile, and again you can use lossless compression on all of them. The downside to this is you'll obviously be using more server space.
Hope this helps!

php image loading time consumption

I have a products website where in I have around 100 images of high quality. Each image is around 6-7MB in size.
In my database I have stored the path of all the images along with their names. The images are saved in a folder /images/product_name/, But when I go to display these images in a web page, the page takes forever to load. All I do is send the id to the table, get the image paths and display it in the products page.
It would be very helpful if I could get any sort of advice on how to optimize the process.
The images you send to the client are most likely way too big. 7MB of size sounds very large for a product picture so 100*7MB = 700MB of data transfered if you display all the product images.
If you only need small images, scale them down to some KB's (thumbnails) and use those to display in your table.
NOTE: you can just preprend a prefix like "tmb_" or "tmb_200x200_" to the original filename, and you won't have to touch the paths in the database.
From reading your comments I think you are looking for some automated process to optimize the files and serve them at a more decent filesize.
You should take a look at imageMagick or the GD library which allows you to resize images (among many other things - http://php.net/manual/en/book.imagick.php) and optimize them. This can be combined with something like the YUI Image cropper to allow you to choose certain parts of the image to show in the thumbnail.
This is best done at the point of upload so the Server doesn't unnecessarily regenerate the images each time they are requested, and stored under a thumbnail column in the database.
If you must show bigger images I suggest you use a lightbox (see an example here - http://leandrovieira.com/projects/jquery/lightbox/) or similar technique that only loads the larger image when the client requests it.
Use http://luci.criosweb.ro/riot/ to compress images.
It is one of the simple and free tools recommended by Google to compress images.
Also as it was mentioned before make sure the thumbs you send to the listing pages are around 100px and only if the user clicks to zoom in show him the large ones one by one, and still even in the zoom option you should only display a 1000px size image max and that should not have more then 200-500k depending on format and quality.
Do you have optimized your images (in term of file size) ? Like on Smush.it for example http://www.smushit.com/ysmush.it/
You should really optimize your images. You can use a commercial product like Photoshop or use a free one like the GIMP to optimize the files.
Loading about 700 MBs of images will take exactly 1 day for me!!
reduce the size of the images to a thumbnail and when the user clicks on the thumbnail then load the original file. This can increase the performance.

Resizing an image on website for efficiency.

I currently have a website that aggregates images, the problem is those images are very large, and when I display them at smaller sizes, they still eat up a lot of computation.
I'm curious as to how to actually force reduced size/quality without just smooshing it into a <div> element.
Here is my site in question, you can see how 'laggy' it gets when you produce images:
http://newgameplus.nikuai.net/TEST/index.html
I was using timthumb.php to resize the images, but the host doesn't allow that script for some reason.
The best way to do this is to use some sort of image re-factoring service.
I have written my own one that uses ffmpeg and imagemagik to resize images on the fly and to generate arbitrarily sized thumbnails from videos. I memcache the results to make subsequent requests super snappy, and have some interesting additions such as automatic Point of Interest detection using Face Detection and Image Entropy, with the aim being "nice thumbnails, no matter the size"
An example of such a service is src.sencha.io - the documentation for this service is here but I have included the important bits below.
Specify Image Size
<img src='http://src.sencha.io/320/200/http://yourdomain.com/path/to/image.jpg'
alt='My constrained image'
width='320'
height='200' />
This will take your image (http://yourdomain.com/path/to/image.jpg) and run in through the resizing service, returning a 320x200 version of your image. You cannot set the gravity/point-of-interest using this service though (as far as I can tell).
You can also use this service to change formats, resize dataurls, do percentage resizes and use the resizing service to detect the width/height of the user agent requesting the image.
There are many such services available on the web.
I agree with slash: It depends on how the images are being resized. One thing I do for a site is use photoshop (or GIMP) to resize the image to the exact dimensions i need for the page i'm using the image for. Then i also include the same dimensions in the width-height attributes on the image itself.
Additionally, you can use your photo editing software to check the size of your image if you were to save it with a different file extension, and (specifically with jpeg and png files) photoshop will let you reduce the quality, which lowers file size and speeds up page loading.

caching and resizing images without shrinking or stretching?

How does Facebook and other image intensive sites maintain a thumbnail size of the full image without shrinking or distorting the thumbnail?
Are these thumbs cropped versions of the original and stored so when the thumb is clicked they reference the full size image.
My images are stretched or shrunk if I simply try to confine them to a preset size in my img tag.
Ideally I would like to crop each image to fit a preset size without distorting the aspect ratio. If this can be done on the fly is this an efficient way to handle images in high volumes?
It is considered bad practice to resize images with your HTML markup or CSS styles. Scaling them up means bad quality, scaling them down means that your users have to download a larger file than necessary, which hurts speed.
There are libraries built for image resizing and caching for almost any language or framework. They mostly feature cropping as well, in order to maintain a standard aspect ratio. For PHP, have a look at phpThumb, which is based on GD/ImageMagick.
The resulting resized versions of your images are saved in a cache folder, so they don't need to be regenerated every time the image is requested. This way, the performance is almost as good as serving static files. The only overhead is a small check if the resized version exists in the cache.
I can't speak directly for facebook, but most sites upload a large image, and then smaller, preset sizes are automatically recreated (usually by the scripting language and some kind of library, like PHP/GD) and saved with a similar file name pattern, so that you can use as little bandwidth as possible, improve loading times, and avoid manipulating images with css.

Resize images vs thumbnails

I've read many articles and discussions in forums and I still cann't figure out the difference between resizing a large image with php(not with html) and creating thumbnails. Except for the fact that the most people suggest phpthumb and imageMagick for thumbnails I don't understand why I should or not prefer to use these and not php functions like imagecreatefromgif(png,jpeg) and imagecopyresampled which can do this job(according to what I read.). What I'm trying to do is to create many galleries, so I want the images the user uploads to be resized in a smaller size if they are too large. Moreover, I need these images to be resized in smaller pictures that will be shown to gallery and when the user clicks on them they get their real size. So far, the user can upload an image that will be saved in a folder. I resize my images for galleries with html(which I know it's wrong) and I use fancybox for clicking and enlarging part with which I am satisfied. Please help understand the difference and give me the your advice what is the best thing to do. Thanks
Take this scenario.
I'm a highly equiped professional photographer and your photo application will store every image I take. I shoot in RAW so my images are massive. I also want to upload the original hi res JPEGs.
Now if you do not resize these images to thumbnails (by whatever mechanism you choose), if I try and view a particular album, I might have to load a 10x 20Mb images, only to have them resized down to 80x60pixels (and become totally crackling/noisy and extremely pixelated).
So what can you do?
Resize the images on your server, using something like ImageMagick as you suggest to generate multiple versions of this image.
This way you can deliver optimzed qualities for every occassion.
You should always intend to display an image in it's native resolution/dimension to avoid rastering or resizing noise.
And also, you want to save the different versions of the images.
imagecreatefromgif() will use the GD library by default.
Some reasons why one might prefer ImageMagick are detailed here and here.
That said, if you're happy with what you've got already and if there's no realistic chance of running into any GD dealbreakers later on, you certainly may reasonably decide to stay with what you have.

Categories