My website is showing same image, one normal and other is blurred and I'm thinking what is better method of doing it in terms of speed. Create two images upon uploading where it uploads one normal image and one blurred to server or upload only one image but blur second one on the fly using gd?
If you're using GD, I would do it at upload and save them as flat files.
Apache and other web servers can serve flat files remarkably fast.
However, I would look into using http://www.graphicsmagick.org/ to do the image manipulation. It's much, much faster and efficient than Imagemagick and most certainly PHP's GD.
Related
I have a php web application which has a gallery.
This gallery uses justified javascript.
Then it uses timthumb.php to resize the images without saving the images in the server.
i would like to know which one would be better..
Loading all the images using timthumb.php
Or saving resized images in the server cache folder and loading all
the images from the cache folder.
I have tried these two methods. Strangely 2nd method is slower than first for the first load.
Thank you for all the help.
Lynn
Timthumb tends to have security issues and either way image processing requires a great deal of RAM so having cache folders is the best option. Notice that I said folders and not a cache folder. On IIS servers or any windows based server you will run into slowness accessing folders which have more than a few thousand files. Linux is known to have the same problem but not until you have a few hundred thousand files in a folder. Either way, if you're dealing with millions of images it is best to categorize them in some way into separate folders so you don't end up with slowdowns from the OS trying to locate the file.
Honestly I do not have much idea about timthumb.php.
Although saving the photos in a server cached folder seems to be a better idea, You can save the save the path of the image in your datasource (normally a relational database) , and then while retrieving the photos , extract it from cached folder.
It might be a possibility that your cache is getting reloaded time and again and thats why taking sometime in the first load.
I'm using a Symfony2 bundle from GregWar to resize up to 12 images at a time from a user upload. I'm using this https://github.com/Gregwar/ImageBundle
I'm resizing them to four different sizes, as these sizes are needed for mobile, desktop, thumbnail, etc. It takes time to do each one of course, but with 12 x 4 resizes it can take a while. Over 30 seconds easily. Which is behind the default PHP timeout and isn't really acceptable for an end user wait time.
I want the resizing to be done at upload so the new sizes are available immediately to the user. Rather than later as a batch process.
This seems like it may be a common problem. So what can I do to improve my situation?
Should I use a different library?
Or reduce my image sizes from four, down to perhaps two to improve processing speed but sacrifice user experience?
Is this normal? Could it be a hardware issue? On my local machine it's even slower.
PHP memory is set to 256MB. I use a ServerGrove VPS, with PHP5.3.
I have moved away from trying to solve this on my server and instead now use a dedicated EC2 instance with Amazon to resize images.
Similar to this approach http://sumitbirla.com/2011/11/how-to-build-a-scalable-caching-resizing-image-server/
If there is GD extension installed on your server, you can use imagecopyresized function.
I use Imagine library and I would like to know if is it possible to compress images with this library?
Other question:
Do you think it's a good solution to compress a lot (30 images) of images with the library in order to use a command line?
You can save the images with lesser quality to compress them:
To save at 50% quality:
$imagine->open('/path/to/image.jpg')->save('/path/to/image.jpg', array('quality' => 50));
And as for whether or not to do the compression at command line, you can do it but I don't recommend it. Image manipulation takes a lot of cpu and ram, so I suggest you download the images and manipulate them on your computer (not on the production server), or do it in a php script but limit the number of images to compress.
I process uploaded images by php to save (after resize) by imagejpeg. As I explored, imagejpeg is the best php command to compress jpg images to reduce the file size. However, when I check my website by Google Page Speed, it says all of my images can be compressed 4-10%.
What is the common method to compress images to meet the Google standard?
Googles "standard" is the possible maximum to be expected by google. You need to use highly optimized image compressors that does nothing else than image compressing and therefore get a possible maximum best value here.
You can for example, open you jpeg file in a image editor like adobe photoshop and create the possible maximum best compression to be expected by you while having visual control. Highly recommended.
The GD library provides a standard conform jpeg compression which should match a library users expectation, but which might not satisfy a graphic designer (and/or google by 4-10%).
I am the tech intern for an online independent newspaper, and the writers on the staff are not tech-savvy. They don't quite understand how web pages work, and often they upload and include images straight from their digital cameras, or scanned from original media. These images become a burden when there are 10 images on the front page each at 3.5Mb each.
We are trying to work out some sort of training method for teaching them how to resize and optimize the images they want to include in their articles, but like I said, they are not very tech savvy, and any method we attempt to employ may go way over their heads.
So, I wanted to know if it is outside of reason to attempt to resample and cache images that are included in the articles using a PHP function and the GD library in order to stream line the amount of data that has to be passed per article.
I think it's possible, I'm just trying to figure out if it would be worth it to just take the time and effort to train the writers, or if creating an automated process would be better.
You'd be better off doing the GD image processing during the upload process. GD can take up quite a bit of resources, so processing each image on every request would not be a preferable solution. If you can't do it during the upload process, you should cache all the resampled images and use those if/when available.
It's certainly possible, and I'd be very surprised if Joomla! doesn't already have modules that do just that.
With the current web site that I'm working on I needed to answer a similar question. I've opted for using the joomla addon Easy Gallery. The two stand out features for me are the automated thumbnail creation and the image resize feature. The sizes are configurable with the components configuration page. You'll have a thumbnail, a resized image and the original image with each upload.
This component is a Joomla 1.0 component, so, if you're running a Joomla 1.5 install you will need to turn on legacy mode. Work seems to be in progress for a native 1.5 version.
I also found a couple of places where the generated thumbnails weren't being displayed. I raised the question on the Easy Gallery forum, and managed to work out the answer for myself.