Image rendering speed optimization - php

I've got an application I'm building with PHP which calls photos from a database according to however many images there are for a given plant species. I have the script resize the otherwise large photos to 100x100. This process REALLY takes a bite out of page load time and my computer's CPU gets up to 100% and is working quite hard.
I think it's because all images are loading at once... Is there a way to have them load only when the previous one is finished? Or is there a more efficient way of rendering images like this? Here is the snippet that loads 'em:
$imagesArray = explode(", ",$images);
unset($imagesArray[count($imagesArray)-1]); // get rid of the last array key which is blank.
echo '<tr><td>Images:</td><td>';
foreach ($imagesArray as $imgloc)
{
echo '<a target="_blank" href="plant_images/'.$imgloc.'"><img src="plant_images/'.$imgloc.'" width="100" height="100" alt="'.$row[2].'" title="'.$row[2].'" /></a> ';
}
Here is a screenshot of a partially loaded image in the page (this is a lot better than what happens other times! Seriously, some species have 10-12 images and my computer takes about 15 seconds to load the page, painfully)
http://www.captainscall.site11.com/temp_stuff/slow-img.png
I found this already, mildly helpful.
Thank you kindly,
Khanahk

The users browser will usually cache the images, so that the user will only experience slow loading the first time he/she visits the page.
However, you should consider having thumbnails of all the images that are being displayed. You don't need to make these thumbnails your self, in 2013 we have computers to do that.
A quick google search will probably give you some software that can make resized copies of all your pictures in no time, or if you know some coding you can make your own script to to this. You can for instance use Imagick in PHP (see http://php.net/manual/en/imagick.scaleimage.php). Then just store two sizes of the images on your server, one for thumbnails and one set with higher resolution.
One advantage of doing this is that you will decrease the outgoing data from your server (if your server has a lot of unique visitors, there will be a lot of traffic due to the size of the images). Also your users will experience less loading time for your site (as you said, waiting for images to load is boring).
You could probably tell the browser with javascript in what order to load the images, but that wouldn't solve your problem, which mainly is that your images are too big.

I think you should:
Serverside: you need to cache an image if it already generated, the next time you can use the cache version. You can create a physical image file for next used.
Client side: you can use library to lazy load your image
http://www.appelsiini.net/projects/lazyload
You can implement a JavaScript function that load image after some specific second.

Related

how can i restrict a user to see only the slideshow and can't download it

i am developing a cake-php application, under this i want to show powerpoint slideshow for the end user but the condition is that the user can only be able to see the show, not be able to download the slideshow.
Can any one plz suggest the best way to do it.
If the slideshow is based on images you can split each image into 9,16 or more squares and display the tiled image. That way if the user decides to 'Save-as' the image he will get only 1/9, or 1/16-th of the real image. If the slideshow is quite big it will be a pain to put together all the pieces and will discourage the users to try and save the slide.
You can see such implementation here - http://whatismycar.com/info/16540/ - the 4 images below the header are in fancybox and if you try to 'Save-as' one of them you will save only a small tile of the original image.
Hope this helps.
It is impossible prevent downloading images from internet, but you can make it hard for users with this. Also you can hide source html image path with php check it here
While I am no expert on this subject, something worth noting is what Youtube seems to be doing.
Ever notice how the whole video never loads if you pause it?
Upon monitoring the network tab during a video you will see that they are actually making hundreds or even thousands of requests for video segments from their server and most likely using JS to clear the cache of parts you've watched.
^^ this is why going back to an earlier point in the video causes it to stall for a bit while it re-downloads the segment which you wish to see.
At the end of the day, PrtScn will trump all of your efforts because the web browser does not have the privilege to control the keyboard outside of it's own environment.

Is there a way display externally linked images at a consistent resolution?

I working on a site that displays images from a variety of other sites by having elements that link directly to the other sites. Some of the images are high resolution and some of there are not. In the case that many high res images are being displayed the website performance goes down. Is there anyway to display the externally linked images at a consistent (reasonable) resolution so that less bandwidth is consumed and site performance does not falter?
No, not really. If these are images out of your control and you're simply linking to them, there is not much you can do.
One option is to create some kind of pre-loader with a progress bar. You'd have to download all the images, process them into smaller images then display them. however, you're not really saving any time doing this since you'll still be downloading the images in the first place.
You just need a class name on the element that the images are going into. You probably have a bunch of div's with different sized images in them. You need <div class="mySize">...</div> and then, mySize {width:100px; height:100px;}.

Load speed on an image-intensive webpage

I'm currently working on a portfolio site wherein I would have to load a large quantity of photos on the same page.
These images are loaded dynamically through PHP, and at the moment I have opted to save thumbnail versions of these images beforehand and load these.
My concern, however, is that this might not be an optimal solution should the site have a large number of users: I would basically have duplicates of each image multiplied by the number of images users have uploaded.
My question for you is whether or not there are better solutions to this problem? A way to load a page as fast as possible without compromising too much space?
Thanks a ton.
Loading a webpage full of lots of images will be slow. The reason for this is because of the bandwidth needed to transfer these images.
You have a couple of options
Load full images in tiled mode. These images will be full images, just resized to fit in a "thumbnail" view. The advantage of this is that you are only saving 1 image, but that image is full sized, and will take a long time to load.
Load the thumbnails as you said you are doing. The advantage to this is performance, but you need to store two copies of each image. Also, depending on how you tackle the thumbnail creation, you may require users to upload two copies of the image to provide their own thumbnail... which could stink.
Load thumbnails, but dynamically generate them on upload. You are essentially keeping two copies of the image on disk, but you are dynamically creating it through some php image modification API. This will load faster, but still eats up disk space. It also minimizes user/administrative requirements to provide a thumbnail.
Load thumbnails on-demand, as the page is requested. This approach would take some testing, as I've never tried it. Basically, you would invoke the php image modification API (or better yet out-source to a native solution!) to create a one-time-use (or cached) thumbnail to be used. You might say "OMG, that'll take so long!". I think this approach might actually be usable if you apply an appropriate caching mechanism so you aren't constantly recreating the same thumbnails. It will keep down on bandwidth, and since the limiting factor here is the network connection, it might be faster then just sending the full images (since the limiting factor of creating the thumbnails would now be CPU/Memory/Hard Disk).
I think #4 is an interesting concept, and might be worth exploring.
What i think is :
A. Take Advantage of Cache (System Cache , Cache In Headers , HTTP Cache .. all Cache )
B. Don't generate Thumb all the time
C. Use a Job Queing System such as Gearman or beanstalkd to generate thumbs so that you don't have to do it instantly
D. Use Imagick is more efficient
E. Paginate
F. Example only generate thumb when the original file has been modified
$file = "a.jpg" ;
$thumbFile = "a.thumb.jpg" ;
$createThumb = true;
if(is_file($thumbFile))
{
if((filemtime($file) - 10) < filemtime($thumbFile));
{
$createThumb = false;
}
}
if($createThumb === true)
{
$thumb = new Imagick();
$thumb->readImage($file);
$thumb->thumbnailImage(50, null);
$thumb->writeImage($thumbFile);
$thumb->destroy();
}
Consider using a sprite or a collage of all the images, so that only one larger image is loaded, saving bandwidth and decreasing page load time.
Also, as suggested already, pagination and and async loading can improve it some.
References:
http://css-tricks.com/css-sprites/
http://en.wikipedia.org/wiki/Sprite_(computer_graphics)#Sprites_by_CSS
Yes, caching the thumbnails is a good idea, and will work out fine when done right. This kind of thing is a great place for horizontal scaling.
You should look into using a
CDN. (Or
possibly implementing something similar.) The caching system will
generate images of commonly-requested sizes, and groom those off if
they become infrequently requested at a later time.
You could also scale horizontally and transplant this service to another server or
vserver in your network or cloud.
It is a disk-intensive and bandwidth-intensive thing, image delivery. Not as bad as video!
Generate thumbnail of large images on-the-fly. (Some hint)
Load images asynchronously (Try JAIL)
Paginate
Caching is also an option, works from the second time the site is loaded though.

How to load external images faster?

I have a database, that stores items and a link to an image for every item. Now the images can't be storen localy (there wouldn't be anough space and other reasons as well), so I link to images on other servers.
When I wan't to display the items (with it's image), I run while statement with mysql_fetch_array like this (simplified):
$result = mysql_query("SELECT * FROM items");
while($row = mysql_fetch_array($result)) {
echo "<h1>".$row['name']."</h1>";
echo "<img width='200' src='".$row['img']."' />";
}
But the image load takes forever, because some of the external images have large sizes, however I only need it 200px wide.
Is there a way to smaller the image 'on the go', or something to speed the loading up?
(Using php/html)
Thanks, Mike.
Anything you do "on the go" will be even slower, since for you to resize the image you have to first download it.
You can resize the image and save a smaller version, doing so is pretty straightforward. There are plenty of examples in the documentation and comments here.
But you only get a benefit out of it if you save the smaller images and serve them from your server (or from somewhere else you can put them, like Amazon S3) on future requests.
You can write a quick script which makes a smaller version of every image your database references offline then start serving them instead of writing <img> tags pointing to the remote versions.
No.
The image is a certain (file) size, and it is stored somewhere that you don't control.
In order to reduce the size, you have to transfer the data to somewhere you do control — and that takes more or less the same time as it would take to transfer to the browser (after which you have to spent time editing the image, and then time transferring it to the browser).
There is no direct way of resizing the images on the fly since it will require a lot of CPU computations, so it is just simpler to serve the full res image and let the browser resize it.
If you decide you want to resize the images on the server and then server lower res images, you can use this PHP class. I it pretty simple to use and works pretty fast.
Hope this helps.
If you own the external location of the image and have a lot of images I would create a scale function.
E.g:
"domain.com/yourimage?w=200"
where "w" would be width, telling the server to resize it before you retrieve it.

PHP/JS - Create thumbnails on the fly or store as files

For an image hosting web application:
For my stored images, is it feasible to create thumbnails on the fly using PHP (or whatever), or should I save 1 or more different sized thumbnails to disk and just load those?
Any help is appreciated.
Save thumbnails to disk. Image processing takes a lot of resources and, depending on the size of the image, might exceed the default allowed memory limit for php. It is less of a concern if you have your own server with only your application running but it still takes a lot of cpu power and memory to resize images. If you're considering creating thumbnails on the fly anyway, you don't have to change much - upon the first request, create the thumbnail from the source file, save it to disk and upon subsequent requests just read it off the disk.
I use phpThumb, as it's the best of both worlds. You can create thumbnails on the fly, but it automatically caches the images to speed up future requests. It creates a nice wrapper around the GD and ImageMagick libraries. Worth a look!
It would be much better to cache the thumbnails. Generating them on the fly would be very taxing on the system.
It depends on the usage pattern of the site, but, basically, how many times do you expect each image to be viewed?
In the case of thumbnails, they're most likely to be around for quite a while (the image is uploaded once and never changed, so the thumbnail doesn't change either), so it's generally worthwhile to generate when the full image is uploaded and store them for later. Unless the site is completely dead, they'll be viewed many (hundreds or thousands of) times over their lifetime and disk is a lot cheaper than latency these days. This also becomes more significant as load on the server increases, of course.
Conversely, for something like stock charts that get updated every hour (if not more frequently), that would be a situation where you'd do better to create them on the fly, so as to avoid wasting CPU time on constantly generating images which no user will ever see.
Or, if you want to get fancy, you can optimize to handle either access pattern by generating the images on the fly the first time they're needed and then showing the pre-generated one afterwards, up until the data it's generated from changes, at which point you delete it so that it will be regenerated the next time it's needed. But that would be overkill for something as static as thumbnails, IMO.
check out the gd library and imagemagick

Categories