Optimization for web page with hundreds of images - php

I have a web platform that creates single page sites .. with lots of images and backgrounds (100-400).
As you can guess it loads terribly slow (if browser cache is disabled). The pages are mainly promo materials so they will be opened only one time in general (so browser cache does not play a role).
I don't have much access to the front end, and it is not a single page but a platform for generating content, so I need some sort of server solution.
Lazy loading is not possible, we have some custom postponed loading for half of the images.
We use CDN* server on production but it does not give great effects (at least for a single user).
Is there some general solution configuration for to help this case ?
Like apache optimization for to give priority to images,
zipping all the images in one(few) requests.

If (further?) lazy loading is not possible, you could bundle rows of images into a single image to use as sprites, and display them using CSS background and background-position properties.
This would reduce the number of downloads required, but it would require more processing and storage space on the server side.

Related

Concept: Why not use data:image for all src besides feeds/API/utility? Is there a PHP script to do it?

Theorizing here on how to get lightning fast media + prevent hotlinking and the <img src="data:image-kj134332k4" /> is coming to mind and more. Scrapers dont need our src and real clients need instant load (esp cell net). Considering the recent google https-everywhere move, this would drastically decrease handshakes as well.
What disadvantages are there to crafting lists such as ecom
categories/widgets/slideshows using data:image?
Is there any implications to extra KB of actual source code over serving vastly larger total page size?
Do ya'll prefer any PHP data:image gen script over another for parsing images as data as data at certain controller levels (leaving standard src images in other areas)?
Are there caching/CDN concerns? Would the parse wonk cache somehow? Seems not but im not cache expert.
Any guidance or case thoughts are much appreciated. Thank you!
Generally, the idea is worth considering, but in most cases the problems outweight the benefits.
It is true that these images won't be cached on the client side anymore. Especially Expires-based caching saves you tons of bandwidth.
As a rule of thumb I'd say: If these are small images that change frequently, embedding is a good idea. If images are larger and clients load the same image more than once in subsequent request, do by all means deliver images separately and put some effort into caching.
As for the other points:
Most browsers support this; however, some old IEs don't … so think of a fallback solution or be ready to get bug reports (may be neglible, depending on your user base.)
The number of SSL handshakes is neglible, if you're using HTTP keep-alive, which is standard. Follow-up requests do indeed require a new handshake, but if you cache properly (see next point) and maybe put static files on a CDN, this is no problem.
Read about caching, especially the Expires/Cache-Control headers and their friends.
If you decide to embed, you don't really need a generator script, embedded images are base64 coded image files; this shouldn't take more than 3 lines of code.
However, if you process/convert your images in PHP, there's even another disadvantage: Instead of statically serving them (maybe even from a different machine or CDN), images have to be on the same machine and go through the PHP engine, thus increasing the used memory of each process that serves a page with these images.

Create thumbnails on the fly with cache or on upload?

I'm currently rewriting a website that need a lot of different sizes for each images. In the past I was doing it by creating the thumbnails images for all sizes on the upload. But now I have a doubt about is performance. This is because now I have to change my design and half of my images are not of the right size. So I think of 2 solutions :
Keep doing this and add a button on the backend to re-generate all the images. The problem is that I always need to know every sizes needed by every part of the site.
Only upload the real size image, and when displaying it, put in the SRC tag something like sr="thumbs.php?img=my-image-path/image.jpg&width=120&height=120". Then create the thumb and display it. Also my script would check if the thumb already exists, if it does it doesn't need to recrate it so just display it. Each 5 Days launch a script with a crontask to delete all the thumbs (to be sure to only use the usefull ones).
I think that the second solution is better but I'm a little concern by the fact that I need to call php everytime an image is shown, even if it's already created, it's php that give it to display...
Thanks for your advises
Based on the original question and subsequent comments, it would sound like on-demand generation would be suitable for you, as it doesn't sound like you will have a demanding environment in terms of absolutely minimizing the amount of download time to the end client.
It seems you already have a grasp around the option to give your <img> tags a src value that is a PHP script, with that script either serving up a cached thumbnail if it exists, or generating it on the fly, caching it, and then serving it up, so let me give you another option.
Generally speaking, utilizing PHP to serve up static resources is not a great idea as you begin to scale your site as
This would require the additional overhead of invoking PHP to serve these sorts of requests, something much more optimized with the basic web server like Apache, Nginx, etc. This means your site is going to be able to handle less traffic per server because it is using extra memory, CPU, etc. in order to serve up this static content.
It makes it hard to move those static resources into a single repository outside of the server for serving up content (such as CDN). This means you have to duplicate your files on each and every web server you have powering a site.
As such, my suggestion would be to still serve up the images as static image files via the webserver, but generate thumbnails on the fly if they are missing. To achieve this you can simply create a custom redirect rule or 404 handler on the web server, such that requests in your thumbnail directory which do not match an existing thumbnail image could be redirected to a PHP script to automatically generate the thumbnail and serve up the image (without the browser even knowing it). Future requests against this thumbnail would be served up as a static image.
This scales quite nicely as, if in the future you have the need to move your static images to a single server (or CDN), you can just use an origin-pull mechanism to try to get the content from your main servers, which will auto-generate them via the same mechanism I just mentioned.
Use the second option, if you don't have too much storage and first if you don't have too much CPU.
Or you can combine these: generate and store the image at the first open of the php thumbnails generator and nex time just give back the cached image.
With this solution you'll have only the necessary images and if you want you can delete sometimes the older ones.

php gallery file i/o or database i/o? the displaying of large image volume on website.

I've been on a project for the past few days and hit a problem displaying large quantities of images (+20gb total ~1-2gb/directory)in a gallery on one area of the site. The site is built on the bootstrap framework. I've been trying to make massive carousels that ultimately do not function fluidly due to combined /images size. Question A: In this situation do I need i/o from a database and store images there-- is this faster than in /images folder on front end?
And b) in my php script i need to -set directories to variables/ iterate through and display images into < li >, but how do I go about putting controls on the memory usage so as to not overload browser? Any additions, suggestions, or alternatives would be greatly appreciated. Im looking for most direct means to end here.
Though the question is a little generic, here are some thoughts in regards to your two questions:
A) No, performance pulling images from a database would most likely be worse than pulling straight from the file system. In general, it is not a good idea to store images or other binary data in databases unless you absolutely have to, because databases can't do much with this information and you are just adding an extra layer on top of the file system that doesn't need to be there. You would, however, want to store paths to images in your database, potentially along with other characteristics such as image dimensions, thumbnail paths, keywords, etc. Then your application would read the entries for the images to return the correct paths to the images.
B) You will almost certainly want to implement some sort of paging if you are displaying many hundreds or thousands of photos. If the final display must be a carousel, you will want to investigate the Javascript that drives it to determine how you could hook in a function that retrieves more results from your PHP application via an AJAX call when it reaches the end or near end of the current listing of images. If you are having problems with the browser crashing due to too many images, you will also want to remove images from the first part of the list of <li>s when you load new ones so that it keeps the DOM under control.
A) It's a bad idea to store that much binary data into a database, even if the DB allows it, you shouldn't use it, it'll also give you much more memory consumption, all your data will be stored in the database's memory space, then copied into PHP's memory space for you to handle, which eats up twice the memory, plus the overhead of running a database server, and querying, etc.. so no, it's slower to use a database, accessing the filesystem directly is faster, if you also use varnish or other front-end caching system, you'll even be able to serve content much faster too.
What I would do is store files on the filesystem, and the best server to handle static serving like that is either G-WAN or NGINX Source, but do your read up and decide for yourself what suits you best. point is, stay away from apache, and probably host all those static files onto a separate server running a lightweight http server
ProTip: Save multiple copies of the same image with scaled down sizes for example 50% and another version with 25% of the original image size, this way you'll be able to send the thumbnails first for quick browsing, then when a user decides to view an image you serve up the 50% or 100% size, depending on their screen size, this way you save yourself bandwidth and memory. you also save a big 3G bill for mobile users.
B) This is where it makes some sense to use a database, you can index all the directories into a database, and use that to store the location of the image in the FS, and perhaps some tags, and maybe even number of views, etc...
and in the forntend you'll implement a scipt that'll fetch for example 50 thumbnails per page then the user can scroll around using some fancy JQuery, and when you need to fetch more, simply get a new result set with 50 more thumbs, etc..
this way you'll save yourself memory, bandwidth and even the users will thank you for such a lightweight browsing experience !
Another tip:
If you want to be able to handle bigger traffic, you might want to consider using a CDN, there are many CDN services that aren't as expensive as Amazon S3, a simple search will give you tons of resources !
Happy hacking !

how to decrease the loading time of web pages?

I made PHP website.It has 100 webpages but when I open it..It takes lots of time for load.This is static website not dynamic.but content size in the pages are larger..It takes more loaing time in web browse.
What can I do for decrease the loading time..Please give me solution.
There is a very beautiful tool available to monitor what you have asked named as Yslow
Have a look at this.
There are a whole variety of methods here:
If you are accessing a database look at optimising your queries, for example specify only the fields that you need in a SELECT query rather than using SELECT *
Employ some form of server-side caching. There are a number of solutions for PHP - see this site for more details http://www.sitepoint.com/caching-php-performance/
Use client-side (browser) caching by setting appropriate Cache HTTP headers (see http://www.mnot.net/cache_docs/ for more details)
Without further information about your site it's difficult to provide a more specific answer.
test your site in chrome
It has a great feature wich shows what time elements take to load.
( ctrl shift i , timeline)
Short steps for full optimization are
1) Backend
Should be Analysis and reduce the Data fetching time using index, reduce subquerys, temptable etc..
2) Frontend
reduce big size library Js scripts
Image size
Php scripts looping (page loading check out using browser plugin)
Reduce the html size as well.
3) its really funny but also need to check. Please check out your broadband and network capacity...
Those thing u have done all the page will come good...
You should optimize query and database operations. you should always prefer to complete things in minimum no. of loop if possible..
loading time is also affected by the page content. you should eliminate unnecessary images from form..
it also affected by server speed if you are running on server..
Simple answer, if there's too much content, then reduce the content on the page! Install YSlow and follow its advice.
To be more specific, you need to apply some rules and show some self-control to keep loading times down. There's also stuff you can do on the PHP side, but we'll get to that later. On the client side, the following tips will help.
Remove any markup that isn't necessary. For example,
<div class="class 1>
<div class="class 2">
<div class="class 3">
<p> Hello, I'm the content</p>
</div>
</div>
</div>
With judicious use of CSS you can in most cases replace this with
<div class="class1 class2 class3">
<p>Hellp, I'm the content!</p>
</div>
You could even ditch the div altogether, if it's only ever going to contain a single child.
<p class="class1 class2 class3">Hello, I'm the content!</p>
Images: Rule of thumb is no image on a web page should exceed 100K in size. While there are exceptions, this is a good rule to stick to. If you have many or large images on your page, try optimizing them. Replace lossless formats with lossy ones,(Truecolour PNG with JPEG) replace older file formats with modern ones with better compression (GIF with non-truecolor PNG), lower image quality settings for JPEG, reduce number of colours in PNG, and so on.
NEVER use BMP images on a web page!
You can speed up page loads by reducing the number of HTTP requests being made. Every asset on your page (image, stylesheet, javascript file, etc) represents a HTTP request, and the specs say you can only have 2 requests open at any one time. Any additional requests will be queued up until the first ones are cleared.
You can reduce the number of requests by, for example, having a single stylesheet for your page instead of multiple ones (though be sensible here, some stuff is better kept in a separate sheet, such as IE fixes), using image sprites, combining javascript files together (again, be sensible). and so on
One thing that won't speed up page load times, but will make them more responsive is to put all your javascript at the bottom of the page (just before the </body> tag) as loading javascript in the head or higher up in the body will force the browser to wait until the JS has been evaluated before rendering what comes after it.
On the server side, turn compression on. Make sure files are sent with suitable cacheing headers so the browser can cache images, stylesheets, javascript, etc.
Finally in PHP, optimize your code so that it generates output more quickly. The server can't start sending content to the client until the PHP script has generated it. This usually means optimizing SQL queries to execute faster.
Finally, if the pages don't change that much, have PHP cache a copy of the output to disc, and send the cached version on subsequent page loads. When the page content is changed, have the PHP script delete the cached version. The fastest query is the one you don't have to run :)
To Speedup the website try to do the following
Avoid HTTP requests.To avoid the direct request for the CSS, JS and
others,include those in our project itself
Avoid the Bulk request from the database.To be more specific,(eg: SELECT *)
Include only the relevant data field from the database table
Optimize Images.

How can I speed up image load time in my web site?

I am currently developing a web site with PHP + MySQL and jQuery. So far I have been doing it in my local machine. I notice that when I see the page the images on it take some time to load (few time but its visible). All images are small (PNG's with less than 3 KB). Now, when I load the page, there are some database connections happening in order to get some data that I will display.
I am not sure if this loading time issue has something to do with the amount of images, or with the time that the PHP script + the DB connections take to execute. (I have very little data in my database so I wouldn't assume this case.)
My question is: Is it a good approach to preload all the images in the beginning of each page? I tried it with jQuery and it works fine. I'm just not sure which disadvantages I can get with it. For example, to do so, I need to include the jQuery library in the beginning of the page? I thought it was a bad practice.
If these PNGs are stored in the database as BLOBs — not clear from your question — don't do that. Serving images from a DB through PHP is not as efficient as letting the web server serve them straight from the filesystem. If the images are tied to particular records, just name the PNG after the row ID, so you can find it in a directory dedicated to storing those images. The PHP code then just generates the URL that points to the PNG file on disk, so the web server can send them statically.
I don't think preloading the images within the same page is going to buy you anything. If anything, it might slow the apparent overall page load time because the browser can only retrieve a fixed number of resources concurrently, typically 2-4. Loading images at the top of the <body> means there are other things at the top of the page "above the fold" that have to wait for some HTTP connection slots to free up. Better to let the images load in their natural order.
Preloading makes sense in two situations:
The image isn't shown by default, but is expected to be needed as the user interacts with the page. Good examples of this are the hover and click state images for rollovers.
The image isn't used on this page, but will be needed on the next. Good examples of this are any site where there is a clear progression from one page to the next, like in a shopping cart.
Either way, do the preload at the very bottom of the <body>, so everything else loads first.
Having addressed those two issues, run YSlow on your site. It started out as a plugin for Firebug, which in turn is a plugin for Firefox, but it's since been ported to all major browsers except IE.
The beauty of YSlow is that it detects common slowdowns automatically, just by loading the page while the extension is active. It then gives you a clear grade for the page, so you can judge when you're "done" optimizing. If you're below an A, you're not done yet. :) It's not uncommon to see sites rating a D or worse, because the default configuration for web servers is conservative to avoid causing problems. Fixing YSlow warnings is generally pretty easy, but you have to be careful to avoid creating caching and other problems, which is why the default server config doesn't do these things.
Another answer recommended the Google PageSpeed offering. It's available as a plugin for Chrome and Firefox, as a server-side Apache module, and as a Google-hosted service. I have no idea how it compares to YSlow, but it looks interesting.
Also consider using the browser's debugger to get a waterfall graph of resource load times:
In Firebug you get this in the Net tab.
In Safari, you get to it via the Develop menu, which is normally disabled in Preferences. Turn it on if needed, then say Develop > Start Timeline Recording. That puts you into the Network Requests instrument. You can also get to it through Develop > Show Web Inspector.
In Chrome, say View > Developer > Developer Tools, then go to the Network tab.
IE has a very weak form of this, via Tools > Developer Tools > Profiler. It just gives a table of numbers, rather than a waterfall graph, so the information is there, but you can't just visually scan for long bars to find the slowest resources.
You should use page speed plugin from google to check what data takes most of the time to load. It will show separate load times for images as well.
If you're using lots of small pngs I suggest you combining them into one image and manipulating the display via css background property since they are part of styling and not information. In that case - instead of few images only one will be loaded and reused through all elements. In this case even preloading of one image is really easy.
Have you considered using CSS Sprites to combine all of your images into a single download? There are a number of tools online to help you do this, and it will significantly reduce the number of HTTP requests required by your page.
Make sure you have set the correct Expires header on your images to allow them to be cached.
Finally, take a look at YSlow which can provide you with futher optimisation tips.

Categories