Today I read some articles about compressing multiple JS and CSS files to one in order to lower the bandwith and HTTP requests. Is there analogical situation with the php files? Is it better to create master pages that include few PHP files(header.php,footer.php and etc)in order to have a well formatted and readable code than having just a large index.php?
PHP is run on the server, and as such, only the resulting HTML is sent over to the client. Keep your files separated for clarity, it doesn't make a difference in the HTTP requests or bandwidth.
Like Xeon06 mentioned, PHP is a server-side script, and doesn't affect a bandwidth, which is size the of the content transferred through HTTP. Size of the bandwidth based on HTML, CSS, JS, Graphics, Flash and anything client-side files. Because browsers cache these kind of files, it reduces the bandwidth, because browser will get the file content from cache and not download from server again.
If I were in your shoes, I would still use includes for headers, footers or whatever is common in most pages. It just makes it easier to maintain a content and code. For example, if you have a feedback form on all pages and you want to edit a field name, link or something else, it's easier to edit one file instead of "Find and Replace" through all files you have, which doesn't work well anyways.
Related
I know that serving multiple small files is much slower than serving one larger file, this is why it's good to use a single CSS document as well as sprite sheets. I've also tried to include as much JavaScript into the smallest amount of files as I can for a while now, to avoid multiple requests from the viewer for more files, but having a variety of clearly different tasks in the same document gets confusing and messy.
I've been wondering if using PHP to combine a larger amount of JavaScript files into a single file and then serving that with the content-type set to application/x-javascript would get around this problem.
I'm assuming that because the server manages retrieving those files, the viewer will only request a single file. I do however have minimal knowledge around how the server will deal with that though, and if it's going to end up being the same issue just the other way around (and end up just as slow). I have a feeling that because the JavaScript is all hosted in the same place as the PHP that it shouldn't be the case.
Will I receive the same benefit of only having a single JavaScript file if I actually have multiple files and serve them as a single document via PHP?
You get the benefit of a single HTTP request to retrieve the JS file, so the browser experience will be faster (at low traffic levels, at least), but your server will be working much harder to execute the PHP code instead of just serving up static files. That may be fine if you don't get much traffic, but for best results you should combine this technique with a cache layer in front of the PHP.
I made PHP website.It has 100 webpages but when I open it..It takes lots of time for load.This is static website not dynamic.but content size in the pages are larger..It takes more loaing time in web browse.
What can I do for decrease the loading time..Please give me solution.
There is a very beautiful tool available to monitor what you have asked named as Yslow
Have a look at this.
There are a whole variety of methods here:
If you are accessing a database look at optimising your queries, for example specify only the fields that you need in a SELECT query rather than using SELECT *
Employ some form of server-side caching. There are a number of solutions for PHP - see this site for more details http://www.sitepoint.com/caching-php-performance/
Use client-side (browser) caching by setting appropriate Cache HTTP headers (see http://www.mnot.net/cache_docs/ for more details)
Without further information about your site it's difficult to provide a more specific answer.
test your site in chrome
It has a great feature wich shows what time elements take to load.
( ctrl shift i , timeline)
Short steps for full optimization are
1) Backend
Should be Analysis and reduce the Data fetching time using index, reduce subquerys, temptable etc..
2) Frontend
reduce big size library Js scripts
Image size
Php scripts looping (page loading check out using browser plugin)
Reduce the html size as well.
3) its really funny but also need to check. Please check out your broadband and network capacity...
Those thing u have done all the page will come good...
You should optimize query and database operations. you should always prefer to complete things in minimum no. of loop if possible..
loading time is also affected by the page content. you should eliminate unnecessary images from form..
it also affected by server speed if you are running on server..
Simple answer, if there's too much content, then reduce the content on the page! Install YSlow and follow its advice.
To be more specific, you need to apply some rules and show some self-control to keep loading times down. There's also stuff you can do on the PHP side, but we'll get to that later. On the client side, the following tips will help.
Remove any markup that isn't necessary. For example,
<div class="class 1>
<div class="class 2">
<div class="class 3">
<p> Hello, I'm the content</p>
</div>
</div>
</div>
With judicious use of CSS you can in most cases replace this with
<div class="class1 class2 class3">
<p>Hellp, I'm the content!</p>
</div>
You could even ditch the div altogether, if it's only ever going to contain a single child.
<p class="class1 class2 class3">Hello, I'm the content!</p>
Images: Rule of thumb is no image on a web page should exceed 100K in size. While there are exceptions, this is a good rule to stick to. If you have many or large images on your page, try optimizing them. Replace lossless formats with lossy ones,(Truecolour PNG with JPEG) replace older file formats with modern ones with better compression (GIF with non-truecolor PNG), lower image quality settings for JPEG, reduce number of colours in PNG, and so on.
NEVER use BMP images on a web page!
You can speed up page loads by reducing the number of HTTP requests being made. Every asset on your page (image, stylesheet, javascript file, etc) represents a HTTP request, and the specs say you can only have 2 requests open at any one time. Any additional requests will be queued up until the first ones are cleared.
You can reduce the number of requests by, for example, having a single stylesheet for your page instead of multiple ones (though be sensible here, some stuff is better kept in a separate sheet, such as IE fixes), using image sprites, combining javascript files together (again, be sensible). and so on
One thing that won't speed up page load times, but will make them more responsive is to put all your javascript at the bottom of the page (just before the </body> tag) as loading javascript in the head or higher up in the body will force the browser to wait until the JS has been evaluated before rendering what comes after it.
On the server side, turn compression on. Make sure files are sent with suitable cacheing headers so the browser can cache images, stylesheets, javascript, etc.
Finally in PHP, optimize your code so that it generates output more quickly. The server can't start sending content to the client until the PHP script has generated it. This usually means optimizing SQL queries to execute faster.
Finally, if the pages don't change that much, have PHP cache a copy of the output to disc, and send the cached version on subsequent page loads. When the page content is changed, have the PHP script delete the cached version. The fastest query is the one you don't have to run :)
To Speedup the website try to do the following
Avoid HTTP requests.To avoid the direct request for the CSS, JS and
others,include those in our project itself
Avoid the Bulk request from the database.To be more specific,(eg: SELECT *)
Include only the relevant data field from the database table
Optimize Images.
If JavaScript and CSS files were included inside of pages it would cut down the number of http requests and therefore make the page load faster. I feel like I am missing something because it seems like any organization interested in lightning-quick pages would do this. However, I don't recall any sites having tons of CSS and JavaScript into their pages as I look at the source code.
Questions:
What errors are in my statements above?
What are the drawbacks of this approach (shown in the title via psuedocde)?
If the data is in an external file it can be cached and reused on other pages (or the same page, revisited) without having to fetch it over the network again.
You get a minor performance penalty on the first page in exchange for a major performance enhancement on subsequent pages.
Modularity is a major concern:
I can pick and choose which javascript and css files I want per page: otherwise I'd have a ton of css and javascript files that have all the different configurations (which is just messy).
I can also cache a file and hand it to someone else faster
Where you will find an example of this happening is when sites chuck their images together into one png file and then use css to slice up the bits they want for buttons etc.
Another aspect not only for inline css and jscript. When I write code I hate to repeat. It leads to errors is difficult to maintain (update/edit) and a waste of time and space. Printing CSS or jscript once in a file that gets downloaded once is less error prone, easy to maintain and less waste of time and space.
Would it make sense to improve pageload speed by serving smaller images from the database rather than make multiple HTTP requests given that the website is PHP driven?
I'm thinking of smaller page design elements, buttons, thumbnails for galleries etc.
No. Since:
A browser only communicates with the server via HTTP so you would have to pull them from a database, put them in HTTP, then return them to the browser
It is more expensive to pull large chunks of binary data from a database then it is to pull them from the filesystem.
If you want to make fewer HTTP requests, you can sprite the images, but don't do that with content images (which should have proper <img> elements with alt text).
also you can serve the images from multiple subdomains, so you can have more concurrent HTTP requests which could help speeding up.
No.
The user isn't directly connected to the database and you can't (well you can but it's so ugly I'm ignoring it) output the image data inside the HTML. They have to be loaded on separate requests.
If you store them in a database, you need something to access the database and then stream it out. It's actually seriously worse than just letting your httpd serve it. If a server hosts it, only the core server and the filesystem get touched. If it's in a database it's the core server, the connector to the language (eg mod_php), the language (eg php), the database connection and the filesystem (which the database is written on).
Keep it simple. Keep it as a file.
If you're drowning in requests:
If you're on Apache consider using a server like lighttpd or nginx. Massively more efficient on static/dynamic mixed environments. You can still keep apache or you can dump it altogether.
Shift your images off to a CDN like S3, Akami, etc. There are plenty of providers and it usually only works out a little bit more expensive than hosting (this is assuming you've got quite a lot of traffic).
It is possible, you can embed image in HTML using Data URI Scheme. But I doubt it will redeem, you will decrease number of HTTP requests, but images can be cached on client, so therefore you will greatly increase length of each response.
But, it will be faster to load those files directly from disc, not from DB.
The number of HTTP requests remains the same whether the browser loads images from a script that loads image data from a database or regular files. In fact, loading image data from a database rather than static files would probably introduce additional overhead.
If you're looking to reduce the amount of HTTP requests a browser has to make to load your documents, you should look into CSS Sprites.
You would save the overhead from the HTTP, but how would you insert the images in the html? Otherwise you have to still make an HTTP request to get the image.
If you serve the images as byteStream from the DB, you don't let browsers to cache the content. And if you use HTTP requests per image, you let them cache the content, but paying the price to do more requests. You have also to consider the time fetching the images from the DB and the time processing them!.
I think that your best option in this case, is put all the small images in just one file (sprite), and then use CSS to display them. That's what high load sites do. This way you just do one request and get all the images, the browser will cache the file and it will improve your perfomance. The price you pay is that you need to write more CSS but that's just plain text and the same number of files. It's a win-win situation :)
There are various ways to improve image performance in a website
Use an alternate domain just for static content. This has two benefits - cookies from your main domain are not sent with each request, and a separate domain gets it's own allocation of connections
Combine images into sprites
Configure caching correctly. Set far future expiry headers. Set the expiry header so that the image is not downloaded between visits to the site. When an image is requested, the ETAG can also be checked and if they match, then a 302 response is returned and the content is not downloaded again.
I don't see why streaming images from a database is going to better than from the file system. Your performance numbers are subjective I suspect because of caching.
I'm creating a website from scratch and I was really into this in the late 90's but the web has changed alot since then! And I'm more of a designer so when I started putting this site together, I basically did a system of php includes to make the site more "dynamic"
When you first visit the site, you'll be presented to a logon screen, if you're not already logged on (cookies). If you're not logged on, a page called access.php is introdused.
I thought I'd preload the most heavy images at this point. So that when the user is done logging on, the images are already cached. And this is working as I want. But I still notice that the biggest image still isn't rendered immediatly anyway. So it's seems kinda pointless.
All of this has made me rethink how the site is structured and how scripts and css files are loaded. Using FireBug and YSlow with Firefox I see a few pointers like expires headers and reducing the size of each script. But is this really the culprit?
For example, would this be really really stupid in the main index.php? The entire site is basically structured like this
<?php
require("dbconnect.php");
?>
<?php
include ("head.php");
?>
And below this is basically just the body and the content of the site.
Head.php however consists of the doctype, head portions, linking of two css style sheets, jQuery library, jQuery validation engine, Cufon and Cufon font file, and then the small Cufon.Replace snippet.
The rest of the body comes with the index.php file, but at the bottom of this again is an include of a file called "footer.php" which basically consists of loading of a couple of jsLoader scripts and a slidepanel and then a js function.
All of this makes the end page source look like a typical complete webpage, but I'm wondering if any of you can see immediatly that "this is really really stupid" and "don't do that, do this instead" etc. :) Are includes a bad way to go?
This site is also pretty image intensive and I can probably do a little more optimization.
But I don't think that's its the primary culprit. YSlow gives me a report of what takes up the most space:
doc(1) - 5.8K
js(5) - 198.7K
css(2) - 5.6K
cssimage(8) - 634.7K
image(6) - 110.8K
I know it looks like it's cssimage(8) that weighs the most, but I've already preloaded these images from before and it doesn't really affect the rendering.
To speed a little, you could assemble all your images on the same image sprite, so that you have only 1 request to download all the images. But that requires you to fine tune your css to let display just the small subset of your image.
To have a better explanation, check out : http://css-tricks.com/css-sprites/
Another answer that could seem a little stupid but I like to think of this when I make a website : Just Keep It Simple. I mean do all your JS add real value, do all this images are fine, could you display less, make a lighter design ? I'm not criticizing your work at all, just suggest you...
I used the following approach on an extranet project:
Using jQuery and a array of file names, I ajax in all the images, .js and .css files so that they are preloaded in the cache. As I iterate through the array, I update a progress bar on the screen that indicates that the site is loading - much like a flash loader.
It worked well.
What I will do is show by default the loading page with pure CSS and HTML then wait for the jQuery to load and preload the images with ImageLoader. Once you are done redirect to the normal website since the images will be already in the cache they won't be loaded again.
Another optimization you can do is minify all JS files and combine all except the jquery.js. Put the jquery.js first into your HTML so it loads first. Also put your SCRIPT tags at the bottom of the HTML.
It sounds like you have pretty much nailed preloading, if you have loaded it once, and the expiry header is set correctly, you have preloaded it, no matter what kind of content it is.
File combination can be key to a quick website, each extra file will add load time, in the worst cases of network and server lag you might add up to a second extra for each separate file. More commonly it will be around 100 - 200 milliseconds per file.
If not already minified, minify the scripts, and put them in the same file, just remember to keep the order. I have no idea why Ivo Sabev wouldn't include jQuery.
Same thing with the CSS files.
How much have you done about testing image compression? There can really be a gain from trying out different compression settings and comparing size vs. quality. For PNG images IrfanView with PNGOUT can often make files 25% smaller than other programs, on top of that, a very big gain in size reduction can be achieved by reducing the image to 8 bit colour, with a lot of graphic elements you simply can't tell the difference. Right here on Stack Overflow there is a great example of well compressed and stacked images in the editor control buttons: http://sstatic.net/so/Img/wmd-buttons.png