I made PHP website.It has 100 webpages but when I open it..It takes lots of time for load.This is static website not dynamic.but content size in the pages are larger..It takes more loaing time in web browse.
What can I do for decrease the loading time..Please give me solution.
There is a very beautiful tool available to monitor what you have asked named as Yslow
Have a look at this.
There are a whole variety of methods here:
If you are accessing a database look at optimising your queries, for example specify only the fields that you need in a SELECT query rather than using SELECT *
Employ some form of server-side caching. There are a number of solutions for PHP - see this site for more details http://www.sitepoint.com/caching-php-performance/
Use client-side (browser) caching by setting appropriate Cache HTTP headers (see http://www.mnot.net/cache_docs/ for more details)
Without further information about your site it's difficult to provide a more specific answer.
test your site in chrome
It has a great feature wich shows what time elements take to load.
( ctrl shift i , timeline)
Short steps for full optimization are
1) Backend
Should be Analysis and reduce the Data fetching time using index, reduce subquerys, temptable etc..
2) Frontend
reduce big size library Js scripts
Image size
Php scripts looping (page loading check out using browser plugin)
Reduce the html size as well.
3) its really funny but also need to check. Please check out your broadband and network capacity...
Those thing u have done all the page will come good...
You should optimize query and database operations. you should always prefer to complete things in minimum no. of loop if possible..
loading time is also affected by the page content. you should eliminate unnecessary images from form..
it also affected by server speed if you are running on server..
Simple answer, if there's too much content, then reduce the content on the page! Install YSlow and follow its advice.
To be more specific, you need to apply some rules and show some self-control to keep loading times down. There's also stuff you can do on the PHP side, but we'll get to that later. On the client side, the following tips will help.
Remove any markup that isn't necessary. For example,
<div class="class 1>
<div class="class 2">
<div class="class 3">
<p> Hello, I'm the content</p>
</div>
</div>
</div>
With judicious use of CSS you can in most cases replace this with
<div class="class1 class2 class3">
<p>Hellp, I'm the content!</p>
</div>
You could even ditch the div altogether, if it's only ever going to contain a single child.
<p class="class1 class2 class3">Hello, I'm the content!</p>
Images: Rule of thumb is no image on a web page should exceed 100K in size. While there are exceptions, this is a good rule to stick to. If you have many or large images on your page, try optimizing them. Replace lossless formats with lossy ones,(Truecolour PNG with JPEG) replace older file formats with modern ones with better compression (GIF with non-truecolor PNG), lower image quality settings for JPEG, reduce number of colours in PNG, and so on.
NEVER use BMP images on a web page!
You can speed up page loads by reducing the number of HTTP requests being made. Every asset on your page (image, stylesheet, javascript file, etc) represents a HTTP request, and the specs say you can only have 2 requests open at any one time. Any additional requests will be queued up until the first ones are cleared.
You can reduce the number of requests by, for example, having a single stylesheet for your page instead of multiple ones (though be sensible here, some stuff is better kept in a separate sheet, such as IE fixes), using image sprites, combining javascript files together (again, be sensible). and so on
One thing that won't speed up page load times, but will make them more responsive is to put all your javascript at the bottom of the page (just before the </body> tag) as loading javascript in the head or higher up in the body will force the browser to wait until the JS has been evaluated before rendering what comes after it.
On the server side, turn compression on. Make sure files are sent with suitable cacheing headers so the browser can cache images, stylesheets, javascript, etc.
Finally in PHP, optimize your code so that it generates output more quickly. The server can't start sending content to the client until the PHP script has generated it. This usually means optimizing SQL queries to execute faster.
Finally, if the pages don't change that much, have PHP cache a copy of the output to disc, and send the cached version on subsequent page loads. When the page content is changed, have the PHP script delete the cached version. The fastest query is the one you don't have to run :)
To Speedup the website try to do the following
Avoid HTTP requests.To avoid the direct request for the CSS, JS and
others,include those in our project itself
Avoid the Bulk request from the database.To be more specific,(eg: SELECT *)
Include only the relevant data field from the database table
Optimize Images.
Related
Goal: minimizing the loading time of my website
I did lots of scans and speed tests.
They all indicate, there is lots of time spent uselessly, waiting for a HTTP response.
The time spent actually sending data is very small compared to that.
Screenshot of a 'speed test':
I started loading all the scripts and styles in PHP and just echoing them:
<!-- loading a style -->
<style><?php echo file_get_contents("style.css"); ?></style>
<!-- loading a script (similar) -->
<script type="text/javascript"><?php echo file_get_contents("script.js"); ?></script>
That does work, however, the images still slow things down a little. How do I use php to read them and echo them as a <img> tag?
(The mentioned website is https://www.finnmglas.com/)
Everyone seeking answers to this:
You should not echo back images with PHP into the main file:
It messes up the loading time (makes it way longer)
And it delays the first render of your website. Even if you have less requests, you will want to have a quick first render (users will notice the delay)
Echoing images from a separate PHP file and including a <img> tag pointing at that PHP should not be a problem ^^
Yes, you can add data instead url to src attribute:
<img src="data:image/jpg;base64,<? php echo base64_encode($data) ?>" />
but I recommend this for small images. Better way for more small images is use it as sprites: https://www.toptal.com/developers/css/sprite-generator/
and for css/javascript use some minify preprocessor: gulp gulp-clean-css gulp-uglify-es
I noticed the same kind of issue of wait and lag...
there are many factors, the location of the server, location of the visitor (in the picscreen case ; location of the server that did the test), routers between the internet provider, etc...
I discovered that using a CDN is way better, the browser will connect to the CDN that have many servers around the world, then cloudflare ask your server and cache the HTML, js, css, images, videos.... so that delivery to the end point becomes way more faster
you can try https://www.cloudflare.com/
Theorizing here on how to get lightning fast media + prevent hotlinking and the <img src="data:image-kj134332k4" /> is coming to mind and more. Scrapers dont need our src and real clients need instant load (esp cell net). Considering the recent google https-everywhere move, this would drastically decrease handshakes as well.
What disadvantages are there to crafting lists such as ecom
categories/widgets/slideshows using data:image?
Is there any implications to extra KB of actual source code over serving vastly larger total page size?
Do ya'll prefer any PHP data:image gen script over another for parsing images as data as data at certain controller levels (leaving standard src images in other areas)?
Are there caching/CDN concerns? Would the parse wonk cache somehow? Seems not but im not cache expert.
Any guidance or case thoughts are much appreciated. Thank you!
Generally, the idea is worth considering, but in most cases the problems outweight the benefits.
It is true that these images won't be cached on the client side anymore. Especially Expires-based caching saves you tons of bandwidth.
As a rule of thumb I'd say: If these are small images that change frequently, embedding is a good idea. If images are larger and clients load the same image more than once in subsequent request, do by all means deliver images separately and put some effort into caching.
As for the other points:
Most browsers support this; however, some old IEs don't … so think of a fallback solution or be ready to get bug reports (may be neglible, depending on your user base.)
The number of SSL handshakes is neglible, if you're using HTTP keep-alive, which is standard. Follow-up requests do indeed require a new handshake, but if you cache properly (see next point) and maybe put static files on a CDN, this is no problem.
Read about caching, especially the Expires/Cache-Control headers and their friends.
If you decide to embed, you don't really need a generator script, embedded images are base64 coded image files; this shouldn't take more than 3 lines of code.
However, if you process/convert your images in PHP, there's even another disadvantage: Instead of statically serving them (maybe even from a different machine or CDN), images have to be on the same machine and go through the PHP engine, thus increasing the used memory of each process that serves a page with these images.
I'm creating a website from scratch and I was really into this in the late 90's but the web has changed alot since then! And I'm more of a designer so when I started putting this site together, I basically did a system of php includes to make the site more "dynamic"
When you first visit the site, you'll be presented to a logon screen, if you're not already logged on (cookies). If you're not logged on, a page called access.php is introdused.
I thought I'd preload the most heavy images at this point. So that when the user is done logging on, the images are already cached. And this is working as I want. But I still notice that the biggest image still isn't rendered immediatly anyway. So it's seems kinda pointless.
All of this has made me rethink how the site is structured and how scripts and css files are loaded. Using FireBug and YSlow with Firefox I see a few pointers like expires headers and reducing the size of each script. But is this really the culprit?
For example, would this be really really stupid in the main index.php? The entire site is basically structured like this
<?php
require("dbconnect.php");
?>
<?php
include ("head.php");
?>
And below this is basically just the body and the content of the site.
Head.php however consists of the doctype, head portions, linking of two css style sheets, jQuery library, jQuery validation engine, Cufon and Cufon font file, and then the small Cufon.Replace snippet.
The rest of the body comes with the index.php file, but at the bottom of this again is an include of a file called "footer.php" which basically consists of loading of a couple of jsLoader scripts and a slidepanel and then a js function.
All of this makes the end page source look like a typical complete webpage, but I'm wondering if any of you can see immediatly that "this is really really stupid" and "don't do that, do this instead" etc. :) Are includes a bad way to go?
This site is also pretty image intensive and I can probably do a little more optimization.
But I don't think that's its the primary culprit. YSlow gives me a report of what takes up the most space:
doc(1) - 5.8K
js(5) - 198.7K
css(2) - 5.6K
cssimage(8) - 634.7K
image(6) - 110.8K
I know it looks like it's cssimage(8) that weighs the most, but I've already preloaded these images from before and it doesn't really affect the rendering.
To speed a little, you could assemble all your images on the same image sprite, so that you have only 1 request to download all the images. But that requires you to fine tune your css to let display just the small subset of your image.
To have a better explanation, check out : http://css-tricks.com/css-sprites/
Another answer that could seem a little stupid but I like to think of this when I make a website : Just Keep It Simple. I mean do all your JS add real value, do all this images are fine, could you display less, make a lighter design ? I'm not criticizing your work at all, just suggest you...
I used the following approach on an extranet project:
Using jQuery and a array of file names, I ajax in all the images, .js and .css files so that they are preloaded in the cache. As I iterate through the array, I update a progress bar on the screen that indicates that the site is loading - much like a flash loader.
It worked well.
What I will do is show by default the loading page with pure CSS and HTML then wait for the jQuery to load and preload the images with ImageLoader. Once you are done redirect to the normal website since the images will be already in the cache they won't be loaded again.
Another optimization you can do is minify all JS files and combine all except the jquery.js. Put the jquery.js first into your HTML so it loads first. Also put your SCRIPT tags at the bottom of the HTML.
It sounds like you have pretty much nailed preloading, if you have loaded it once, and the expiry header is set correctly, you have preloaded it, no matter what kind of content it is.
File combination can be key to a quick website, each extra file will add load time, in the worst cases of network and server lag you might add up to a second extra for each separate file. More commonly it will be around 100 - 200 milliseconds per file.
If not already minified, minify the scripts, and put them in the same file, just remember to keep the order. I have no idea why Ivo Sabev wouldn't include jQuery.
Same thing with the CSS files.
How much have you done about testing image compression? There can really be a gain from trying out different compression settings and comparing size vs. quality. For PNG images IrfanView with PNGOUT can often make files 25% smaller than other programs, on top of that, a very big gain in size reduction can be achieved by reducing the image to 8 bit colour, with a lot of graphic elements you simply can't tell the difference. Right here on Stack Overflow there is a great example of well compressed and stacked images in the editor control buttons: http://sstatic.net/so/Img/wmd-buttons.png
I am currently developing a web site with PHP + MySQL and jQuery. So far I have been doing it in my local machine. I notice that when I see the page the images on it take some time to load (few time but its visible). All images are small (PNG's with less than 3 KB). Now, when I load the page, there are some database connections happening in order to get some data that I will display.
I am not sure if this loading time issue has something to do with the amount of images, or with the time that the PHP script + the DB connections take to execute. (I have very little data in my database so I wouldn't assume this case.)
My question is: Is it a good approach to preload all the images in the beginning of each page? I tried it with jQuery and it works fine. I'm just not sure which disadvantages I can get with it. For example, to do so, I need to include the jQuery library in the beginning of the page? I thought it was a bad practice.
If these PNGs are stored in the database as BLOBs — not clear from your question — don't do that. Serving images from a DB through PHP is not as efficient as letting the web server serve them straight from the filesystem. If the images are tied to particular records, just name the PNG after the row ID, so you can find it in a directory dedicated to storing those images. The PHP code then just generates the URL that points to the PNG file on disk, so the web server can send them statically.
I don't think preloading the images within the same page is going to buy you anything. If anything, it might slow the apparent overall page load time because the browser can only retrieve a fixed number of resources concurrently, typically 2-4. Loading images at the top of the <body> means there are other things at the top of the page "above the fold" that have to wait for some HTTP connection slots to free up. Better to let the images load in their natural order.
Preloading makes sense in two situations:
The image isn't shown by default, but is expected to be needed as the user interacts with the page. Good examples of this are the hover and click state images for rollovers.
The image isn't used on this page, but will be needed on the next. Good examples of this are any site where there is a clear progression from one page to the next, like in a shopping cart.
Either way, do the preload at the very bottom of the <body>, so everything else loads first.
Having addressed those two issues, run YSlow on your site. It started out as a plugin for Firebug, which in turn is a plugin for Firefox, but it's since been ported to all major browsers except IE.
The beauty of YSlow is that it detects common slowdowns automatically, just by loading the page while the extension is active. It then gives you a clear grade for the page, so you can judge when you're "done" optimizing. If you're below an A, you're not done yet. :) It's not uncommon to see sites rating a D or worse, because the default configuration for web servers is conservative to avoid causing problems. Fixing YSlow warnings is generally pretty easy, but you have to be careful to avoid creating caching and other problems, which is why the default server config doesn't do these things.
Another answer recommended the Google PageSpeed offering. It's available as a plugin for Chrome and Firefox, as a server-side Apache module, and as a Google-hosted service. I have no idea how it compares to YSlow, but it looks interesting.
Also consider using the browser's debugger to get a waterfall graph of resource load times:
In Firebug you get this in the Net tab.
In Safari, you get to it via the Develop menu, which is normally disabled in Preferences. Turn it on if needed, then say Develop > Start Timeline Recording. That puts you into the Network Requests instrument. You can also get to it through Develop > Show Web Inspector.
In Chrome, say View > Developer > Developer Tools, then go to the Network tab.
IE has a very weak form of this, via Tools > Developer Tools > Profiler. It just gives a table of numbers, rather than a waterfall graph, so the information is there, but you can't just visually scan for long bars to find the slowest resources.
You should use page speed plugin from google to check what data takes most of the time to load. It will show separate load times for images as well.
If you're using lots of small pngs I suggest you combining them into one image and manipulating the display via css background property since they are part of styling and not information. In that case - instead of few images only one will be loaded and reused through all elements. In this case even preloading of one image is really easy.
Have you considered using CSS Sprites to combine all of your images into a single download? There are a number of tools online to help you do this, and it will significantly reduce the number of HTTP requests required by your page.
Make sure you have set the correct Expires header on your images to allow them to be cached.
Finally, take a look at YSlow which can provide you with futher optimisation tips.
I need to write a text file viewer (not the directory tree, but the actual file contents) for use in a browser. It will be used to view large files. I want to give the user the ability to actually ummm, browse the file, ie prev page & next page buttons, while each page will show only a portion of the file.
Two question:
Is there anyway to pass the file descriptor through POST (or something) so that on each page I can keep reading from an already open file, and not starting all over again (again - huge files)
Is there a way to read the file backwards? Will be very useful for browsing back in a file.
Any other implementation ideas are very welcome. Thanks
Keeping the file open between requests is not a good idea - you don't have to "start all over again" - just maintain an offset and use fseek() to jump to that offset. That way, you can also implement the "backwards jumping".
Cut your huge files into smaller files once, and then serve the small files to the user.
You should consider pagination. If you're concerned about the user being frustrated by needing to click "next" too often, you could make each chunk reasonably large (so a normal reader pages every 20min).
Another option is the Chunked-Endoding transfer type: Wikipedia Entry. This would allow your server to respond quickly and give the user something to read while it streams the rest of the file over the network (rather than the server needing to read in the file and send it all at once). This could dramatically improve the perceived performance compared to serving the files normally, but still consumes a lot of bandwidth for your server.
You might be able to simulate a large document with Javascript and AJAX, but only send pieces at a time for better performance.
Consider sending a few pages worth of your document and attaching listeners to the scroll event of your browser. Over time or as the user scrolls down you AJAX more chunks. This creates a few annoying UX edge cases, like:
Scroll bar indicates a much smaller document than there actually is
You might be able to avoid this by filling in the bottom of your document with many page breaks, but it'll be difficult to make the length perfect.
Scrolling past the point of currently-available content will show a blank page.
You could detect this using JavaScript and display a "loading" icon to let the user know what's going on.
Built-in "find" feature doesn't work
Hard to avoid this without the user downloading the entire document, but you could provide your own search feature for them to use instead (not as good but perhaps adequate).
Really though, you're probably best off with pagination with medium-sized pages. It's a very well understood design pattern that's a relatively easy (compared to other options at least) to implement and make fast.
Hope that helps!