How to reduce php page loading time? - php

I have created webpages for a Magazine in PHP. I have called header and footer files in each page. My pages are having only HTML codes except header and footer files. Even though my page loading time is too long. Can any one help me to solve my problem. I am very new to PHP so i dont know the tricks.

use a good web server with caching frameworks like opcache. Reduce the file sizes of images, javascript, css. Avoid unneeded loops.
That are some examples. It depends on what you do.

You can try this page: http://www.webpagetest.org/ it measures the performance of your generated output (html).
Everything else like slow php execution times cannot be measured by observing the frontend properly. You will have to gain some skills and tools to profile your php application and see what is actually slow. The standard procedure is googling for optimizing php scripts or profiling a php script, etc. You probably will need to invest money to get some proper tools.
PHP must not even be the culprit here. Long server response, bad mysql queries, etc may impact execution and delivery speed.

Related

Check Internet connection before displaying web page content

In web development, is it possible to perform a test on bandwidth/internet connection speed in client device, and based on this test, filter items to be displayed.
For instance, is it possible do avoid showing social media buttons if connection speed is low for the user?
I am using PHP for server, HTML/jquery for client side.
There is no easy way to do this. There are no built in functions or anything. Using a combination of PHP and JS there are workarounds, but honestly if your user has a slow connection it will slow it down even more to have all that extra javascript to download. You're really better off just making your website as compact as possible. Use style sheets and avoid adding excessive images, etc.
You'll have to use javascript to download files of varying sizes and time how long it takes. Then based on how long it takes you can handle things differently. Take a look here: Bandwidth utility using javascript

How important is caching for a site's speed with PHP?

I've just made a user-content orientated website.
It is done in PHP, MySQL and jQuery's AJAX. At the moment there is only a dozen or so submissions and already I can feel it lagging slightly when it goes to a new page (therefore running a new MySQL query)
Is it most important for me to try and optimise my MySQL queries (by prepared statements) or is it worth in looking at CDN's (Amazon S3) and caching (much like the WordPress plugin WP Super Cache) static HTML files when there hasn't been new content submitted.
Which route is the most beneficial, for me as a developer, to take, ie. where am I better off concentrating my efforts to speed up the site?
Premature optimization is the root of all evil
-Donald Knuth
Optimize when you see issues, don't jump to conclusions and waste time optimizing what you think might be the issue.
Besides, I think you have more important things to work out on the site (like being able to cast multiple votes on the same question) before worrying about a caching layer.
Its done in PHP, MySQL and jQuery's AJAX, at the moment there is only a dozen or so submissions and already i can feel it lagging slightly when it goes to a new page (therefore running a new mysql query)
"Can feel it lagging slightly" – Don't feel it, know it. Run benchmarks and time your queries. Are you running queries effectively? Is the database setup with the right indexes and keys?
That being said...
CDN's
A CDN works great for serving static content. CSS, JavaScript, images, etc. This can speed up the loading of the page by minimizing the time it takes to request all the resources. It will not fix bad query practice.
Content Caching
The easiest way to implement content caching is with something like Varnish. Basically sits in front of your site and re-serves content that hasn't been updated. Minimally intrusive and easy to setup while being amazingly effective.
Database
Is it most important for me to try and optimise my MySQL queries (by prepared statements)
Why the hell aren't you already using prepared statements? If you're doing raw SQL queries always use prepared statements unless you absolutely trust the content in the queries. Given a user content based site I don't think you can safely say that. If you notice query times running high then take a look at the database schema, the queries you are running per-page, and the amount of content you have. With a few dozen entries you should not be noticing any issue even with the worst queries.
I checked out your site and it seems a bit sluggish to me as well, although it's not 100% clear it's the database.
A good first step here is to start on the outside and work your way in. So use something like Firebug (for Firefox), that - like similar plug-ins of its type - will allow you to break down where the time goes in loading a page.
http://getfirebug.com/
Second, per your comment above, do start using PreparedStatements where applicable; it can make a big difference.
Third, make sure your DB work is minimally complete - that means make sure you have indexes in the right place. It can be useful here to run the types of queries you get on your site and where the time goes. Explaining plans
http://dev.mysql.com/doc/refman/5.0/en/explain.html
and MySQL driver logging (if your driver supports it) can be helpful here.
If the site is still slow and you've narrowed it to use of the database, my suggestion is to do a simple optimization at first. Caching DB data, if feasible, is likely to give you a pretty big bang for the buck here. One very simple solution towards that end, especially given the stack you mention above, is to use Memcached:
http://memcached.org/
After injecting that into your stack, measure your performance + scalability and only pursue more advanced technologies if you really need to. I think you'll find that simple load balancing, caching, and a few instances of your service will go pretty far in addressing basic performance + scalability goals.
In parallel, I suggest coming up with a methodology to measure this more regularly and accurately. For example, decide how you will actually do automated latency measures and load testing, etc.
For me - optimising DB is on first place - because any caching can cause that when you find some problem , you need to rebuild all cache
There are several areas that can be optimized.
Server
CSS/JS/Images
PHP Code/Setup
mySQL Code/Setup
1st, I would use firefox, and the yslow tag, to evaluate your website's performance, and it will give server based suggestions.
Another solution, I have used is this addon.
http://aciddrop.com/php-speedy/
"PHP Speedy is a script that you can install on your web server to automatically speed up the download time of your web pages."
2nd, I would create a static domain name like static.yourdomainane.com, in a different folder, and move all your images, css, js there. Then point all your code to that domain, and then tweak your web server settings to cache all those files.
3rd, I would look at articles/techniques like this, http://www.catswhocode.com/blog/3-ways-to-compress-css-files-using-php to help compress/optimize your static files like css/js.
4th, review all your images, and their sizes, and make sure they are fully optimized. Or, convert to using css sprites.
http://www.smashingmagazine.com/2009/04/27/the-mystery-of-css-sprites-techniques-tools-and-tutorials/
http://css-tricks.com/css-sprites/
Basically for all your main site images, move them into 1 css sprite, then change your css, to refer to different spots on that sprite to display the image needed.
5th, Review your content pages, which pages, change frequently, and which ones rarely change, and those that rarely change, make those into static html pages. Those that change frequently, you can either leave as php pages, or create a cron or scheduled task using php command line to create new static html versions of the php page.
6th, for mySQL, I recommend you have the slow query log on, to help identify slow queries. Review your table structure, make sure they are optimal, and have tables, that are well designed. Use views and stored procedures, to move hard sql logic or functioning from php to mySQL.
I know this is a lot, but I hope it's useful.
It depends where your slowdowns really lie. You have a lot of twitter and facebook stuff on there that could easily slow your page down significantly.
Use firebug to see if anything is being downloaded during your perceived slow loading times. You can also download the YSlow firefox plugin to give you tips on speeding up page loads.
A significant portion of perceived slowness can be due to the javascript on the page rather than your back-end. With such a small site you should not see any performance issues on the back end until you have thousands of submissions.
Is it most important for me to try and optimise my MySQL queries (by prepared statements)
Sure.
But prepared statements has nothing to do with optimizations.
Nearly 99% of sites are running with no cache at all. So, I don't think you're really need it.
If your site is running slow, you have to profile it first and then optimise certain place that proven being a bottleneck.

Should I only be loading Javascript files on pages I need them?

So I am starting to get interested in Optimization and I was wondering if it is worth it to only include the Javascript files in the pages I need and exclude them from the rest using PHP?
Thanks
Loading only the scripts you need will definitely help.
Another huge improvement can be made by combining all javascript to one file and (if possible) minimizing and caching the output. The less requests a browser has to make, the faster your page will load.
Ideally, yes.
Each page should not load more than it needs. Minimising traffic is always good.
It is a good idea to have a framework for generating pages, and only add the resources you need for each page.
If you include all or most of the JavaScript that your site needs on your main page (in an minimized and compressed form like what is provided by YUI Compressor) it will increase the load time of your main page but will decrease the load time of every other page because the JavaScript will already be in cache. A lot of web applications will use this technique so that the user experience after the initial load is smoother.
Either technique has its merits, it really just depends on the type of application/page you are building. But like dirkbonhomme said you should be minimizing, compressing, and caching the JavaScript (and CSS) no matter which way you go.

Improve performance of website

I have designed a new web site. I have hosted it online. I want it to be of the best performance and load pages faster.
This website is designed in php 5.0+ using codeigniter. This is using mysql as DB. I have images on it. I am using Nitobi grid for displaying set of records on page. The rest is everything normal page controls.
As i am not so very experienced with website performance factors i would like to get suggestions and details on factors that can improve performance of website. Please let me know how i can improve my performance.
Also please let me know if there are any ways to measure the performance of website and also any websites or tools to help test the performance.
To start, get Firefox and Firebug and then install YSlow. YSlow gives great information about the client-side performance of the website in question. Here's an User Guide.
For the server-side performance, have a look at Apache JMeter.
Have you looked into opcode caching, APC, memcache etc? As another has said, you need to time the loading of your pages and try to find potential SQL bottlenecks and/or scripts that can be refactored. You may also want to look at getting something like webgrind installed so you can see what happens on a page load and how long each process takes.
You can see loading times of the main page and the components it contains with the Net tab in the already mentioned Firebug addon for Firefox. There you can see if a page is slow due to having a lot of external content (like user added images or so) or because of itself.
In the first case not much you can do except removing the content that takes most time, in the second case you will need to take a look at your PHP code considering the fact that most of the times performance issues in PHP applications depend on imperfect database interaction (badly written queries, repeated queries when one would suffice, etc.).
Profiling is the key word in the world of performance optimization.
To profile your site you have to measure 2 different areas: php scripts running time and whole page load time (including pictures, javascripts, style sheets etc). To measure PHP scripts is quite easy. The easiest way is to place this line at the top of your page
$TIMER['start']=microtime(1);
and this line at the bottom:
echo "Time taken: ".round(microtime(1) - $TIMER['start'],3);
if it stays below 0,1 sec, it's ok. Now to the whole page loading. Dunno if there are some http sniffer with response time recording.
Edit: Looks like Firebug's Net tab mentioned above is the right tool for this
Like what Kevin said, I suggest trying opcode caching with PHP. I'm not sure which is currently best, but when I looked it up a year ago, I decided to go with [eAccelerator][1] and it works great. I've also used APC on another server but I prefer eAccelerator.
You should probable go with Col. Shrpnel's advice and do some profiling as well.
[1]: http://en.wikipedia.org/wiki/EAccelerator eAccelerator
from the server-perspective:
as others wrote; use a php accelerator (I use APC, which is
supposed to become standard in php)
take care of database; number of queries, complexity of queries, data in resultset, ... can have a big impact
cache dynamic pages
and from a browser-perspective:
minimize number of JS and CSS-files
(one of each is ideal), put css in head, js below
avoid calling 3rd party javascript (analytics, widgets, ...)
check size of images (I use smush.it)
impact of these can be huge, cfr. tests I ran on my (wordpress-based) site.
If you have time to play try HipHop developed and used by Facebook
Page generated in 0.0074 secs.
DB runtime 0.0006 secs (7.87 %) using 1 DB queries, 7 DB cache fetches, 3 RSS cache fetches and 61.88 K memory.
http://i42.tinypic.com/2m31frp.jpg
ouch !!
dont bump - this is his benchmark ;)
This site will measure integrated performance mark for your site, as well as give you some relevant advice. All you have to do is to type in the URL.
I would suggest give Clicktale a try. I’ve been using it for 2 months and it is neat to watch what your users do, I learned a lot.

Creating a two-pass PHP cache system with mutable items

I want to implement a two-pass cache system:
The first pass generates a PHP file, with all of the common stuff (e.g. news items), hardcoded. The database then has a cache table to link these with the pages (eg "index.php page=1 style=default"), the database also stores an uptodate field, which if false causes the first pass to rerun the next time the page is viewed.
The second pass fills in the minor details, such as how long ago something(?) was, and mutable items like "You are logged in as...".
However I'm not sure on a efficient implementation, that supports both cached and non-cached (e.g., search) pages, without a lot of code and several queries.
Right now each time the page is loaded the PHP script is run regenerating the page. For pages like search this is fine, because most searches are different, but for other pages such as the index this is virtually the same for each hit, yet generates a large number of queries and is quite a long script.
The problem is some parts of the page do change on a per-user basis, such as the "You are logged in as..." section, so simply saving the generated pages would still result in 10,000's of nearly identical pages.
The main concern is with reducing the load on the server, since I'm on shared hosting and at this point can't afford to upgrade, but the site is using a sizeable portion of the servers CPU + putting a fair load on the MySQL server.
So basically minimising how much has to be done for each page request, and not regenerating stuff like the news items on the index all the time seems a good start, compared to say search which is a far less static page.
I actually considered hard coding the news items as plain HTML, but then that means maintaining them in several places (since they may be used for searches and the comments are on a page dedicated to that news item (i.e. news.php), etc).
I second Ken's rec of PEAR's Cache_Lite library, you can use it to easily cache either parts of pages or entire pages.
If you're running your own server(s), I'd strongly recommend memcached instead. It's much faster since it runs entirely in memory and is used extensively by a lot of high-volume sites. It's a very easy, stable, trouble-free daemon to run. In terms of your PHP code, you'd use it much the same way as Cache_Lite, to cache various page sections or full pages (or other arbitrary blobs of data), and it's very easy to use since PHP has a memcache interface built in.
For super high-traffic full-page caching, take a look at doing Varnish or Squid as a caching reverse proxy server. (Pages that get served by Varnish are going to come out easily 100x faster than anything that hits the PHP interpreter.)
Keep in mind with caching, you really only need to cache things that are being frequently accessed. Sometimes it can be a trap to develop a really sophisticated caching strategy when you don't really need it. For a page like your home page that's getting hit several times a second, you definitely want to optimize it for speed; for a page that gets maybe a few hits an hour, like a month-old blog post, it's a bad idea to cache it, you only waste your time and make things more complicated and bug-prone.
I recommend to don't reinvent the wheel... there are some template engines that support caching, like Smarty
For server side caching use something like Cache_Lite (and let someone else worry about file locking, expiry dates, file corruption)
You want to save the results to a file and use logic like this to pull them back out:
if filename exists
include filename
else
generate results
render to html (as string)
write to file
output string or include file
endif
To be clear, you don't need two passes because you can save parts of the page and leave the rest dynamic.
As always with this type of question, my response is:
Why do you need the caching?
Is your application consuming too much IO on your database?
What metrics have you run?
Your are talking about adding an extra level of complexity to your app so you need to be very sure that you actually need it.
You might actually benefit from using the built-in MySQL query cache, if the database is the contention point in your system. The other option is too use Memcache.
I would recommend using existing caching mechanism. Depending on what you really need, You might be looking for APC, memcached, various template caching libs... It easier/faster to tune written/tested code to please your need than to write everything from scratch. (usually, although there might be situations when you don't have a choisce)

Categories