I am rewriting a HTML site into PHP. There are various changing menus I want to make php calls:
<?php include("header.php");?>
With the footer, and sidebars I have 4 or 5 php includes on each page.
How much am I slowing the pageload with 5 php calls? If I want a fast load is it worth sacrificing sitewide editabitity and calling less php pages? Or its only a few milliseconds?
Is there any difference speed wise between calling 2 css files or 2 php files?
( What is a good caching system to use for such simple php calls? )
For static files, like css files, merging them will decrease page loading time. Because these files are not server-side files.
Clients send more than one request for downloading these files. It will effect loading time. But php files are server-side files.
It won't effect loading time too much (if the files are not complicated too much).
That is a server-side include and the browser doesn't have to make a separate request for it, so it should be only a few milliseconds to process each include.
including files costs ~nothing
The act of including a file in php is negligible, less than 1ms. Splitting a file into several chunks and including the component files will have no noticeable difference in performance compared to including one file with the equivalent markup/php logic in it.
static files are always faster than php
Serving a css file with a webserver (apache) will always be faster and more efficient than making a request to a php file - as a webserver can handle serving static files (and appropriate headers) without involving php at all. In simplistic terms: Less processes/logic means faster performance.
As said by #Jordan Denison it a server-side include and so should not take much time. One thing more, If you will include your page from an another domain then it will cause performance issues because then PHP has to go out to the internet connect to DNS and that all stuff but if it is on the same domain or in the same root then it should not take much time.
Related
Background: I'm working on a single page web app that loads everything through AJAX and so started learning php a few days ago. I immediately thought of putting everything (html, css, javascript) in php files so that there is only one html file and one sprite request. For instance, external javascript could be stored in:
main.js.php (adding the .js for organizational purposes only) which would look like:
<script>
...
</script>
or
<style>
...
</style>
Question: Would storing everything in php be a bad idea? I have OCD and like to have related functions in separate files (and actually, folders too), so let's just say my project uses 100+ includes. These only get loaded exactly once, when the user visits (AJAX site). I want to reduce the number of Http Requests to just 1 html and 1 sprite (my app uses a custom font for images actually). My app will also be ran on mobile devices (using a different design, using far fewer includes but a similar method).
Research: Here's what I know:
You can have Apache handle js/css as php, but is not something I'm interested in (dangerous) - link
This site gave me the idea, although I don't quite understand it - 3 Ways to Compress CSS
Caching with something like APC (not sure how it works, but out of the scope of this question) would improve php speeds (?)
Whilst reducing the number of HTTP requests is a good thing, it has problems too. If you combine CSS and Javascript into the HTML file, you increase the page size. You also make it impossible for the browser to cache the CSS and Javascript files too - so those assets are being redownloaded over and over, instead of just once.
Also - serving static files is much faster than serving via PHP. PHP is slow.
Combining all images, however, is a great idea. This is what Google uses: http://www.google.co.uk/images/nav_logo91.png
Overall, I say don't bother. Keep CSS and Javascript separate, combine images where possible and - yes - definitely use APC if you can. Remember though - don't optimise your site prematurely. Get the content in there first, develop the features, etc. Making your site fast can come at the expense of reduced maintainability - I know that from experience.
some points to consider:
1. code - text ratio:
your content pages are read by google. when google is ranking you pages, one of the parameter is the ratio of code versus the textual content. if you put your css/js code together with the content, you lower the ratio. (btw, one of the arguments for using divs instead of tables is that tables normally will take more html code and lower the ratio).
EDIT: this is a theory and not really known fact. it's important that the html code will be syntactically correct, so it will be easier to parse by search engine parsers. some say that google ignores the content that comes after the first 100kb, so it's also something to consider.
2. nginX
i have nginx installed with apache as a reversed proxy to handle php.
nginx is an http server, that knows how to handle static pages. apache's design is thread per client, while nginx uses the reactor pattern, meaning - nginx can handle much more traffic than apache as a web server (about 50 times the number of requests).
the drawback is that nginx doesn't handle the php requests, and for that the apache is installed too - nginx will send all the php calls to the apache, so it will handle them and return the response back to nginx, and back to the client.
if in that setup (which is quite common) you will put css/js files under javascript, you will lose the advantage of the nginx, which instead of handling the static js/css files on its own, it will send them to the apache as it'll address them as php pages.
3. cache
caching files is one of the most common mechanisms to improve your website performance, while reducing traffic. if you mix static content with dynamic content, you will lose the advantage you get from caching static files.
when working in web environment, it's best (as a habbit) to keep as much static content as you can separated from dynamic content. this will give you the best results when caching static data.
of course, there are no rules for what should and what shouldn't. i have many dynamic js contents, but the main functions are normally extracted to static files.
4. CSS sprites
css sprites (as #Muu mentioned) are a great improvement to performance and should definitely be adopted.
another recommendation more specific to your case - if you want your content indexed properly - since you mentioned that most data will be loaded using ajax, i'd recommend to have it present also without ajax. for example: www.domain.com/ will have a link to #contact, that will show the form (loaded using ajax). you should also have www.domain.com/contact for indexing. also make sure that if a user enters www.domain.com/#contact - he will be redirected to the contact page (or content will be loaded dynamically).
use browsers' web dev tools to see what requests are being made and see where you can lower the number of requests, and also pay attention to file sizes, see which files are cached and which are requested by the server. define cache attributes in you server config and htaccess.
hope that helps ;)
PS: another tip - if you get water spilled all over your keyboard - don't try to dry it with a hair dryer - it could melt down your keys...
I would keep CSS and JS out of the PHP unless it is dynamically generated. Static pages are usually faster and easier to serve by the webserver. Keeping the files separate also allows for the client browser to cache some data used repeatedly (e.g. the CSS file).
Still, make sure you use compression on the individual files. Some useful links in the Google "Page Speed" documentation.
Also, bytecode caching improves speed for sure. Doesn't have to compile every time.
I am doing some tests (lamp):
Basically I have 2 version of my custom framework.
A normal version, that includes ~20 files.
A lite version that has everything inside one single big file.
Using my lite version more and more i am seeing a time decrease for the load time. ie, from 0.01 of the normal to 0.005 of the lite version.
Let's consider just the "include" part. I always thought PHP would store the included .php files in memory so the file system doesn't have to retrieve them at every request.
Do you think condensing every classes/functions in one big file it's worth the "chaos" ?
Or there is a setting to tell PHP to store in memory the require php files?
Thanks
(php5.3.x, apache2.x, debian 6 on a dedicated server)
Don't cripple your development by mushing everything up in one file.
A speed up of 5ms is nothing compared to the pain you will feel maintaining such a beast.
To put it another way, a single incorrect index in your database can give you orders of magnitude more slowdown.
Your page would load faster using the "normal" version and omitting one 2kb image.
Don't do it, really just don't.
Or you can do this:
Leave the code as it is (located in
many different files)
Combine them in one file when you are ready to upload it to the production server
Here's what i use:
cat js/* > all.js
yuicompressor all.js -o all.min.js
First i combine them into a single file and then i minify them with the yui compressor.
I'm currently using PHP to include multiple css (or js) files into a single file (as well as compress the content using GZIP).
E.g. the HTML page calls resources like this...
<link rel="stylesheet" href="Concat.php?filetype=css&files=stylesheet1,stylesheet2,stylesheet3"></link>
<script src="Concat.php?filetype=js&files=script1,script2,script3"></script>
Example of my Concat.php file can be found here: http://dl.dropbox.com/u/3687270/Concat.php (feel free to comment on any problems with the code)
But instead of having to open up my command prompt and running YUI Compressor manually on my CSS/JS files I want the Concat.php file to handle this for at least the CSS side of things (I say CSS only because I appreciate that YUI Compressor does minification of variables and other optimisations so it isn't feasible to replicate in PHP - but that is part 2 of my question).
I know this can be done with some Regex magic and I haven't a problem doing that.
So, my question has 2 parts, which are:
1.) What is the performance implications of having the server minify using preg_replace on a CSS file (or set of CSS files that could have a few hundred lines of code per file - normally it would be a lot less but I'm thinking that if the server compresses the file then I wouldn't have to worry too much about extra whitespace in my CSS)
2.) And how can I get the JavaScript files that are concatenated via my Concat.php file run through YUI Compressor? Maybe run via the server (I have direct access to the server so I could install YUI Compressor there if necessary), but would this be a good idea? Surely optimising on the server everytime a page is requested will be slow and bad for the server + increase bandwidth etc.
The reason this has come up is that I'm constantly having to go back and make changes to existing 'compressed/minified' JS/CSS files which is a real pain because I need to grab the original source files, make changes then re-minify and upload. When really I'd rather just have to edit my files and let the server handle the minification.
Hope someone can help with this.
If your webserver is Apache, you should use mod_concat and let the Apache take care of compression using gzip,
http://code.google.com/p/modconcat/
You should minify the JS just once and save the minified version on servers.
As suggested in the comments you could use one of the pre-built scripts for that. They make use of YUI compressor as well as other solutions even if you can't run Java on the server.
The first one was probably PHP Speedy, which still works but has been abandoned.
A new one is Minify, which offers a lot of features including general caching solution depending on the server's capabilities (APC, Memcached, File cache).
Another advantage of these projects is that your URLs won't have query strings in them (contrary to your current method), which causes troubles in a lot of browsers when it comes to caching. They also take care of gzipping and handling Expires headers for your content.
So I definitely recommend that you try out one of these projects as they offer immediate positive effects, with some simple steps of configuration.
Here's how i recommend you do it:
Turn on GZIP for that specific folder (Web server level)
Use one of the tools to strip out whitespace and concat the files. This will serve as a backup for search engines/proxy users who don't have gzip enabled. You'd then cache the output of this - so the expensive regex calls aren't hit again.
The above wont be very expensive CPU wise if you configure your server correctly. The PHP overhead won't really be much - As you'll have a cached version of the CSS, ie.
-- css.php --
if (!isset($_GET['f'])) {
exit();
}
if (file_exists('path/to/cached/css/'.md5($_GET['f'])) {
// just include that file...
readfile('/path/to/cached/css/'.md5($_GET['f']));
exit();
}
$files = explode(',', $_GET['f']);
ob_start();
foreach ($files as $file)
{
readfile($file);
}
// set a header (including the etags, future expiration dates..)
Header(....);
echo ob_get_flush(); // remove whitespace etc..
// write a new cached file
file_put_contents('/path/to/cache/'.md5($_GET['f']));
exit();
You can then do href="css.php?f=style.css,something.css,other.css" the script will then make a cache file which is the md5 of those files included.
The above example isn't complete.. it's more pseudo really.
Each page on my website is rendered using PHP.
Each PHP file uses around 10 includes. So for every page that is displayed, the server needs to fetch 10 files, in addition to the rest of its functions (MySQL, etc).
Should I combine them into a single include file? Will that make ANY difference to the real-world speed? It's not a trivial task as there would be a spaghetti of variable scope to sort out.
Include files are processed on the server, so they're not "fetched" by the browser. The performance difference of using includes vs. copy and pasting the code or consolidating files is so negligible (and I'm guessing we're talking about in the 10 ms to 100 ms range, at the absolute most), that it isn't at all worth it.
Feel free to include and require to your heart's content. Clean code is substantially more important than shaving less than 100 ms off a page load. If you're building something where timing is that critical, you shouldn't be using PHP anyway.
What takes time is figuring out where the files are actually located in the include path. If you got multiple locations in your include path, PHP will search each location until it either finds the file or fails (in which case it throws an error). That's why you should put the include path where most of the included files are to be found on top of the include path.
If you use absolute paths in your include path, PHP will cache the path in the realpath cache, but note that this gets stale very quickly. So yes, including ten files is potentially slower than including one large file, simply because PHP has to check the include path more often. However, unless your webserver is a really weak machine, ten files is not enough to make an impact. This gets only interesting when including hundreds of files or have many locations to search, in which case you should use an OpCode cache anyway.
Also note that when including files, it is not good practice to include each and every file right at the beginning, because you might be including files that are never called by your application for a specific request.
Reference
http://de2.php.net/manual/en/ini.core.php#ini.include-path
http://de2.php.net/manual/en/ini.core.php#ini.sect.performance
http://en.wikipedia.org/wiki/List_of_PHP_accelerators
Although disk I/O operations among the biggest performance-eaters, a regular site won't notice any sensible number of includes.
Before you hit any problems with includes, you probably already would have some opcode cache that eliminates this problem too.
include\ andrequires` only open file on the server side, but that might be time consumming depending on the hardware/filesystem, etc.
Anyway, if you can, use autoloader. Only needed files will be loaded that way.
Then if you think included files are a source of slowdown (and I think there is a lot of other points to look for improvement before), you can try to automatically merge the files. You still have one file per class when developping, but you can build a file that contains each class' definition to have only one include (something like cat <all your included file>.php > to_include.php).
The question might prompt some people to say a definitive YES or NO almost immediately, but please read on...
I have a simple website where there are 30 php pages (each has some php server side code + HTML/CSS etc...). No complicated hierarchy, nothing. Just 30 pages.
I also have a set of purely back-end php files - the ones that have code for saving stuff to database, doing authentication, sending emails, processing orders and the like. These will be reused by those 30 content-pages.
I have a master php file to which I send a parameter. This specifies which one of those 30 files is needed and it includes the appropriate content-page. But each one of those may require a variable number of back-end files to be included. For example one content page may require nothing from back-end, while another might need the database code, while something else might need the emailer, database and the authentication code etc...
I guess whatever back-end page is required, can be included in the appropriate content page, but one small change in the path and I have to edit tens of files. It will be too cumbersome to check which content page is requested (switch-case type of thing) and include the appropriate back-end files, in the master php file. Again, I have to make many changes if a single path changes.
Being lazy, I included ALL back-end files inthe master file so that no content page can request something that is not included.
First question - is this a good practice? if it is done by anyone at all.
Second, will there be a performance problem or any kind of problem due to me including all the back-end files regardless of whether they are needed?
EDIT
The website gets anywhere between 3000 - 4000 visits a day.
You should benchmark. Time the execution of the same page with different includes. But I guess it won't make much difference with 30 files.
But you can save yourself the time and just enable APC in the php.ini (it is a PECL extension, so you need to install it). It will cache the parsed content of your files, which will speed things up significantly.
BTW: There is nothing wrong with laziness, it's even a virtue ;)
If your site is object-oriented I'd recommend using auto-loading (http://php.net/manual/en/language.oop5.autoload.php).
This uses a magic method (__autoload) to look for a class when needed (it's lazy, just like you!), so if a particular page doesn't need all the classes, it doesn't have to get them!
Again, though, this depends on if it is object-oriented or not...
It will slow down your site, though probably not by a noticable amount. It doesn't seem like a healthy way to organize your application, though; I'd rethink it. Try to separate the application logic (eg. most of the server-side code) from the presentation layer (eg. the HTML/CSS).
it's not a bad practice if the files are small and contains just definition and settings.
if they actually run code, or extremely large, it will cause a performance issue.
now - if your site has 3 visitors an hour - who cares, if you have 30000... that's another issue, and you need to work harder to minimize that.
You can migitate some of the disadvantages of PHP code-compiling by using XCache. This PHP module will cache the PHP-opcode which reduces compile time and performance.
Considering the size of your website; if you haven't noticed a slowdown, why try to fix it?
When it comes to larger sites, the first thing you should do is install APC. Even though your current method of including files might not benefit as much from APC as it could, APC will still do an amazing job speeding stuff up.
If response-speed is still problematic, you should consider including all your files. APC will keep a cached version of your sourcefiles in memory, but can only do this well if there are no conditional includes.
Only when your PHP application is at a size where memory exhaustion is a big risk (note that for most large-scale websites Memory is not the bottleneck) you might want to conditionally include parts of your application.
Rasmus Lerdorf (the man behind PHP) agrees: http://pooteeweet.org/blog/538
As others have said, it shouldn't slow things down much, but it's not 'ideal'.
If the main issue is that you're too lazy to go changing the paths for all the included files (if the path ever needs to be updated in the future). Then you can use a constant to define the path in your main file, and use the constant any time you need to include/require a file.
define('PATH_TO_FILES', '/var/www/html/mysite/includes/go/in/here/');
require_once PATH_TO_FILES.'database.php';
require_once PATH_TO_FILES.'sessions.php';
require_once PATH_TO_FILES.'otherstuff.php';
That way if the path changes, you only need to modify one line of code.
It will indeed slow down your website. Most because of the relative slow loading and processing of PHP. The more code you'd like to include, the slower the application will get.
I live by "include as little as possible, as much as necessary" so i usually just include my config and session handling for everything and then each page includes just what they need using an include path defined in the config include, so for path changes you still just need to change one file.
If you include everything the slowdown won't be noticeable until you get a lot of page hits (several hits per second) so in your case just including everything might be ok.