Background: I'm working on a single page web app that loads everything through AJAX and so started learning php a few days ago. I immediately thought of putting everything (html, css, javascript) in php files so that there is only one html file and one sprite request. For instance, external javascript could be stored in:
main.js.php (adding the .js for organizational purposes only) which would look like:
<script>
...
</script>
or
<style>
...
</style>
Question: Would storing everything in php be a bad idea? I have OCD and like to have related functions in separate files (and actually, folders too), so let's just say my project uses 100+ includes. These only get loaded exactly once, when the user visits (AJAX site). I want to reduce the number of Http Requests to just 1 html and 1 sprite (my app uses a custom font for images actually). My app will also be ran on mobile devices (using a different design, using far fewer includes but a similar method).
Research: Here's what I know:
You can have Apache handle js/css as php, but is not something I'm interested in (dangerous) - link
This site gave me the idea, although I don't quite understand it - 3 Ways to Compress CSS
Caching with something like APC (not sure how it works, but out of the scope of this question) would improve php speeds (?)
Whilst reducing the number of HTTP requests is a good thing, it has problems too. If you combine CSS and Javascript into the HTML file, you increase the page size. You also make it impossible for the browser to cache the CSS and Javascript files too - so those assets are being redownloaded over and over, instead of just once.
Also - serving static files is much faster than serving via PHP. PHP is slow.
Combining all images, however, is a great idea. This is what Google uses: http://www.google.co.uk/images/nav_logo91.png
Overall, I say don't bother. Keep CSS and Javascript separate, combine images where possible and - yes - definitely use APC if you can. Remember though - don't optimise your site prematurely. Get the content in there first, develop the features, etc. Making your site fast can come at the expense of reduced maintainability - I know that from experience.
some points to consider:
1. code - text ratio:
your content pages are read by google. when google is ranking you pages, one of the parameter is the ratio of code versus the textual content. if you put your css/js code together with the content, you lower the ratio. (btw, one of the arguments for using divs instead of tables is that tables normally will take more html code and lower the ratio).
EDIT: this is a theory and not really known fact. it's important that the html code will be syntactically correct, so it will be easier to parse by search engine parsers. some say that google ignores the content that comes after the first 100kb, so it's also something to consider.
2. nginX
i have nginx installed with apache as a reversed proxy to handle php.
nginx is an http server, that knows how to handle static pages. apache's design is thread per client, while nginx uses the reactor pattern, meaning - nginx can handle much more traffic than apache as a web server (about 50 times the number of requests).
the drawback is that nginx doesn't handle the php requests, and for that the apache is installed too - nginx will send all the php calls to the apache, so it will handle them and return the response back to nginx, and back to the client.
if in that setup (which is quite common) you will put css/js files under javascript, you will lose the advantage of the nginx, which instead of handling the static js/css files on its own, it will send them to the apache as it'll address them as php pages.
3. cache
caching files is one of the most common mechanisms to improve your website performance, while reducing traffic. if you mix static content with dynamic content, you will lose the advantage you get from caching static files.
when working in web environment, it's best (as a habbit) to keep as much static content as you can separated from dynamic content. this will give you the best results when caching static data.
of course, there are no rules for what should and what shouldn't. i have many dynamic js contents, but the main functions are normally extracted to static files.
4. CSS sprites
css sprites (as #Muu mentioned) are a great improvement to performance and should definitely be adopted.
another recommendation more specific to your case - if you want your content indexed properly - since you mentioned that most data will be loaded using ajax, i'd recommend to have it present also without ajax. for example: www.domain.com/ will have a link to #contact, that will show the form (loaded using ajax). you should also have www.domain.com/contact for indexing. also make sure that if a user enters www.domain.com/#contact - he will be redirected to the contact page (or content will be loaded dynamically).
use browsers' web dev tools to see what requests are being made and see where you can lower the number of requests, and also pay attention to file sizes, see which files are cached and which are requested by the server. define cache attributes in you server config and htaccess.
hope that helps ;)
PS: another tip - if you get water spilled all over your keyboard - don't try to dry it with a hair dryer - it could melt down your keys...
I would keep CSS and JS out of the PHP unless it is dynamically generated. Static pages are usually faster and easier to serve by the webserver. Keeping the files separate also allows for the client browser to cache some data used repeatedly (e.g. the CSS file).
Still, make sure you use compression on the individual files. Some useful links in the Google "Page Speed" documentation.
Also, bytecode caching improves speed for sure. Doesn't have to compile every time.
Related
I am rewriting a HTML site into PHP. There are various changing menus I want to make php calls:
<?php include("header.php");?>
With the footer, and sidebars I have 4 or 5 php includes on each page.
How much am I slowing the pageload with 5 php calls? If I want a fast load is it worth sacrificing sitewide editabitity and calling less php pages? Or its only a few milliseconds?
Is there any difference speed wise between calling 2 css files or 2 php files?
( What is a good caching system to use for such simple php calls? )
For static files, like css files, merging them will decrease page loading time. Because these files are not server-side files.
Clients send more than one request for downloading these files. It will effect loading time. But php files are server-side files.
It won't effect loading time too much (if the files are not complicated too much).
That is a server-side include and the browser doesn't have to make a separate request for it, so it should be only a few milliseconds to process each include.
including files costs ~nothing
The act of including a file in php is negligible, less than 1ms. Splitting a file into several chunks and including the component files will have no noticeable difference in performance compared to including one file with the equivalent markup/php logic in it.
static files are always faster than php
Serving a css file with a webserver (apache) will always be faster and more efficient than making a request to a php file - as a webserver can handle serving static files (and appropriate headers) without involving php at all. In simplistic terms: Less processes/logic means faster performance.
As said by #Jordan Denison it a server-side include and so should not take much time. One thing more, If you will include your page from an another domain then it will cause performance issues because then PHP has to go out to the internet connect to DNS and that all stuff but if it is on the same domain or in the same root then it should not take much time.
I've searched for this, but can't find anything - maybe I'm describing it badly!?
Anyway - I have a website (on an IIS 6 server) whose pages loads 2 or 3 CSS files, these css files are actually ASP files with the response headers set accordingly.
The ASP in the files simply reads the query string to set colours of various css rules based on user preferences.
I've noticed that sometimes pages are loading very slowly, and using the dev tools in Chrome (although this seems to apply in all browsers) I could see the page load is stalling on these CSS files, the reported latency can be up to 2 minutes - everything else only takes a few milliseconds.
I've tried using PHP files instead of ASP files, but this makes no difference!
The website is behind password so I can't really demo it easily - although could try and set something up if it would help - Although it isn't consistent, it seem to happen most of the time, but sometimes can be quite fast!
Any ideas what I could try?
Thanks
A couple of things you can try are using Google's page speed or Yahoo's YSlow - both will generate suggestions for you to help improve performance.
I keep seeing all these cool new frameworks for web dev, but am very confused, because 95% of the info I read is all just hype, how do they work?
Is it as simple as providing a link in your html to a server that hosts the framework? Or do you have to download the framework, and install it on your own server?
Do web frameworks work with Winhost.com (windows-based hosting with php), or the many other windows-based hosting providers? Sorry if this is a stupid question, but the pages I have visited are very confusing!
Most of the frameworks would require you to download them and re upload them to your hosting.
Since having some crazy requirements would hit the popularity of such framework, most of the populars one tends to have as less as possible requirements. I.e. you don't need to have specific PHP extensions or PHP settings, so it would be possible to use them on any hosting(PHP5 hosting, zf, symphony and other don't play well with PHP4).
In term of what a framework brings you, you can see a framework as a big code base the you can use to make your development faster. You don't have to reinvent the wheel. Plus a framework would force you to code more cleanly.
Generally speaking and in a nutshell, they allow you to generate HTML (with code) instead of providing static pages to the users. This also means you get to code less and don't repeat yourself.
PHP and Ruby on Rails are examples of web frameworks. You have to get them installed on a server.
Here's how it works.
Static HTML page is the oldest type of webpage. You write some HTML code, and when the server receive request from browser it parses the URL and determine which HTML file corresponds to the URL.
Dynamic page, is similar to static HTML page; but instead of writing HTML code, you write PHP/ASP/Python/CGI/etc code that writes HTML code.
As it happens, a lot of dynamic websites shares a large chunk of similar PHP/ASP/Python/CGI/etc code. A web framework is a set of pre-written code someone else have written; so instead of you writing the code, you offload half of the code-writing to the web framework's authors.
Different framework have different requirements. The simplest are just several simple PHP pages you can include() into your own codes (i.e. installing is a matter of copying the PHP pages into the same directory as your own code). The more complex one might reverse the role, they take control how the page is processed (i.e. installation is more involved, they might need to tweak the server's configurations).
I'm currently using PHP to include multiple css (or js) files into a single file (as well as compress the content using GZIP).
E.g. the HTML page calls resources like this...
<link rel="stylesheet" href="Concat.php?filetype=css&files=stylesheet1,stylesheet2,stylesheet3"></link>
<script src="Concat.php?filetype=js&files=script1,script2,script3"></script>
Example of my Concat.php file can be found here: http://dl.dropbox.com/u/3687270/Concat.php (feel free to comment on any problems with the code)
But instead of having to open up my command prompt and running YUI Compressor manually on my CSS/JS files I want the Concat.php file to handle this for at least the CSS side of things (I say CSS only because I appreciate that YUI Compressor does minification of variables and other optimisations so it isn't feasible to replicate in PHP - but that is part 2 of my question).
I know this can be done with some Regex magic and I haven't a problem doing that.
So, my question has 2 parts, which are:
1.) What is the performance implications of having the server minify using preg_replace on a CSS file (or set of CSS files that could have a few hundred lines of code per file - normally it would be a lot less but I'm thinking that if the server compresses the file then I wouldn't have to worry too much about extra whitespace in my CSS)
2.) And how can I get the JavaScript files that are concatenated via my Concat.php file run through YUI Compressor? Maybe run via the server (I have direct access to the server so I could install YUI Compressor there if necessary), but would this be a good idea? Surely optimising on the server everytime a page is requested will be slow and bad for the server + increase bandwidth etc.
The reason this has come up is that I'm constantly having to go back and make changes to existing 'compressed/minified' JS/CSS files which is a real pain because I need to grab the original source files, make changes then re-minify and upload. When really I'd rather just have to edit my files and let the server handle the minification.
Hope someone can help with this.
If your webserver is Apache, you should use mod_concat and let the Apache take care of compression using gzip,
http://code.google.com/p/modconcat/
You should minify the JS just once and save the minified version on servers.
As suggested in the comments you could use one of the pre-built scripts for that. They make use of YUI compressor as well as other solutions even if you can't run Java on the server.
The first one was probably PHP Speedy, which still works but has been abandoned.
A new one is Minify, which offers a lot of features including general caching solution depending on the server's capabilities (APC, Memcached, File cache).
Another advantage of these projects is that your URLs won't have query strings in them (contrary to your current method), which causes troubles in a lot of browsers when it comes to caching. They also take care of gzipping and handling Expires headers for your content.
So I definitely recommend that you try out one of these projects as they offer immediate positive effects, with some simple steps of configuration.
Here's how i recommend you do it:
Turn on GZIP for that specific folder (Web server level)
Use one of the tools to strip out whitespace and concat the files. This will serve as a backup for search engines/proxy users who don't have gzip enabled. You'd then cache the output of this - so the expensive regex calls aren't hit again.
The above wont be very expensive CPU wise if you configure your server correctly. The PHP overhead won't really be much - As you'll have a cached version of the CSS, ie.
-- css.php --
if (!isset($_GET['f'])) {
exit();
}
if (file_exists('path/to/cached/css/'.md5($_GET['f'])) {
// just include that file...
readfile('/path/to/cached/css/'.md5($_GET['f']));
exit();
}
$files = explode(',', $_GET['f']);
ob_start();
foreach ($files as $file)
{
readfile($file);
}
// set a header (including the etags, future expiration dates..)
Header(....);
echo ob_get_flush(); // remove whitespace etc..
// write a new cached file
file_put_contents('/path/to/cache/'.md5($_GET['f']));
exit();
You can then do href="css.php?f=style.css,something.css,other.css" the script will then make a cache file which is the md5 of those files included.
The above example isn't complete.. it's more pseudo really.
This concept is a new one for me -- I first came across it at the YUI dependency configurator. Basically, instead of having multiple requests for many files, the files are chained into one http request to cut down on page load time.
Anyone know how to implement this on a LAMP stack? (I saw a similar question was asked already, but it seems to be ASP specific.
Thanks!
Update: Both answers are helpful...(my rep isn't high enough to comment yet so I'm adding some parting thoughts here). I also came across another blog post with PHP-specific examples that might be useful. David's build answer, though, is making me consider a different approach. Thanks, David!
There are various ways, the two most obvious would be:
Build a tool like YUI which builds a bespoke, unique version based on the components you ticked as required so that you can still serve the file as static. MooTools and jQuery UI all provide package-builders like this when you download their package to give you the most streamlined and effecient library possible. I'm sure a generic all purpose tool exists out there.
Create a simple Perl/PHP/Python/Ruby script that serves a bunch of JavaScript files based on the request. So "onerequest.js?load=ui&load=effects" would go to a PHP script that loads in the files and serves them with the correct content-type. There are many examples of this but personally I'm not a fan.
I prefer not to serve static files through any sort of script, but I also like to develop my code with 10 or so seperate small class files without the cost of 10 HTTP requests. So I came up with a custom build process that combines all the most common classes and functions and then minifies them into a single file like project.min.js and have a condition in all my views/templates that includes this file on production.
Edit - The "custom build process" is actually an extremely simple perl script. It reads in each of the files that I've passed as arguments and writes them to a new file, optionally passing the entire thing through JSMIN (available in all your favourite languages) automatically.
At the command like it looks like:
perl build-project-master.pl core.js class1.js etc.js /path/to/live/js/file.js
There is a good blog post on this # http://www.hunlock.com/blogs/Supercharged_Javascript.
What you want is Minify. I just wrote a walkthrough for setting it up.
Capistrano is a fairly popular Ruby-based web deployment tool. If you're considering it or already using it, there's a great gem that will figure out CSS and Javascript dependencies, merge, and minify the files.
gem install juicer
From the Juicer GitHub page, it can figure out which files depend on each other and merge them together, reducing the number of http requests per page view, thus improving performance.