If JavaScript and CSS files were included inside of pages it would cut down the number of http requests and therefore make the page load faster. I feel like I am missing something because it seems like any organization interested in lightning-quick pages would do this. However, I don't recall any sites having tons of CSS and JavaScript into their pages as I look at the source code.
Questions:
What errors are in my statements above?
What are the drawbacks of this approach (shown in the title via psuedocde)?
If the data is in an external file it can be cached and reused on other pages (or the same page, revisited) without having to fetch it over the network again.
You get a minor performance penalty on the first page in exchange for a major performance enhancement on subsequent pages.
Modularity is a major concern:
I can pick and choose which javascript and css files I want per page: otherwise I'd have a ton of css and javascript files that have all the different configurations (which is just messy).
I can also cache a file and hand it to someone else faster
Where you will find an example of this happening is when sites chuck their images together into one png file and then use css to slice up the bits they want for buttons etc.
Another aspect not only for inline css and jscript. When I write code I hate to repeat. It leads to errors is difficult to maintain (update/edit) and a waste of time and space. Printing CSS or jscript once in a file that gets downloaded once is less error prone, easy to maintain and less waste of time and space.
Related
I'm using CakePHP to build my site (if that matters). I have a TON of elements/modules each having their own file and fairly complicated CSS (in some cases).
Currently the CSS is in a massive single CSS file, but for sanity sake (and the below mentioned details), I would like to be able to keep the CSS in it's own respective file - ie css/modules/rotator.css. But with normal CSS, that would call a TON of CSS files.
So, I started looking into SASS or LESS per recommendation. But - it seems these are supposed to be compiled then uploaded. But in my case, each page is editable via the CMS, so a page might have 10 modules one minute, then after a CMS change it could have 20 or 5...etc. And I don't want to have to compile the CSS for every module if it's not going to use it.
Is there a way I can have a ton of CSS files that all compile on the fly?
Side note: I'd also like to allow the user to edit their own CSS for a page and/or module, which would then load after the default CSSs. Is this possible with SASS and/or LESS?
I don't need a complete walkthrough (though that would be awesome), but so far my searches have returned either things that are over my head related to Ruby on Rails (never used) or generic tutorials on each respective CSS language.
Any other recommendations welcome. I'm a complete SASS/LESS noob.
Clarified question:
How do I dynamically (server-side) combine multiple CSS files using LESS? (even a link to a resource that would get me on the right track is plenty!)
If you want to reduce the number of CSS files & you have one huge css file that has all the component css, just link to it on all pages & make sure you set cache headers properly.
They will load the file once and use it everywhere. The one pitfall is initial pageload time; if that's not an issue go with this solution. If it is an issue consider breaking down your compiled CSS files to a few main chunks (default.css, authoring.css, components.css eg.).
Don't bother trying to make a custom css for each collection of components, you will actually be shooting yourself in the foot by forcing users to re-download the same CSS reorganized in different ways.
Check out lessphp (http://leafo.net/lessphp/). It's a php implementation of less and can recompile changed files by comparing the timestamp.
Assuming that 'on the fly' means 'on pageload', that would likely be even slower than sending multiple files. What I would recommend is recompiling the stylesheets whenever a module is saved.
The issue of requiring only necessary modules should be solved by means of CMS. It has nothing to do with SASS or LESS.
If your CMS is aware of which modules current page has, do not run a SASS/LESS compilation (it will be painfully slow unless you implement caching which is not a trivial task). Instead, adjust your CMS's logic so that it includes each module's CSS file.
Advanced CMSs like Drupal not only automatically fetch only necessary CSS files, but also assemble them into a single file and compress it.
And if your CSS is not aware of which modules current page has (e. g. "modules" are simply HTML code that is saved into post body), then you can't really do anything.
UPD: As sequoia mcdowell says in his answer, making users download one large CSS file once is better than making them download a number of lesser CSS files that contain duplicate code. The cumulative size of all those smaller CSS files will turn out to be larger than the size of a full CSS file.
What are the benefits of using a CSS external stylesheet over a php file set to content-type: text/css? If you place this header at the top of a PHP file I feel like you have so much more potential:
<?php
header("Content-type: text/css");
if($query_string = "contact_us") {
#nav {}
}
?>
(^ that's a .php file). http://shapeshed.com/using_php_to_enhance_css/
If there are no downfalls (and I checked how they were cached in Chrome's Network Panel and I believe it's the same), isn't it kind of like whether to use .html or .php?
Thanks for any feedback.
Here are some differences:
PHP will create extra CPU / memory overhead compared with serving purely static CSS. Probably not all that much, but still a consideration. The more processing you do the bigger a deal it is, obviously.
You can't push PHP files to a CDN
SASS and LESS have been developed to deal with any dynamic features you might need so PHP isn't likely necessary
Sounds like you're worried about serving more CSS than is needed for certain pages. Realistically it's not an issue since browsers will cache the CSS after the first download.
Additional thoughts:
I wrote a UI template engine that isolates both JS and CSS code to only the specific views on which they are used. If a CSS or JS is used more than once, it is pushed to the "kitchen sink level" and included globally. This limits selector conflicts and also best balances the number of HTTP requests and total download size per request. Also keeping relevant code (i.e. button event listeners or page / element-specific styles) close together helps with more rapid programming, especially for non-expert teams / developers.
That could have been nice before CSS preprocessors such as SASS or LESS.
Dynamic CSS isn't even as useful as dynamic JavaScript, much less dynamic HTML. Having one large file with all the rules in it is more effective than having a file with rules that change since you can more easily cache the former in the client than the latter.
The reality is that there isn't much benefit to this...dynamic CSS doesn't have many valid use-cases, and the one you're describing certainly isn't one of them. If you just combine and minify all of your css into a single file, it'll cache on the client and be downloaded only when you bust the cache it.
In a modular CMS, this could be useful. As long as your application could generate a .php URL that consistently generates the exact same CSS for caching purposes, you could dramatically reduce the amount of CSS to download. For example, pages that uses 1 theme and 5 modules (each providing CSS) could return the CSS for that combination, instead of the CSS for 1 theme and 50 modules. That could be the difference between 50KB of CSS and 500KB -- a huge savings for slower connections.
If your website is hand-made, as in a website that has specific goals known ahead of time, then there's really not a good reason to do this as others have answered.
Today I read some articles about compressing multiple JS and CSS files to one in order to lower the bandwith and HTTP requests. Is there analogical situation with the php files? Is it better to create master pages that include few PHP files(header.php,footer.php and etc)in order to have a well formatted and readable code than having just a large index.php?
PHP is run on the server, and as such, only the resulting HTML is sent over to the client. Keep your files separated for clarity, it doesn't make a difference in the HTTP requests or bandwidth.
Like Xeon06 mentioned, PHP is a server-side script, and doesn't affect a bandwidth, which is size the of the content transferred through HTTP. Size of the bandwidth based on HTML, CSS, JS, Graphics, Flash and anything client-side files. Because browsers cache these kind of files, it reduces the bandwidth, because browser will get the file content from cache and not download from server again.
If I were in your shoes, I would still use includes for headers, footers or whatever is common in most pages. It just makes it easier to maintain a content and code. For example, if you have a feedback form on all pages and you want to edit a field name, link or something else, it's easier to edit one file instead of "Find and Replace" through all files you have, which doesn't work well anyways.
I run a PHP based landing page with a big header graphic and a div with the common JavaScript sharing buttons like Twitter, Stumpleupon and Facebook below. These buttons are slowing down the loading process for everthing that's below.
So I'ld like that the important parts of the website show up first and the less important parts should get loaded at the end.
How to archive that?
Thanks
Michael
The simplest way is to move all the JavaScript code to the bottom of the document. It may require some modification (i.e. use DOM functions instead of document.write) or restyling, but will make the site usable before these gadgets are fully loaded. Setting the async and defer is an elegant, but also complicated, alternative.
Concatenating multiple JavaScript documents helps, too, especially with older browsers with a low number of concurrent connections. You can also combine graphics(mostly icons/logos etc) with CSS sprites. On the cutting edge, data: URLs allow embedding images into the source of the HTML document. Read more about these techniques in the Yahoo Best Practices.
Additional speedup can be gained by gzipping HTML and CSS documents. JavaScript files can be compressed too, but minification (for example with the YUI compressor) tends to achieve even greater gains. You should also specify caching directives for static resources.
On the server side, you should really be using a php bytecode cacher like APC. Google has some additional tips on php best practices.
General resources:
Google: Let's make the web faster
Yahoo: Best Practices for Speeding Up Your Web Site
Great question, and one which better authors than I have written mountains about. In fact, give this article by Yahoo! staff a shot - it's the definitive document on the subject, and pretty easy to follow:
http://developer.yahoo.com/performance/rules.html
Another answer suggests moving your Javascript to the bottom of the page. This is likely to help, but won't solve all your problems and won't do much to help your images load. From Yahoo!'s guide, you would do well trading numerous smaller images for single images contained in a CSS spritesheet to cut down on HTTP request overhead, and make sure to enable caching for all your content.
You can also (advanced!) do clever, tricky stuff like only putting the important stuff in the document at all, and having javascript (located at the bottom of the page) dynamically load in the "extras" using AJAX after the initial load is complete. Spiffy!
Like others are saying you need to put javascript at the bottom. Maybe headjs library makes this task a little bit easier for you?
I'm creating a website from scratch and I was really into this in the late 90's but the web has changed alot since then! And I'm more of a designer so when I started putting this site together, I basically did a system of php includes to make the site more "dynamic"
When you first visit the site, you'll be presented to a logon screen, if you're not already logged on (cookies). If you're not logged on, a page called access.php is introdused.
I thought I'd preload the most heavy images at this point. So that when the user is done logging on, the images are already cached. And this is working as I want. But I still notice that the biggest image still isn't rendered immediatly anyway. So it's seems kinda pointless.
All of this has made me rethink how the site is structured and how scripts and css files are loaded. Using FireBug and YSlow with Firefox I see a few pointers like expires headers and reducing the size of each script. But is this really the culprit?
For example, would this be really really stupid in the main index.php? The entire site is basically structured like this
<?php
require("dbconnect.php");
?>
<?php
include ("head.php");
?>
And below this is basically just the body and the content of the site.
Head.php however consists of the doctype, head portions, linking of two css style sheets, jQuery library, jQuery validation engine, Cufon and Cufon font file, and then the small Cufon.Replace snippet.
The rest of the body comes with the index.php file, but at the bottom of this again is an include of a file called "footer.php" which basically consists of loading of a couple of jsLoader scripts and a slidepanel and then a js function.
All of this makes the end page source look like a typical complete webpage, but I'm wondering if any of you can see immediatly that "this is really really stupid" and "don't do that, do this instead" etc. :) Are includes a bad way to go?
This site is also pretty image intensive and I can probably do a little more optimization.
But I don't think that's its the primary culprit. YSlow gives me a report of what takes up the most space:
doc(1) - 5.8K
js(5) - 198.7K
css(2) - 5.6K
cssimage(8) - 634.7K
image(6) - 110.8K
I know it looks like it's cssimage(8) that weighs the most, but I've already preloaded these images from before and it doesn't really affect the rendering.
To speed a little, you could assemble all your images on the same image sprite, so that you have only 1 request to download all the images. But that requires you to fine tune your css to let display just the small subset of your image.
To have a better explanation, check out : http://css-tricks.com/css-sprites/
Another answer that could seem a little stupid but I like to think of this when I make a website : Just Keep It Simple. I mean do all your JS add real value, do all this images are fine, could you display less, make a lighter design ? I'm not criticizing your work at all, just suggest you...
I used the following approach on an extranet project:
Using jQuery and a array of file names, I ajax in all the images, .js and .css files so that they are preloaded in the cache. As I iterate through the array, I update a progress bar on the screen that indicates that the site is loading - much like a flash loader.
It worked well.
What I will do is show by default the loading page with pure CSS and HTML then wait for the jQuery to load and preload the images with ImageLoader. Once you are done redirect to the normal website since the images will be already in the cache they won't be loaded again.
Another optimization you can do is minify all JS files and combine all except the jquery.js. Put the jquery.js first into your HTML so it loads first. Also put your SCRIPT tags at the bottom of the HTML.
It sounds like you have pretty much nailed preloading, if you have loaded it once, and the expiry header is set correctly, you have preloaded it, no matter what kind of content it is.
File combination can be key to a quick website, each extra file will add load time, in the worst cases of network and server lag you might add up to a second extra for each separate file. More commonly it will be around 100 - 200 milliseconds per file.
If not already minified, minify the scripts, and put them in the same file, just remember to keep the order. I have no idea why Ivo Sabev wouldn't include jQuery.
Same thing with the CSS files.
How much have you done about testing image compression? There can really be a gain from trying out different compression settings and comparing size vs. quality. For PNG images IrfanView with PNGOUT can often make files 25% smaller than other programs, on top of that, a very big gain in size reduction can be achieved by reducing the image to 8 bit colour, with a lot of graphic elements you simply can't tell the difference. Right here on Stack Overflow there is a great example of well compressed and stacked images in the editor control buttons: http://sstatic.net/so/Img/wmd-buttons.png