Caching fonts on simple website to prevent FOIT? - php

My website is incredibly slow, or at least the fonts are, and I am getting massive amounts of FOIT. I have solved this by putting a spinner on the frontpage, which does not display the website until the fonts are loaded. However, when I then visit other pages, I still get FOITs - how come? I thought the font was loaded now, and would not have to be loaded again?
If this is not the case, can i somehow cache it and save it, once its loaded?
I am using normal HTML, PHP, CSS.

See this post which describes how to use a font face loader and monitor to avoid "Flashes Of Invisible Text" (FOIT.)
To answer specifically regarding caching of fonts once loaded: this should happen, so there may be one of the following issues occurring:
Your server is not sending the appropriate cache control headers for the fonts resulting in them being re-requested on subsequent pages.
Subsequent pages are referencing different font files or font files at different URLs than your main page.

Related

Images in cache not getting used

I'm working on this web app for which I'd like to cache some static content to lower loading times on a slow connection. Other parts are being generated through PHP and AJAX and can therefor not be cached.
I'm successfully using the cache.manifest to cache the static content, such as images, but I notice that the dynamic pages aren't using them. Instead they are downloading the images again, even though they have been cached, making the caching useless.
Because my dynamic content is generated based on GET-requests the dynamic pages are in the NETWORK section of the manifest (e.g. "user?*"). Might that have something to do with it? Is there a way to maybe force using cached images?
Note: the exact full reference to the dynamic pages (e.g. "user?id=22") is of course not in the manifest. Does that make the cache unreachable for those pages?
Seems I've come up with my own solution.
Since caching dynamic pages (e.g. "user.php?id=22") broke my application I tried inserting an iframe containing a reference to the manifest instead of referencing it on the page itself. That did not seem to do it for me.
I now include an explicit reference with a wildcard to the dynamic pages in the CACHE section of the manifest. That does seem to work. They are now not getting cached entirely with parameters et al. (so the application remains functional) but are using the cached images. Might not be ideal but seems to be a pretty simple solution.
I have not yet tried removing those iframes but my guess is that they can be removed.
CACHE MANIFEST
./user.php
./user.php?*
etc...
NETWORK:
./db_connect.php
etc...

SASS and/or LESS - can I create dynamic CSS files on the fly?

I'm using CakePHP to build my site (if that matters). I have a TON of elements/modules each having their own file and fairly complicated CSS (in some cases).
Currently the CSS is in a massive single CSS file, but for sanity sake (and the below mentioned details), I would like to be able to keep the CSS in it's own respective file - ie css/modules/rotator.css. But with normal CSS, that would call a TON of CSS files.
So, I started looking into SASS or LESS per recommendation. But - it seems these are supposed to be compiled then uploaded. But in my case, each page is editable via the CMS, so a page might have 10 modules one minute, then after a CMS change it could have 20 or 5...etc. And I don't want to have to compile the CSS for every module if it's not going to use it.
Is there a way I can have a ton of CSS files that all compile on the fly?
Side note: I'd also like to allow the user to edit their own CSS for a page and/or module, which would then load after the default CSSs. Is this possible with SASS and/or LESS?
I don't need a complete walkthrough (though that would be awesome), but so far my searches have returned either things that are over my head related to Ruby on Rails (never used) or generic tutorials on each respective CSS language.
Any other recommendations welcome. I'm a complete SASS/LESS noob.
Clarified question:
How do I dynamically (server-side) combine multiple CSS files using LESS? (even a link to a resource that would get me on the right track is plenty!)
If you want to reduce the number of CSS files & you have one huge css file that has all the component css, just link to it on all pages & make sure you set cache headers properly.
They will load the file once and use it everywhere. The one pitfall is initial pageload time; if that's not an issue go with this solution. If it is an issue consider breaking down your compiled CSS files to a few main chunks (default.css, authoring.css, components.css eg.).
Don't bother trying to make a custom css for each collection of components, you will actually be shooting yourself in the foot by forcing users to re-download the same CSS reorganized in different ways.
Check out lessphp (http://leafo.net/lessphp/). It's a php implementation of less and can recompile changed files by comparing the timestamp.
Assuming that 'on the fly' means 'on pageload', that would likely be even slower than sending multiple files. What I would recommend is recompiling the stylesheets whenever a module is saved.
The issue of requiring only necessary modules should be solved by means of CMS. It has nothing to do with SASS or LESS.
If your CMS is aware of which modules current page has, do not run a SASS/LESS compilation (it will be painfully slow unless you implement caching which is not a trivial task). Instead, adjust your CMS's logic so that it includes each module's CSS file.
Advanced CMSs like Drupal not only automatically fetch only necessary CSS files, but also assemble them into a single file and compress it.
And if your CSS is not aware of which modules current page has (e. g. "modules" are simply HTML code that is saved into post body), then you can't really do anything.
UPD: As sequoia mcdowell says in his answer, making users download one large CSS file once is better than making them download a number of lesser CSS files that contain duplicate code. The cumulative size of all those smaller CSS files will turn out to be larger than the size of a full CSS file.

Compress images on load that are within frames

We have developed this site: http://www.aloha-connect.com/
Our issue is that within this page, we have many frames.. in some cases, frame within a frame.
Our issue is, that on connections with low bandwidth. It can take a long time to load. We have tried using the php gzip code: to load, but we are noticing that the frame contents are not loading as quick. We then tried putting the code within the frame pages as well and didn't make any difference.
Same happens on this page as well http://aloha-connect.com/rates/
Any support/suggestions appreciated.
take a look at and attempt to fix any high and medium priority items from Google pagespeed
get rid of the unnecessary frames (all of them). You tagged this as PHP, so use PHP include() to include content from other files. (As noted by #cryptic, you will have to edit the html from your included files so it will display properly)
use CSS Sprites when you have lots of icons and small images. put your icons in one image, and then use CSS to only show the correct icon: http://css-tricks.com/css-sprites/
compress all your CSS into one css file and all your JS into one minified JS file. How to minify JS or CSS on the fly
use a CDN for jQuery.
Are all the frames served from the same domain?
Keep in mind that browsers restrict the number of requests to the same domain. I think this is currently set to 2. This includes HTML pages, images, scripts, CCS files.
A common workaround is to host some resources on other domains, like cdn1.mydomain.com and cdn2.mydomain.com. This will allow the browser to fetch more resources at once, potentially all of them at once.
If you open the Net tab in Firebug or the Network tab in Chrome you can see which requests are pending.

Loading my website is still slow after gzip, md5-call-cachefiles-in-server, unlimited cache control, need help in cacheing

So, I'm in to cache everything on my website called http://apolloinvest.hu.
I sending gzipped, optimized images, js, css, and everything also the whole site is gzipped, the JS files are loads deferred, with LAB, and everything must be fantastic, I also made a browser cache. But my site is still loads for 1 sec to load any page, and not instantly do it.
Could you help me please, why?
My redbot andswer is: http://redbot.org/?uri=http%3A%2F%2Fapolloinvest.hu%2F
Google PageSpeed rank is 99/100 (Because I don't want to remove the comments from the jquery UI)
The answer for CSS files: http://redbot.org/?uri=http%3A%2F%2Fapolloinvest.hu%2Fda232d78aa810382f2dcdceae308ff8e.css
For JS files: http://redbot.org/?uri=http%3A%2F%2Fapolloinvest.hu%2F5ec01c6d8ca5258bf9dcef1fc6bfb38c.js
So to tell the true I dont know what is the matter, with my caching or my JSes. Thanks for the help guys.
Répás
The site is pretty fast as it is, but here are a few possible improvements:
Directly render the HTML page instead of using JavaScript to do so. Put all the <script> elements at the bottom of the HTML document (just before </body>) so that the browser can render the page even before the JavaScript code is downloaded.
You can concatenate all the JavaScript files into one. Currently, http://apolloinvest.hu/475a641fc1d70f7c92efa3488e27568f.js is just empty.
If possible, serve static content such as JavaScript files and styles with Cache-Control and Expires headers far in the future.
A couple of unrelated notes:
The site is not valid HTML. The additional overhead caused by the browser transforming it to valid HTML does not matter, but the readability (and compatibility) does.
Your stylesheet is restricted to screen. When printed out (or viewed on another non-screen device), it looks ugly.
The site breaks for users without JavaScript. It's just showing a loading bar, forever.
I sending gzipped, optimized images, js, css, and everything also the whole site is gzipped, the JS files are loads deferred, with LAB
THAT IS exactly your problem.
Instead of doing all that fancy stuff, you had to profile your application first, determine a certain bottleneck and then optimize the exact part that is causing the slowness.
Let me suggest you to start from the "Net" tab of Firebug where you can watch actual response times of the requests. It is very likely that your code runs FAST but some JS-based web-counter prevents the page from displaying immediately.
if it's 1 second that takes for the PHP code to execute - time to profile it. Xdebug or simple microtime(1)-based manual profiling can tell you where is the problem. Once you find it, you'll be able to ask more certain question here.

How to properly preload images, js and css files?

I'm creating a website from scratch and I was really into this in the late 90's but the web has changed alot since then! And I'm more of a designer so when I started putting this site together, I basically did a system of php includes to make the site more "dynamic"
When you first visit the site, you'll be presented to a logon screen, if you're not already logged on (cookies). If you're not logged on, a page called access.php is introdused.
I thought I'd preload the most heavy images at this point. So that when the user is done logging on, the images are already cached. And this is working as I want. But I still notice that the biggest image still isn't rendered immediatly anyway. So it's seems kinda pointless.
All of this has made me rethink how the site is structured and how scripts and css files are loaded. Using FireBug and YSlow with Firefox I see a few pointers like expires headers and reducing the size of each script. But is this really the culprit?
For example, would this be really really stupid in the main index.php? The entire site is basically structured like this
<?php
require("dbconnect.php");
?>
<?php
include ("head.php");
?>
And below this is basically just the body and the content of the site.
Head.php however consists of the doctype, head portions, linking of two css style sheets, jQuery library, jQuery validation engine, Cufon and Cufon font file, and then the small Cufon.Replace snippet.
The rest of the body comes with the index.php file, but at the bottom of this again is an include of a file called "footer.php" which basically consists of loading of a couple of jsLoader scripts and a slidepanel and then a js function.
All of this makes the end page source look like a typical complete webpage, but I'm wondering if any of you can see immediatly that "this is really really stupid" and "don't do that, do this instead" etc. :) Are includes a bad way to go?
This site is also pretty image intensive and I can probably do a little more optimization.
But I don't think that's its the primary culprit. YSlow gives me a report of what takes up the most space:
doc(1) - 5.8K
js(5) - 198.7K
css(2) - 5.6K
cssimage(8) - 634.7K
image(6) - 110.8K
I know it looks like it's cssimage(8) that weighs the most, but I've already preloaded these images from before and it doesn't really affect the rendering.
To speed a little, you could assemble all your images on the same image sprite, so that you have only 1 request to download all the images. But that requires you to fine tune your css to let display just the small subset of your image.
To have a better explanation, check out : http://css-tricks.com/css-sprites/
Another answer that could seem a little stupid but I like to think of this when I make a website : Just Keep It Simple. I mean do all your JS add real value, do all this images are fine, could you display less, make a lighter design ? I'm not criticizing your work at all, just suggest you...
I used the following approach on an extranet project:
Using jQuery and a array of file names, I ajax in all the images, .js and .css files so that they are preloaded in the cache. As I iterate through the array, I update a progress bar on the screen that indicates that the site is loading - much like a flash loader.
It worked well.
What I will do is show by default the loading page with pure CSS and HTML then wait for the jQuery to load and preload the images with ImageLoader. Once you are done redirect to the normal website since the images will be already in the cache they won't be loaded again.
Another optimization you can do is minify all JS files and combine all except the jquery.js. Put the jquery.js first into your HTML so it loads first. Also put your SCRIPT tags at the bottom of the HTML.
It sounds like you have pretty much nailed preloading, if you have loaded it once, and the expiry header is set correctly, you have preloaded it, no matter what kind of content it is.
File combination can be key to a quick website, each extra file will add load time, in the worst cases of network and server lag you might add up to a second extra for each separate file. More commonly it will be around 100 - 200 milliseconds per file.
If not already minified, minify the scripts, and put them in the same file, just remember to keep the order. I have no idea why Ivo Sabev wouldn't include jQuery.
Same thing with the CSS files.
How much have you done about testing image compression? There can really be a gain from trying out different compression settings and comparing size vs. quality. For PNG images IrfanView with PNGOUT can often make files 25% smaller than other programs, on top of that, a very big gain in size reduction can be achieved by reducing the image to 8 bit colour, with a lot of graphic elements you simply can't tell the difference. Right here on Stack Overflow there is a great example of well compressed and stacked images in the editor control buttons: http://sstatic.net/so/Img/wmd-buttons.png

Categories