I'm in the process of optimizing CMS website's CSS usage and I decided it would be a good idea to serve the CSS through PHP so I could send the ID of the current page to the stylesheet, e.g:
<link href="/css/style.php?id=<?php echo $page_id; ?>" type="text/css" rel="stylesheet" />
I did this because I thought it would be a good way to stop CSS meant for different pages being loaded on pages that didn't need it. Then today it struck me, this setup means when a new user clicks an internal link, they won't be able to use their cached stylesheet and will have to download a new stylesheet for every page.
Obviously this isn't the way forward, does anybody know a better way of doing it? I've considered using session data, but I'd rather not because if anybody had cookies off it would break. I've also considered using $_SERVER['REQUEST_URI'] in the stylesheet but I'm worried about false positives.
Any ideas would be appreciated. Thanks!
I am pretty sure that caching is always better then serving up dynamic spreadsheets.
The bottleneck in pretty much every webapp is bandwith/latency. So not having to request a file is better then serving a lot of perhaps smaller files that may require a bit less processing power.
Related
Goal: minimizing the loading time of my website
I did lots of scans and speed tests.
They all indicate, there is lots of time spent uselessly, waiting for a HTTP response.
The time spent actually sending data is very small compared to that.
Screenshot of a 'speed test':
I started loading all the scripts and styles in PHP and just echoing them:
<!-- loading a style -->
<style><?php echo file_get_contents("style.css"); ?></style>
<!-- loading a script (similar) -->
<script type="text/javascript"><?php echo file_get_contents("script.js"); ?></script>
That does work, however, the images still slow things down a little. How do I use php to read them and echo them as a <img> tag?
(The mentioned website is https://www.finnmglas.com/)
Everyone seeking answers to this:
You should not echo back images with PHP into the main file:
It messes up the loading time (makes it way longer)
And it delays the first render of your website. Even if you have less requests, you will want to have a quick first render (users will notice the delay)
Echoing images from a separate PHP file and including a <img> tag pointing at that PHP should not be a problem ^^
Yes, you can add data instead url to src attribute:
<img src="data:image/jpg;base64,<? php echo base64_encode($data) ?>" />
but I recommend this for small images. Better way for more small images is use it as sprites: https://www.toptal.com/developers/css/sprite-generator/
and for css/javascript use some minify preprocessor: gulp gulp-clean-css gulp-uglify-es
I noticed the same kind of issue of wait and lag...
there are many factors, the location of the server, location of the visitor (in the picscreen case ; location of the server that did the test), routers between the internet provider, etc...
I discovered that using a CDN is way better, the browser will connect to the CDN that have many servers around the world, then cloudflare ask your server and cache the HTML, js, css, images, videos.... so that delivery to the end point becomes way more faster
you can try https://www.cloudflare.com/
What are the benefits of using a CSS external stylesheet over a php file set to content-type: text/css? If you place this header at the top of a PHP file I feel like you have so much more potential:
<?php
header("Content-type: text/css");
if($query_string = "contact_us") {
#nav {}
}
?>
(^ that's a .php file). http://shapeshed.com/using_php_to_enhance_css/
If there are no downfalls (and I checked how they were cached in Chrome's Network Panel and I believe it's the same), isn't it kind of like whether to use .html or .php?
Thanks for any feedback.
Here are some differences:
PHP will create extra CPU / memory overhead compared with serving purely static CSS. Probably not all that much, but still a consideration. The more processing you do the bigger a deal it is, obviously.
You can't push PHP files to a CDN
SASS and LESS have been developed to deal with any dynamic features you might need so PHP isn't likely necessary
Sounds like you're worried about serving more CSS than is needed for certain pages. Realistically it's not an issue since browsers will cache the CSS after the first download.
Additional thoughts:
I wrote a UI template engine that isolates both JS and CSS code to only the specific views on which they are used. If a CSS or JS is used more than once, it is pushed to the "kitchen sink level" and included globally. This limits selector conflicts and also best balances the number of HTTP requests and total download size per request. Also keeping relevant code (i.e. button event listeners or page / element-specific styles) close together helps with more rapid programming, especially for non-expert teams / developers.
That could have been nice before CSS preprocessors such as SASS or LESS.
Dynamic CSS isn't even as useful as dynamic JavaScript, much less dynamic HTML. Having one large file with all the rules in it is more effective than having a file with rules that change since you can more easily cache the former in the client than the latter.
The reality is that there isn't much benefit to this...dynamic CSS doesn't have many valid use-cases, and the one you're describing certainly isn't one of them. If you just combine and minify all of your css into a single file, it'll cache on the client and be downloaded only when you bust the cache it.
In a modular CMS, this could be useful. As long as your application could generate a .php URL that consistently generates the exact same CSS for caching purposes, you could dramatically reduce the amount of CSS to download. For example, pages that uses 1 theme and 5 modules (each providing CSS) could return the CSS for that combination, instead of the CSS for 1 theme and 50 modules. That could be the difference between 50KB of CSS and 500KB -- a huge savings for slower connections.
If your website is hand-made, as in a website that has specific goals known ahead of time, then there's really not a good reason to do this as others have answered.
I created a private distribution in Cloudfront to prevent hotlinking. I managed to create links to my objects with signed URL which is working fine now.
My only concerns, is that images link inside my css stylesheets are not working because they are not signed. So if I have, for instance:
background-image: url('../img/bg.png');
The background image is not going to show up since the stylesheet does not include a signed url, and therefore, Cloudfront refuses to serve the content.
Is there anything I can do to prevent this ?
Let me step back and ask a fundamental question: Are you really worried about people hotlinking your images? Really? And if someone does, what is the realistic impact it will have on you? Really?
If you have a legitimate reason for preventing people from hot linking, then I'm not really sure that any CDN service (in this case, CloudFront) is the right solution for you.
Hey, I'm just being honest…
There are a few way, each with drawbacks.
Instead of a static CSS file, you generate it off a template (or some other smart way to map resources to CloudFront locations). You can use some degree of caching here by using Last-Modified and max-age Cache headers. The hardest solution but arguably the best protection.
Set up a redirection path for all your CSS resources, basically a small script that rewrites the path to CF (take care to only rewrite CSS resources and nothing else). This allows you to keep your current static CSS but opens up a potential hot linking of your redirection script.
Something in between could be a cron script that generates static CSS files with links that expire in 1.5d) to reduce server load.
It's generally a really bad idea to use private distributions for your css files, because adding the (presumably constantly changing) expiration time and signature to css urls prevents browsers from caching them, therefore greatly reducing the usefulness of using Cloudfront in the first place.
And even if you use long expiration times, then someone who desperately wants to hotlink your css background images will simply set up a script to scrape your css file and extract the image urls from there.
you could rotate the entire CDN host name every day or two, then you don't have to change anything in the css (assuming the css is on the CDN which it looks like from the example)
http://www.explainthatstuff.com/blocking-cloudfront-hotlinks.html
Then your CSS doesn't need signed URLs and you can still block hotlinking just as effectively.
I just recently heard about base64 encoding into html/css. There are pros and cons, but it might be what your looking for:
Is embedding background image data into CSS as Base64 good or bad practice?
how about using <base> tag in yr html?
Though did'nt try, but might help.
Another option is to use php to generate your stylesheet. for example, you get the signed urls for all the images, then you push them as variables to the style elements.
If I'm writing code in PHP is there a reason why I would use a CSS Preprocessor instead of PHP? For example, I could use PHP in my CSS file by having this in my header:
<link rel="stylesheet" type="text/css" media="all" href="style.php" />
That way I could pass it variables like style.php?color=#000
Or I could use something like LESS to preprocess my CSS. If I use less.js, I'm not sure how I would be able to pass variables like in the previous example.
Now, I've heard that PHP CSS files can't be cached so I can see why that would be a problem, especially if the CSS file was large. But I'd like the ability to pass variables to my CSS sheet.
Can someone tell me a little more about why I'd use one over the other, and/or how I would pass variables to my .less file if I used less.js?
Now, I've heard that PHP CSS files can't be cached so I can see why that would be a problem, especially if the CSS file was large.
PHP CSS files can be cached, but if you pass dynamic values to them, the point of caching is usually lost. If you have a dynamic value that may change with every request, caching becomes pointless.
Also, shoving huge amounts of mostly static CSS through the PHP preprocessor tends to be a waste of server resources.
The much easier approach is usually to have static CSS files, and to declare all dynamic values in the page body:
<!-- the static style sheet declares all the static values -->
<link rel="stylesheet" type="text/css" href="static.css">
<!-- now you override all the dynamic values -->
<style>
.classname { color: <?php echo $color; ?> }
</style>
This way, you can have dynamic values as you please, but you still avoid having a lot of CSS data being processed by PHP.
Any and all HTTP requests CAN be cached, you just generate appropriate cache headers see rfc2616.
Interestingly, caching will work very nicely because if your GET values change then you DON'T want the PHP to be cached anyhow. So go ahead and enjoy using them.
Part of your css should be something like:
<?php
header("Content-type: text/css");
?>
Here is a very interesting tutorial on it: http://css-tricks.com/snippets/php/intelligent-php-cache-control/
Beside browser caching, static files are much better for server-side caching:
Static CSS files can be cached into memory (and even precompressed with some servers like nginx) which enables you to serve them from cookie-less static-serving domain. Using a web server like nginx can create a huge performance boost since less RAM is used. If you don't have much RAM or have a lot of traffic, the difference can be enormous.
If you have a small website than it does not matter much.
So, I'm in to cache everything on my website called http://apolloinvest.hu.
I sending gzipped, optimized images, js, css, and everything also the whole site is gzipped, the JS files are loads deferred, with LAB, and everything must be fantastic, I also made a browser cache. But my site is still loads for 1 sec to load any page, and not instantly do it.
Could you help me please, why?
My redbot andswer is: http://redbot.org/?uri=http%3A%2F%2Fapolloinvest.hu%2F
Google PageSpeed rank is 99/100 (Because I don't want to remove the comments from the jquery UI)
The answer for CSS files: http://redbot.org/?uri=http%3A%2F%2Fapolloinvest.hu%2Fda232d78aa810382f2dcdceae308ff8e.css
For JS files: http://redbot.org/?uri=http%3A%2F%2Fapolloinvest.hu%2F5ec01c6d8ca5258bf9dcef1fc6bfb38c.js
So to tell the true I dont know what is the matter, with my caching or my JSes. Thanks for the help guys.
Répás
The site is pretty fast as it is, but here are a few possible improvements:
Directly render the HTML page instead of using JavaScript to do so. Put all the <script> elements at the bottom of the HTML document (just before </body>) so that the browser can render the page even before the JavaScript code is downloaded.
You can concatenate all the JavaScript files into one. Currently, http://apolloinvest.hu/475a641fc1d70f7c92efa3488e27568f.js is just empty.
If possible, serve static content such as JavaScript files and styles with Cache-Control and Expires headers far in the future.
A couple of unrelated notes:
The site is not valid HTML. The additional overhead caused by the browser transforming it to valid HTML does not matter, but the readability (and compatibility) does.
Your stylesheet is restricted to screen. When printed out (or viewed on another non-screen device), it looks ugly.
The site breaks for users without JavaScript. It's just showing a loading bar, forever.
I sending gzipped, optimized images, js, css, and everything also the whole site is gzipped, the JS files are loads deferred, with LAB
THAT IS exactly your problem.
Instead of doing all that fancy stuff, you had to profile your application first, determine a certain bottleneck and then optimize the exact part that is causing the slowness.
Let me suggest you to start from the "Net" tab of Firebug where you can watch actual response times of the requests. It is very likely that your code runs FAST but some JS-based web-counter prevents the page from displaying immediately.
if it's 1 second that takes for the PHP code to execute - time to profile it. Xdebug or simple microtime(1)-based manual profiling can tell you where is the problem. Once you find it, you'll be able to ask more certain question here.