I want to use compression so that i can speedup my website.Which is the best compression available? Is compression using ob_start("ob_gzhandler"); the best? Does it compress the images embedded.Is there a way to compress those images?
P.S: I don't have access to the server configuration. I only have a FTP account
I would suggest you use the webserver's functions for that. If you use apache, you can check out http://httpd.apache.org/docs/2.0/mod/mod_deflate.html for instance.
Yes, ob_start("ob_gzhandler") is the best you can get if you don't control the server.
The difference between various methods is mainly in efficiency, e.g. some servers or reverse proxies are able to cache compressed data to avoid re-compressing them again. However, gzip is very fast for today's hardware, so even most basic compression in PHP is going to be a net gain.
You should also compress your JS and CSS files, but you shouldn't do that by simply wrapping them in a PHP script, because by default PHP makes files non-cacheable. Any gain from compression will be lost when browsers are forced re-download those files over and over again.
Ideally you should either use a CDN that will compress them for you or ask your host to enable server-level compression for static text files.
No, ob_gzhandler doesn't compress images if you just provide a link in your HTML. Only if you load them into a PHP file as binary data, like with file_get_contents().
Related
Is there a compression library that has both PHP and JavaScript implementation, such that I can compress/decompress from/to JavaScript/PHP interchangeably?
I know there are different options to compress data with PHP or JavaScript, the problem is that they store their output with different metadata. This means for example, that if some data is compressed into file output.bin using PHP, once I read output.bin using JavaScript (ajax) there is no way I can decompress that stream.
PS: I'm not looking for compression of HTTP requests, I have a bunch of compressed files (throught PHP) which I need to retrieve and decompress using JavaScript.
Thanks
I don't know if something like node.js would fit into your scheme, but it provides fast zlib compression and decompression. php has zlib as well, so that would meet your requirements. If you need a zlib clone written entirely in Javascript, there are several out there that show up in a google search. However I don't know if any are complete or correct.
It's going to be trickier on the Javascript side than the PHP side, but there's lots of discussion and links here.
I'd suggest looking into HTTP-level compression. But if you really want to do this by hand, you could use zlib for example. Both PHP and JS have implementations:
http://www.php.net/manual/en/ref.zlib.php
https://github.com/imaya/zlib.js
In my PHP.INI, gzip compression already enabled. May I use ob_start('ob_gzhandler'); in my PHP pages? And whats different between these two?
Enabling compression like this:in PHP.ini can be done like this:
zlib.output_compression = On
this will mean every page PHP serves will be compressed, which may or may not be what you want.
Using ob_start('ob_gzhandler') however will only compress that particular buffer / page and will not affect anything else served by PHP.
Use the second method if you want to compress only certain output. Mixing the two will be pointless, and will probably just use extra CPU cycles trying to compress the already compressed output.
It could be that PHP is clever enough to only do the compression once, but it's still a fruitless exercise to use both approaches together.
It's usually better to enable compression in your web server, although this depends on what you are trying to achieve.
You cannot use the two together.
"You cannot use both ob_gzhandler() and zlib.output_compression. Also note that using zlib.output_compression is preferred over ob_gzhandler()."
http://www.php.net/manual/en/function.ob-gzhandler.php
You can still use ob_start() if you need to buffer your output, you just cannot us the gzhandler callback.
I'm caching the main page of my site as a flat html file, and then with .htaccess loading that file if the user is not logged in (since no user specific info is displayed), instead of loading my entire php framework.
The one downside to this is that PHP doesn't gzip the file automatically, since PHP is not even being used, as it's just a plain html file that is being loaded by the browser.
I tried this:
$html = gzencode($html);
$fp = fopen($cachefile, 'w');
fwrite($fp, $html);
But when the file url is loaded in a browser it is just a bunch of weird characters.
EDIT: I guess one simple solution would be to save the files as .php instead of html, that way the php ob_gzhandler compresses the file. I'm wondering if there's a performance gain to be had by serving up html that is already gzipped and skipping php altogether..
UPDATE: As OP discovered, ob_gzhandler() can deal with this sort of use case, and is not a bad way to go.
ORIGINAL ANSWER: It is likely that, even if you manage to make this work somehow, it will result in poorer performance than simply having the file as a plain text file on your file system.
If you want to take advantage of gzip compression, have Apache do it with mod_deflate. Store your file uncompressed. Apache will deal with the compression issue.
But before you go through any trouble getting that set up: How big can this file be? If it is not a very large HTML file, the cost of the overhead of having to uncompress the file every transaction probably outweighs the benefit of the actual compression. You'd really only see a benefit with very large HTML files, and those files would probably cause the browser to come to a screeching halt anyway.
All that said, you can use gzdecode() to uncompress the file, but then you're not serving up the static file--you're running it through PHP before serving it. Again, for this use case, you're best bet is probably to just serve up the straight HTML rather than messing with compression.
Any tips or methods(encryption, plugins etc) to load flash files (i.e. swf) as quickly as possible at the client side ???
consider files size is between 5MB - 10MB....
make sure gzip/deflate and http caching are active during transport
If you can break up your files, you have the option to load a small file first and then bring in additional files as needed or at least once the initial interface is up. I no longer activly develop for flash, so this might be a bit outdated, but I always had the best results when I structured my desktop applications with the ultimate goal of easy portability to the web...
You mean, load them more quickly than a standard <embed>? Hardly, seeing as the file is so and so large, and that number of bytes needs to be transferred to the client in any case.
Make sure compression is enabled as stillstanding says. There's probably not much more you can do except work on the Flash files themselves, reducing the quality of embedded images, splitting it into several files, reducing embedded fonts, etc..
I'm currently using PHP to include multiple css (or js) files into a single file (as well as compress the content using GZIP).
E.g. the HTML page calls resources like this...
<link rel="stylesheet" href="Concat.php?filetype=css&files=stylesheet1,stylesheet2,stylesheet3"></link>
<script src="Concat.php?filetype=js&files=script1,script2,script3"></script>
Example of my Concat.php file can be found here: http://dl.dropbox.com/u/3687270/Concat.php (feel free to comment on any problems with the code)
But instead of having to open up my command prompt and running YUI Compressor manually on my CSS/JS files I want the Concat.php file to handle this for at least the CSS side of things (I say CSS only because I appreciate that YUI Compressor does minification of variables and other optimisations so it isn't feasible to replicate in PHP - but that is part 2 of my question).
I know this can be done with some Regex magic and I haven't a problem doing that.
So, my question has 2 parts, which are:
1.) What is the performance implications of having the server minify using preg_replace on a CSS file (or set of CSS files that could have a few hundred lines of code per file - normally it would be a lot less but I'm thinking that if the server compresses the file then I wouldn't have to worry too much about extra whitespace in my CSS)
2.) And how can I get the JavaScript files that are concatenated via my Concat.php file run through YUI Compressor? Maybe run via the server (I have direct access to the server so I could install YUI Compressor there if necessary), but would this be a good idea? Surely optimising on the server everytime a page is requested will be slow and bad for the server + increase bandwidth etc.
The reason this has come up is that I'm constantly having to go back and make changes to existing 'compressed/minified' JS/CSS files which is a real pain because I need to grab the original source files, make changes then re-minify and upload. When really I'd rather just have to edit my files and let the server handle the minification.
Hope someone can help with this.
If your webserver is Apache, you should use mod_concat and let the Apache take care of compression using gzip,
http://code.google.com/p/modconcat/
You should minify the JS just once and save the minified version on servers.
As suggested in the comments you could use one of the pre-built scripts for that. They make use of YUI compressor as well as other solutions even if you can't run Java on the server.
The first one was probably PHP Speedy, which still works but has been abandoned.
A new one is Minify, which offers a lot of features including general caching solution depending on the server's capabilities (APC, Memcached, File cache).
Another advantage of these projects is that your URLs won't have query strings in them (contrary to your current method), which causes troubles in a lot of browsers when it comes to caching. They also take care of gzipping and handling Expires headers for your content.
So I definitely recommend that you try out one of these projects as they offer immediate positive effects, with some simple steps of configuration.
Here's how i recommend you do it:
Turn on GZIP for that specific folder (Web server level)
Use one of the tools to strip out whitespace and concat the files. This will serve as a backup for search engines/proxy users who don't have gzip enabled. You'd then cache the output of this - so the expensive regex calls aren't hit again.
The above wont be very expensive CPU wise if you configure your server correctly. The PHP overhead won't really be much - As you'll have a cached version of the CSS, ie.
-- css.php --
if (!isset($_GET['f'])) {
exit();
}
if (file_exists('path/to/cached/css/'.md5($_GET['f'])) {
// just include that file...
readfile('/path/to/cached/css/'.md5($_GET['f']));
exit();
}
$files = explode(',', $_GET['f']);
ob_start();
foreach ($files as $file)
{
readfile($file);
}
// set a header (including the etags, future expiration dates..)
Header(....);
echo ob_get_flush(); // remove whitespace etc..
// write a new cached file
file_put_contents('/path/to/cache/'.md5($_GET['f']));
exit();
You can then do href="css.php?f=style.css,something.css,other.css" the script will then make a cache file which is the md5 of those files included.
The above example isn't complete.. it's more pseudo really.