I'm caching the main page of my site as a flat html file, and then with .htaccess loading that file if the user is not logged in (since no user specific info is displayed), instead of loading my entire php framework.
The one downside to this is that PHP doesn't gzip the file automatically, since PHP is not even being used, as it's just a plain html file that is being loaded by the browser.
I tried this:
$html = gzencode($html);
$fp = fopen($cachefile, 'w');
fwrite($fp, $html);
But when the file url is loaded in a browser it is just a bunch of weird characters.
EDIT: I guess one simple solution would be to save the files as .php instead of html, that way the php ob_gzhandler compresses the file. I'm wondering if there's a performance gain to be had by serving up html that is already gzipped and skipping php altogether..
UPDATE: As OP discovered, ob_gzhandler() can deal with this sort of use case, and is not a bad way to go.
ORIGINAL ANSWER: It is likely that, even if you manage to make this work somehow, it will result in poorer performance than simply having the file as a plain text file on your file system.
If you want to take advantage of gzip compression, have Apache do it with mod_deflate. Store your file uncompressed. Apache will deal with the compression issue.
But before you go through any trouble getting that set up: How big can this file be? If it is not a very large HTML file, the cost of the overhead of having to uncompress the file every transaction probably outweighs the benefit of the actual compression. You'd really only see a benefit with very large HTML files, and those files would probably cause the browser to come to a screeching halt anyway.
All that said, you can use gzdecode() to uncompress the file, but then you're not serving up the static file--you're running it through PHP before serving it. Again, for this use case, you're best bet is probably to just serve up the straight HTML rather than messing with compression.
Related
We want to merge a lot of PDF files into one big file and send it to the client. However, the resources on our production server are very restricted, so merging all files in memory first and then sending the finished PDF file results in our script being killed because it exhausts its available memory.
The only solution (besides getting a better server, obviously) would be starting to stream the PDF file before it is fully created to bypass the memory limit.
However I wonder if that is even possible. Can PDF files be streamed before they're fully created? Or doesn't the PDF file format allow streaming unfinished files because some headers or whatever have to be set after the full contents are certain?
If it is possible, which PDF library supports creating a file as a stream? Most libraries that I know of (like TCPDF) seem to create the full file in memory and then in the end output this finished result somewhere (i. e. via the $tcpdf->Output() method).
The PDF file format is entirely able to be streamed. There's certainly nothing that'll prevent it anyway.
As an example, we recently had a customer that required reading a single page over a HTTP connection to a remote PDF, without downloading or reading the whole PDF. We're able to do this by making many small HTTP requests for specific content within the PDF. We use the trailer at the end of the PDF and the cross reference table to find the required content without having to parse the whole PDF.
If I understand your problem, it looks like your current library you're using loads each PDF in memory before creating or streaming out the merged document.
If we look at this problem a different way, the better solution would be for the PDF library to only take references to the PDFs to be merged, then when the merged PDF is being created or streamed, pull in the content and resources from the PDFs to be merged, as-and-when required.
I'm not sure how many PHP libraries there are that can do this as I'm not too up-to-date with PHP, but I know there are probably a few C/C++ libraries that may be able to do this. I understand PHP can use extensions to call these libraries. Only downside is that they'll likely have commercial licenses.
Disclaimer: I work for the Mako SDK R&D group, hence why I know for sure there are some libraries which will do this. :)
I quickly wrote this up, http://www.ionfish.org/php-shrink/ where a user uploads a .php file with comments and spaces in it, and it will "minify" it for small file-size.
It does exactly this: http://devpro.it/remove_phpcomments/ except it's web-based, and instant. You get a download prompt after you click upload, to save the processed file.
My questions:
Is a 2-megabyte upload limit enough? Should I make it 4, 8, etc. ?
Is the output (requires you test it with some random PHP) satisfactory, or should it be tweaked?
Would there be any use for this for the general public, and should I add support to minify HTML, CSS, JS, and even C++ and Python etc. ?
EDIT: Changed to 250K for now, will see if it suffices.
A file that needs such a minification is proof of bad practice.
The file size of a single php file should never be a problem when using best practice.
You should not be uploading files during your deployment anyways. Instead you should be checking out files from your VCS and you don't want 'minified' files in your VCS.
Such a minification will not improve site performance either since every serious project uses opcode caching.
Conclusion: Such a service is not needed.
I want to use compression so that i can speedup my website.Which is the best compression available? Is compression using ob_start("ob_gzhandler"); the best? Does it compress the images embedded.Is there a way to compress those images?
P.S: I don't have access to the server configuration. I only have a FTP account
I would suggest you use the webserver's functions for that. If you use apache, you can check out http://httpd.apache.org/docs/2.0/mod/mod_deflate.html for instance.
Yes, ob_start("ob_gzhandler") is the best you can get if you don't control the server.
The difference between various methods is mainly in efficiency, e.g. some servers or reverse proxies are able to cache compressed data to avoid re-compressing them again. However, gzip is very fast for today's hardware, so even most basic compression in PHP is going to be a net gain.
You should also compress your JS and CSS files, but you shouldn't do that by simply wrapping them in a PHP script, because by default PHP makes files non-cacheable. Any gain from compression will be lost when browsers are forced re-download those files over and over again.
Ideally you should either use a CDN that will compress them for you or ask your host to enable server-level compression for static text files.
No, ob_gzhandler doesn't compress images if you just provide a link in your HTML. Only if you load them into a PHP file as binary data, like with file_get_contents().
I'm currently using PHP to include multiple css (or js) files into a single file (as well as compress the content using GZIP).
E.g. the HTML page calls resources like this...
<link rel="stylesheet" href="Concat.php?filetype=css&files=stylesheet1,stylesheet2,stylesheet3"></link>
<script src="Concat.php?filetype=js&files=script1,script2,script3"></script>
Example of my Concat.php file can be found here: http://dl.dropbox.com/u/3687270/Concat.php (feel free to comment on any problems with the code)
But instead of having to open up my command prompt and running YUI Compressor manually on my CSS/JS files I want the Concat.php file to handle this for at least the CSS side of things (I say CSS only because I appreciate that YUI Compressor does minification of variables and other optimisations so it isn't feasible to replicate in PHP - but that is part 2 of my question).
I know this can be done with some Regex magic and I haven't a problem doing that.
So, my question has 2 parts, which are:
1.) What is the performance implications of having the server minify using preg_replace on a CSS file (or set of CSS files that could have a few hundred lines of code per file - normally it would be a lot less but I'm thinking that if the server compresses the file then I wouldn't have to worry too much about extra whitespace in my CSS)
2.) And how can I get the JavaScript files that are concatenated via my Concat.php file run through YUI Compressor? Maybe run via the server (I have direct access to the server so I could install YUI Compressor there if necessary), but would this be a good idea? Surely optimising on the server everytime a page is requested will be slow and bad for the server + increase bandwidth etc.
The reason this has come up is that I'm constantly having to go back and make changes to existing 'compressed/minified' JS/CSS files which is a real pain because I need to grab the original source files, make changes then re-minify and upload. When really I'd rather just have to edit my files and let the server handle the minification.
Hope someone can help with this.
If your webserver is Apache, you should use mod_concat and let the Apache take care of compression using gzip,
http://code.google.com/p/modconcat/
You should minify the JS just once and save the minified version on servers.
As suggested in the comments you could use one of the pre-built scripts for that. They make use of YUI compressor as well as other solutions even if you can't run Java on the server.
The first one was probably PHP Speedy, which still works but has been abandoned.
A new one is Minify, which offers a lot of features including general caching solution depending on the server's capabilities (APC, Memcached, File cache).
Another advantage of these projects is that your URLs won't have query strings in them (contrary to your current method), which causes troubles in a lot of browsers when it comes to caching. They also take care of gzipping and handling Expires headers for your content.
So I definitely recommend that you try out one of these projects as they offer immediate positive effects, with some simple steps of configuration.
Here's how i recommend you do it:
Turn on GZIP for that specific folder (Web server level)
Use one of the tools to strip out whitespace and concat the files. This will serve as a backup for search engines/proxy users who don't have gzip enabled. You'd then cache the output of this - so the expensive regex calls aren't hit again.
The above wont be very expensive CPU wise if you configure your server correctly. The PHP overhead won't really be much - As you'll have a cached version of the CSS, ie.
-- css.php --
if (!isset($_GET['f'])) {
exit();
}
if (file_exists('path/to/cached/css/'.md5($_GET['f'])) {
// just include that file...
readfile('/path/to/cached/css/'.md5($_GET['f']));
exit();
}
$files = explode(',', $_GET['f']);
ob_start();
foreach ($files as $file)
{
readfile($file);
}
// set a header (including the etags, future expiration dates..)
Header(....);
echo ob_get_flush(); // remove whitespace etc..
// write a new cached file
file_put_contents('/path/to/cache/'.md5($_GET['f']));
exit();
You can then do href="css.php?f=style.css,something.css,other.css" the script will then make a cache file which is the md5 of those files included.
The above example isn't complete.. it's more pseudo really.
I have a unique problem, which is proving difficult to solve using google. I am consolidating all of my javascript and css into separate php files, which use require_once() to pull the contents of the files in. The javascript file looks something like this:
<?php
header('Content-Type: text/javascript');
require_once('jquery.form.js');
require_once('jquery.jqtransform.js');
require_once('jquery.validate.js');
?>
My specific problem is that web browsers will 'see' that this is a dynamic page, because of the php file extension, and then request the content anew each time a page on the site is loaded. What I am trying to do is get the time of last request from the browser, and then check each file modification time to see if I really do need to send the file contents again. It is proving difficult to find the time of last request by the user. Also, I have not yet started to tackle the problem of finding the last modified date of the files that are included, so if there is information regarding finding the file details of a file on the server, that would also be appreciated.
Just to be clear, the reason I am doing this is because (I think) it takes advantage of the gzip compression better than individually gzipped files.
Thanks in advance
I wrote a series of posts about this issue specifically. See Supercharging Javascript in PHP and Supercharging CSS in PHP. These cover:
Combining files;
Gzipping best practices;
Caching best practices; and
Versioning output.
Your premise is incorrect. Browsers don't "see" the PHP file extension and decide not to cache things. See http://www.enhanceie.com/redir/?id=httpperf for information on how browsers really work.
You should set an ETAG on your response, and then you can simply check the If-None-Match request header and return a 304 if the content is unchanged.
Browsers don't determine if a page or a file is dynamic or static by its extension. Its headers do. Just set the proper headers so the browser knows it can cache the results.
Also, ditch the closing ?>. It's not required and is bad practice.
Alister Bulman just mentioned a neat library solution for this problem but placed it as a comment. I'm repeating his comment as an answer since I found it valuable:
Minify - code.google.com/p/minify - is a library designed to do what
is required here - concatenate the files and send the appropriate
headers, also trimming down the contents, and quite possibly gzipping
them, while caching the results on disk. – Alister Bulman Jan 10 '10
at 10:44
You can enable auto-gzipping of files using apache mod_deflate.
You can also use apache mod_rewrite to refer to these files in the html as js files and redirect the request to the php files, avoiding your server caching issues.
Something like this:
RewriteEngine On
RewriteRule (.*).js $1.php
Put this code in a .htaccess file in your directory.