In my PHP.INI, gzip compression already enabled. May I use ob_start('ob_gzhandler'); in my PHP pages? And whats different between these two?
Enabling compression like this:in PHP.ini can be done like this:
zlib.output_compression = On
this will mean every page PHP serves will be compressed, which may or may not be what you want.
Using ob_start('ob_gzhandler') however will only compress that particular buffer / page and will not affect anything else served by PHP.
Use the second method if you want to compress only certain output. Mixing the two will be pointless, and will probably just use extra CPU cycles trying to compress the already compressed output.
It could be that PHP is clever enough to only do the compression once, but it's still a fruitless exercise to use both approaches together.
It's usually better to enable compression in your web server, although this depends on what you are trying to achieve.
You cannot use the two together.
"You cannot use both ob_gzhandler() and zlib.output_compression. Also note that using zlib.output_compression is preferred over ob_gzhandler()."
http://www.php.net/manual/en/function.ob-gzhandler.php
You can still use ob_start() if you need to buffer your output, you just cannot us the gzhandler callback.
Related
I want to use compression so that i can speedup my website.Which is the best compression available? Is compression using ob_start("ob_gzhandler"); the best? Does it compress the images embedded.Is there a way to compress those images?
P.S: I don't have access to the server configuration. I only have a FTP account
I would suggest you use the webserver's functions for that. If you use apache, you can check out http://httpd.apache.org/docs/2.0/mod/mod_deflate.html for instance.
Yes, ob_start("ob_gzhandler") is the best you can get if you don't control the server.
The difference between various methods is mainly in efficiency, e.g. some servers or reverse proxies are able to cache compressed data to avoid re-compressing them again. However, gzip is very fast for today's hardware, so even most basic compression in PHP is going to be a net gain.
You should also compress your JS and CSS files, but you shouldn't do that by simply wrapping them in a PHP script, because by default PHP makes files non-cacheable. Any gain from compression will be lost when browsers are forced re-download those files over and over again.
Ideally you should either use a CDN that will compress them for you or ask your host to enable server-level compression for static text files.
No, ob_gzhandler doesn't compress images if you just provide a link in your HTML. Only if you load them into a PHP file as binary data, like with file_get_contents().
I'm caching the main page of my site as a flat html file, and then with .htaccess loading that file if the user is not logged in (since no user specific info is displayed), instead of loading my entire php framework.
The one downside to this is that PHP doesn't gzip the file automatically, since PHP is not even being used, as it's just a plain html file that is being loaded by the browser.
I tried this:
$html = gzencode($html);
$fp = fopen($cachefile, 'w');
fwrite($fp, $html);
But when the file url is loaded in a browser it is just a bunch of weird characters.
EDIT: I guess one simple solution would be to save the files as .php instead of html, that way the php ob_gzhandler compresses the file. I'm wondering if there's a performance gain to be had by serving up html that is already gzipped and skipping php altogether..
UPDATE: As OP discovered, ob_gzhandler() can deal with this sort of use case, and is not a bad way to go.
ORIGINAL ANSWER: It is likely that, even if you manage to make this work somehow, it will result in poorer performance than simply having the file as a plain text file on your file system.
If you want to take advantage of gzip compression, have Apache do it with mod_deflate. Store your file uncompressed. Apache will deal with the compression issue.
But before you go through any trouble getting that set up: How big can this file be? If it is not a very large HTML file, the cost of the overhead of having to uncompress the file every transaction probably outweighs the benefit of the actual compression. You'd really only see a benefit with very large HTML files, and those files would probably cause the browser to come to a screeching halt anyway.
All that said, you can use gzdecode() to uncompress the file, but then you're not serving up the static file--you're running it through PHP before serving it. Again, for this use case, you're best bet is probably to just serve up the straight HTML rather than messing with compression.
I have always had issues with Large file uploading with PHP.
I heard that Perl is an alternative and a reliable way of handling large file uploads.
or Is there a better way in php (using swfupload etc) to manage large file uploads.
Do you have any idea about this?
Thanks,
B2W 2011
There are 3 configuration options that affects file uploading in php, all of them in php.ini, and some of them configurable at runtime.
You should take care of:
**max_input_time** its the time a script could invest in parsing the input
**file_uploads should** be set to on, it determines if uploads are allowed at all or not
**upload_max_filesize** is the maximum size for the uploaded files
**post_max_size** since uploads are inside POST requests, you should raise this value at least to the value you specified in upload_max_filesize
After you change this settings in php.ini, remember to restart apache.
It is also adviced to remove the max execution time limitation with:
set_time_limit(0); at code level.
Aside from that, remember that if you upload large files, you should never put the content of the files directly inside a variable, you would run out of memory if you do it.
Normally when you disable the timeout limit using set_time_limit(0)it should not produce any errors.
How large?
I believe that if the file is too (hundreds of megabytes), perhaps use a service dedicated to this (S3/DropBox, etc)?
Perl is an interpreted server-side language that runs on top of the web server, just like PHP, you switching languages is unlikely to change anything.
Is there a better way? Since you don't say you what issues are, we can't suggest a way to fix them ;-)
i use file_get_contents function to grab data from sites and store the data in database. it will be very inconvenient for me, if one day the script will start not working.
I know, that it can start not working, if they change the structure of site, but now i'm afraid, that maybe there are mechanisms to disable the working of this function, maybe from server?
i tried to find documentation about it, but can't get, so maybe you will help me?
Thanks
I know, that it can start not working,
if they change the structure of site,
but now i'm afraid, that maybe there
are mechanisms to disable the working
of this function, maybe from server?
Yes, it can be disabled from php.ini with allow_url_fopen option. You have other options such as CURL extension too.
Note also that you will need to have openssl extension turned on from php.ini if you are going to use the file_get_contents function to read from a secure protocol.
So in case file_get_contents is/gets disabled, you can go for CURL extension.
It is possible to disable certain functions using disable_function. Furthermore the support of URLs with filesystem functions like file_get_contents can be disabled with allow_url_fopen. So chances are that file_get_contents might not work as expected one day.
There are at least two PHP configuration directives that can break your script :
If allow_url_fopen is disabled, then, file_get_contents() will not be able to fetch files that are not on the local disk
i.e. it will not be able to load remote pages via HTTP.
Note : I've seen that option disabled quite a few times
And, of course, with disable_functions, any PHP function can be disabled.
Chances are pretty low that file_get_contents() itself will ever get disabled...
But remote-file loading... Well, it might be wise to add an alternative loading mecanism to your script, that would use curl in case allow_url_fopen is disabled.
I noticed the other day that a new script I wrote for php 5 began outputting html that was viewable before the php script had actually finished. Did this happen with 4?
For instance, I have a long loop that echos something out with each iteration. The output was small in terms of kb, so I dont think it was lag due to the download speed. Can someone explain the difference in output?
Maybe there a difference in the configuration of the output_buffering directive, in php.ini ?
If output_buffering is enabled, PHP will "keep" the generated output in memory (at least, if it doesn't become bigger than the size of the memory buffer), and only send it to the browser when the page's generation is finished.
If output_buffering is disabled, the ouput is sent immediatly when generated, even if the script's execution is not finished yet.
I doubt there is a difference to this regard between PHP 4 and 5, but you can get this behaviour on both versions, namely by enabling/disabling the output_buffer. Maybe the default value for PHP 5 is different than it was for PHP 4? (Haven't checked)
When the data is sent, is dependant on PHP configuration, it's an output buffer, and behaves like a buffer.
Having said that, you can use the function ob_start() and ob_end_flush() to take control of the buffer. The Zend Framework does some clever stuff with output buffering for instance...
The usual suspects are:
Browser and HTML structure
Output buffering or output handlers
HTTP compression handled by PHP or by the web server
A close look at phpinfo() at a tool to see HTTP headers can help you.