How can I disable nginx gzip from PHP? - php

I am tyring to stop nginx from gzipping a single PHP request. I already have the following:
#ini_set('zlib.output_compression', 'Off');
#ini_set('implicit_flush', 1);
header('X-Accel-Buffering: no');
According to everything I have found, that X-Accel-Buffering alone should disable gzip, however when I load this page from a browser, I can still see the header:
Content-Encoding:gzip
I'm using php7-fpm, nginx 1.10.1, debian8
EDIT:
I did a test using sleep() to delay the output. It looks like header('X-Accel-Buffering: no'); IS working, however it only prevents buffering and not gzipping. I guess gzipping is working as a stream somehow.
I can see that if I output 1,000 bytes, looping over an echo statement with 1 char each, the browser receives about 11kb. If i echo a str_rep x 1000, then much less data is sent. There must be some overhead there.
Regardless, I need to disable gzip so that I can send a large amount of content and measure the download time. If it's gzipped, I don't know what the actual throughput is.

Nginx will not run gzip filter if Content-Encoding header found in answer. So, you can set Content-Encoding: identity header on backend, and nginx will pass it to client without gzip process. identity means "not encoded".

I haven't found any way to disable it explicitly, however I did find that there is a list of default mime-types that will be encoded by the gzip module.
Since the data doesn't have to be read and is just being sent as a speed test, changing the content type to anything other than that list will prevent it from being gzipped.
header('Content-Type: raw/data');

Related

How can I disable gzip inside php code on HHVM? (eg setting content-encoding header)

I'm converting php code to hhvm. One page in particular sometimes needs to flush() a status-message to the browser before sending some emails and a few other slow tasks and then updating the status message.
Before hhvm (using php-fpm and nginx) I used:
header('Content-Encoding: none;');
echo "About to send emails...";
if (ob_get_level() > 0) { ob_end_flush(); }
flush();
// Emails sent here
echo "Emails sent.";
So the content-encoding stops gzip being used, then the flush sends the first message, then the second message is sent when the page ends.
Using HHVM (and nginx), setting the Content-encoding header works (it shows up in the browser), but either hhvm or nginx is ignoring it and sending the page as gzipped content, so the browser interprets the content-encoding=none with binary data.
How can I disable gzip inside php code on HHVM?
(I know I could turn it off in the config files, but I want it kept on for nearly every page load except a few that will run slower.)
While my suggestion would be to have different nginx location paths with different gzip configuration, here's a better alternative solution to achieve what you want to happen.
Better Solution:
It is often referred to as bad practice to keep a connection open (and the browser loading bar spinning) while you're doing work in the background.
Since PHP 5.3.3 there is a method fastcgi_finish_request() which flushes the data and closes the connection, while it continues to work in the background.
Now, this is unfortunately not supported yet on HHVM. However, there is an alternative way of doing this.
HHVM alternative:
You can use register_postsend_function('function_name'); instead. This closes the connection, and the given function will be executed in the background.
Here is an example:
<?php
echo "and ...";
register_postsend_function(function() {
echo "... you should not be seeing this";
sleep("10"); // do a lot of work
});
die();

php inserts HEX number of characters before the content

I'm moving a website to a new server. (Old server had php 5.3.2, the new one has php 5.5.9)
Centos, httpd Apache/2.2.26.
I've copied files and it works fine except the only weird thing:
some strange HEX number is inserted before the content of pages:
Also, in the bottom of the page, 0 is inserted after the </html> tag.
I've noticed two things:
1) In my case only two headers are sent from php script:
header("HTTP/1.1 200 OK");
header("Status: 200");
If I comment the first header than It will be ok - no strange numbers.
2) It looks like that number is the number of characters on the page (I've checked it).
And if the page is less then 8000 characters, than the number doesn't appear, but if page has 8001 characters than 1F41 appears
P.S. I was advised to remove all BOM from files. Files were OK - already without BOM. So it's not about BOM.
UPD:
I made a very simple test (index.php):
<?php header("HTTP/1.1 200 OK"); ?>
Lorem Ipsum ... 8000 characters
Everything is OK.
<?php header("HTTP/1.1 200 OK"); ?>
Lorem Ipsum ... 8001 characters
Bug happens 1f41 before Lorem Ipsum.
This is not PHP nor a BOM. You have a problem with Content-Transfer-Encoding.
The server is sending along a Chunked encoding (this is usually done when the Content-Length is unavailable) which the client apparently doesn't know how to handle, or the header combination makes the client believe that it can bypass dechunking.
Therefore, what is actually the "length of next chunk" gets interpreted by the client, and you see it as a hex bit before the content:
05
These
05
Are
03
the
1F
first characters of the senten
03
ce.
instead of
Content-Length: 48
These are the first characters of the sentence.
(I calculated the lengths by eye, they're probably wrong)
The likely cause is that you have some kind of buffering which interferes with Content Encoding. If everything stays in the buffer, all well and good, a Content-Length is available, gets sent and Bob's your uncle. But if you send more than the 8000 bytes of the buffer, the buffer is flushed and something untoward happens. Try looking in the documentation for zlib and output buffering, you might have some conflict in php.ini between what Apache does and what PHP does.
Interesting links that show how exactly 8000 is (was?) a buffer size used in some portions of apache:
https://bugs.php.net/bug.php?id=45945
https://serverfault.com/questions/366996/how-can-the-apache-2-2-deflate-module-length-limit-be-increased
This last link suggests me to suggest you to try checking whether zlib, mod_gzip, ob_gzhandler and/or mod_deflate are enabled, and possibly conflicting, or to try exchanging the one for the other.
Update (as per comment)
Adding the appropriate Transfer-Encoding header fixes the problem.
So what really happened? We know that the output was correctly chunked-encoded (i.e. the chunking itself was correct), and the client was able to interpret it after being told to. So what was missing was simply the knowledge of the content being chunked, which looks absurd and a good bit buggy: for whatever chunked the content had to know it would get chunked (d'oh!), so it had responsibility of adding the appropriate header, and yet it did not (or it did, but something else stripped it), until the OP rectified the situation adding a header himself by hand.
The problem is now solved, but I think there must be still a bug lingering somewhere in the workflow, either in the modules, the application, or in how the headers are processed.
As discussed in the PHP chat room yesterday 9F 1A is the BOM for UTF-16BE Encoding
In my case only two headers are sent from php script:
header("HTTP/1.1 200 OK"); header("Status: 200");
What's going here? Why are you setting both? One is a proper http response, the other is a CGI response. You shouldn't need to set either, as PHP will default to a 200 OK if the script completes. And setting a CGI style header is probably wrong, unless you're definitely running PHP in CGI mode through Apache.
Had this problem for serving HTML as well as plain text.
EXPLICITLY setting header Transfer-Encoding to "chunked"
header("Transfer-Encoding: chunked");
indeed solved the problem (not sure how reliably though).
That Transfer-Encoding header was being added automatically even if i didn't set it - but the problem was still there. Setting the header explicitly in PHP code solved it, but not exactly - see below. Tested using websniffer.cc, PHP 7.4.9, CentOS 7.8
UPDATE: adding the header might confuse some clients. https://securityheaders.com/ failed to connect to the site with this header explicitly set on (regardless of content length)

PHP flush the output to browser

I work on a PHP project and I use flush().
I did a lot of search and found that PHP sends long outputs of scripts to the browser in chunk parts and does not send all the huge data when the script terminates.
I want to know the size of this data, I mean how many bytes the output must be for PHP to send them to browser.
It's not only PHP that chunks the data; it's actually the job of Apache (or Tomcat etc) to do this. That's why the default is to turn off the "chunking" in PHP and leave it to Apache. Even if you force a flush from PHP, it still can get trapped by Apache. From the manual:
flush() may not be able to override the buffering scheme of your web
server and it has no effect on any client-side buffering in the
browser. It also doesn't affect PHP's userspace output buffering
mechanism. This means you will have to call both ob_flush() and
flush() to flush the ob output buffers if you are using those.
There's a Wikipedia article on transfer encoding / chunking: http://en.wikipedia.org/wiki/Chunked_transfer_encoding
Apache gets more complicated with GZIP or deflate encoding; you'll need to hit an apache server as to how you chan configure it.
i think you are wrong
see this code
echo str_repeat(' ',1024);
for($i=0;$i<10;$i++){
echo $i;
flush();
sleep(1);
if you run it see that every 1 byte sent to browser and print
//the str_repeat is for browser buffer for showing data and nothing else

php flush not working

My flush mechanism stopped working, i'm not sure why.
I'm trying to run a simple flush example now, with no luck:
echo "before sleep";
flush();
sleep(5);
echo "after sleep";
after doing some reading, and understanding ngin x was was installed on my server lately, I requested it to be disabled for my domain. (the server admin said he disabled it for this specific domain)
also, i tried disabling gzip, added these lines to .htaccess
SetOutputFilter DEFLATE
SetEnv no-gzip dont-vary
also, tried adding these to my php file
ini_set('output_buffering','on');
ini_set('zlib.output_compression', 0);
nothing helps. its sleeping 5 seconds and then displaying all the content together.
I've been using it before, and have been using also through the output buffer (ob_start, ob_flush etc., now just trying to make the simplest example work)
"Stopped working" is a pretty high level. You should actually take a look what works or not to find out more.
This can be done by monitoring the network traffic. You will see how much of the response is already done and in which encoding it's send.
If the response is getting compressed, most compression functions need a certain number of bytes before they can compress them. So even you do a flush() to signal PHP to flush the output buffer, there still can be a place either within PHP output filtering or the server waiting for more to do the compression. So next to compression done by apache, check if your PHP configuration does compression as well and disable it.
If you don't want to monitor your network traffic, the curl command-line utility is doing a pretty well job to display what's going on as well and it might be easier to use it instead of network monitoring.
curl -Ni --raw URL
Make sure you use the -N switch which will disable buffering by curl so you see your scripts/servers output directly.
Please see the section Inspecting HTTP Compression Problems with Curl in a previous answer of mine that shows some curl commands to look into the output of a request while it's done with compression as well.
curl is able to show you eventually compressed data uncompressed and you can disable compression per request, so regardless of the server or PHP output compression settings, you can test more differentiated.
<?php
ini_set('zlib.output_handler', '');
ini_set('zlib.output_compression', 0);
ini_set('output_handler', '');
ini_set('output_buffering', false);
ini_set('implicit_flush', true);
apache_setenv( 'no-gzip', '1' );
for($i = 0; $i < 5; $i++){
echo str_repeat(chr(0), 4096); #flood apache some null bytes so it feels the packet is big enough to be sent...
echo "$i<br/>";
flush();
sleep(1);
}
?>

Strange http gzip issue

Here's a strange one:
I've got nginx reverse proxying requests to apache 2 with mod_php.
A user (using firefox 3.1b3) reported that recently, he's started getting sporadic "What should firefox do with this file?" popups during normal navigation. We haven't had any other reports of this issue, and haven't been able to reproduce it ourselves.
I checked Nginx and apache's logs. Nothing in the error logs, and they both show a normal HTTP 200 for the request.
I had him send me the downloaded file, and it's generated HTML, as it should be -- except it has some trailing and leading bytes tacked on.
The opening byte sequence is the magic gzip header: 1F8B08
Here are the opening characters, C-escaped for convenience:
\x1F\x8B\x089608\r\n<!DOCTYPE HTML ...
and the file ends with:
...</html>\n\r\n0\r\n\r\n
When I fetch the same URL via wget, it starts with as expected; the mysterious opening and closing bytes are nowhere to be seen.
Has anyone ever seen anything similar to this? Could this be a FF 3.1b3 bug?
Never seen an issue exactly like that, but I did have an issue once with a transparent proxy that would claim to the web server that it could handle gzip compressed content when, in fact, it received the gzipped content from the server, stripped the gzip headers without decompressing it, and sent the result to the browser. The behavior we saw was what you describe: a save/open file dialog for what should have been a normal web page. In this case the browser in question was IE.
I'm not sure if you're problem is related, but as an experiment, you could look at the requests between the proxy and Apache and see if they are gzipped, or else turn off the gzip compression for requests in Apache and see if that fixes the issue. If so, then you probably have a problem with gzip handling in your proxy.
wget doesn't request a compressed response. Try:
curl --compressed <URL>
You could also try adding a -v to print the response headers, and check that a sensible Content-Type is being returned.

Categories