PHP Query String Limit - php

I have PHP 5.1.6 (cli) installed and whenever the GET query string is more than 128 characters it fails with HTTP 406 Not Acceptable error. Any suggestions how I can fix this so can use more than 128 characters? POST is not an option.
The error is being returned by the server so don't think it's browser issue.
And the reason I think it's PHP and not Apache is because it works fine with an HTML file.
GET /test.php?phptestof129characterstring-NEW-WOVEN-FENCE-PANELS-GARDEN_W0QQitemZ200303392512QQihZ010QQcategoryZ139954QQtcZphotoQQcmdZViewItem
HTTP/1.1
Host: *****
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-GB; rv:1.9.0.5) Gecko/2008120122 Firefox/3.0.5
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-gb,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 300
Connection: keep-alive
Cookie: agent_name=Tim
HTTP/1.1 406 Not Acceptable
Date: Tue, 03 Feb 2009 12:05:33 GMT
Server: Apache/2.2.3 (Red Hat)
X-Powered-By: PHP/5.1.6
Content-Length: 0
Connection: close
Content-Type: text/html
GET /test.html?phptestof129characterstring-NEW-WOVEN-FENCE-PANELS-GARDEN_W0QQitemZ200303392512QQihZ010QQcategoryZ139954QQtcZphotoQQcmdZViewItem
HTTP/1.1
Host: *****
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-GB; rv:1.9.0.5) Gecko/2008120122 Firefox/3.0.5
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-gb,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 300
Connection: keep-alive
Cookie: agent_name=Tim
HTTP/1.1 200 OK
Date: Tue, 03 Feb 2009 12:18:19 GMT
Server: Apache/2.2.3 (Red Hat)
Last-Modified: Fri, 19 Dec 2008 15:01:17 GMT
ETag: "156960d-221-94be8940"
Accept-Ranges: bytes
Content-Length: 545
Connection: close
Content-Type: text/html

Do you have mod_security enabled on your webserver? It sounds like something it would do. If so, you may be able to disable locally inside your <VirtualHost> block or with an .htaccess file for v1.x
<IfModule mod_security.c>
SecFilterEngine Off
SecFilterScanPOST Off
</IfModule>
Version 2.x has different configuration syntax:
<IfModule mod_security2.c>
SecRuleEngine Off
</IfModule>
That's a bit of a brute force approach, you may want to read the documentation to see how you might allow particular URIs to pass through. See also Handling False Positives and Creating Custom Rules

As a work-around, you can try using Javascript to put the data in cookies. The cookies will be sent automatically with every GET request, and give you an extra 2KB of data space (if I'm not mistaken).
This is very risky if you don't want to transmit that data with every request, so generally speaking I would recommend against it.

It's a long shot, but try adding:
header('Content-Type: text/html');
to your server-side code. If that doesn't help, check your Apache configuration, maybe it's misconfigured so that PHP files can't emit text/html MIME type. If that doesn't help, how about setting up Apache so that .html files are treated as PHP and rename the target script to .html?
BTW, from http://www.checkupdown.com/status/E406.html :
A client (e.g. your Web browser or our CheckUpDown robot) can indicate to the Web server characteristics of the data it will accept back from the Web server. This is done using 'accept headers' of the following types:
Accept: The MIME types accepted by the client. For example, a browser may only accept back types of data (HTML files, GIF files etc.) it knows how to process.
Accept-Charset: The character sets accepted by the client.
Accept-Encoding: The data encoding accepted by the client e.g. the file formats it understands.
Accept-Language: The natural languages (English, German etc.) accepted by the client.
Accept-Ranges: Whether the client accepts ranges of bytes from the resource i.e. a portion of the resource.
If the Web server detects that the data it wants to return is not acceptable to the client, it returns a header containing the 406 error code.

Have found answer thanks to comment from Ben.
Although this generates 406 error:
test.php?129+characters
This works fine:
test.php?data=129+characters
So my guess is that in the first instance PHP is attempting to use the 129 characters as name in $_GET array whereas the second example has only 4 characters for the name and the rest is assigned as value, so array must have 128 character limit for index name.

Related

HTTP416 with Laravel/Excel

Since my hosting provider has switched to HTTP2 I am getting a HTTP 416 Range Not Satisfiable error when trying to export data to Excel OR pdf formats. What I've tried already:
adding RequestHeader unset Range to .htaccess
adding Header unset Accept-Range to .htaccess
setting Cache-Control headers expiration to 0
I've talked to the technical support of my hosting provider and whatever they've doing for the next 20 minutes didn't work either. I think they even tried to switch off HTTP2 for my account – however the error persisted.
What I find even more bizarre is the fact that the error comes before the logs have been written, meaning that the file has not been generated at that point at all. It seems like the request comes to the server and bounces back with 416 without actually launching any process in the first place. Any ideas what it could be? I have not touched the code in the last 2 weeks and it seems to have been working alright.
Here are the request headers
:authority: [....xxx].com
:method: POST
:path: /v1/reports/shifts/export
:scheme: https
accept: application/json
accept-encoding: gzip, deflate, br
accept-language: en-US,en;q=0.9,de;q=0.8,ru;q=0.7,be;q=0.6,pl;q=0.5,fr;q=0.4
authorization: Bearer: eyJ0eXAiOiJKV1QiLCJhbGc....
cache-control: no-cache
content-length: 132
content-type: application/json
origin: https://[....xxx].com
pragma: no-cache
referer: https://[....xxx].com
sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="101", "Google Chrome";v="101"
sec-ch-ua-mobile: ?0
sec-ch-ua-platform: "macOS"
sec-fetch-dest: empty
sec-fetch-mode: cors
sec-fetch-site: same-site
user-agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.64 Safari/537.36
and here are the headers coming back with HTTP416 response
accept-ranges: none
access-control-allow-headers: Origin,X-Requested-With,Content-Type,Accept,Authorization,Accept-Language,Content-Language,Last-Event-ID,X-HTTP-Method-Override
access-control-allow-methods: GET, PATCH, POST, PUT, DELETE, OPTIONS
access-control-allow-origin: https://[....xxx].com
access-control-request-headers: X-Requested-With
cache-control: public
content-disposition: attachment; filename=shift.xls
content-length: 0
content-range: bytes */6144
content-type: application/vnd.ms-excel
date: Fri, 27 May 2022 11:54:20 GMT
last-modified: Fri, 27 May 2022 11:54:20 GMT
server: nginx
vary: Authorization
Any ideas?

php - Output Compression problems

I'm facing a problem trying to use php output compression , I've been searching for many hours and I still have no clues...
Lets see a simple script :
<?php
$response = "abcdefghabcdefghabcdefghabcdefghabcdefghabcdefghabcdefghabcdefgh";
if(function_exists('ob_gzhandler'))
{
ob_start('ob_gzhandler');
}
else {
ob_start();
}
echo $response;
ob_end_flush();
This is a method I've found all over the internet, and it used to work for me ... but not any more (and I've got no idea why) .
If i look at the http headers when I call this script :
Request :
Host: 192.168.51.191
User-Agent: Mozilla/5.0 (Windows NT 10.0; WOW64; rv:43.0) Gecko/20100101 Firefox/43.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: fr,fr-FR;q=0.8,en-US;q=0.5,en;q=0.3
Accept-Encoding: gzip, deflate
Connection: keep-alive
Cache-Control: max-age=0
Response :
Connection: Keep-Alive
Content-Type: text/html
Date: Tue, 26 Jan 2016 15:19:07 GMT
Keep-Alive: timeout=5, max=100
Server: Apache/2.4.9 (Win64) OpenSSL/1.0.1g PHP/5.5.12
Transfer-Encoding: chunked
Vary: Accept-Encoding
X-Powered-By: PHP/5.5.12
You can see that the response is NOT zipped (Firebird gives me a 0.06ko response), and the server sends the responses using chunked encoding.
I tried an alternate method to send zipped responses :
<?php
$response = "abcdefghabcdefghabcdefghabcdefghabcdefghabcdefghabcdefghabcdefgh";
$replyBody = gzencode($response, 9, FORCE_GZIP);
header("Content-Encoding: gzip");
echo $replyBody;
And the response headers are as follow (the request headers are always the same):
Response :
Connection: Keep-Alive
Content-Type: text/html
Date: Tue, 26 Jan 2016 15:29:01 GMT
Keep-Alive: timeout=5, max=100
Server: Apache/2.4.9 (Win64) OpenSSL/1.0.1g PHP/5.5.12
Transfer-Encoding: chunked
X-Powered-By: PHP/5.5.12
As you can see, this is basically the same behavior as in the first method.
Then if I try this :
<?php
$response = "abcdefghabcdefghabcdefghabcdefghabcdefghabcdefghabcdefghabcdefgh";
$replyBody = gzencode($response, 9, FORCE_GZIP);
echo $replyBody;
I receive something that looks like a zipped response (random characters), and the output size is 0.03ko .
Here a the corresponding response http headers :
Connection: Keep-Alive
Content-Length: 31
Content-Type: text/html
Date: Tue, 26 Jan 2016 15:32:46 GMT
Keep-Alive: timeout=5, max=100
Server: Apache/2.4.9 (Win64) OpenSSL/1.0.1g PHP/5.5.12
X-Powered-By: PHP/5.5.12
It lets me think that the zipping part is working correctly because the output size has been reduced (it is obviously not readable because the browser can't know that it is zipped content).
That's where I'm lost ...
If I understand it right, when I manually send zipped data (using gzencode) , If I set the header"Content-Encoding: gzip" , the webserver/php seems to UNZIP it before sending it to the browser ??? How is it possible ?
And Why is it sending it as "chunked" data instead of setting the Content-Length header ?
I tried to set the Content-Length manually in the response; it doesn't change anything (it won't appear in the response headers, I'll still have a "chunked" response.
I've seen somewhere that I have to write the "Content-Length" header before sending other data or header to avoid the "chunked" response, I tried it and still had the same results.
I thought it could be a problem with BOM characters at the beginning of my php test script, but it is saved in UTF-8 without BOM encoding so I don't think it 's the problem.
I have this problem on my dev computer (using wampserver) AND in the production environment (IIS), previously it was working on both servers.
I have this problem using several browsers, and I checked the response sizes I wrote previously with fiddler.
Does anyone see where the problem could be ?
Thanks in advance
I'd go through the following checklist if I were you.
1. Check zlib extension is installed or not.
ob_gzhandler needs the zlib extension to work. Without which it just silently falls back to default settings.
2. Verify that you don't have zlib.output_compression enabled in your php.ini.
As explained here, even though zlib.output_compression is preferred over ob_gzhandler(), you can not use both simultaneously. So your code becomes..
if (extension_loaded('zlib') && !ini_get('zlib.output_compression')){
ob_start('ob_gzhandler');
}
3. Check if headers have already been sent eg. something got output-ted before the
ob_start(ob_gzhandler) This will prevent the compressed output from being detected as such. eg. Having some character before <?php or an echo somewhere up in the code.
4. Make sure you aren't using all of the above in addition to the gzipping in apache(mod_deflate).
This will only cause the output to be double gzipped which most probably will confuse the browser.

why does php add pre- and postfix to body?

I try to get a php api working, but i monitored requests and php ads a pre- and postfix. no idea why. But its not getting showed in my browser (2c and 0). strange.
here the request:
GET /tasks?dk=123 HTTP/1.1
Host: device.mydomain.eu
Connection: keep-alive
Cache-Control: max-age=0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/35.0.1916.114 Safari/537.36
DNT: 1
Accept-Encoding: gzip,deflate,sdch
Accept-Language: de-DE,de;q=0.8,en-US;q=0.6,en;q=0.4
HTTP/1.1 200 OK
Date: Tue, 10 Jun 2014 10:20:20 GMT
Server: Apache/2.2.22
X-Powered-By: PHP/5.4.9
Connection: close
Transfer-Encoding: chunked
Content-Type: application/json
2c
{"tasks":[{"feature":"start","action":"1"}]}
0
any idea?
It's not added by PHP, it is added by Apache. This is due to the chunked transfer. You are receiving one chunk, the first one being 44 characters long (or 2c in hexadecimal notation).
Chunked responses always end with a chunk containing 0 bytes, to indicate the end.

PHP + jQuery + MySQL async fetching data, script ends unexpected

I'm trying to use jQuery's $.ajax() to GET a PHP page, on a based interval, which prints the latest enteries from a MySQL table. If there are new enteries, they are printed on the page, if not, "NaN" gets printed.
This is my PHP function which checks for new enteries: http://pastebin.com/R6NHDU3t
This is the HTML + jQuery which checks for response from the above function(got it from some answer here on Stack): http://pastebin.com/myT7evAx
The idea is this: I access the HTML page, the jQuery inside is GET-ing the PHP page every 3 seconds, the PHP checkes for new enteries inside MySQL; if there are new enteries, they are printed on the PHP page and get fetched by jQuery, then appended into #messages, if not, "NaN" is printed(because of JSON, doesen't matter).
This works fine if no new enteries are available: every 3 seconds "NaN" gets printed on the page; some Live HTTP Headers:
http://localhost/handler.class.php?Q=comments&_=1348715334458
Host: localhost
User-Agent: Mozilla/5.0 (Windows NT 5.1; rv:14.0) Gecko/20100101 Firefox/14.0.1
Accept: */*
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip, deflate
Connection: keep-alive
X-Requested-With: XMLHttpRequest
Referer: http://localhost/live.html
HTTP/1.1 200 OK
Date: Thu, 27 Sep 2012 03:08:54 GMT
Server: Apache/2.2.22 (Win32) PHP/5.3.13
X-Powered-By: PHP/5.3.13
Content-Length: 15 <---- the "NaN", no new data
Keep-Alive: timeout=5, max=97
Connection: Keep-Alive
Content-Type: text/html
If new enteries are made, the script stopps appending "NaN"(doh, new data), and it hould append my new enteries, but the script stopps, no new data is appended and it stopps making requests. Here is the latest Live HTTP Headers log, showing the last request:
http://localhost/handler.class.php?Q=comments&_=1348715337473
GET /handler.class.php?Q=comments&_=1348715337473 HTTP/1.1
Host: localhost
User-Agent: Mozilla/5.0 (Windows NT 5.1; rv:14.0) Gecko/20100101 Firefox/14.0.1
Accept: */*
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip, deflate
Connection: keep-alive
X-Requested-With: XMLHttpRequest
Referer: http://localhost/live.html
HTTP/1.1 200 OK
Date: Thu, 27 Sep 2012 03:08:57 GMT
Server: Apache/2.2.22 (Win32) PHP/5.3.13
X-Powered-By: PHP/5.3.13
Content-Length: 3257 <-- new data, so it got the response, but stopped here. no appending, no more requests.
Keep-Alive: timeout=5, max=96
Connection: Keep-Alive
Content-Type: text/html
Why does it stop when handler.class.php?Q=comments returns new data?

PHP File Serving Script: Unreliable Downloads?

This post started as a question on ServerFault ( https://serverfault.com/questions/131156/user-receiving-partial-downloads ) but I determined that our php script was the culprit. So I'm issuing an updated question here about what I believe is the actual issue.
I am using a php script to verify permissions and then serve up a file for users of my website to download. Most of the time, this works, but recently one user has been seeing problems with larger downloads. He is only getting ~80% of downloads for files that are > 100MB in size. Also, all downloads from this script fail to report a filesize. Further, tests revealed that the same user COULD reliably download each of the failed files if given a direct link (at which point the filesize is reported).
Here's the relevant snippet of code that we are using to serve the file:
header("Content-type:$contenttype");
$len = filesize($filename);
header("Content-Length: $len");
header("Content-Disposition: attachment; filename=".$title.".".$ext);
readfile($filename);
Note that $contenttype, $filename, $title, and $ext are all set correctly before we get here. These have been triple-checked. None of them are the problem. Also, $len does provide the correct filesize.
While researching this issue, I came across this post: Content-Length header always zero
It seems that I am encountering the same issue. When I use the script, I get chunked encoding on the file and no size is set for content-length. I'm hypothesizing that something is going wrong on the large downloads, leading him to get a zero-length chunk before the end of the file.
Here's what the headers look like for a direct request:
http://www.grinderschool.com/videos/zfff5061b65ae00e8b21/KillsAids021.wmv
GET /videos/zfff5061b65ae00e8b21/KillsAids021.wmv HTTP/1.1
Host: www.grinderschool.com
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 (.NET CLR 3.5.30729)
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 115
Connection: keep-alive
Referer: http://www.grinderschool.com/phpBB3/viewtopic.php?f=14&p=29468
Cookie: style_cookie=printonly; phpbb3_7c544_u=2; phpbb3_7c544_k=44b832912e5f887d; phpbb3_7c544_sid=e8852df42e08cc1b2250300c2897f78f; __utma=174624884.2719561324781918700.1251850714.1270986325.1270989003.575; __utmz=174624884.1264524375.411.12.utmcsr=google|utmccn=(organic)|utmcmd=organic|utmctr=low%20stakes%20poker%20videos; phpbb3_cmviy_k=; phpbb3_cmviy_u=2; phpbb3_cmviy_sid=d8df5c0943863004ca40ef9c392d371d; __utmb=174624884.4.10.1270989003; __utmc=174624884
Pragma: no-cache
Cache-Control: no-cache
HTTP/1.1 200 OK
Date: Sun, 11 Apr 2010 12:57:41 GMT
Server: Apache/2.2.14 (Unix) mod_ssl/2.2.14 OpenSSL/0.9.8l DAV/2 mod_auth_passthrough/2.1 FrontPage/5.0.2.2635
Last-Modified: Sun, 04 Apr 2010 12:51:06 GMT
Etag: "eb42d6-7d9b843-48368aa6dc280"
Accept-Ranges: bytes
Content-Length: 131708995
Keep-Alive: timeout=10, max=30
Connection: Keep-Alive
Content-Type: video/x-ms-wmv
And here's what they look like for the request answered by my script:
http://www.grinderschool.com/download_video_test.php?t=KillsAids021&format=wmv
GET /download_video_test.php?t=KillsAids021&format=wmv HTTP/1.1
Host: www.grinderschool.com
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 (.NET CLR 3.5.30729)
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 115
Connection: keep-alive
Cookie: style_cookie=printonly; phpbb3_7c544_u=2; phpbb3_7c544_k=44b832912e5f887d; phpbb3_7c544_sid=e8852df42e08cc1b2250300c2897f78f; __utma=174624884.2719561324781918700.1251850714.1270986325.1270989003.575; __utmz=174624884.1264524375.411.12.utmcsr=google|utmccn=(organic)|utmcmd=organic|utmctr=low%20stakes%20poker%20videos; phpbb3_cmviy_k=; phpbb3_cmviy_u=2; phpbb3_cmviy_sid=d8df5c0943863004ca40ef9c392d371d; __utmb=174624884.4.10.1270989003; __utmc=174624884
HTTP/1.1 200 OK
Date: Sun, 11 Apr 2010 12:58:02 GMT
Server: Apache/2.2.14 (Unix) mod_ssl/2.2.14 OpenSSL/0.9.8l DAV/2 mod_auth_passthrough/2.1 FrontPage/5.0.2.2635
X-Powered-By: PHP/5.2.11
Content-Disposition: attachment; filename=KillsAids021.wmv
Vary: Accept-Encoding
Content-Encoding: gzip
Keep-Alive: timeout=10, max=30
Connection: Keep-Alive
Transfer-Encoding: chunked
Content-Type: video/x-ms-wmv
So the question is...what can I do to make downloads from the script work properly? Again, for 99% of users, it works as is (though I find it annoying now that no filesize is reported and thus that no time estimate can be computed about the download).
It's your GZIP compression. When you specify a content length but turn compression on, it gums everything up. It's happened to me a few times: try turning it off in your script.
Generally you'd turn it on with:
ob_start("ob_gzhandler");
...so just comment that line out. If that's not in your code, chances are there's a setting in either your php.ini file somewhere or your apache.conf/conf.d.
Hope this helps!
Content-Encoding: gzip
Hmm. Presumably PHP's zlib.output_compression is doing that. (Doesn't look like Apache's mod_deflate.)
Try turning it off and seeing if it's that that's forcing chunked encoding. You don't want to compress the download of a filetype like WMV which is already highly compressed.
However, chunked encoding would only explain the lack of size report. The download should still work. Is it possible you're being hit by a timeout (eg. PHP's set_time_limit, or Apache Timeout)?
If it is the script then have you tried using a substitute function for the readfile() function that reads and outputs a bit at a time? The reasoning behind this could be that a memory limit is reached somewhere and it fails.
From http://php.net/manual/en/function.readfile.php :
function readfile_chunked ($filename) {
$chunksize = 1*(1024*1024); // how many bytes per chunk
$buffer = '';
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, $chunksize);
print $buffer;
}
return fclose($handle);
}
Also, try to flush the output as often as you can.

Categories