I have a download script written in php which works fine BUT I've been running into problems allowing users to download .zip files of between 300 and 600mb from a hosted server. Files of around 70mb download fine but larger ones are either incomplete or corrupt. I've been reading that this problem could be caused by gzip being enabled on the server.
As the server is hosted I have no way of editng the apache config file to disable gzip. Is there another way? I've tried adding the code below
to the .htaccess file that I created in the same directory as the php download script but it did not disable gzip.
#Disable GZIP in directory
RewriteRule ^(.*)$ $1 [NS,E=no-gzip:1,E=dont-vary:1]
What can I do to disable gzip?
This is what the repsonse headers are sending back when I try to download.
Cache-Control max-age=604800
Connection keep-alive
Content-Encoding gzip
Content-Type text/html; charset=UTF-8
Date Tue, 13 Feb 2018 16:12:05 GMT
Expires Tue, 20 Feb 2018 16:12:05 GMT
Server nginx/1.12.2
Transfer-Encoding chunked
Related
Is there any clever way to trick nginx to stop gzip if the backend already has set the "content-encoding" header?
Nginx is configured to gzip the output from the php fastcgi backend.
This works great in 99% of the cases.
Except on rare occasion php will send a raw gzipped file and attach a Content-Encoding: gzip header.
Nginx unfortunately will go right ahead and try to gzip that content a second time.
The produces a double content-encoding: gzip content-encoding: gzip header and double-encoded gzipped body.
Most modern browsers can handle this, Firefox, Chrome.
IE8 cannot, Safari mobile cannot, old Safari 5 for Windows cannot - instead they will show garbled gzipped content because it merges the content-encoding headers and only decodes the gzipped body once.
Thanks for any ideas.
Somewhere in nginx.conf where it applies (there should be a fastcgi_params file somewhere) :
fastcgi_param HTTP_ACCEPT_ENCODING "";
This will disable the encoding from the backend.
I hope that still Nginx will serve encoded content after this. (I am not sure)
I use the following code in htaccess file located in /projects/testsite
AddType x-mapp-php5 .php
RewriteEngine On
RewriteBase /
Options -MultiViews
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . projects/testsite/index.php [L,QSA]
#ErrorDocument 404 /404.php
when i am in http://www.mydomain.com/projects/testsite/admin/articles/1/edit
and i press Save which redirects the request to http://www.mydomain.com/projects/testsite/admin/articles/1/save
all post data are lost.
What i get if i try to debug is
POST: Array print_r: Array ( )
GET: Array print_r: Array ( )
What should i do to my .htaccess file to keep redirecting all requests to index.php but preserving all post data?
Thank you in advance!
P.S. my site works normally if i set it in a Windows Server with web.config rewrite rules.
UPDATE #1
From Firefox Live HTTP headers i see a significant issue: a 301 Moved Permanently header is captured only in Apache (this is not happening in IIS)
HTTP/1.1 301 Moved Permanently
Date: Sun, 06 Apr 2014 13:48:06 GMT
Server: Apache
Location: http://mydomain.gr/projects/testsite/admin/articles/6/save
Vary: Accept-Encoding
Content-Encoding: gzip
Content-Length: 270
Keep-Alive: timeout=3, max=100
Connection: Keep-Alive
Content-Type: text/html; charset=iso-8859-1
UPDATE #2
It seems that there is some relation with this issue:
How to preserve POST data via ajax request after a .htaccess redirect?
where the asker finds out that someone was forcing a 301 redirect rule for every request on top of the .htaccess file.
Is it the hosting provider to blame?
This was quite baffling, hope i saved someone from the same headache:
The problem was located in Parallels Panel>Websites & Domains>mydomain.gr> Hosting Settings
Locate the select field:
Preferred domain, it was set to domain.tld but all my requests where to www.domain.tld so a 301 redirection (you lose all post data) was forced by parallels and not by my applications files.
Note that the label in Parallels warns that:
Regardless of the domain's URL that visitors specify in a browser (with the www prefix or without it), a page with the preferred domain's URL opens. The HTTP 301 code is used for such a redirection. The 'None' value means that no redirection is performed.
So my solution was to change Preferred domain to None.
Sorry to bother you with a question that seems very well documented, but none of the things I tried was completely satisfying.
My Company switched from a hosting package to a manages Server just last week and I'm still in the process of optimising it.
Now, Googles PageSpeed Insights tell me, it is not compressed, as does GTMetrix. GidNetwork tells me compression works fine.
I have already added
<IfModule mod_deflate.c>
<FilesMatch "\.(html|php|txt|xml|js|css)$">
SetOutputFilter DEFLATE
</FilesMatch>
</IfModule>
to my .htaccess, (as recommended here) which works correctly, other settings I've changed are fine, as well as
zlib.output_compression = On
to my php.ini.
The entire .htaccess and php.ini can be seen at jsFiddle.
Headers sent and received in both Firefox and Chrome claim that compression is happening.
I also created a httpd.conf in my home directory, because none existed on my server yet. Should I move the file somewhere else?
What I really want to know:
Soo... what am I doing wrong? Is it compressed? Is it not? How can I make google 'see' the compression?
Thank you very much for your help.
This should be the function you need, it should automatically generates the headers you need:
http://www.php.net/manual/en/function.gzdecode.php
anyway check your php version because it works only with php 5.4.0 or later ones.
Although my browser accepts deflate/gzip encoding (Accept-Encoding: gzip, deflate), your server does not answer using compressed data:
HTTP/1.1 200 OK
Date: Tue, 11 Mar 2014 09:41:45 GMT
Server: Apache
Connection: Keep-Alive
Keep-Alive: timeout=2, max=200
Etag: "96-100e9-4f4517d791912"
Expires: Fri, 11 Apr 2014 13:28:25 GMT
Cache-Control: max-age=2692000, public
Vary: User-Agent
If it was compressed, the server would send also
Content-Encoding: deflate
Use FireBug or the dev console to see the headers. It must be your httpd.conf.
You cannot simply create this file in your home directory and Apache will load it.
Have a look in /etc/apache* and /etc/httpd/* for config files.
You have already enabled gzip compression but you haven't set it to compress some file types such as javascript and css. That is why Googles PageSpeed was tried to suggest to enable compression. To enble compression for those two types, use
AddOutputFilterByType DEFLATE text/css
AddOutputFilterByType DEFLATE application/javascript
What worked for me in the end to compress the js/css files that were left uncompressed for reasons I don't quite get yet is described here.
In this method, javascript and css-files are forcibly gzipped by a php-script. It also sets new "Expire"-Headers, so if you want caching for more than 5 Minutes, change the number. Adding different files is trickier, but possible, I think.
I'm on a shared hosted linux server so I must use .htaccess. I'm busy working on compressing and caching things. I actually have two questions, but first here is how I have the cache setup in my .htaccess file.
ExpiresActive on
<FilesMatch "\.(bmp|png|ico|gff|jpg|jpeg|gif|gcf)$">
FileETag MTime Size
ExpiresDefault "now plus 60 minutes"
</FilesMatch>
Question 1, so this does cache those things with the exception of one PNG file.
Now most of my files are all lowercase as I'm on Linux but a few PNG files have slipped through with and upper case extension.
What's strange is that all of the PNG files cache except for one called addon2.PNG. At first I thought it was because of the case but I have checked and I have 3 other PNG files with uppercase extensions - which Google Page Speed says are cached. So any ideas or is Google Page Speed just B.S.?
And question 2, as I'm wary due to my hosts mess up with their Varnish issue I'm adding things to cache a little at a time and waiting to see if my stuff screws up. When I try to cache HTML files the login/logout features of my site—written in PHP—do not work.
You have to login and refresh or logout and refresh. I'm wondering is that because the page HTML is output via the PHP file? All my main pages are PHP and I only have a few actual html files. But I thought caching HTML would just do those files with htm & html extensions using the code below. But it's kinda like the server is trying to cache the html outputted by the PHP files. Am I out of my mind here?
ExpiresActive on
ExpiresDefault "now plus 60 minutes"
ExpiresByType text/html "now plus 60 minutes"
<FilesMatch "\.(css|bmp|png|ico|htm|gff|html|jpg|jpeg|gif|gcf)$">
FileETag MTime Size
ExpiresDefault "now plus 60 minutes"
</FilesMatch>
Your long explanation basically boils down to a PNG image is not being cached & HTML is not refreshing. It could be that the PNG image you say is having issues (addon2.PNG) is simply damaged. But the problem with your question is you have not provided any headers to show what the output is.
If you are using Linux, just go to the terminal and use curl to get headers. For example, this is how I can get headers sent with the main Google image on their homepage:
curl -I https://www.google.com/images/srpr/logo11w.png
And the output of that command is:
HTTP/1.1 200 OK
Content-Type: image/png
Last-Modified: Wed, 09 Oct 2013 01:35:39 GMT
Date: Tue, 17 Dec 2013 23:39:21 GMT
Expires: Wed, 17 Dec 2014 23:39:21 GMT
X-Content-Type-Options: nosniff
Server: sffe
Content-Length: 14022
X-XSS-Protection: 1; mode=block
Cache-Control: public, max-age=31536000
Age: 1647616
Alternate-Protocol: 443:quic
So in your case just type in this command—remember to set it to the real path of the image—like so:
curl -I https://url/to/your/site/addon2.PNG
And compare the headers against something you know is caching as you wish it to.
Also, ExpiresDefault & Cache-Control can sometimes slightly conflict with each other. So check those two values to see what your host is setting versus what your .htaccess is setting.
And regarding this:
So any ideas or is Google Page Speed just B.S.?
All of those page benchmarking tools have varying usefulness. In general for the specific issue you are facing, they are not B.S. but utterly useless. It’s like you are in a car with an oil leak & some automatic system in the car is telling you to change your oil. You don’t need to “change your oil” you need to find the leak & plug it up so the oil stays in!
So in your case the best bet is to using Unix tools like curl to check what the header output is & adjust/tweak until you get it right.
EDIT Also, please look over this pretty in-depth—and useful—caching guide from Google.
I have flv files on my server which have changing content but not necessarily changing names.
I am having an issue whereby the flv files send headers which set the file to cache. As sometimes the same user may require the same file with different content later the files need to tell the browser not to cache them.
I have tried using something similar to PHP's header() command but when I run:
Curl -I myfile.com/file1.flv
The headers are still there.
Any help please?
I am not sure how you tried using PHP for this, it is apache which is processing and dispatching the file so best to start there.
Try the below:
1) Enable headers.load (headers module) on apache. Else wont work.
2) add below to .htaccess. This will capture file types of all the below formats and set them not to cache.
<FilesMatch "\.(jpg|gif|js|css|ico|swf|zip|pdf|doc|htc|xls|rtf|odt|wav|mp3|avi|wmv|mov|txt|flv)$"> FileETag None
<IfModule mod_headers.c>
Header unset ETag
Header set Cache-Control "max-age=0, no-cache, no-store, must-revalidate" Header set Pragma "no-cache"
Header set Expires "Wed, 11 Jan 1984 05:00:00 GMT"
</IfModule>
</FilesMatch>
3) restart apache.
4) Try 'curl -i www.url.com/file.flv' command again.
You should see the headers tell the file to not cache.