We have a magento commerce site running on an IIS 6.0 server with PHP 5.2.11 running magento.
Whenever user tries to use the print to download pdf to their computer from the admin panel the download does not complete. I can see that the full file is downloaded to the computer but the browser still keeps on saying it is downloading. This means the file gets save with a .part in the end and users cant open the file as pdf. If i remove .part extension created by firefox then i can view the pdf correctly. This means the data is sent to the browser from server in full but download does not terminate.
See headers below on response while starting to download the pdf
HTTP/1.x 200 OK
Cache-Control: must-revalidate, post-check=0, pre-check=0
Pragma: public
Content-Length: 1456781
Content-Type: application/pdf
Content-Encoding: gzip
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Last-Modified: Fri, 18 Dec 2009 10:23:37 +0000
Vary: Accept-Encoding
Server: Microsoft-IIS/6.0
X-Powered-By: ASP.NET, PHP/5.2.11
Content-Disposition: attachment; filename=invoice2009-12-18_10-23-37.pdf
Date: Fri, 18 Dec 2009 10:23:37 GMT
I guess it is something to do with not closing the connection after sending the whole file through? Please help!
Thanks.
I had the exact same problem (Apache), I temporarily solved the issue by turning off the gzip compression on the responses. My guess is that the size being reported by Magento (which it gets from a strlen() call on the PDF content) to the browser does not reflect the real content size that the browser gets given that it gets compressed later on. This results in the browser waiting for more data which is never going to arrive..
edit: worth noting that in my case I was going to the site through a reverse proxy.
Have you tried explicitly calling exit; after you output the pdf data. Sounds like an IIS thing.
Related
I have seen this method being used on about three sites now, including Facebook, Dropbox and Microsoft's Skydrive. It works like this. Let's say you want to look at the image without downloading, then you'd just do this.
https://fbcdn-sphotos-a.akamaihd.net/hphotos-ak-xxxx/xxx_xxxxxxxxxxxxxxx_xxxxxxxxx_o.jpg
But if I want to download it, I'd add ?dl=1
https://fbcdn-sphotos-a.akamaihd.net/hphotos-ak-xxxx/xxx_xxxxxxxxxxxxxxx_xxxxxxxxx_o.jpg?dl=1
Easy peasy right? Well, it's probably not easy on the server side, and this is where my problem is. I would know how to do this if that .jpg-file was a PHP script and the $_GET parameter pointed to the image and another parameter would specify whether the image were to be downloaded or not. But that's not the case.
So, what methods did I try? None. Because I honestly have no idea how this works, it's like magic to me. Maybe it's something that you do in .htaccess? That sounds reasonable to me, but after a while of googling I didn't find anything even close to what I'm asking for.
You have some options.
One option would be to use a PHP script instead of the .jpg file. So your URL would point to a PHP file and in the PHP file you would do something like this:
header('Content-Type: image/jpeg');
if ($_GET['dl'] == 1)
header('Content-Disposition: attachment; filename="downloaded.jpg"');
$file = $_GET["file"];
// do some checking to make sure the user is allowed to get the file specified.
echo file_get_contents($file);
Another option would be to use mod_rewrite in your .htaccess file to check for ?dl=1 and if found, redirect to the PHP script that will download the file (the same way as above).
I'm sure there are more options, but those two are the only ones popping into my head right now.
I would have redirected all of the images to a single PHP file that will handle them based on their URI parameters.
in the .htaccess I would put:
Options +FollowSymLinks +ExecCGI
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{REQUEST_URI} \.(jpg|png|gif)$
RewriteRule (.*) images.php [QSA]
</IfModule>
that will make all of the requests to files with the extensions of jpg, png and gif redirect to you images.php file.
in the images.php file I would search for the existence of ?dl=1 and by that decide how to serve the image:
$requestedImage = $_SERVER['REQUEST_URI']
if (strpos($requestedImage,'?dl=1') !== false) {
// serve the image as attachment
}else{
// just print it as usual
}
The display response headers on such a facebook URL:
HTTP/1.1 200 OK
Content-Type: image/jpeg
Content-Length: 5684
Last-Modified: Fri, 01 Jan 2010 00:00:00 GMT
X-Backend: hs675.ash3
X-BlockId: 157119
X-Object-Type: PHOTO_PROFILE
Access-Control-Allow-Origin: *
Cache-Control: max-age=1209600
Expires: Fri, 12 Oct 2012 13:07:08 GMT
Date: Fri, 28 Sep 2012 13:07:08 GMT
Connection: keep-alive
And the download response headers:
HTTP/1.1 200 OK
Content-Type: image/jpeg
Content-Length: 5684
Last-Modified: Fri, 01 Jan 2010 00:00:00 GMT
X-Backend: hs675.ash3
X-BlockId: 157119
X-Object-Type: PHOTO_PROFILE
Content-Disposition: attachment
Access-Control-Allow-Origin: *
Cache-Control: max-age=1209600
Expires: Fri, 12 Oct 2012 13:07:17 GMT
Date: Fri, 28 Sep 2012 13:07:17 GMT
Connection: keep-alive
See the Content-Disposition: attachment line which is a difference.
So as you're already serving the images from a PHP script, in case the download parameter is set, add:
header('Content-Disposition: attachment');
and you should be fine.
Our web application has version numbers that get served out to the client on each request so we can detect an update to the code (ie rolling updates) and displays a popup informing them to reload to take advantage of the latest update.
But I'm experiencing some weird behaviour after the update of the version number on the server, where some requests return the new version number and some return the old, so the popup keeps poping up until you have reloaded the page a few times.
Originally I suspected maybe apache was caching files it read off disk via file_get_contents so instead of storing the version number in a plain text file, I now store it in a php file that gets included with each request, but I'm experiencing the exact same issue!
Anyone have any ideas what might be causing apache or php it self to be serving out old information after i have done an update?
EDIT: I have confirmed its not browser caching as I can have the client generate unique urls to the server (that it can deal with via rewrite) and i still see the same issue where some requests return the old version number and some the new, and clearing the browser cache doesn't help
EDIT 2: The response headers as requested
HTTP/1.1 200 OK
Date: Mon, 23 Jul 2012 16:50:53 GMT
Server: Apache/2.2.14 (Ubuntu)
X-Powered-By: PHP/5.3.2-1ubuntu4.7
Cache-Control: no-cache, must-revalidate
Pragma: no-cache
Expires: Sat, 26 Jul 1997 05:00:00 GMT
Vary: Accept-Encoding
Content-Encoding: gzip
Content-Length: 500
Connection: close
Content-Type: text/html
EDIT 3: So trying to reproduce to get the response headers I found I could only make it happen going through our full deploy process which involves creating versioned folders storing the code and symlinking the relavant folder into the webroot. Just changing the version number wasn't enough to cause it to happen! So seems to be somehow related to the symlinks i create!
I have the same problem when there is a change in the symlink. Have a look at https://bugs.php.net/bug.php?id=36555 it's maybe what you are looking for.
Try (as said in this bug report) setting realpath_cache_size is 0.
EDIT #2
Ok, the problem is different to what I originally thought, so I'm putting this 'edit' up the top. I've also updated the question title (the old one was 'Streaming wav audio from a mysql blob').
The problem now seems to be related to CodeIgniter sessions. The above script will only run if the user is logged in. For some reason, when I manually set the response headers (either
with php's header() or codeigniters output class) I can see from my logs that everything gets reloaded and reinitilised -- and that the session data is lost, so the user is no longer logged in, so the script is actually outputting an error.
Once I removed any requirement or reference to session data the audio plays fine... but this isn't really an option unless I can manage to authenticate the user some other way. Very frustrating.
.
.
.
** Original Text **
I have a mysql blob which contains audio data in wav format. I'm trying to output that to the browser to stream in whatever audio player plugin the browser wants to use.
The audio player works fine if I point it at a physical .wav file, so that doesn't seem to be a problem. However, if I point it at my PHP script I get a 'Search for suitable plugin?' popup in Firefox (which fails to find anything), and just an empty inactive player in Chrome.
Here's the PHP, the $aud object contains information retrieved from the database:
header("Content-length: $aud->size");
header("Content-type: audio/x-wav");
echo $aud->recording;
exit();
If I add header("Content-Disposition: attachment; filename=$name"); to the above I get to download the file, which then plays fine in an external audio player, but that's not what I want.
This snippit is part of a CodeIgniter application, if that would make a difference. I have routing set up so that /audio/$id.wav will grab the appropriate data and output it with the code above.
Can anyone see or think of any reason the above might not be working?
EDIT
These are the headers returned by the php script:
Date: Tue, 22 Mar 2011 22:12:06 GMT
Server: Apache/2.2.16 (Ubuntu)
X-Powered-By: PHP/5.3.3-1ubuntu9.3
Set-Cookie: ci_session=<long encrypted session string>; expires=Wed, 23-Mar-2011 00:12:06 GMT; path=/
Content-Length: 12345
Keep-Alive: timeout=15, max=98
Connection: Keep-Alive
Content-Type: audio/x-wav
200 OK
And for comparison, these headers are returned when I force a download of the above audio and open that wav file directly in the browser:
Date: Tue, 22 Mar 2011 22:10:53 GMT
Server: Apache/2.2.16 (Ubuntu)
Last-Modified: Tue, 22 Mar 2011 22:08:30 GMT
Etag: "200a83-3039-49f197bfcb380"
Accept-Ranges: bytes
Content-Length: 12345
Content-Type: audio/x-wav
200 OK
Saving and then opening the file directly does work. Having the PHP script output to the browser does not.
I would start investigating this problem by comparing the HTTP content-length, content-type, and all the other headers sent by the web server when pointing to the physical .wav file, with the headers sent when trying to open the PHP script. I think the actual content body is correct, according to your post stating that if you download the file as an attachment, it can be played with an audio player application.
Try adding this header in there:
header('Content-Transfer-Encoding: binary');
See if that helps it out.
I believe this would be a more CPU friendly method, can it be implemented with php ?, instead of gzipping content for every request, I compress the files once and serve those instead =).
Yes, this is quite easy to do with Apache.
Store the uncompressed and compressed files side by side. E.g.:
\-htdocs
|-index.php
|-javascript.js
\-javascript.js.gz
Enable content negotiation in Apache. Use:
Options +MultiViews
Now when "/javascript" is requested, Apache will serve the gzipped version if the client declares it accepts it (through Accept-encoding).
Example of two HTTP requests (some headers omitted):
Client claims to accept gzip
GET /EP/Exames/2006-2007/exame2B HTTP/1.1
Host: lebm.geleia.net
Accept-Encoding: gzip, identity
HTTP/1.1 200 OK
Date: Fri, 13 Aug 2010 16:22:59 GMT
Content-Location: exame2B.nb.gz
Vary: negotiate,accept-encoding
TCN: choice
Last-Modified: Sun, 04 Feb 2007 15:33:53 GMT
ETag: "0-c9d-428a84de03a40;48db6d490abee"
Accept-Ranges: bytes
Content-Length: 3229
Content-Type: application/mathematica
Content-Encoding: gzip
‹áüÅE
(response continues)
Client does not claim to accept gzip
GET /EP/Exames/2006-2007/exame2B HTTP/1.1
Host: lebm.geleia.net
Accept-Encoding: identity
HTTP/1.1 200 OK
Date: Fri, 13 Aug 2010 16:23:14 GMT
Content-Location: exame2B.nb
Vary: negotiate,accept-encoding
TCN: choice
Last-Modified: Sun, 04 Feb 2007 15:33:53 GMT
ETag: "0-257f-428a84de03a40;48db6d490abee"
Accept-Ranges: bytes
Content-Length: 9599
Content-Type: application/mathematica
(************** Content-type: application/mathematica **************
CreatedBy='Mathematica 5.2'
(response continues)
See a more complete version here http://pastebin.com/TAwxpngX
Yes, this is a sensible approach to save both bandwidth and connections. (You can enable gzip compression within Apache if so desired, but it's potentially worth doing this anyway as you've save connections.)
In essence, use a PHP function to check if the browser supports gzip compression. (If if doesn't you'll need to fetch the JavaScript/CSS as per normal.) If it does, you can simply point the JavaScript or CSS source location at a PHP script which is responsible for:
Checking to see if there's a compressed version in place. (Simply output the existing 'on disk' if there is.)
Creating a compressed version of the required files.
You'll also probably want to enable/disable this from a define/top level config (for testing purposes, etc.) As a suggestion, you could store the required CSS/JavaScript files paths in a set of arrays which could be used as a basis for creating the cache file or including the files in the traditional manner as a fallback.
I've written a solution along these lines in the past that created a file based on a hash of the required filenames. As such, the cache was automatically rebuilt if a different/additional file was included. (It also re-built the cache after 'n' hours, but that's only to keep things fresh if the filenames didn't change, but the content did.)
I've got a page on a webapp that has about 13 images that are generated by my application, which is written in the Kohana PHP framework. The images are actually graphs. They are cached so they are only generated once, but the first time the user visits the page, and the images all have to be generated, about half of the images don't load in the browser. Once the page has been requested once and images are cached, they all load successfully.
Doing some ad-hoc testing, if I load an individual image in the browser, it takes from 450-700 ms to load with an empty cache (I checked this using Google Chrome's resource tracking feature). For reference, it takes around 90-150 ms to load a cached image.
Even if the image cache is empty, I have the data and some of the application's startup tasks cached, so that after the first request, none of that data needs to be fetched.
My questions are:
Why are the images failing to load? It seems like the browser just decides not to download the image after a certain point, rather than waiting for them all to finish loading.
What can I do to get them to load the first time, with an empty cache?
Obviously one option is to decrease the load times, and I could figure out how to do that by profiling the app, but are there other options?
As I mentioned, the app is in the Kohana PHP framework, and it's running on Apache. As an aside, I've solved this problem for now by fetching the page as soon as the data is available (it comes from a batch process), so that the images are always cached by the time the user sees them. That feels like a kludgey solution to me, though, and I'm curious about what's actually going on.
Edit: A commenter asked to see the headers for the request:
Request
Request URL: http://domain.com/charts/chart_name/1234/
Request Method: GET
Status Code: 200 OK
Request Headers
Cache-Control: max-age=0
Referer: http://domain.com/home/chart_page
User-Agent: Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.55 Safari/533.4
Response Headers
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Connection: Keep-Alive
Content-Length: 6354
Content-Type: image/png
Date: Wed, 26 May 2010 21:10:45 GMT
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Keep-Alive: timeout=15, max=94
Pragma: no-cache
Server: Apache
X-Powered-By: PHP/5.3.1
With the image cached, the only difference in the response headers is:
Content-Length: 1129
Keep-Alive:timeout=15, max=96
I am looking into the strange difference for the content length, as it should be the exact same content. I realize that this is likely not optimized in terms of getting the browser to cache the image, but once the image is generated once, the entire page load (including downloading images, scripts, etc.) takes about 1-2 seconds. Without the images cached on the server, the page load is taking 20-30s and several of the images fail to load at all.
After noticing the discrepancy in file sizes there, I realized that I had Kohana's profiler set up incorrectly, so it was outputting a bunch of profile data at the end of the images. Not a lot per request, but in aggregate, it made a considerable difference. The images all load now.