I'm using readfile() and header() for make the user download a file. This works good but today when I tried the script with a mp4 video file, the video get corrupted.
The video was uploaded with success for sure, because if I access the video directly in the address bar I can download it, but if I use my script (example download.php?id=105) I got a corrupted video. What I can't understand is because all other files (videos, images, pdf, ecc) were all downloaded correctly and this file is corrupted instead.
P.S The script I used (I repeat, it works for all file except this new video) is:
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header('Content-Type: '.$mime);
header("Content-Transfer-Encoding: Binary");
header('Content-length: '.filesize($href));
header('Content-Disposition: inline; filename="'.(pathinfo($href, PATHINFO_BASENAME)).'"');
readfile($href);
EDIT: If i remove the headers and readfile, no error is shown.
Your problem is here:
echo readfile($href);
readfile() will return integer value (number of bytes that were read). This function will output result by itself - you don't need to do echo (and you shouldn't, in fact), use:
readfile($href);
-because otherwise you'll definitely get not the thing that you're expecting
May be change max_execution_time & max_input_time in php.ini (php config system)
; Maximum execution time of each script, in seconds
; http://php.net/max-execution-time
; Note: This directive is hardcoded to 0 for the CLI SAPI
max_execution_time = 300
; Maximum amount of time each script may spend parsing request data. It's a good
; idea to limit this time on productions servers in order to eliminate unexpectedly
; long running scripts.
; Note: This directive is hardcoded to -1 for the CLI SAPI
; Default Value: -1 (Unlimited)
; Development Value: 60 (60 seconds)
; Production Value: 60 (60 seconds)
; http://php.net/max-input-time
max_input_time = 300
My guess is wrong file path, but you said it is correct.
Only 1 possible error source left! Where? In code above the headers, which you did not show to us.
EDIT: If I remove the headers and readfile, no error is shown.
Please, remove the headers again (not the readfile part tho), and enable error reporting:
<?php
// Put these lines to the very top of your script
error_reporting(E_ALL);
ini_set('display_errors', true);
ini_set('display_startup_errors', true);
ini_set('xmlrpc_errors', true);
Quite sure you will see an error now.
I have same problem. The downloaded file corrupts, but the one on the server is correct. This sort of happens randomly. Sometimes, the downloaded file takes the name of the script plus its .php extension. I guess it's a problem with windows. Try uploading to a linux server and see if problem persists
I've been using PHP for most of my web apps, which usually included some form of download or another. They all worked fine until my most recent project. Videos/other files are uploaded correctly, but are corrupted on download. In order to inspect the problem, I used Burp Suite to intercept/inspect web traffic and here's what I got:
Image showing unwanted space in Video file
Notice the number of blank lines in the response at the right of the image (after "Content-Length: 11028102"). In HTTP standard, only one blank line (new line character) is to be left before the response body (which in this case is the video--what appears to be garbled text from line 21). In other words, the response body/video text should have started at line 13.
Initially, I thought it was a server problem, so I changed from Apache running on LAMPP to Nginx. Still problem persisted. I then supposed it was a PHP readfile() function problem, so I set up a Node Server on port 3000 strictly for download. So PHP /download page processes the request and fetches from Node server (which is JavaScript). I thought it'd be sufficient and faster since Node server does not create a new thread for each new connection unlike PHP and many other languages. It worked out well, the video file started from line 13 (in my case) as it ought to, and the files were no longer corrupted, plus it was very fast. However, this isn't a probable solution.
Simple Solution: So recently, I got curious again as regarding this problem, and inspected my index page using Burp Suite again and noticed similar occurrence to what was corrupting files: unnecessary space before the HTML as can be seen in the image below:
Image showing index file wrongly returned
The same space was there but the HTML comment was showing correctly starting at line 12. This was the top code portion of my index file:
<!--header-->
<?php
require_once "core/init.php";
require_once "header.php";
?>
This showed that the comment was output normally, starting from line 12 but there was space before the < !DOCTYPE html> contained in "header.php". Apparently, it was only "init.php" which sat between the HTML comment and the "header.php" where the HTML code starts. THIS HAD TO MEAN SOMETHING WAS WRONG WITH MY "init.php file
I then opened the file and noticed several blank lines corresponding in number to the blank lines that were output on Burp Suite, as shown in this image:
Image showing culprit blank lines in "init.php"
I simply deleted the blank lines and my files (in this case videos) were properly outputted as shown in this image:
Image showing properly outputted file from PHP
So, if you've got corrupted files on download, this is the problem. On that page, remove ANY SPACE before your PHP tags, check the files included on that page directly or indirectly using PHP, e.g., "init.php", "function files", "database connection files"; make sure there is NO SPACE before the PHP opening tag AND AFTER THE PHP closing tag. Clear all these blank lines and you're good to go! Good luck with your projects!
Related
I'm randomly getting download errors from a link on a page. I also simplified the link to a directory for easy usage in emails for users.
On the main page the link looks like this:
a href="http://myPage.com/Mac" target="_blank" id="macDownloadButton" class="downloadbutton w-button">Download Mac version</a>
On my server, that's a directory with an index.php in it which looks like this:
<?php
// mac version
$file="http://www.myPage.com/downloads/myApp_Mac.zip";
$filename="myApp_Mac.zip";
header('Content-Transfer-Encoding: binary');
header('Accept-Ranges: bytes');
header('Content-Length: ' . filesize($file));
header('Content-Encoding: none');
header('Content-Type: application/zip');
header('Content-Disposition: attachment; filename=' . $filename);
readfile($file);
exit;
?>
Again, the reason I do this is so it's a simple link to send to users in email like, "http://myPage.com/Mac" and "http://myPage.com/Windows".
The weird thing is that it mostly works...but sometimes it doesn't.
What am I doing wrong?
It's hard to know precisely what's wrong unless you check for errors on your readfile() call.
But you're invoking your web server from your web server here when you specify a filename starting with http. You're doing
readfile('http://www.myPage.com/downloads/myApp_Mac.zip');
where you could just as easily do
readfile('../downloads/myApp_Mac.zip');
and read the zip file from the local file system to send to your user.
What's more, filesize('../downloads/myApp_Mac.zip'); will yield a numerical value quickly and send it in the Content-Length header. That will allow the browser, by knowing the total size of the file you're sending, to display a meaningful progress bar.
You should remove the Accept-Ranges header; the php program you showed us doesn't honor range requests. If you lie to the browser by telling it you do honor those requests, the browser may get confused and corrupt the downloaded copy of your file. That will baffle your user.
Your Content-Disposition header is perfect. It defines the filename to be used on your user's machine in the downloads folder.
Simple operations are more reliable, and this may help you.
The reason you got stat failed with link as an error message is this: stat(2) is a operating-system call that operates on files in local and mounted file systems.
As previously mentioned by O. Jones you should definitely always use your local file path.
Most of my previous issues have been mostly browser related where I needed to tweak/add a http header, and in one case I needed to send all the HTTP headers in lowercase but I haven't had an issue like that in years. My personal recommendation would be to use a solid download library/function - it will make a noticeable difference to your productivity as well as rule out most browser related issues you may come across.
I have used the codeIgniter download helper for the last 3 years and recommend it for 99% of use cases. At the very least I would recommend your read through it's code - you will probably find a few cases you have never even considered such as clearing the output buffer,mime detection and even a special case for Android 2.1 as well as a few headers you may or may not need.
If all else fails I have no idea what server your running this on but if you continue to have issues I would recommend monitoring which processes your machine is running while paying close attention to ram and IO usage. I've have encountered bad/misbehaving services that run periodically using 99% of my IO or ram for short intervals at a time that caused a few really odd and unexpected errors.
The following script is trivial and works ok apache without issue
header('Content-Type: image/jpeg');
echo file_get_contents('./photo.jpg');
On NGINX/PHP-FPM I get a blank page. I have tried two different virtual servers. One I created, and the homestead improved box ( https://github.com/Swader/homestead_improved ) which is based on Laravel Homestead.
Error reporting is on, there are no errors. If I remove the header and just use:
echo file_get_contents('./photo.jpg');
I get the binary converted to ASCII and see the strange characters; the file is being loaded correctly.
I thought the issue might be a missing header, so I tried content length:
header('Content-Type: image/jpeg');
$contents = file_get_contents('./photo.jpg');
header('Content-length: ' . strlen($contents));
echo $contents;
This gives a different result: The page never loads, as if the browser never receives all the bytes it's expecting.
If I print strlen($contents) it displays the file size in bytes. PHP is loading the image correctly, but it's never reaching the browser.
The script works on an Apache server so the issue seems to be NGINX or PHP-FPM.
I have tried different images (one 80kb, one 2.2mb), the result is the same. I've also tried readfile instead of file_get_contents.
Update
In Chrome developer tools, the full image is downloaded and shown in the Network tab. The browser is getting the data but it's not displayed.
Your problem lies in process memory. PHP uses a different configuration file when running under PHP-FPM then when running under Apache for instance.
The problem with file_get_contents is that it reads the entire file into memory. In the case of an image file, memory reaches its limit and the http response never completes.
To fix the problem, you can either stream the image using fopen or increase PHP-FPM php's memory limit.
Like Scriptonomy said the memory can be an issue. You can also use readfile
https://secure.php.net/manual/en/function.readfile.php
This block of code shown below using a php header redirect works locally, but not on my Bluehost server:
if ($_POST['submit']=='No')
{
$url ='Location: index.php?id='.$id.'&page='.$page;
header($url);
exit;
}
When the server gets to this block of code, absolutely nothing happens. No error, no warning, just a blank page. The page that my form submit redirects to isn't supposed to do anything except reroute the user to the relevant page.
I'm pretty dang positive it has nothing to do with the common problem of including HTML before the redirect (since it works locally). Therefore I suspect it has something to do with differences between my php.ini files. I've pulled up PHPinfo() for both servers, and my local server has a module named mod_headers while my Bluehost server has none. I think this potentially could be the problem, although normally my Bluehost has no problems using header redirects, except in this one instance.
So I suspect the problem has something to do with my ini file, but I don't know exactly what.
What makes this problem even stranger is that there are other blocks of code which work just fine, for instance
if(!empty($_POST['id']))
{
$id = htmlentities(strip_tags($_POST['id']));
$sql = "UPDATE entries SET title=?, entry=? WHERE id=? LIMIT 1";
$stmt = $db->prepare($sql);
$stmt->execute(array($title,$entry,$id));
$stmt->closeCursor();
$url= 'Location: ../index.php?id='.$id.'&page='.$page;
header($url);
exit;
}
works just great.
I had the same problem that it works fine on XAMPP but not on bluehost or 1&1.
PHP documentation says there should be no output before calling the header function.
http://php.net/manual/en/function.header.php
In my case there was a space before the opening <?PHP which made the header function not work.
This should help to solve the problem.
The quesiton remains though: why does it work on XAMPP and not on the servers of bluehost, 1&1, etc?
(since I use firefox for both tests - it is definitely not a browser issue)
I just had this problem. I realized that I was using an editor that was inserting a BOM at the beginning of the file. This BOM character is virtually invisible to most editors but will stop PHP's header from firing because it does count as ouput. Here is more information on the BOM character:
http://en.wikipedia.org/wiki/Byte_order_mark
I was using notepad++ and was able to disable this BOM by using these directions:
Go to Settings > Preferences > New Document/Default Directory
Changed Encoding to UTF-8 without BOM
The HTTP standard requires that the URL in the Location: directive is an absolute URL. You can't just use index.php for that redirect. You'll need to use:
header("Location: http://example.com/index.php");
Some browsers ignore the standard and allow relative URLs.
I faced the same problem when running moving my Wordpress blog to Bluehost.
The solution was to change output_buffering option in php.ini
Look what at my config along with the explanation of this option:
; Output buffering allows you to send header lines (including cookies) even
; after you send body content, at the price of slowing PHP's output layer a
; bit. You can enable output buffering during runtime by calling the output
; buffering functions. You can also enable output buffering for all files by
; setting this directive to On. If you wish to limit the size of the buffer
; to a certain size - you can use a maximum number of bytes instead of 'On', as
; a value for this directive (e.g., output_buffering=4096).
output_buffering = 4096
We have a web app using Andrew Valums ajax file uploader, if we kick off 5 - 10 image uploads at once, more often then not at least 2 or 3 will result in the same gd error "Corrupt JPEG data"
Warning: imagecreatefromjpeg() [function.imagecreatefromjpeg]:
gd-jpeg, libjpeg: recoverable error: Corrupt JPEG data:
47 extraneous bytes before marker 0xd9 in ....
However this did not happen on our old test server, or local development box's, only on our new production server.
The file size on the server is the same as the original on my local machine, so it completes the upload but I think the data is being corrupted by the server.
I can "fix" the broken files by deleting them and uploading again, or manually uploading via FTP
We had a shared host on Godaddy and just have started to have this issue on a new box (that I set up, so probably explains a lot :) CentOS 5.5+, Apache 2.2.3, PHP 5.2.10
You can see some example good and bad picture here. http://174.127.115.220/temp/pics.zip
When I BinDiffed them I see a consistent pattern the corruption is always 64 byte blocks, and while the distance between corrupted blocks is not constant the number 4356 comes up a lot.
I really think we can rule out the Internet as error checking and retransmission with TCP is pretty reliable, further there seems to be no difference between browser versions, or if I turn anti-virus and firewalls off.
So I'm picking configuration of Apache / PHP?
Some cameras will append some data inside the file that will get interpreted incorrectly (most likely do to character encoding with in the headers).
A solution I found was to read the file in binary mode like so
$fh = fopen('test.jpg', 'rb');
$str = '';
while($fh !== false && !feof($fh)){
$str .= fread($fh, 1024);
}
$test = #imagecreatefromstring($str);
imagepng($test,'save.png');
Well, i think the problem is jpeg-header data, and as far as i know there is nothing to do with it by PHP, i think the problem is your fileuploader, maybe there are some configuration for it that you are missing.
Hmm - a 64 byte corruption?...or did you mean 64 bit?
I'm going to suggest that the issue is in fact as a result of the PHP script. the problem that regularly comes up here is that the script inserts CRLFs into the data stream being uploaded, and is caused by differences between the Window/*nix standards.
Solution is to force the php script to upload in binary mode (use the +b switch for ALL fopen() commands in the php upload). It is safe to upload a text file in binary mode as at least you can still see the data.
Read here for more information on this issue:
http://us2.php.net/manual/en/function.fopen.php
This can be solved with:
ini_set ('gd.jpeg_ignore_warning', 1);
I had this problem with GoDaddy hosting.
I had created the database on GoDaddy using their cPanel interface. It was created as "latin collation" (or something like that). The database on the development server was UTF8. I've tried all solutions on this page, to no avail. Then I converted the database to UTF8, and it worked.
Database encoding shouldn't affect BLOB data (or so I would think). BLOB stands for BINARY Large Object (something...), to my knowledge!
Also, strangely, the data was copied from the dev to production server while the database was still "latin", and it was not corrupted at all. It's only when inserting new images that the problem appeared. So I guess the image data was being fed to MySQL as text data, and I think there is a way (when using SQL) of inserting binary data, and I did not follow it.
Edit: just took a look at the MySQL export script, here it is:
INSERT INTO ... VALUES (..., _binary 0xFFD8FF ...
Anyway, hope this will help someone. The OP did not indicate what solved his problem...
I'm working on a PHP script which generates large (multi-MB) output on the fly without knowing the length in advance. I am writing directly to php://output via fwrite() and have tried both standard output and using Transfer-Encoding: chunked (encoding the chunks as required) but no matter what I try the browser waits until all the data is written before displaying a download dialog. I have tried flush()ing too after the headers and after each chunk but this also makes no difference.
I'm guessing that Apache is caching the output as the browser would normally display after receiving a few kB from the server.
Does anyone have any ideas on how to stop this caching and flush the data to the browser as it is generated?
Thanks,
J
First of all, like BlaM mentioned in his comment, if in the PHP configuration OutputBuffering is enabled, it wont work, so it would be useful to know your phpinfo().
Next thing, try if it works with a big file that is stored on yor webserver, output it usinf readfile. And, together with this, check if you send the correct headers. Hints on how to readfile() and send the correct headers a provided here: StackOverflow: How to force a file download in PHP
And while you are at it, call ob_end_flush() or ob_end_clean() at the top of your script.