I cannot seem to figure this issue out. I'm downloading a file (Moodle .mbz extension) using PHP with HTTP headers. The file is first downloaded from Amazon S3 to a server using the aws php sdk, which works fine. The issue is downloading the file from the server to another workstation. No matter what I seem to change, the md5 checksums for the file on the workstation and the file on the server do not match. Clients cannot restore the .mbz file they have downloaded to their workstation. It seems something is happening to the file to change it in some way but I cannot figure out what.
I have referenced:
1. this tutorial
2. This similar SO question
and various other resources via google. I'm desperate, please help if you can. I have ensured compression is off in the httpd.conf file. Turned debugging completely off in php.ini. I can replicate the issue in both my development and staging environment.
Here's the code for the direct download:
<?php
if(!file_exists($filename))
{
// file does not exist
die('file '.$filename.' not found');
} else
{
$this->load->model('Job_model', 'job');
$save_as = $this->job->get_file_name_from_key($filename);
header("Content-Type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"$save_as\"");
header("Expires: 0");
header("Pragma: public");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Content-Length: " . filesize($filename));
readfile($filename);
}
Using PHP with codeigniter framework, if that's relevant.
Thank you!
Edit: An octal dump of the first 16 bytes shows that 4 spaces are being appended to the beginning of the file downloaded using the above code vs the file on the server.
This is not so much an answer as a suggestion for an experiment.
I'm wondering if there is a mime problem. I found this file which, on line 145, describes the mime type for 'mbz' as 'application/vnd.moodle.backup'. Maybe by changing the Content-Type header, you can get better results.
As I understand it, 'mbz' is basically a zip file. So try adding the following to the bottom of /application/config/mimes.php.
'mbz' => array('application/vnd.moodle.backup', 'application/zip'),
If you want to cover all the bases you could expand the definition to
'mbz' => array('application/vnd.moodle.backup', 'application/x-zip', 'application/zip', 'application/x-zip-compressed', 'application/s-compressed', 'multipart/x-zip'),
Which takes all the possibilites for a zip type and makes the "default" 'application/vnd.moodle.backup'. I'd try the simpler version first.
Try this header
header("Content-Type: application/vnd.moodle.backup");
So I found this question, specifically the answer from MrPanda. I went through all the helpers and libraries (that I had written) I called/initialized in my controller and deleted any white space after the closing php tag. Problem solved. Thank you to the users who tried helping me! Two days of frustration finally come to a close.
Related
So my company store all PDF files in Amazon S3 privately.
When the user request it our system pull it from Amazon S3 and then serve it to the user with following code:
header("Cache-Control: public");
header("Pragma: public");
header("Expires: 0");
header("Content-Description: File Transfer");
header('Content-Disposition: attachment; filename="'.$fileName.'"');
header('Content-Length: ' . strlen($res->body));
header("Content-type: application/pdf");
header("Content-Transfer-Encoding: binary");
header('Connection: close');
echo $res->body;
$res is the respond returned from Amazon with the content from $res->body;
I see random slow download speed when the user try to download the PDF files, especially when the PDF is large (~5mb) compare to the rest that only having 800kb-1.5mb.
Solution tried:
1) Removing the content-length header doesn't help.
2) Remove EnableSendfile off in httpd.conf doesn't help either.
I also checked the server to make sure it wasn't the workload of the server that's causing this.
The speed test of both the server and user's workstation looks good too.
Do anyone of you have any idea what is the reason that's causing this slowness?
It might be that multiple people from your company use the internet.
You should do a full scan of the network, but still even then your not alone on the internet, and their "virtual" server environments are shared as well.
from my understanding the issue is that simply takes time to fetch the file from S3 in order to return it to the user.
Make yourself a favor:
create a signed url for a short period time
redirect the user to such url
dont worry, as creating signed urls doesn't expose any private information that compromises your security (if you do ir correctly)
I have used following code to download approximate 920MB file,
set_time_limit(0);
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: public");
header("Content-Description: File Transfer");
header("X-Sendfile: $zipname"); // For Large Files
header("Content-type: application/zip");
header("Content-Disposition: attachment; filename=\"".$zipname."\"");
header("Content-Transfer-Encoding: binary");
header("Content-Length: ".filesize($directory_location . '/' . $zipname));
ob_end_flush();
readfile($directory_location . '/' . $zipname);
Before this code i did some study with the following links Using X-Sendfile with Apache/PHP, Streaming a large file using PHP and Limit download speed using PHP but not much helpful to me because file download still takes more time with just (2MB) file. It's not showing and transfer rate or anything else. I want download start to serve file with around 60Kbps, with all files (Large or small)
UPDATE: One more thing i noticed its not showing any download process just executing and after sometime display the pop-up to choose the location, and after hitting save button its direct save to the computer without any downloading process window :(
Please help me to guide the right way.
Based on above comments there are two solutions:
1) Just download the file directly. You don't appear to be doing any validation, so if not, then just pass the user to the file to download and let apache handle it.
2) If you do need validation / pre-processing, then check mod_xsendfile - adding the header isn't enough, you actually need to add the mod to apache. If you're in Linux then compile from source (https://tn123.org/mod_xsendfile/). If you're not in Linux then mod_xsendfile for Win x64? has a response from the author saying he can provide binaries - but that's in 2010. There's a bit of advice around the web - although it's been a while since I looked at it so can't really help much more.
I am currently trying to develop a PHP application in which my server downloads a file and the user can do the same almost simultaneously. I already think about the problem "If the user downloads fastly than the server...", but it's not a problem at this moment.
To do so, I used the header and readfile functions of php. Here is my code :
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="'.$data['name'].'";');
header('Content-Transfer-Encoding: binary');
header('Content-Length: '.$data['size']);
readfile($remoteFile);
I must to use the Content-length header to set the proper size of the file and not the size that is downloaded when the user clicks on the link. However, after some seconds or minutes, download is stopped and I need to restart...
If you think about a solution, even if it didn't use the header(); function, please tell me.
Thank you in advance...
I have experienced that this is directly related to maximum runtime settings, that are enforced upon you if you run with safe_mode on.
If you have the option, try setting set_time_limit(0) and see if that makes it work.
if you have your own server, you should look into the mod_xsendfile module for apache, since that is built specifically to send large files to the user.
Oh, and its stupidly easy to use
header("X-Sendfile: $path_to_somefile");
header("Content-Type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"$somefile\"");
exit;
A PHP application is offering binary data as a download:
header("Content-Type: application/octet-stream");
header("Pragma: public");
header("Cache-Control: private");
header("Content-Disposition: attachment; filename=\"$filename\"");
header("expires: 0");
set_time_limit(0);
ob_clean();
flush();
#readfile($completefilename); exit;
$completefilename is a stream like "ftp://user:pwd#..."
The size of the data can be several MByte. It works fine, but sporadically I get the following error:
It's most likely that the remote stream is occasionally down, or times out.
Also as #fab says it could be that the file you are trying to load is larger than your script's memory.
You should start logging the errors readfile() returns, e.g. using the error_log php.ini directive.
If this needs to be completely foolproof, I think you'll have to use something more refined than readfile() that allows to set a timeout (like curl, or readfile with stream context options).
You could then catch any errors that occur while downloading, and serve a locally hosted fallback document instead. That document could e.g. be a text file containing the message "Resource xyz could not be loaded".
Do you have anything in your error logs?
Maybe PHP is running out of memory because readfile() needs to pull the while file into memory. Make sure memory_limit is larger than the largest file you work on with readfile(). Another option is to output the file in chunks using fread().
I have a bad problem on my site where my file downloads are getting truncated when I use PHP to set the Content-Disposition to "attachment" and then sending the file using readfile(). I know this is somewhat of a notorious problem since there is plenty of discussion about it on PHP's readfile() manual page. I have tried each of the solutions posted there with no luck (sending chunked output, disabling gzip compression in both Apache/PHP). Here is the gist of that code, without all the workarounds, just as you'd find it on the PHP manual:
$file = '/path/to/ugly.file.name.mp4';
$filename = 'ANiceFileName.mp4';
header('Content-Description: File Transfer');
header('Content-Type: video/mp4');
header('Content-Disposition: attachment; filename='.$filename.'.mp4');
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
Most importantly, notice that I'd like to be able to send a dynamic file name to the browser, to avoid the more machine-friendly codes I use to store the files on disk. And yes, I have done plenty of security checks before this bit of code executes -- I already know readfile() can be a security flaw. :-)
The only reliable way I have found to send a complete file is to simply trash PHP and instead use Apache's mod_header to force the file as a download. Like so:
<FilesMatch "\.mp4$">
ForceType application/octet-stream
Header set Content-Disposition attachment
</FilesMatch>
This works very well, but there's a big problem -- I can no longer generate dynamic file names upon request, which is somewhat important to my content. I already know about specifying ;filename=file_name.mp4, but remember - I need to specify a dynamic name rather than just file_name.mp4.
It would be really nice if I could somehow notify Apache (using PHP) of the dynamic file name when sending a file. Is that possible? Or am I going to be forced to rename all my files on disk to a user-friendly name? The PHP solutions just aren't working at all. My files are sometimes up to 15MB, and those are often truncated to less than 2MB when transferring. Ouch!
Help me Stack Overflow Kenobi! You're my only hope.
Try mod_xsendfile as described here. This basically allows PHP to step out of the middle of the HTTP conversation at the point that PHP sets the X-Sendfile header - but, as you can see on that blog, it allows for setting your Content-Disposition header.
Is it getting truncated because the php script times out? set_time_limit?