I have the following jquery:
$(".download").click(function(){
$.post('get_bot.php', "url_code="+url_code, function (response) {
alert(response);
});
});
url_code is a variable that has the url for a json structure, here is a live example of the return:
https://services.sapo.pt/Codebits/botmake/01,02,03,04,05,06,07,08,I%20Rule!
Those numbers are parameters to generate different images.
On my get_bot.php page I'm doing:
$urlc=$_POST['url_code'];
$bot = file_get_contents($urlc);
header("content-type: image/png");
echo $bot;
I'm looking into ways on how to get a response as a .png file download, so when the user clicks .download there is a download window prompt with the .png file.
Passing in a correct url and echoing the file_get_content results seems to work fine (although if I try to right click and save the image, it actually saves the php file...)
Any help with this would be great, I'm not very experienced with json structures, so far I've only dealt with array structures, never an image output.
I'm aware I'm probably way off here on getting an actual result, but any pointers would be appreciated.
RFC2616 describes what you need to do. Basically, you need to add
Content-Disposition: attachment; filename="fname.ext"
To the header if I'm not mistaken.
EDIT
Here is a sample script. I've confirmed that this works on two of my servers with different setups.
<?php
header("content-type: image/jpg");
header("Content-Disposition: attachment; filename='pic.jpg'");
readfile('http://lorempixel.com/400/200/');
?>
Just use this function on your get_bot.php to start a download of a file instead of showing it in the browser (should work cross-browser):
function download($file, $path)
{
$size = filesize($path.$file);
#ob_end_clean();
if(ini_get('zlib.output_compression'))
ini_set('zlib.output_compression', 'Off');
header('Content-Type: application/force-download');
header('Content-Disposition: attachment; filename="'.basename($file).'"');
header("Content-Transfer-Encoding: binary");
header('Accept-Ranges: bytes');
header("Cache-control: no-cache, pre-check=0, post-check=0");
header("Cache-control: private");
header('Pragma: private');
header("Expires: Mon, 26 Jul 1997 05:00:00 GMT");
if(isset($_SERVER['HTTP_RANGE']))
{
list($a, $range) = explode("=",$_SERVER['HTTP_RANGE'],2);
list($range) = explode(",",$range,2);
list($range, $range_end) = explode("-", $range);
$range=intval($range);
if(!$range_end) {
$range_end=$size-1;
} else {
$range_end=intval($range_end);
}
$new_length = $range_end-$range+1;
header("HTTP/1.1 206 Partial Content");
header("Content-Length: $new_length");
header("Content-Range: bytes $range-$range_end/$size");
} else {
$new_length=$size;
header("Content-Length: ".$size);
}
$chunksize = 1*(1024*1024);
$bytes_send = 0;
if ($file = fopen($path.$file, 'rb'))
{
if(isset($_SERVER['HTTP_RANGE']))
fseek($file, $range);
while
(!feof($file) &&
(!connection_aborted()) &&
($bytes_send<$new_length) )
{
$buffer = fread($file, $chunksize);
print($buffer);
flush();
$bytes_send += strlen($buffer);
}
fclose($file);
} else die('Error - can not open file.');
die();
}
Related
I have a resumable download php script.
It works fine on Apache server but not on IIS 7 ( which my client currently use)
The problem with IIS is:
When downloading file,other page at same site will freeze.
(even display 500 server error sometimes)
(same script on a Apache server does not carry the same problem)
The problem disappared if I turnoff resumable support
(even downloading on a download manager will freeze all browsers viewing the same site)
This make me believe IIS need some configuration? or php.ini?
I have no luck with google so far any help will be grateful
and.. yes, I have access on IIS and php.ini
and yes..I setup the maximum connection time on IIS already(needed for large file transfer)
this script is..
(anybody come across here and like to use this script for large file tranfer on IIS please read -->php on IIS 7 >> FastCGI timeout settings<< )
$filename="test.flv";
$filepath="zekkai.flv";
//set mime
$mime_type="";
$known_mime_types=array(
"flv" => "video/x-flv",
"mp4" => "video/mp4",
"mov" => "video/quicktime",
"avi" => "video/x-msvideo",
"wmv" => " video/x-ms-wmv "
);
if($mime_type==''){
$file_extension = strtolower(substr(strrchr($filepath,"."),1));
if(array_key_exists($file_extension, $known_mime_types)){
$mime_type=$known_mime_types[$file_extension];
} else {
$mime_type="application/force-download";
};
};
header("Connection: Keep-Alive");
header("Keep-Alive: timeout=65000");
$fsize=filesize($filepath);
set_time_limit(0);
//turn off buffer
ob_end_clean();
if(ini_get('zlib.output_compression'))
ini_set('zlib.output_compression', 'Off');
header("Content-Description: File Transfer");
header("Content-type: ".$mime_type);
header("Content-Disposition: attachment; filename=\"".$filename."\"");
header("Content-Transfer-Encoding: binary");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header('Accept-Ranges: bytes');
header("Cache-control: public");
header('Pragma: public');
header("Expires: 0");
// resumable support..
if(isset($_SERVER['HTTP_RANGE'])){
// delete this part to turnoff resumable support
list($a, $range) = explode("=",$_SERVER['HTTP_RANGE'],2);
list($range) = explode(",",$range,2);
list($range, $range_end) = explode("-", $range);
$range=intval($range);
if(!$range_end) {
$range_end=$fsize-1;
} else {
$range_end=intval($range_end);
}
$new_length = $range_end-$range+1;
header("HTTP/1.1 206 Partial Content");
header("Content-Length: $new_length");
header("Content-Range: bytes $range-$range_end/$fsize");
// resumable support end
} else {
$new_length=$fsize;
header("Content-Length: ".$fsize);
}
/* output the file itself */
$chunksize = 3*(1024*1024); //you may want to change this
$bytes_send = 0;
if ($Source_File = fopen($filepath, 'rb')){
if(isset($_SERVER['HTTP_RANGE'])){
fseek($Source_File, $range);
}
while(!feof($Source_File) && (!connection_aborted()) && ($bytes_send<$new_length) ) {
$buffer = fread($Source_File, $chunksize);
print($buffer); //echo($buffer); // is also possible
flush();
$bytes_send += strlen($buffer);
}
fclose($Source_File);
} else die('Error - can not open file.');
exit();
note.
none php script is not affected.
only php page affect by the download..
so my guess the problem is realted to fastcgi module?
I'm using a PHP script to control access to download files. This works fine for anything under 2Gb but fails for larger files.
Apache and PHP are both 64bit
Apache will allow the file to be downloaded if accessed directly (which I can't allow)
The guts of the PHP (ignoring the access control):
if (ob_get_level()) ob_end_clean();
error_log('FILETEST: '.$path.' : '.filesize($path));
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($path));
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($path));
readfile($path);
exit;
The error log shows the file size fine
[Tue Apr 08 11:01:16 2014] [error] [client *.*.*.*] FILETEST: /downloads/file.name : 2251373807, referer: http://myurl/files/
But the access log has a negative size:
*.*.*.* - - [08/Apr/2014:11:01:16 +0100] "GET /files/file.name HTTP/1.1" 200 -2043593489 "http://myurl/files/" "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:24.0) Gecko/20100101 Firefox/24.0"
And so browsers refuse to download the file. In fact, using wget, it's not sending anything:
$ wget -S -O - http://myurl/files/file.name
--2014-04-08 11:33:38-- http://myurl/files/file.name
HTTP request sent, awaiting response... No data received.
Retrying.
Try to read the file in chunks and expose them to the browser instead of filling your local memory with 2GB and flushing all at once.
Replace readfile($path); by:
#ob_end_flush();
flush();
$fileDescriptor = fopen($file, 'rb');
while ($chunk = fread($fileDescriptor, 8192)) {
echo $chunk;
#ob_end_flush();
flush();
}
fclose($fileDescriptor);
exit;
8192 bytes is a critical point in some cases, refere to php.net/fread.
Adding some microtime variables (and comparing with the pointer position of the file descriptor) will also allow you to controll the maximum speed of the download.
*(Flushing the output buffer also slightly depends on the webserver, use those commands to be sure it at least tries to flush as much as possible.)
Add code before readfile($path);
ob_clean();
flush();
I use this code for download:
if (file_exists($file)) {
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
exit;
}
Your best choice is to force apache to http chunked mode with a function like this. You'll save a lot of PHP memory this way.
function readfile_chunked($filename, $retbytes = TRUE) {
$CHUNK_SIZE=1024*1024;
$buffer = '';
$cnt =0;
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, $CHUNK_SIZE);
echo $buffer;
#ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
I came across this issue before and used the below script for downloading files, it breaks the file into chunks to download large files instead of trying to take the whole file at once. This script also takes into account the browser being used as some browsers (namely IE) can handle headers slightly differently.
private function outputFile($file, $name, $mime_type='') {
$fileChunkSize = 1024*30;
if(!is_readable($file)) die('File not found or inaccessible!');
$size = filesize($file);
$name = rawurldecode($name);
$known_mime_types=array(
"pdf" => "application/pdf",
"txt" => "text/plain",
"html" => "text/html",
"htm" => "text/html",
"exe" => "application/octet-stream",
"zip" => "application/zip",
"doc" => "application/msword",
"xls" => "application/vnd.ms-excel",
"ppt" => "application/vnd.ms-powerpoint",
"gif" => "image/gif",
"png" => "image/png",
"jpeg"=> "image/jpg",
"jpg" => "image/jpg",
"php" => "text/plain"
);
if($mime_type=='')
{
$file_extension = strtolower(substr(strrchr($file,"."),1));
if(array_key_exists($file_extension, $known_mime_types))
$mime_type=$known_mime_types[$file_extension];
else
$mime_type="application/force-download";
}
#ob_end_clean();
if(ini_get('zlib.output_compression'))
ini_set('zlib.output_compression', 'Off');
header('Content-Type: ' . $mime_type);
header('Content-Disposition: attachment; filename="'.$name.'"');
header("Content-Transfer-Encoding: binary");
header('Accept-Ranges: bytes');
header("Cache-control: private");
header('Pragma: private');
header("Expires: Mon, 26 Jul 1997 05:00:00 GMT");
if(isset($_SERVER['HTTP_RANGE']))
{
list($a, $range) = explode("=",$_SERVER['HTTP_RANGE'],2);
list($range) = explode(",",$range,2);
list($range, $range_end) = explode("-", $range);
$range=intval($range);
if(!$range_end)
$range_end=$size-1;
else
$range_end=intval($range_end);
$new_length = $range_end-$range+1;
header("HTTP/1.1 206 Partial Content");
header("Content-Length: $new_length");
header("Content-Range: bytes $range-$range_end/$size");
}
else
{
$new_length=$size;
header("Content-Length: ".$size);
}
$chunksize = 1*($fileChunkSize);
$bytes_send = 0;
if ($file = fopen($file, 'r'))
{
if(isset($_SERVER['HTTP_RANGE']))
fseek($file, $range);
while(!feof($file) &&
(!connection_aborted()) &&
($bytes_send<$new_length)
)
{
$buffer = fread($file, $chunksize);
print($buffer);
flush();
$bytes_send += strlen($buffer);
}
fclose($file);
}
else die('Error - can not open file.');
die();
}
I am trying to append some details in to a file and add that for download.
I am using JavaScript and PHP for this purpose.. Clicking download button, it will fire an AJAX request.
$.ajax({
url:"php/test.php",
type: 'POST',
data: { totalQuery : test1, },
success: function(finalEntityList){
},
});
Lets assume test.php has a single line code
$html="Test";
Now I want to add this to a file and make it available for download. I've used the code
header('Content-Type: text/csv; charset=utf-8');
header('Content-Disposition: attachment; filename=data.csv');
$output = fopen('php://output', 'w');
fwrite($output, $html);
fclose($output);
But the download will not start automatcially... I've to open the POST request link using firebug so that the download will be initiated.. what could be wrong??
Perhaps what you need to do is simply return the path of the file with your AJAX call and then use JavaScript to "initiate" the download by using one of the following -
window.open
window.location.href
$.ajax({
url:"php/test.php",
type: 'POST',
dataType: 'json',
data: { totalQuery : test1, },
success: function(response){
// initiate download using direct path to file
window.location.href = response.URL;
}
});
Now your test.php file will only need to return the URL path for the download file in a JSON format -
$filename = 'data.csv';
$path = $_SERVER['DOCUMENT_ROOT'].'/downloads/';
echo json_encode(array('URL'=>$path.$filename));
You might consider returning the URL as a raw string - but I feel using JSON might be better because you can easily add additional information in to the response without needing additional parsing functions. All this makes it a more robust choice.
It requires some more parameters and headerinfo:
$file = "data.csv";
$mime_type = "text/csv";
$size = filesize($file);
$name = rawurldecode($name);
#ob_end_clean(); //turn off output buffering to decrease cpu usage
// required for IE, otherwise Content-Disposition may be ignored
if(ini_get('zlib.output_compression'))
ini_set('zlib.output_compression', 'Off');
header('Content-Type: ' . $mime_type);
header('Content-Disposition: attachment; filename="'.$name.'"');
header("Content-Transfer-Encoding: binary");
header('Accept-Ranges: bytes');
/* The three lines below basically make the
download non-cacheable */
header("Cache-control: private");
header('Pragma: private');
header("Expires: Mon, 26 Jul 1997 05:00:00 GMT");
// multipart-download and download resuming support
if(isset($_SERVER['HTTP_RANGE']))
{
list($a, $range) = explode("=",$_SERVER['HTTP_RANGE'],2);
list($range) = explode(",",$range,2);
list($range, $range_end) = explode("-", $range);
$range=intval($range);
if(!$range_end)
{
$range_end=$size-1;
}
else
{
$range_end=intval($range_end);
}
$new_length = $range_end-$range+1;
header("HTTP/1.1 206 Partial Content");
header("Content-Length: $new_length");
header("Content-Range: bytes $range-$range_end/$size");
}
else
{
$new_length=$size;
header("Content-Length: ".$size);
}
/* output the file itself */
$chunksize = 1*(1024*1024); //you may want to change this
$bytes_send = 0;
if ($file = fopen($file, 'r'))
{
if(isset($_SERVER['HTTP_RANGE']))
fseek($file, $range);
while(!feof($file) && (!connection_aborted()) && ($bytes_send<$new_length))
{
$buffer = fread($file, $chunksize);
print($buffer); //echo($buffer); // is also possible
flush();
$bytes_send += strlen($buffer);
}
fclose($file);
}
else
die('Error - can not open file.');
I have an excel file which can be downloaded..for example NAME.xlsx well it works in firefox but in webkit(safari/chrome) it appends to the name also the extension .xhtml so then name it will be NAME.xlsx.html it should be ONLY .xlsx
Here you have my headers:
$objWriter = new PHPExcel_Writer_Excel2007($objPHPExcel);
$objWriter->save($root.'/application/to_excel/KSW.xlsx');
$this->getResponse()->setHeader('Content-type', 'application/download', true);
$this->getResponse()->setHeader('Content-type', 'application/octet-stream', true);
$this->getResponse()->setHeader('Content-type', 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet', true);
$this->getResponse()->setHeader('Content-disposition', 'attachment;filename='.basename($root.'/application/to_excel/KSW.xlsx').'', true);
$this->getResponse()->setHeader('Cache-Control', 'max-age=0', true);
So what I'm doing wrong?
I've not had this function fail yet -- it works with all the Office 2007/2010 files that I've tried so far in Safari (Windows) and Chrome. The get_known_mime_types() function just returns a giant array of all the mime-types that my app supports -- just Google for the MIME types you need. $file is the actual path to the file on your host, and $name is the file name that displays in the download (run/save) dialog. I've also given due credit to the place I got most of it from. Hope you have luck with it too:
function file_download($file, $name, $mime_type='') {
/* The majority of this code was taken from:
* http://w-shadow.com/blog/2007/08/12/how-to-force-file-download-with-php/
*
* So a big thanks to them.
* I have modified parts of it, though, so it's not 100% borrowed.
*/
if(!is_readable($file)) die('File not found or inaccessible!');
$size = filesize($file);
$name = rawurldecode($name);
/* Figure out the MIME type (if not specified) */
$known_mime_types = get_known_mime_types();
if($mime_type==''){
$file_extension = strtolower(substr(strrchr($file,"."),1));
if(array_key_exists($file_extension, $known_mime_types)){
$mime_type=$known_mime_types[$file_extension];
} else {
$mime_type="application/force-download";
}
}
#ob_end_clean(); //turn off output buffering to decrease cpu usage
// required for IE, otherwise Content-Disposition may be ignored
if(ini_get('zlib.output_compression')) {
ini_set('zlib.output_compression', 'Off');
}
header('Content-Type: ' . $mime_type);
header('Content-Disposition: attachment; filename="'.$name.'"');
header("Content-Transfer-Encoding: binary");
header('Accept-Ranges: bytes');
/* The three lines below basically make the download non-cacheable */
header("Cache-control: private");
header('Pragma: private');
header("Expires: Mon, 26 Jul 1997 05:00:00 GMT");
// multipart-download and download resuming support
if(isset($_SERVER['HTTP_RANGE'])) {
list($a, $range) = explode("=",$_SERVER['HTTP_RANGE'],2);
list($range) = explode(",",$range,2);
list($range, $range_end) = explode("-", $range);
$range=intval($range);
if(!$range_end) {
$range_end=$size-1;
} else {
$range_end=intval($range_end);
}
$new_length = $range_end-$range+1;
header("HTTP/1.1 206 Partial Content");
header("Content-Length: $new_length");
header("Content-Range: bytes $range-$range_end/$size");
} else {
$new_length=$size;
header("Content-Length: ".$size);
}
/* output the file itself */
$chunksize = 1*(1024*1024); // 1MB, can be tweaked if needed
$bytes_send = 0;
if ($file = fopen($file, 'r')) {
if(isset($_SERVER['HTTP_RANGE'])) {
fseek($file, $range);
}
while(!feof($file) && (!connection_aborted()) && ($bytes_send<$new_length)) {
$buffer = fread($file, $chunksize);
print($buffer); //echo($buffer); // is also possible
flush();
$bytes_send += strlen($buffer);
}
fclose($file);
} else {
die('Error - can not open file.');
}
die();
}
These are large (20-60mb) quickbooks files. Seemingly at random, IE users who are downloading them get "server returned an invalid or unrecognized response", and the download fails.
Works 100% of the time in other browsers.
This is over SSL. These downloads are being forced, I've tried every variation of headers I have seen. Currently:
#ob_end_clean();
if(ini_get('zlib.output_compression')) ini_set('zlib.output_compression', 'Off');
header('Content-Type: application/force-download');
header('Content-Disposition: attachment; filename="'.$file->original_name.'"');
header("Content-Transfer-Encoding: binary");
header('Accept-Ranges: bytes');
header("Cache-Control: public, must-revalidate");
header("Pragma: hack");
header("Expires: Mon, 26 Jul 1997 05:00:00 GMT");
$size = filesize($_SERVER['DOCUMENT_ROOT'].'/uploads/'.$file->name);
header("Content-Length: ".$size);
$new_length = $size;
/* output the file itself */
$chunksize = 1*(1024*1024); //you may want to change this
$bytes_send = 0;
if ($file_h = fopen($_SERVER['DOCUMENT_ROOT'].'/uploads/'.$file->name, 'rb'))
{
while
(!feof($file_h) &&
(!connection_aborted()) &&
($bytes_send<$new_length) )
{
set_time_limit(5);
$buffer = fread($file_h, $chunksize);
echo($buffer);
flush();
$bytes_send += strlen($buffer);
}
fclose($file_h);
}
die();
As I have seen, the problem comes from the Pragma header field, when it is set to "no-cache", which does not seem to be your problem. Did you use any tools [like Firefox Live Headers] to check the value of the Pragma field?