Offering file download within a session - php

I am trying to make a file download dependent inside a session. Here is the code:
<?php>
session_name("My-Download");
session_start();
$_SESSION['Download-Authorized'] = 1;
echo "<a class='invlink' rel='nofollow' download target='_blank' href='download.php?download_file=file.to.download.pdf'>Name of File</a><br /><br />";
?>
The download script ('download.php') comes next:
<?php
session_start();
if(!isset($_SESSION['Download-Authorized'])) {
exit;
}
$path = $_SERVER['DOCUMENT_ROOT']."/downdir/";
$fullPath = $path.$_GET['download_file'];
if ($fd = fopen ($fullPath, "r")) {
$fsize = filesize($fullPath);
$path_parts = pathinfo($fullPath);
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: public");
header("Content-Description: File Transfer");
header("Content-Type: application/pdf");
header("Content-Disposition: attachment; filename=\"".$path_parts["basename"]."\"");
header("Content-Transfer-Encoding: binary");
header("Content-Length: " . $fsize);
while(!feof($fd)) {
$buffer = fread($fd, 2048);
print($buffer);
flush();
}
fclose ($fd);
} else {
die("File does not exist. Make sure you specified correct file name.");
}
exit;
?>
All works fine as long as the verification of '$_SESSION['Download-Authorized'] ist commented out.
When I check the session-variable $_SESSION['Download-Authorized'] is set
the download will fail.
What's wrong with my code?
Any help appreciated.
After adding session_start() to the beginning of download.php the script still does not work.
It appears to be the case that the Session-ID as well as the Session-Name changes when "download.php" is called. Additionally $_SESSION['Downlad-Autorized'] is reset.

Your initial script stores the flag in a session explicitely renamed (session_name("My-Download");), but the download script uses the default session name (no session_name()).
Therefore your download script starts with another (possibly empty) session.

Related

php - file_get_contents() for dynamic script

I have a script that outputs an image.
Works fine
include('../myfolder/myImageScript.php'); // outputs image on page
Fails
echo file_get_contents('../myfolder/myImageScript.php'); // nothing displayed
I think this fails because in php a script, in my case myImageScript.php, isn't executed when called via the file_get_contents() function, but is when called using a include() the script is executed.
I am struggling to get a zip function to work due to the empty output of file_get_contents().
the file i'm trying to call via file_get_contents() is:
myImageScript.php
$imgstr = "data:image/jpeg;base64,/9j/........... rest of string";
if (!preg_match('/data:([^;]*);base64,(.*)/', $imgstr, $matches)) {
die("error");
}
// Decode the data
$content = base64_decode($matches[2]);
// Output the correct HTTP headers
header('Content-Type: '.$matches[1]);
//header("Content-Type: image/jpeg"); // tried this made no difference
// Output the actual image data
echo $content;
Any help would be greatly appreciated.
Something like this should work, BUT you need to enable ZipArchive http://php.net/manual/en/class.ziparchive.php (should not be a problem)
<?php
$imgstr = "data:image/gif;base64,R0lGODlhyAAiALMAAFONvX2pzbPN4p6/2tTi7mibxYiw0d/q86nG3r7U5l2UwZO31unx98nb6nOiyf///yH5BAUUAA8ALAAAAADIACIAAAT/8MlJq7046827/2AojmRpnmiqriwGvG/Qjklg28es73wHxz0P4gcgBI9IHVGWzAx/xqZ0KlpSLU9Y9MrtVqzeBwFBJjPCaC44zW4HD4TzZI0h2OUjON7EsMd1fXcrfnsfgYUSeoYLPwoLZ3QTDAgORAoGWxQHNzYSBAY/BQ0XNZw5mgMBRACOpxSpnLE3qKqWC64hk5WNmBebnA8MjC8KFAygMAUCErA2CZoKq6wHkQ8C0dIxhQRED8OrC1hEmQ+12QADFebnABTr0ukh1+wB20QMu0ASCdn16wgTDmCTNlDfhG/sFODi9iMLvAoOi6hj92LZhHfZ3FEEYNEDwnMK/ykwhDEATAN2C/5d3PiDiYSIrALkg6EAz0hiFDNFJKeqgIEyM1nhwShNo0+glhBhgKlA5qqaE25KY1KAYkGAYlYVSEAgQdU1DFbFe3DgKwysWcHZ+QjAAIWdFQaMgkjk2b4ySLtNkCvuh90NYYmMLUsErVRiC8o8OLmkAYF5hZkRKYCHgVmDAiJJLeZpVUdrq/DA7XB5rAV+gkn/MJ0hc8sKm6OuclDoo8tgBQFgffd335p3cykEjSK1gIXLEl+Oq9OgTIKZtymg/hHuAoHmZJ6/5gDcwvDOyysEDS7B9VkJoSsEhuEyN6KSPyxKrf4qsnIoFQ4syL0qum8i9AW0H/9F/l3gngXwwSAfEQ5csIoFUmH1oAVrTEhXQ+Cdd6GGD4z230b+TQdDgB8S6INeG76AlVSsoYeibBg+cOAX2z1g4Vv2sYggER15uFliZFwWnUAAQmhLGUKe+MMFEa1oH40/FMKYht1RMKVB7+AiwTvEMehdeB2CicwLlAlXI1m5kSjBmACUOQF0HWRpAZcZqngBbxWwqZtkZz4QlEsJvkDiejDIcRh5h4kG5pPBrEHkDw06GKMEhAJwGxx+uBIoAIOmlxaH9TWCh4h2fgqDAWcc019AqwTHwDtu1UmMRQnkdpuHRU6gZ3uWOOaHILmuScc6LlFDhKuwwgiqsjQNgAD/UWgFZaKuq/w0AHIAuHIYReR5+A4C12HkEksSfRvuqiuxR4GebSFw7SraMqoRuXvK2t+Z+JDb22bsxDqBh+YRVCO5RgT81JnEGiNtNvvKKwl/IzJKql8ORadqQuSZis7CANCWYnIScOyAiJHayFIUIpM8r0GUstsrbA4HhC2nJi9LwDuihKkuhEQpgAAiEQpjyc99aWHMppz2gSLBlCL9iFQrW2pdz0TDPCkGCRgQjU9GVPpZQAkgIICWHfQhABkNkM1svQxg9wcJfWSn1AlxI5DA3COYjbbaLJBKzhQRuiF4Cn8nMiMXgQ+uOAkBFDDA2wxABkPJiMe8+OUaECVNLMZUJI755xtoHmwXnoNuugUQp4bGLzf0dvrriy2wsAMD4A377YJjSgDfD0QAADs=";
if (!preg_match('/data:([^;]*);base64,(.*)/', $imgstr, $matches)) {
die("error");
$content = base64_decode($matches[2]);
$zip = new ZipArchive;
$filename = tempnam("/tmp", "testmeZip");
$res = $zip->open($filename, ZipArchive::CREATE);
if ($res === TRUE) {
$zip->addFromString('test.gif', $content);//you can use $matches to figure out extension
$zip->close();
echo 'ok';
} else {
echo 'failed';
}
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: public");
header("Content-Description: File Transfer");
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"test.zip\"");
header("Content-Transfer-Encoding: binary");
// make sure the file size isn't cached
clearstatcache();
header("Content-Length: ".filesize($filename));
// output the file
readfile($filename);

php download fails on large files

I'm trying to get a link to download files without redirecting, or using another file to download large zip files.
It works great for files under ~100mb, but anything larger and it's sent to an error page in chrome that says: Error code: ERR_INVALID_RESPONSE.
The $file and $path are put into a form with jQuery, and then $('form').submit(); is called.
function downloadFile($file, $path){
set_time_limit(0);
$fullPath = $path . $file;
if ($fd = fopen ($fullPath, "r")) {
$fsize = filesize($fullPath);
$path_parts = pathinfo($fullPath);
$ext = strtolower($path_parts["extension"]);
switch ($ext) {
case "pdf":
header("Content-type: application/pdf"); // add here more headers for diff. extensions
header("Content-Disposition: attachment; filename=\"".$path_parts["basename"]."\""); // use 'attachment' to force a download
break;
default;
header("Content-type: application/octet-stream");
header("Content-Disposition: filename=\"".$path_parts["basename"]."\"");
}
header("Content-length: $fsize");
header("Cache-control: private"); //use this to open files directly
while(!feof($fd)) {
$buffer = fread($fd, 2048);
echo $buffer;
}
}
fclose ($fd);
exit;
}
Any ideas why it fails on larger files?
Update:
When I use the following code, it downloads the file, but the finished file doesn't open as it's corrupted for some reason:
set_time_limit(0);
ini_set('memory_limit', '1024M');
// http headers for downloads
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: public");
header("Content-Description: File Transfer");
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"".$filename."\"");
header("Content-Transfer-Encoding: binary");
header("Content-Length: ".$file_size);
ob_end_flush();
#readfile($fileinfo['path']);
When I use this code, it only downloads the first 49.7kb then says it's finished, I even tried ob_flush() and flush() together:
// http headers for downloads
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: public");
header("Content-Description: File Transfer");
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"".$filename."\"");
header("Content-Transfer-Encoding: binary");
header("Content-Length: ".filesize($fileinfo['path']));
if ($fd = fopen ($fileinfo['path'], "r")) {
set_time_limit(0);
ini_set('memory_limit', '1024M');
while(!feof($fd)) {
echo fread($fd, 4096);
flush();
}
}
The only thing I do different in my downloader is Content-Transfer-Encoding: chunked. Also take the flush() out of the loop and make sure you do ob_clean(); flush(); before and ob_end_flush(); after sending the data.

My download.php script sends a corrupt file

I am having difficulty working out my download.php script. The customer gets a unique download link generated from my database and when they click it on it is meant to send them the file from my server. Only in my case the zip file is always corrupt and bigger then the original file. Original file is 68KB and the downloaded file ends up being 90KB.
Here is my code that sends the file to the browser:
<?php
$actual_link = "http://$_SERVER[HTTP_HOST]";
$post_id = $wpdb->get_results("SELECT transaction_id FROM slx_jomsocial_plugin_orders WHERE (transaction_id ='". $tx ."' and unique_code='". $token ."')");
$rowCount = $wpdb->num_rows;
if($rowCount==1){
$name="Unzip_First_Video_Embedding.zip";
$file_url=$actual_link."/jomsplugins/".$name;
header("Content-type: application/x-file-to-save");
header("Content-Disposition: attachment; filename=".$name);
readfile($file_url);
echo "<h1>Thank you ".$name=$_SESSION['s_name']." For downloading the plugins.</h1><br />";
}else{
echo "<h1>You are not permitted to downlaod, Please purchase plugin first.</h1> <br /><div style=\"margin-left: 25%;margin-top: 1%;\">Click Here</div>";
}
?>
What am I doing wrong here?
UPDATE:
I have been experimenting with the script but still my files come out corrupted (Unzipping them gives a .cpgz file). Here is my current script now:
<?php
$actual_link = "http://$_SERVER[HTTP_HOST]";
$post_id = $wpdb->get_results("SELECT transaction_id FROM slx_jomsocial_plugin_orders WHERE (transaction_id ='". $tx ."' and unique_code='". $token ."')");
$rowCount = $wpdb->num_rows;
if($rowCount==1){
// set example variables
$filename = "Unzip_First_Video_Embedding.zip";
$filepath = $actual_link."/jomsplugins/";
// http headers for zip downloads
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: public");
header("Content-Description: File Transfer");
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"".$filename."\"");
header("Content-Transfer-Encoding: binary");
ob_end_flush();
ob_clean();
flush();
#readfile($filepath.$filename);
ob_clean();
flush();
exit;
}

php download do not show progress in browser

I have problem with a php managed file download where the browser do no show progress of a file download. In fact, the browser appears to be waiting and waiting and waiting, until the file is completely downloaded. The file will then appear in the download list (with chrome and firefox). I cannot even download the file with IE8. I would like the browser to show the actual file size and the progress of the download.
Strangely the download is not even visible in firebug (no line appear in the network tab if you paste the download url).
I suspected problem with compression/zlib so I disabled both: no change. I disabled output buffering with the same result.
Live example can be found here: http://vps-1108994-11856.manage.myhosting.com/download.php
Phpinfo: http://vps-1108994-11856.manage.myhosting.com/phpinfo.php
The code is below, your help is appreciated.
<?php
$name = "bac.epub";
$publicname = "bac.epub";
#apache_setenv('no-gzip', 1);
ini_set("zlib.output_compression", "Off");
header("Content-Type: application/epub+zip");
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: public");
header("Content-Description: File Transfer");
header("Content-Transfer-Encoding: binary");
header("Content-Length: " . filesize($name));
header("Content-disposition: attachment; filename=" . $publicname) );
ob_end_flush();
flush();
// dump the file and stop the script
$chunksize = 1 * (128 * 1024); // how many bytes per chunk
$size = filesize($name);
if ($size > $chunksize) {
$handle = fopen($name, 'rb');
$buffer = '';
while (!feof($handle)) {
$buffer = fread($handle, $chunksize);
echo $buffer;
ob_flush();
flush();
sleep(1);
}
fclose($handle);
} else {
readfile($name);
}
exit;
The sleep in the code was to ensure that the download is long enough to see the progress.
Keep it, really, really, simple.
<?php
header("Content-Type: application/epub+zip");
header("Content-disposition: attachment; filename=" . $publicname) );
if(!readfile($name))
echo 'Error!';
?>
It is all you really need.
header("Content-Type: application/epub+zip");
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: public");
header("Content-Description: File Transfer");
header("Content-Transfer-Encoding: binary");
header("Content-Length: " . filesize($file_path));
header("Content-disposition: attachment; filename=" . $local_file_name);
// dump the file and stop the script
$chunksize = 128 * 1024; // how many bytes per chunk (128 KB)
$size = filesize($file_path);
if ($size > $chunksize)
{
$handle = fopen($file_path, 'rb');
$buffer = '';
while (!feof($handle))
{
$buffer = fread($handle, $chunksize);
echo $buffer;
flush();
sleep(1);
}
fclose($handle);
}
else
{
readfile($file_path);
}
I have modified your code Francis. And now it works ... :)
This is likely caused by a firewall or some sort of proxy between you and the remote site. I was wrestling with the same problem - disabling gzip, flushing buffers etc. until I tried it under a web VPN and the progress indicator re-appeared.
I don't think that the progress indicator is buggy - it's just that the content is being embargoed before it gets to you, which appears as a waiting state in the download. Then when the content is scanned or approved, it may come down very quickly relative to the normal download speed of your site. For large enough files, maybe you could see a progress indicator at this stage.
Nothing you can do about it except to determine if this is the real reason for this behaviour.

Serve file to user over http via php

If I goto http://site.com/uploads/file.pdf I can retrieve a file.
However, if I have a script such as:
<?php
ini_set('display_errors',1);
error_reporting(E_ALL|E_STRICT);
//require global definitions
require_once("includes/globals.php");
//validate the user before continuing
isValidUser();
$subTitle = "Attachment";
$attachmentPath = "/var/www/html/DEVELOPMENT/serviceNow/selfService/uploads/";
if(isset($_GET['id']) and !empty($_GET['id'])){
//first lookup attachment meta information
$a = new Attachment();
$attachment = $a->get($_GET['id']);
//filename will be original file name with user name.n prepended
$fileName = $attachmentPath.$_SESSION['nameN'].'-'.$attachment->file_name;
//instantiate new attachmentDownload and query for attachment chunks
$a = new AttachmentDownload();
$chunks= $a->getRecords(array('sys_attachment'=>$_GET['id'], '__order_by'=>'position'));
$fh = fopen($fileName.'.gz','w');
// read and base64 encode file contents
foreach($chunks as $chunk){
fwrite($fh, base64_decode($chunk->data));
}
fclose($fh);
//open up filename for writing
$fh = fopen($fileName,'w');
//open up filename.gz for extraction
$zd = gzopen($fileName.'.gz', "r");
//iterate over file and write contents
while (!feof($zd)) {
fwrite($fh, gzread($zd, 60*57));
}
fclose($fh);
gzclose($zd);
unlink($fileName.'.gz');
$info = pathinfo($fileName);
header('Content-Description: File Transfer');
header('Content-Type: '.Mimetypes::get($info['extension']));
header('Content-Disposition: attachment; filename=' . basename($fileName));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
header('Content-Length: ' . filesize($fileName));
ob_clean();
flush();
readfile($fileName);
exit();
}else{
header("location: ".$links['status']."?".urlencode("item=incident&action=view&status=-1&place=".$links['home']));
}
?>
This results in sending me the file, but when I open it I receive an error saying:
"File type plain text document (text/plain) is not supported"
First off, I'd start by checking the HTTP headers. You can do this in Firefox easily using the "Live HTTP headers" extension; not sure about equivalents in other browsers offhand. This will let you verify if the header is actually getting set to "application/pdf" and whether your other headers are getting set as well.
If none of the headers are getting set, you might be inadvertently sending output before the calls to header(). Is there any whitespace before the <?php tag?
Are you sure application/pdf is the header your browser is actually seeing?
You can check that out with various HTTP dev tools, for instance HTTP Client for the Mac or Firebug for Firefox.
I use this one and it works.
if(file_exists($file_serverfullpath))
{
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: private", false);
//sending download file
header("Content-Type: application/octet-stream"); //application/octet-stream is more generic it works because in now days browsers are able to detect file anyway
header("Content-Disposition: attachment; filename=\"" . basename($file_serverfullpath) . "\""); //ok
header("Content-Transfer-Encoding: binary");
header("Content-Length: " . filesize($file_serverfullpath)); //ok
readfile($file_serverfullpath);
}
Try prepending "error_reporting(0);". I found this in the comments at http://php.net/readfile (where you took this example from).
Another thing that could be a problem is your file size. There have been issues reported in the past about PHP5 (we're talking 2005 here, so i hope this is fixed by now) having trouble reading files >2MB. If your file size exceeds this you may want to verify that it reads the whole file.

Categories