Error 324 EMPTY_RESPONSE while echo file_get_contents - php

I'm trying to make something like ftp. I have a home made server with aplication set on wamp.
And i don't want to keep all my files in wamp folder so i also acces local files from this application.
Already read a lot of staff that i can't do this and no browser let me do something like this. But I managed it somehow.
This is the code i use to download files:
function getFile($name,$path) {
if(file_exists($path.$name)) {
$name = urldecode($name);
$fsize = filesize($path.$name);
header("Content-disposition: attachment; filename=\"".$name."\"");
header("Content-type: application/force-download");
header("Content-length: ".$fsize);
echo file_get_contents($path.$name);
}
And it works for a small files no matter what extension they've got but i also have to download some .exe files over 200Mb.
i have already set memory_limit in php.ini couse this was the reason of first troubles i've got but now i get the ERROR:324.
Any ideas how to get over it?

file_get_contents in first read file in memory and then return it.
For file size of 200MB it need more than 200MB memory for one script.
For fix it you should read file by blocks and output it.
For example
$fp = fopen($path.$name, "rb");
if ($fp) {
while(!feof($fp)) {
$str = fread($fp, 1024);
echo $str;
}
fclose($fp);
}

Related

Codeigniter force_download fails on large Zip files

When using force_download to download a zip file my code works for a zip file that is 268Mb (31 MP3 files) but not for a zip file that is 287Mb (32 MP3 files), the difference being 1 extra MP3 file added to the zip. The download attempts to start and appears as though it keeps starting over and over a couple of times and shows as failed with Chrome indicating that the zip file is incomplete. Windows reports the zip file which is only 61Kb is invalid when trying to open it.
The zip file gets created and MP3 files added to it by another area of code.
I have increased the memory_limit up to 1024M but its no different.
Below is the code I want working:
$this->load->helper("download");
$lastbasket = "uniquefilename.zip";
$zipdlpath = base_url()."uploads/zipped/".$lastbasket;
$fileContent = file_get_contents($zipdlpath);
force_download($lastbasket, $fileContent);
I have also tried using the following code:
$this->load->helper("download");
$lastbasket = "uniquefilename.zip";
$zipdlpath = FCPATH."uploads/zipped/".$lastbasket;
force_download($zipdlpath, NULL);
Providing a direct link to the zip file works fine (so I know the issue isnt with the zip file itself) but the force_download function in the controller appears to have an issue with larger files or is there a setting I am missing somewhere that is forcing a limit somehow?
PHP 7.1.33
CodeIgniter 3.1.9
Try to increase memory limit by adding this code:
ini_set('memory_limit','1024M');
increase memory limit and use fopen, fread
try this
$this->load->helper("download");
$lastbasket = "uniquefilename.zip";
$zipdlpath = FCPATH."uploads/zipped/".$lastbasket;
force_download($zipdlpath, NULL);
if (is_file($zipdlpath))
{
$chunkSize = 1024 * 1024;
$handle = fopen($zipdlpath, 'rb');
while (!feof($handle))
{
$buffer = fread($handle, $chunkSize);
echo $buffer;
ob_flush();
flush();
}
fclose($handle);
exit;
}
I've tried with the following custom download helper, may it will work for you.
Ref Link - https://github.com/bcit-ci/CodeIgniter/wiki/Download-helper-for-large-files

How do I auto download .jar files with Php

I have looked for an answer for to long, I have the auto download and redirect done, I just need it to make the .jar file download properly. It keeps getting an error when you download and open it.
It says "Error: Invalid or corrupt jarfile C:/path/Final frontier(Pre-Alpha 0.3).jar"
<?php
header("Content-Type: 'application/jar', 'true'");
header('Content-Disposition: attachment; filename="Final frontier (Pre-> Alpha 0.3).jar"');
header("Content-Length: " . filesize("public_html/paid/game/Final frontier (Pre-Alpha 0.3).jar"));
$fp = fopen("public_html/paid/game/Final frontier (Pre-Alpha 0.3).jar", "r");
fpassthru($fp);
fclose($fp);
?>
"Thanks you so much!!! That worked! – TJGames 1 min ago"
Comments to answer, since it solved the question.
This fopen("public_html/ seems like it may be playing tricks on you. I'd either remove it fopen("paid/ if the script's running from the root, or fopen("../paid/ as an added example, or use a full server path. I.e. fopen("/var/usr/you/public_html/paid/
Do the same thing for filesize("public_html/

Stream protected media (located outside of httpdocs) with jPlayer

I have uploaded some sample mp3 files to a directory outside of httpdocs, I have ensured that this is accessible to PHP by configuring open_basedir correctly and tested that this directory is working.
What I would like to do is stream these files via a PHP file as non-authenticated users should never have access to these files. I am currently using jPlayer and expect the setMedia function should look similar to this:
$("#jquery_jplayer").jPlayer("setMedia", { mp3: "stream.php?track=" + id + ".mp3" });
I have tried setting content headers etc in stream.php and it currently looks like this:
$filePath = "../song_files/mp3/";
$fileName = "$_GET[track].mp3";
header("Content-Type: audio/mpeg");
header('Content-Disposition: attachment; filename="'.$fileName.'"');
getFile($filePath + $fileName);
If I load this page directly, the mp3 file downloads and plays fine, but when I use the above javascript, jPlayer doesn't play the track.
I have had a look at this post ( Streaming an MP3 on stdout to Jplayer using PHP ) and it appears the user was trying to achieve exactly what I want, but upon testing the solution I keep running into a problem, all I get is "CURL Failed".
Are there any different methods I can use to achieve this. Pointing me in the right direction would be greatly appreciated.
After searching around some more I have found a solution that is working fine. I used the code from a similar topic ( PHP to protect PDF and DOC )
I will place the code I used here to help answer the question correctly:
//check users is loged in and valid for download if not redirect them out
// YOU NEED TO ADD CODE HERE FOR THAT CHECK
// array of support file types for download script and there mimetype
$mimeTypes = array(
'doc' => 'application/msword',
'pdf' => 'application/pdf',
);
// set the file here (best of using a $_GET[])
$file = "../documents/file.doc";
// gets the extension of the file to be loaded for searching array above
$ext = explode('.', $file);
$ext = end($ext);
// gets the file name to send to the browser to force download of file
$fileName = explode("/", $file);
$fileName = end($fileName);
// opens the file for reading and sends headers to browser
$fp = fopen($file,"r") ;
header("Content-Type: ".$mimeTypes[$ext]);
header('Content-Disposition: attachment; filename="'.$fileName.'"');
// reads file and send the raw code to browser
while (! feof($fp)) {
$buff = fread($fp,4096);
echo $buff;
}
// closes file after whe have finished reading it
fclose($fp);
</code></pre>

Correctly setting headers so a file can be downloaded via a proxy using PHP

I'm finding it difficult to phrase this question correctly, let me try to explain our problem...
We have an intranet running on Ubunutu box with Apache2/PHP 5.2.4. We have a bit of PHP code that reads a file from a directory that is not publically accessible and output it to the screen (code below):
$file_path = '/home/path/to/filename.gif';
if(file_exists($file_path)){
$output = FALSE;
//File Information
$path_parts = pathinfo($file_path);
$file_size = filesize($file_path);
$file_ext = (isset($path_parts['extension'])) ? strtolower($path_parts['extension']) : null;
$file_name = $path_parts['basename'];
//Sets up the headers
if($file_size > 0){
header('Content-Length: ' .$file_size);
}
header('Content-Disposition: attachment; filename="'.$file_name.'"');
header('Content-Type: application/octet-stream');
//Reads the File
if($file_size > 0){
$handle = fopen($file_path, "r");
$output = fread($handle, $file_size);
fclose($handle);
}
//Outputs the File
echo $output;
}
Inside our network when, browsing to the page that uses this code, the file is downloaded perfectly and quickly...
However, when accessing this page via our Cisco ASA/Proxy/VPN (not sure what to call it) this code locks up the browser, but does eventually download the file...
After a bit of experimenting, after taking out the headers and just echoing the contents of the file to the browser, it prints no problem. However as soon as I add the lines with the headers back into the code it causes the hanging again, but only when accessed via this box..
Anybody come across this problem before or have any idea what we can try to move forward?
Thanks for any advice...
Have you tried eliminating the content-size header entirely? The proxy may be taking that as a firm promise and if the data you're sending ends up being a different size, the proxy may wait for those last few "missing" bytes to show up.
Just as an aside, you should use [readfile()][1] instead of the fopen()/fread()/echo construct you have now.
As it stands now, you're slurping the contents of the entire file into memory and then echoing out. For large files and multiple requests, you'll kill the server with memory starvation. readfile will automatically stream the file in smaller chunks so that memory usage is minimal.
Your proxy obviously have problems with the Content-Type: application/octet-stream. Try setting it to the real MIME-type of each file. You can use the Fileinfo module to find out which MIME-type a certain file is, like this:
//You may need to specify the location of your system's magic file
//See http://php.net/finfo_open for more info
$finfo = new finfo(FILEINFO_MIME);
$mimetype = $finfo->file($file_path);

Downloading large files reliably in PHP

I have a php script on a server to send files to recipents: they get a unique link and then they can download large files. Sometimes there is a problem with the transfer and the file is corrupted or never finishes. I am wondering if there is a better way to send large files
Code:
$f = fopen(DOWNLOAD_DIR.$database[$_REQUEST['fid']]['filePath'], 'r');
while(!feof($f)){
print fgets($f, 1024);
}
fclose($f);
I have seen functions such as
http_send_file
http_send_data
But I am not sure if they will work.
What is the best way to solve this problem?
Regards
erwing
Chunking files is the fastest / simplest method in PHP, if you can't or don't want to make use of something a bit more professional like cURL, mod-xsendfile on Apache or some dedicated script.
$filename = $filePath.$filename;
$chunksize = 5 * (1024 * 1024); //5 MB (= 5 242 880 bytes) per one chunk of file.
if(file_exists($filename))
{
set_time_limit(300);
$size = intval(sprintf("%u", filesize($filename)));
header('Content-Type: application/octet-stream');
header('Content-Transfer-Encoding: binary');
header('Content-Length: '.$size);
header('Content-Disposition: attachment;filename="'.basename($filename).'"');
if($size > $chunksize)
{
$handle = fopen($filename, 'rb');
while (!feof($handle))
{
print(#fread($handle, $chunksize));
ob_flush();
flush();
}
fclose($handle);
}
else readfile($path);
exit;
}
else echo 'File "'.$filename.'" does not exist!';
Ported from richnetapps.com / NeedBee. Tested on 200 MB files, on which readfile() died, even with maximum allowed memory limit set to 1G, that is five times more than downloaded file size.
BTW: I tested this also on files >2GB, but PHP only managed to write first 2GB of file and then broke the connection. File-related functions (fopen, fread, fseek) uses INT, so you ultimately hit the limit of 2GB. Above mentioned solutions (i.e. mod-xsendfile) seems to be the only option in this case.
EDIT: Make yourself 100% that your file is saved in utf-8. If you omit that, downloaded files will be corrupted. This is, because this solutions uses print to push chunk of a file to a browser.
If you are sending truly large files and worried about the impact this will have, you could use the x-sendfile header.
From the SOQ using-xsendfile-with-apache-php, an howto blog.adaniels.nl : how-i-php-x-sendfile/
Best solution would be to rely on lighty or apache, but if in PHP, I would use PEAR's HTTP_Download (no need to reinvent the wheel etc.), has some nice features, like:
Basic throttling mechanism
Ranges (partial downloads and resuming)
See intro/usage docs.
We've been using this in a couple of projects and it works quite fine so far:
/**
* Copy a file's content to php://output.
*
* #param string $filename
* #return void
*/
protected function _output($filename)
{
$filesize = filesize($filename);
$chunksize = 4096;
if($filesize > $chunksize)
{
$srcStream = fopen($filename, 'rb');
$dstStream = fopen('php://output', 'wb');
$offset = 0;
while(!feof($srcStream)) {
$offset += stream_copy_to_stream($srcStream, $dstStream, $chunksize, $offset);
}
fclose($dstStream);
fclose($srcStream);
}
else
{
// stream_copy_to_stream behaves() strange when filesize > chunksize.
// Seems to never hit the EOF.
// On the other handside file_get_contents() is not scalable.
// Therefore we only use file_get_contents() on small files.
echo file_get_contents($filename);
}
}
For downloading files the easiest way I can think of would be to put the file in a temporary location and give them a unique URL that they can download via regular HTTP.
As part generating these links you could also remove files that were more than X hours old.
Create a symbolic link to the actual file and make the download link point at the symbolic link. Then, when the user clicks on the DL link, they'll get a file download from the real file but named from the symbolic link. It takes milliseconds to create the symbolic link and is better than trying to copy the file to a new name and download from there.
For example:
<?php
// validation code here
$realFile = "Hidden_Zip_File.zip";
$id = "UserID1234";
if ($_COOKIE['authvalid'] == "true") {
$newFile = sprintf("myzipfile_%s.zip", $id); //creates: myzipfile_UserID1234.zip
system(sprintf('ln -s %s %s', $realFile, $newFile), $retval);
if ($retval != 0) {
die("Error getting download file.");
}
$dlLink = "/downloads/hiddenfiles/".$newFile;
}
// rest of code
?>
<a href="<?php echo $dlLink; ?>Download File</a>
That's what I did because Go Daddy kills the script from running after 2 minutes 30 seconds or so....this prevents that problem and hides the actual file.
You can then setup a CRON job to delete the symbolic links at regular intervals....
This whole process will then send the file to the browser and it doesn't matter how long it runs since it's not a script.
When I have done this in the past I've used this:
set_time_limit(0); //Set the execution time to infinite.
header('Content-Type: application/exe'); //This was for a LARGE exe (680MB) so the content type was application/exe
readfile($fileName); //readfile will stream the file.
These 3 lines of code will do all the work of the download readfile() will stream the entire file specified to the client, and be sure to set an infinite time limit else you may be running out of time before the file is finished streaming.
If you are using lighttpd as a webserver, an alternative for secure downloads would be to use ModSecDownload. It needs server configuration but you'll let the webserver handle the download itself instead of the PHP script.
Generating the download URL would look like that (taken from the documentation) and it could of course be only generated for authorized users:
<?php
$secret = "verysecret";
$uri_prefix = "/dl/";
# filename
# please note file name starts with "/"
$f = "/secret-file.txt";
# current timestamp
$t = time();
$t_hex = sprintf("%08x", $t);
$m = md5($secret.$f.$t_hex);
# generate link
printf('%s',
$uri_prefix, $m, $t_hex, $f, $f);
?>
Of course, depending on the size of the files, using readfile() such as proposed by Unkwntech is excellent. And using xsendfile as proposed by garrow is another good idea also supported by Apache.
header("Content-length:".filesize($filename));
header('Content-Type: application/zip'); // ZIP file
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="downloadpackage.zip"');
header('Content-Transfer-Encoding: binary');
ob_end_clean();
readfile($filename);
exit();
I'm not sure this is a good idea for large files. If the thread for your download script runs until the user has finished the download, and you're running something like Apache, just 50 or more concurrent downloads could crash your server, because Apache isn't designed to run large numbers of long-running threads at the same time. Of course I might be wrong, if the apache thread somehow terminates and the download sits in a buffer somewhere whilst the download progresses.
I have used the following snippet found in the comments of the php manual entry for readfile:
function _readfileChunked($filename, $retbytes=true) {
$chunksize = 1*(1024*1024); // how many bytes per chunk
$buffer = '';
$cnt =0;
// $handle = fopen($filename, 'rb');
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, $chunksize);
echo $buffer;
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
I have had same problem,
my problem solved by adding this before starting session
session_cache_limiter('none');
This is tested on files of a size 200+ MB on a server that has 256MB memory limit.
header('Content-Type: application/zip');
header("Content-Disposition: attachment; filename=\"$file_name\"");
set_time_limit(0);
$file = #fopen($filePath, "rb");
while(!feof($file)) {
print(#fread($file, 1024*8));
ob_flush();
flush();
}

Categories