How to upload larger files in php upto 50 GB - php

I have to make the php max file upload to the 50 GB. The server has the capability but I have confusion that how this task should be accomplished.
My first question: Is it possible to upload a 50 GB file at once in php?
Second question: if possible, is there any way to upload the file in chunks so it would better if the connection lost due to some reason so it will continue from the chunks which are left and the uploaded chunks will remain in server.
Sorry, I have not much experience in PHP and never do such a task. I try to google but couldn't find any solution.
Thanks

1. It's possible but it depends on many factors like your internet connection (execution timeout), PHP version and PHP settings.
post_max_size = 0
upload_max_filesize = 0
2. It's shouldn't be considered sending such big files without chunking. It can be achieved using other protocols than HTTP (which is not recommended) and with JS/PHP.
Look at other answers, because this topic was mentioned many times and there are many libraries for this, i.e.
Upload 1GB files using chunking in PHP

To upload a large file you can consider the two most important things
Good internet connection
Upload file chunk by chunk
I use the below code to upload a large file that is greater than 5MB. You can increase the chunk size. Although I don't know your file type. You may try.
/**
* #param $file
* #param $fileSize
* #param $name
* #return int
*/
public function chunkUpload($file, $fileSize, $applicantID, $name) {
$targetFile = 'upload/'. $name;
$chunkSize = 256; // chunk in bytes
$uploadStart = 0;
$handle = fopen($file, "rb");
$fp = fopen($targetFile, 'w');
# Start uploading
try {
while($uploadStart < $fileSize) {
$contents = fread($handle, $chunkSize);
fwrite($fp, $contents);
$uploadStart += strlen($contents);
fseek($handle, $uploadStart);
}
fclose($handle);
fclose($fp);
return 200;
} catch (\Exception $e) {
return 400;
}
}

Related

Codeigniter force_download fails on large Zip files

When using force_download to download a zip file my code works for a zip file that is 268Mb (31 MP3 files) but not for a zip file that is 287Mb (32 MP3 files), the difference being 1 extra MP3 file added to the zip. The download attempts to start and appears as though it keeps starting over and over a couple of times and shows as failed with Chrome indicating that the zip file is incomplete. Windows reports the zip file which is only 61Kb is invalid when trying to open it.
The zip file gets created and MP3 files added to it by another area of code.
I have increased the memory_limit up to 1024M but its no different.
Below is the code I want working:
$this->load->helper("download");
$lastbasket = "uniquefilename.zip";
$zipdlpath = base_url()."uploads/zipped/".$lastbasket;
$fileContent = file_get_contents($zipdlpath);
force_download($lastbasket, $fileContent);
I have also tried using the following code:
$this->load->helper("download");
$lastbasket = "uniquefilename.zip";
$zipdlpath = FCPATH."uploads/zipped/".$lastbasket;
force_download($zipdlpath, NULL);
Providing a direct link to the zip file works fine (so I know the issue isnt with the zip file itself) but the force_download function in the controller appears to have an issue with larger files or is there a setting I am missing somewhere that is forcing a limit somehow?
PHP 7.1.33
CodeIgniter 3.1.9
Try to increase memory limit by adding this code:
ini_set('memory_limit','1024M');
increase memory limit and use fopen, fread
try this
$this->load->helper("download");
$lastbasket = "uniquefilename.zip";
$zipdlpath = FCPATH."uploads/zipped/".$lastbasket;
force_download($zipdlpath, NULL);
if (is_file($zipdlpath))
{
$chunkSize = 1024 * 1024;
$handle = fopen($zipdlpath, 'rb');
while (!feof($handle))
{
$buffer = fread($handle, $chunkSize);
echo $buffer;
ob_flush();
flush();
}
fclose($handle);
exit;
}
I've tried with the following custom download helper, may it will work for you.
Ref Link - https://github.com/bcit-ci/CodeIgniter/wiki/Download-helper-for-large-files

ftp_nb_fput not transferring more than 4096 bytes

I am trying to upload a file to a server with ftp_nb_fput, just that it doesn't upload more than 4096 bytes from the files, and the file has about 700 kb.
$connection_to = ftp_connect($host_to);
$ftp_to = ftp_login($connection_to, $user_to, $pass_to);
$fp = fopen($directory_to_move_files.$file_to_move, 'r');
ftp_nb_fput($connection_to, $file_to_move, $fp, FTP_ASCII);
ftp_close($connection_to);
I am interested to use this function not file_put_contents or CURL.
There is no error that I get.
There are two things to take into considertion when working with ftp_nb_put function from ftp
it works asynchronously so it works using chunks it meaning that
ftp_nb_put($my_connection, "test.remote", "test.local", FTP_BINARY);
will only result in a small chunk of data uploaded and the flag FTP_MOREDATA returned from the ftp_nb_put function arises, so to complete the upload using this command you will need to iterate:
$ret = ftp_nb_put($my_connection, "test.remote", "test.local", FTP_BINARY);
while ($ret == FTP_MOREDATA) {
$ret = ftp_nb_continue($my_connection);
}
there are the following directives to take into account so you can upload files with big size, this directives are located in php.ini and can not be modified from current script:
; Maximum allowed size for uploaded files.
upload_max_filesize = XXM
; Must be greater than or equal to upload_max_filesize
post_max_size = XXM
where XX are the number of Mb. do not forget to put M,
After any Modification it will be neccesary to restart server.
If you want to transfer the whole file at once, use ftp_put(), not ftp_nb_fput(). It'll make your code a bit simpler:
$connection_to = ftp_connect($host_to);
$ftp_to = ftp_login($connection_to, $user_to, $pass_to);
$local_file = $directory_to_move_files . $file_to_move;
ftp_put($connection_to, $file_to_move, $local_file, FTP_BINARY);
ftp_close($connection_to);
Side note: don't use FTP_ASCII unless you're absolutely sure the file you're transferring is plain text. It will corrupt binary files, including images. Using FTP_BINARY is always safe.

How to decompress gzip stream chunk by chunk using PHP?

I can't read from an active http gzip stream chunk by chunk.
In short, it can't decompress the stream chunk by chunk, it requires the first chunk when it decompress the second one, it requires the first and second one when decompress the third one, or it will return strange characters(gzip string I guess).
I guess there are no existing ways for this as I have googled it for 2 days, anyway, I'll be appreciative if you have any suggestions.
Following is the function which I am using for decompressing:
function gzdecode1($data){
$g = tempnam('./','gz');
file_put_contents($g,$data);
ob_start();
readgzfile($g);
$d = ob_get_clean();
unlink($g);
return $d;
}
Here are ten example chunks
http://2.youpiaoma.com/chunk_s.rar
Use gzopen() and gzread()
$h = gzopen($filename, 'r');
while ($chunk = gzread($h, $chunksize)) {
// do magic
}
If it's a remote you might need to enable that remote file opens, I've never done it in that kind of environment though.

Cannot resume downloads bigger than 300M

I am working on a program with php to download files.
the script request is like: http://localhost/download.php?file=abc.zip
I use some script mentioned in Resumable downloads when using PHP to send the file?
it definitely works for files under 300M, either multithread or single-thread download, but, when i try to download a file >300M, I get a problem in single-thread downloading, I downloaded only about 250M data, then it seems like the http connection is broken. it doesnot break in the break-point ..Why?
debugging the script, I pinpointed where it broke:
$max_bf_size = 10240;
$pf = fopen("$file_path", "rb");
fseek($pf, $offset);
while(1)
{
$rd_length = $length < $max_bf_size? $length:$max_bf_size;
$data = fread($pf, $rd_length);
print $data;
$length = $length - $rd_length;
if( $length <= 0 )
{
//__break-point__
break;
}
}
this seems like every requested document can only get 250M data buffer to echo or print..But it works when i use a multi-thread to download a file
fread() will read up to the number of bytes you ask for, so you are doing some unnecessary work calculating the number of bytes to read. I don't know what you mean by single-thread and multi-thread downloading. Do you know about readfile() to just dump an entire file? I assume you need to read a portion of the file starting at $offset up to $length bytes, correct?
I'd also check my web server (Apache?) configuration and ISP limits if applicable; your maximum response size or time may be throttled.
Try this:
define(MAX_BUF_SIZE, 10240);
$pf = fopen($file_path, 'rb');
fseek($pf, $offset);
while (!feof($pf)) {
$data = fread($pf, MAX_BUF_SIZE);
if ($data === false)
break;
print $data;
}
fclose($pf);

Downloading large files reliably in PHP

I have a php script on a server to send files to recipents: they get a unique link and then they can download large files. Sometimes there is a problem with the transfer and the file is corrupted or never finishes. I am wondering if there is a better way to send large files
Code:
$f = fopen(DOWNLOAD_DIR.$database[$_REQUEST['fid']]['filePath'], 'r');
while(!feof($f)){
print fgets($f, 1024);
}
fclose($f);
I have seen functions such as
http_send_file
http_send_data
But I am not sure if they will work.
What is the best way to solve this problem?
Regards
erwing
Chunking files is the fastest / simplest method in PHP, if you can't or don't want to make use of something a bit more professional like cURL, mod-xsendfile on Apache or some dedicated script.
$filename = $filePath.$filename;
$chunksize = 5 * (1024 * 1024); //5 MB (= 5 242 880 bytes) per one chunk of file.
if(file_exists($filename))
{
set_time_limit(300);
$size = intval(sprintf("%u", filesize($filename)));
header('Content-Type: application/octet-stream');
header('Content-Transfer-Encoding: binary');
header('Content-Length: '.$size);
header('Content-Disposition: attachment;filename="'.basename($filename).'"');
if($size > $chunksize)
{
$handle = fopen($filename, 'rb');
while (!feof($handle))
{
print(#fread($handle, $chunksize));
ob_flush();
flush();
}
fclose($handle);
}
else readfile($path);
exit;
}
else echo 'File "'.$filename.'" does not exist!';
Ported from richnetapps.com / NeedBee. Tested on 200 MB files, on which readfile() died, even with maximum allowed memory limit set to 1G, that is five times more than downloaded file size.
BTW: I tested this also on files >2GB, but PHP only managed to write first 2GB of file and then broke the connection. File-related functions (fopen, fread, fseek) uses INT, so you ultimately hit the limit of 2GB. Above mentioned solutions (i.e. mod-xsendfile) seems to be the only option in this case.
EDIT: Make yourself 100% that your file is saved in utf-8. If you omit that, downloaded files will be corrupted. This is, because this solutions uses print to push chunk of a file to a browser.
If you are sending truly large files and worried about the impact this will have, you could use the x-sendfile header.
From the SOQ using-xsendfile-with-apache-php, an howto blog.adaniels.nl : how-i-php-x-sendfile/
Best solution would be to rely on lighty or apache, but if in PHP, I would use PEAR's HTTP_Download (no need to reinvent the wheel etc.), has some nice features, like:
Basic throttling mechanism
Ranges (partial downloads and resuming)
See intro/usage docs.
We've been using this in a couple of projects and it works quite fine so far:
/**
* Copy a file's content to php://output.
*
* #param string $filename
* #return void
*/
protected function _output($filename)
{
$filesize = filesize($filename);
$chunksize = 4096;
if($filesize > $chunksize)
{
$srcStream = fopen($filename, 'rb');
$dstStream = fopen('php://output', 'wb');
$offset = 0;
while(!feof($srcStream)) {
$offset += stream_copy_to_stream($srcStream, $dstStream, $chunksize, $offset);
}
fclose($dstStream);
fclose($srcStream);
}
else
{
// stream_copy_to_stream behaves() strange when filesize > chunksize.
// Seems to never hit the EOF.
// On the other handside file_get_contents() is not scalable.
// Therefore we only use file_get_contents() on small files.
echo file_get_contents($filename);
}
}
For downloading files the easiest way I can think of would be to put the file in a temporary location and give them a unique URL that they can download via regular HTTP.
As part generating these links you could also remove files that were more than X hours old.
Create a symbolic link to the actual file and make the download link point at the symbolic link. Then, when the user clicks on the DL link, they'll get a file download from the real file but named from the symbolic link. It takes milliseconds to create the symbolic link and is better than trying to copy the file to a new name and download from there.
For example:
<?php
// validation code here
$realFile = "Hidden_Zip_File.zip";
$id = "UserID1234";
if ($_COOKIE['authvalid'] == "true") {
$newFile = sprintf("myzipfile_%s.zip", $id); //creates: myzipfile_UserID1234.zip
system(sprintf('ln -s %s %s', $realFile, $newFile), $retval);
if ($retval != 0) {
die("Error getting download file.");
}
$dlLink = "/downloads/hiddenfiles/".$newFile;
}
// rest of code
?>
<a href="<?php echo $dlLink; ?>Download File</a>
That's what I did because Go Daddy kills the script from running after 2 minutes 30 seconds or so....this prevents that problem and hides the actual file.
You can then setup a CRON job to delete the symbolic links at regular intervals....
This whole process will then send the file to the browser and it doesn't matter how long it runs since it's not a script.
When I have done this in the past I've used this:
set_time_limit(0); //Set the execution time to infinite.
header('Content-Type: application/exe'); //This was for a LARGE exe (680MB) so the content type was application/exe
readfile($fileName); //readfile will stream the file.
These 3 lines of code will do all the work of the download readfile() will stream the entire file specified to the client, and be sure to set an infinite time limit else you may be running out of time before the file is finished streaming.
If you are using lighttpd as a webserver, an alternative for secure downloads would be to use ModSecDownload. It needs server configuration but you'll let the webserver handle the download itself instead of the PHP script.
Generating the download URL would look like that (taken from the documentation) and it could of course be only generated for authorized users:
<?php
$secret = "verysecret";
$uri_prefix = "/dl/";
# filename
# please note file name starts with "/"
$f = "/secret-file.txt";
# current timestamp
$t = time();
$t_hex = sprintf("%08x", $t);
$m = md5($secret.$f.$t_hex);
# generate link
printf('%s',
$uri_prefix, $m, $t_hex, $f, $f);
?>
Of course, depending on the size of the files, using readfile() such as proposed by Unkwntech is excellent. And using xsendfile as proposed by garrow is another good idea also supported by Apache.
header("Content-length:".filesize($filename));
header('Content-Type: application/zip'); // ZIP file
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="downloadpackage.zip"');
header('Content-Transfer-Encoding: binary');
ob_end_clean();
readfile($filename);
exit();
I'm not sure this is a good idea for large files. If the thread for your download script runs until the user has finished the download, and you're running something like Apache, just 50 or more concurrent downloads could crash your server, because Apache isn't designed to run large numbers of long-running threads at the same time. Of course I might be wrong, if the apache thread somehow terminates and the download sits in a buffer somewhere whilst the download progresses.
I have used the following snippet found in the comments of the php manual entry for readfile:
function _readfileChunked($filename, $retbytes=true) {
$chunksize = 1*(1024*1024); // how many bytes per chunk
$buffer = '';
$cnt =0;
// $handle = fopen($filename, 'rb');
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, $chunksize);
echo $buffer;
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
I have had same problem,
my problem solved by adding this before starting session
session_cache_limiter('none');
This is tested on files of a size 200+ MB on a server that has 256MB memory limit.
header('Content-Type: application/zip');
header("Content-Disposition: attachment; filename=\"$file_name\"");
set_time_limit(0);
$file = #fopen($filePath, "rb");
while(!feof($file)) {
print(#fread($file, 1024*8));
ob_flush();
flush();
}

Categories