I made an ajax based multiple file upload which is working well enough. But the issue is I have to upload very large files means upto 200 MB of file each and may be 20-25 files of same size at the same time. Files are being uploaded but it takes very long time to upload.
I have changed several things in php ini settings
post_max_size 10G
upload_max_size 10G
max_execution_time 3600
memory_limit -1
So what is the best solution for handling such type of file upload, which perform fast.
My internet connection is of 100 MB/S and upload speed with 20 MB/S.
Please suggest me some good solution.
Depending on wich type of file you're uploading, you can try to send a zipped file to the server and unzip it when the upload is finished.
You can see more about zip functions in php here.
To zip one file:
Source: http://davidwalsh.name/create-zip-php
/* creates a compressed zip file */
function create_zip($files = array(),$destination = '',$overwrite = false) {
//if the zip file already exists and overwrite is false, return false
if(file_exists($destination) && !$overwrite) { return false; }
//vars
$valid_files = array();
//if files were passed in...
if(is_array($files)) {
//cycle through each file
foreach($files as $file) {
//make sure the file exists
if(file_exists($file)) {
$valid_files[] = $file;
}
}
}
//if we have good files...
if(count($valid_files)) {
//create the archive
$zip = new ZipArchive();
if($zip->open($destination,$overwrite ? ZIPARCHIVE::OVERWRITE : ZIPARCHIVE::CREATE) !== true) {
return false;
}
//add the files
foreach($valid_files as $file) {
$zip->addFile($file,$file);
}
//debug
//echo 'The zip archive contains ',$zip->numFiles,' files with a status of ',$zip->status;
//close the zip -- done!
$zip->close();
//check to make sure the file exists
return file_exists($destination);
}
else
{
return false;
}
}
To unzip files in the server:
<?php
$zip = new ZipArchive;
$res = $zip->open('file.zip');
if ($res === TRUE) {
$zip->extractTo('/myzips/extract_path/');
$zip->close();
echo 'woot!';
} else {
echo 'doh!';
}
?>
From your application's point of view the only thing you can do is to limit the simultaneous file uploads. Extend your ajax multiple file upload script to upload just one file at a time. This probably will speed up things a little bit.
However your problem is probably not caused by the application itself, but by your server's network speed, or its disk writing speed. Some VPS providers also limits the number of disk write operations/sec. So, the best thing might be to migrate your app to another server with better network speed and performance. :)
The best way is to use old good FTP and some FTP clients like FileZilla for instance.
Related
I am trying to upload csv file so that I can upload those data into the database.
The code for the same is as bellow
public function upload(Request $request){
set_time_limit(0);
ini_set('MAX_EXECUTION_TIME', 36000);
$file = Input::file('file');
$filePath = $file->getRealPath();
$handle = fopen($filePath, "r");
while(!feof($handle))
{
<< DO THE DATABASE OPERATION >>
}
fclose($handle);
return redirect()->back()->with('success', 'File uploaded successfully');
}
This works fine if the file size is less lets say about 100 or 200mb. But when the file size is big like 2GB. It closes the local server. In the console, it says out of memory
My php.ini settings are :
post_max_size=10000M
upload_max_filesize=10000M
My system config:
Window machine 64 bit
Problem
It's not only failing to upload but also closing the development server i.e localhost:8000
Can anyone please tell me why this is happening and how can i fix it.
I did follow a couple of threads on StackOverflow like this
phpMyAdmin: Can't import huge database file, any suggestions?
Large file uploads failing php
Laravel out of memory issue?
Uploading a file larger than 2GB using PHP
But unfortunately, these solutions did not help.
I would check your Windows Event Viewer to try and track down why the server is crashing. That would hopefully give you some insight into what the issue is.
If you can't find the source of the crash and since you are using 7.1 and large files, try processing the file with a generator. You're loading the whole thing into memory. Even with that large value for memory_limit, I still think that loading that whole file into memory might be your issue.
http://php.net/manual/en/language.generators.php
This will stream it line by line, but you can find other examples that allow you to chunk the file.
public function read_file($file)
{
$fp = fopen($file, 'r');
while(($line = fgets($fp)) !== false)
yield $line;
fclose($fp);
}
public function upload(Request $request){
set_time_limit(0);
ini_set('MAX_EXECUTION_TIME', 36000);
$file = Input::file('file');
$filePath = $file->getRealPath();
foreach(read_file($filePath) as $line)
{
<< DO THE DATABASE OPERATION >>
}
return redirect()->back()->with('success', 'File uploaded successfully');
}
Is it possible to lock the file img.jpg until Imagick creates it?
$image->writeImage('img.jpg')
I'm not entirely sure the problem you are describing actually exists as a problem. No on else has ever reported it.
However, even if it is a problem, you don't want to use file locking here....that is for solving a separate set of problems.
Instead what you want to use is atomic operations, that are done 'instantaneously' by the computer.
$created = false;
for ($i=0; $i<5 && $created == false; $i++) {
// Create a temp name
$tmpName = "temp".rand(10000000, 99999999).".jpg";
// Open it. The x+ means 'do not create if file already exists'.
$fileHandle = #fopen($tmpName, 'x+');
if ($fileHandle === false) {
// The file with $tmpName already exists, or we otherwise failed
// to create the file, loop again.
continue;
}
// We don't actually want the file-handle, we just wanted to make sure
// we had a uniquely named file ending with .jpg so just close it again.
// You could also use tempnam() if you don't care about the file extension.
fclose($fileHandle);
// Writes the image data to the temp file name.
$image->writeImage($tmpName);
rename($tmpName, 'img.jpg');
$created = true;
}
if ($created === false) {
throw new FailedToGenerateImageException("blah blah");
}
There's no locking in there....but it is not possible for any process to read the data from img.jpg while it is being written to. If there are any other processes that have img.jpg while the rename occurs, their file-handle to the old version of the file will continue to exist, and they will continue to read the old file, until they close and open it again.
I am trying to transfer an entire folder to FTP server using PHP.
Right now I am using this code:
function ftp_copyAll($conn_id, $src_dir, $dst_dir) {
if (is_dir($dst_dir)) {
return "<br> Dir <b> $dst_dir </b> Already exists <br> ";
} else {
$d = dir($src_dir);
ftp_mkdir($conn_id, $dst_dir);
echo "create dir <b><u> $dst_dir </u></b><br>";
while($file = $d->read()) { // do this for each file in the directory
if ($file != "." && $file != "..") { // to prevent an infinite loop
if (is_dir($src_dir."/".$file)) { // do the following if it is a directory
ftp_copyAll($conn_id, $src_dir."/".$file, $dst_dir."/".$file); // recursive part
} else {
$upload = ftp_put($conn_id, $dst_dir."/".$file, $src_dir."/".$file, FTP_BINARY); // put the files
echo "creat files::: <b><u>".$dst_dir."/".$file ." </u></b><br>";
}
}
ob_flush() ;
sleep(1);
}
$d->close();
}
return "<br><br><font size=3><b>All Copied ok </b></font>";
}
But is it possible to transfer the entire folder without iterating through the files? Because I have about 100+ files and PHP is taking lot of time for the transfer.
Is there any way to increase the speed of transfer?
No there's no other generic way supported by a common FTP server.
Except that you pack the files (zip, gzip, etc) locally, upload them and unpack remotely.
But if you have an FTP access only, you do not have a way to unpack them remotely anyway. Unless the FTP server explicitly allows that. Either by allowing you to execute an arbitrary remote shell command (typically not allowed) or using a proprietary "unpack" extension (very few servers do support that).
The FTP protocol is generally very inefficient for transferring a large amount of small files, because each file transfer has quite an overhead for opening a separate data transfer connection.
I have a project where the user can add multiple files to a cart and then zip them into a single file and download all of them at once.
It seems though that as soon as the cart gets bigger then 10mb it cannot create the zip file.
I thought this might be because of upload_max_filesize, but that is set to 200m and same for post_max_size.
What would be limiting this?
Here is my code.
It creates a random file name, then checks the cart loop and adds each file.
// create object
$zip = new ZipArchive();
$filename = "loggedin/user_files/ABX-DOWNLOAD-".rand(1, 1000000).".zip";
while (file_exists($filename)):
$filename = "loggedin/user_files/ABX-DOWNLOAD-".rand(1, 1000000).".zip";
endwhile;
// open archive
if ($zip->open('../'.$filename, ZIPARCHIVE::CREATE) !== TRUE) {
die ("Could not open archive");
}
// add files
//echo $all_audio;
foreach($audio_ids as $audio_id){
if($audio_id != ""){
$audio = Audio::find_by_id($audio_id);
$selected_category = Categories::find_by_id($audio->category_id);
$selected_sub_category = Sub_Categories::find_by_id($audio->sub_category_id);
$selected_sub_sub_category = Sub_Sub_Categories::find_by_id($audio->sub_sub_category_id);
$f = "uploads/mp3/".$selected_category->category_name."/".$selected_sub_category->sub_category_name."/".$selected_sub_sub_category->sub_sub_category_name."/".$audio->media_path;
$zip->addFile($f) or die ("ERROR: Could not add file: $f");
}
}
// close and save archive
$zip->close() or die("ERROR: COULD NOT CLOSE");
}
If the archive is being created in memory, you can increase the PHP memory limit through PHP.ini or other means, or alternatively write the file directly to disk while it is being created.
You can also stream the file to the user as it is being created. See php rendering large zip file - memory limit reached
Are you using the addFile() method?
If so, try replacing that call with addFromString() + file_get_contents().
Well, narrowed it down. A pathname was corrupted and was making the zip error out because the file did not exist. Looks like I need some better error checking.
I'm currently writing an upload class for uploading images. I do extension checks to verify that the uploaded images are of the supported types, and the photos are always chmod(0664) when the uploaded file is copied to it's resting place. Is this relatively safe? I don't know much about image encoding, but even if someone went through the trouble of somehow tricking my extension check, the file could never be ran on the server anyways unless there was a security hole elsewhere and the attackers were already into my file system, correct? Here's my extension check:
function validate_ext() { //Function validates that the files extension matches the list of allowed extensions
$extension = $this->get_ext($this->theFile);
$ext_array = $this->extensions;
if (in_array($extension, $ext_array)) { //Check if file's ext is in the list of allowed exts
return true;
echo "ext found";
} else {
$this->error[] = "That file type is not supported. The supported file types are: ".$this->extString;
return false;
}
}
And here's the function that copies the uploaded file to it's final resting place.
if ($_FILES[$this->uploadName]['error'] === UPLOAD_ERR_OK){
$newfile = $this->uploadDir.$this->theFile;
if (!move_uploaded_file($this->tempFile, $newfile)) {
$this->error[] = "The file could not be moved to the new directory. Check permissions and folder paths.";
die($this->error_text());
}else{
$this->error[] = "The file ".$this->originalName." was successfully uploaded.";
if ($this->renameFile == true){
$this->error[] = $this->originalName." was renamed to ".$this->theFile;
}
chmod($newfile , $this->fileperm);
}
}else{
$this->error[] = $this->file_upload_error_message($_FILES[$this->uploadName]['error']);
die($this->error_text());
}
Reading the extension really isnt a good way to check file type. You should read the file mime type... granted that can be faked too, but its more of a hassle to fake.
In Linux world, as long as u gave the file non-executable permission, the file cannot execute. Whether it's .jpeg or it's .bash. That's true the other way around too, .jpeg with an executable permission could be executed too (if the content of that .jpeg file is executable file, not image content).
You can use getimagesize() to check the file itself.