ftp_nb_fput not transferring more than 4096 bytes - php

I am trying to upload a file to a server with ftp_nb_fput, just that it doesn't upload more than 4096 bytes from the files, and the file has about 700 kb.
$connection_to = ftp_connect($host_to);
$ftp_to = ftp_login($connection_to, $user_to, $pass_to);
$fp = fopen($directory_to_move_files.$file_to_move, 'r');
ftp_nb_fput($connection_to, $file_to_move, $fp, FTP_ASCII);
ftp_close($connection_to);
I am interested to use this function not file_put_contents or CURL.
There is no error that I get.

There are two things to take into considertion when working with ftp_nb_put function from ftp
it works asynchronously so it works using chunks it meaning that
ftp_nb_put($my_connection, "test.remote", "test.local", FTP_BINARY);
will only result in a small chunk of data uploaded and the flag FTP_MOREDATA returned from the ftp_nb_put function arises, so to complete the upload using this command you will need to iterate:
$ret = ftp_nb_put($my_connection, "test.remote", "test.local", FTP_BINARY);
while ($ret == FTP_MOREDATA) {
$ret = ftp_nb_continue($my_connection);
}
there are the following directives to take into account so you can upload files with big size, this directives are located in php.ini and can not be modified from current script:
; Maximum allowed size for uploaded files.
upload_max_filesize = XXM
; Must be greater than or equal to upload_max_filesize
post_max_size = XXM
where XX are the number of Mb. do not forget to put M,
After any Modification it will be neccesary to restart server.

If you want to transfer the whole file at once, use ftp_put(), not ftp_nb_fput(). It'll make your code a bit simpler:
$connection_to = ftp_connect($host_to);
$ftp_to = ftp_login($connection_to, $user_to, $pass_to);
$local_file = $directory_to_move_files . $file_to_move;
ftp_put($connection_to, $file_to_move, $local_file, FTP_BINARY);
ftp_close($connection_to);
Side note: don't use FTP_ASCII unless you're absolutely sure the file you're transferring is plain text. It will corrupt binary files, including images. Using FTP_BINARY is always safe.

Related

Codeigniter force_download fails on large Zip files

When using force_download to download a zip file my code works for a zip file that is 268Mb (31 MP3 files) but not for a zip file that is 287Mb (32 MP3 files), the difference being 1 extra MP3 file added to the zip. The download attempts to start and appears as though it keeps starting over and over a couple of times and shows as failed with Chrome indicating that the zip file is incomplete. Windows reports the zip file which is only 61Kb is invalid when trying to open it.
The zip file gets created and MP3 files added to it by another area of code.
I have increased the memory_limit up to 1024M but its no different.
Below is the code I want working:
$this->load->helper("download");
$lastbasket = "uniquefilename.zip";
$zipdlpath = base_url()."uploads/zipped/".$lastbasket;
$fileContent = file_get_contents($zipdlpath);
force_download($lastbasket, $fileContent);
I have also tried using the following code:
$this->load->helper("download");
$lastbasket = "uniquefilename.zip";
$zipdlpath = FCPATH."uploads/zipped/".$lastbasket;
force_download($zipdlpath, NULL);
Providing a direct link to the zip file works fine (so I know the issue isnt with the zip file itself) but the force_download function in the controller appears to have an issue with larger files or is there a setting I am missing somewhere that is forcing a limit somehow?
PHP 7.1.33
CodeIgniter 3.1.9
Try to increase memory limit by adding this code:
ini_set('memory_limit','1024M');
increase memory limit and use fopen, fread
try this
$this->load->helper("download");
$lastbasket = "uniquefilename.zip";
$zipdlpath = FCPATH."uploads/zipped/".$lastbasket;
force_download($zipdlpath, NULL);
if (is_file($zipdlpath))
{
$chunkSize = 1024 * 1024;
$handle = fopen($zipdlpath, 'rb');
while (!feof($handle))
{
$buffer = fread($handle, $chunkSize);
echo $buffer;
ob_flush();
flush();
}
fclose($handle);
exit;
}
I've tried with the following custom download helper, may it will work for you.
Ref Link - https://github.com/bcit-ci/CodeIgniter/wiki/Download-helper-for-large-files

error 500 image base64

Good day to all, it's my first time to post here.
I want to upload an image in my domain using an image that is encoded to base64,
my image was completely uploaded to the server, but I'm still getting an server error 500,
The memory_limit at my php.ini file is 128M`
I'm using XAMPP server
<?php
header('Content-type : bitmap; charset=utf-8');
$encoded_string = $_POST['string_encoded']; //encoded string
$imagename = 'image.png';
$decoded_string = base64_decode($encoded_string);
$path = 'imageses/'.$imagename;
$file = fopen($path, 'wb');
fwrite($file, $decoded_string);
fclose($file);
?>`
Let's suppose image.png has a size of 2MB. In this case, only decoding it from base64 will write roughly 64 * 2 MB into memory, which is 128 MB. This could be the a cause of the issue. To fix it, increase memory_limit in your php.ini. Another possible problem can be that a script is loaded several times, doing the same large decoding in parallel manner. If everything fails, then you can still achieve success, but not decoding the whole file, only one smaller packet at a time and forgetting the packet when calculated as soon as possible.

PHP upload and download not working for larger files (>1 MB)

I have an upload.php script, that I access from a mobile device, to upload documents to.
Whenever I upload an image of some sort, that is < 1MB, it works perfectly, however, when uploading a file larger than that, it will be corrupted.
The upload script also renames the file, and removes the extension from it, if this could have anything to do with the error...
Here's the script:
<?php header('Access-Control-Allow-Origin: *');
//Read parameters - I use this when I'm adding the file to the database.
$documentuniqueid = addslashes($_REQUEST['documentuniqueid']);
$type = addslashes($_REQUEST['type']);
$notes = addslashes($_REQUEST['notes']);
//Get file name
$filename = urldecode($_FILES["file"]["name"]);
// Check for errors
if($_FILES['file']['error'] > 0){
outputJSON('An error ocurred when uploading.');
}
// Check filetype
//if($_FILES['file']['type'] != 'image/png'){
// outputJSON('Unsupported filetype uploaded.');
//}
// Check filesize
if($_FILES['file']['size'] > 500000){
outputJSON('File uploaded exceeds maximum upload size.');
}
// Check if the file exists
if(file_exists('upload/' . $_FILES['file']['name'])){
outputJSON('File with that name already exists.');
}
// Upload file
if(!move_uploaded_file($_FILES['file']['tmp_name'], 'documents/'.$documentuniqueid)){
outputJSON('Error uploading file - check destination is writeable.');
}
?>
Download script looks like this:
<?php
sleep(2);
$document_id = addslashes($_REQUEST['document_id']);
if($document_id != "") {
$real_document_name = GetRealFileName($document_id);
if($real_document_name != "ERROR") {
$original_filename = "http://www.whatever.com/documents/".$document_id;
$new_filename = $real_document_name;
//Headers
$finfo = finfo_open(FILEINFO_MIME_TYPE);
header('Content-Type: '.finfo_file($finfo, $original_filename));
header("Content-Length: " . filesize($original_filename));
header('Content-disposition: attachment; filename="'.$new_filename.'"');
//clean up
ob_clean();
flush();
readfile($original_filename);
exit();
}
}
?>
Any tips on improving this?
Or any insight, on why is this not working correctly?
You can see that I am renaming the files upon upload, to a random string, and them when downloading, I look up the filenames and rename it back to the original one.
That works as expected for the small file sizes.
I also have to note, that even if I go in manually into the FTP, download the uploaded file and add the right extension myself, I'm unable to open them. The images look messed up, and PDFs, for instance are corrupted.
PHP DETAILS:
both post_max_size and upload_max_filesize is set to 100M. max_file_uploads is set to 20 and file_uploads is ON
max_input_time : 60
max_execution_time : 3000
Increase the size of upload file property of the php.ini file inside /etc/php5/apache2/.
Set:
upload_max_filesize = 50M
I think you do not have enough space in your host, Try to upload by FTP "FileZilla for example" and check the log messages
also do not forget to check these values
upload_max_filesize
post_max_size
max_input_time
max_execution_time
To keep the original name of file you need to change
move_uploaded_file($_FILES['file']['tmp_name'], 'documents/'.$documentuniqueid)
to
move_uploaded_file -
move_uploaded_file($_FILES['file']['tmp_name'], $_FILES['file']['name'])
or write this line in upload script to see the information about the file
$_FILES -
print_r($_FILES);
sorry for my english, i hope i helped.

base64_decode image or pdf is failing in PHP when size is higher than 500kb

I'm uploading files (image or pdf) using AJAX. My process is converting the file to base64 then send the data via AJAX then process in server side (PHP) to become a image or pdf. This is my code in server side and it's working fine but failing when file size is above 500kb.
if ($picture_ext == 'pdf') { //pdf
$image_generated_name = $select_name . '_' . $generate_rand_num . '_file.pdf';
file_put_contents(WP_PLUGIN_DIR.'/plugin_name/uploads/'.$image_generated_name, base64_decode(substr($product_img_upload,28)));
} else { //image
file_put_contents(WP_PLUGIN_DIR.'/plugin_name/uploads/'.$image_generated_name, base64_decode(substr($product_img_upload,22)));
}
We dont know what the error is, but I suspect its possibly with your in variables post_max_size and upload_max_filesize. You can modifiy these in php.ini config,
Add the following commands before that is run, see if it works, and modify your ini based on that:
ini_set('post_max_size', '10M');
ini_set('upload_max_filesize', '10M');

PHP filesize reporting old size

The following code is part of a PHP web-service I've written. It takes some uploaded Base64 data, decodes it, and appends it to a file. This all works fine.
The problem is that when I read the file size after the append operation I get the size the file was before the append operation.
$fileOut = fopen($filepath.$filename, "ab")
fwrite($fileOut, base64_decode($data));
fflush($fileOut);
fclose($fileOut);
$newSize = filesize($filepath.$filename); // gives old file size
What am I doing wrong?
System is:
PHP 5.2.14
Apache 2.2.16
Linux kernel 2.6.18
On Linux based systems, data fetched by filesize() is "statcached".
Try calling clearstatcache(); before the filesize call.
According to the PHP manual:
The results of this function are
cached. See clearstatcache() for more
details.
http://us2.php.net/manual/en/function.filesize.php
Basically, you have to clear the stat cache after the file operation:
$fileOut = fopen($filepath.$filename, "ab")
fwrite($fileOut, base64_decode($data));
fflush($fileOut);
fclose($fileOut);
clearstatcache();
$newSize = filesize($filepath.$filename);
PHP stores all file metadata it reads in a cache, so it's likely that the file size is already stored in that cache, and you need to clear it. See clearstatcache and call it before you call filesize.

Categories