Good day to all, it's my first time to post here.
I want to upload an image in my domain using an image that is encoded to base64,
my image was completely uploaded to the server, but I'm still getting an server error 500,
The memory_limit at my php.ini file is 128M`
I'm using XAMPP server
<?php
header('Content-type : bitmap; charset=utf-8');
$encoded_string = $_POST['string_encoded']; //encoded string
$imagename = 'image.png';
$decoded_string = base64_decode($encoded_string);
$path = 'imageses/'.$imagename;
$file = fopen($path, 'wb');
fwrite($file, $decoded_string);
fclose($file);
?>`
Let's suppose image.png has a size of 2MB. In this case, only decoding it from base64 will write roughly 64 * 2 MB into memory, which is 128 MB. This could be the a cause of the issue. To fix it, increase memory_limit in your php.ini. Another possible problem can be that a script is loaded several times, doing the same large decoding in parallel manner. If everything fails, then you can still achieve success, but not decoding the whole file, only one smaller packet at a time and forgetting the packet when calculated as soon as possible.
Related
I am trying to upload a file to a server with ftp_nb_fput, just that it doesn't upload more than 4096 bytes from the files, and the file has about 700 kb.
$connection_to = ftp_connect($host_to);
$ftp_to = ftp_login($connection_to, $user_to, $pass_to);
$fp = fopen($directory_to_move_files.$file_to_move, 'r');
ftp_nb_fput($connection_to, $file_to_move, $fp, FTP_ASCII);
ftp_close($connection_to);
I am interested to use this function not file_put_contents or CURL.
There is no error that I get.
There are two things to take into considertion when working with ftp_nb_put function from ftp
it works asynchronously so it works using chunks it meaning that
ftp_nb_put($my_connection, "test.remote", "test.local", FTP_BINARY);
will only result in a small chunk of data uploaded and the flag FTP_MOREDATA returned from the ftp_nb_put function arises, so to complete the upload using this command you will need to iterate:
$ret = ftp_nb_put($my_connection, "test.remote", "test.local", FTP_BINARY);
while ($ret == FTP_MOREDATA) {
$ret = ftp_nb_continue($my_connection);
}
there are the following directives to take into account so you can upload files with big size, this directives are located in php.ini and can not be modified from current script:
; Maximum allowed size for uploaded files.
upload_max_filesize = XXM
; Must be greater than or equal to upload_max_filesize
post_max_size = XXM
where XX are the number of Mb. do not forget to put M,
After any Modification it will be neccesary to restart server.
If you want to transfer the whole file at once, use ftp_put(), not ftp_nb_fput(). It'll make your code a bit simpler:
$connection_to = ftp_connect($host_to);
$ftp_to = ftp_login($connection_to, $user_to, $pass_to);
$local_file = $directory_to_move_files . $file_to_move;
ftp_put($connection_to, $file_to_move, $local_file, FTP_BINARY);
ftp_close($connection_to);
Side note: don't use FTP_ASCII unless you're absolutely sure the file you're transferring is plain text. It will corrupt binary files, including images. Using FTP_BINARY is always safe.
I'm using fopen, fwrite and fclose to save a PNG onto my server using this code:
ini_set('memory_limit', '128M');
$f = fopen('../../myFolder/myImage.png', 'w+');
fwrite($f, base64_decode($lowerDesign));
$success = fclose($f);
echo $success != false ? '1' : '0';
Now this works perfectly for small file sizes (1-5kb) but fails for larger images. I'm getting absolutely no errors in my logs whatsoever. All I get is a '0' instead of a '1' and there is no PNG saved.
Obviously, the file size is the issue but I can't think of how to get around it.
Any ideas?
Split $lowerDesign into small chunks, base64_decode() has issues with large amounts of data
I'm trying to debug this issue by posting raw PNG image data to the server with the help of Postman. Here's a screenshot, which might help to understand the issue:
On the server I'm receiving the file as follows:
$png = $GLOBALS["HTTP_RAW_POST_DATA"];
Then I write the data to a new file:
$fh = fopen($myFile, 'w') or die("can't open file");
fwrite($fh, $png);
fclose($fh);
The file gets saved correctly, but it now has a different file size,
417KB instead of 279KB which is the size of the original file.
Now of course, I can't do any image operation as none of the functions (such as getimagesize which returns bool(false)) recognizes the file as a valid image.
I have debugged this process to a point where the issue must be somewhere in the file operations, but I don't understand why the file just doesn't result in the very same file type and size as the original, when the only thing I am doing is using the same raw data.
UPDATE:
I've now compared the encodings of the original file with the uploaded one,
and the former is in ISO-8859-1 and it displays correctly, the latter is in UTF-8 and has about 138kB more in file size.
Now I've achieved to convert the file on the server to ISO-8859-1.
fwrite($fh, iconv("UTF-8", "ISO-8859-1", $png));
The resulting file does now have the same output file size (279kB),
but it is still not recognized as a PNG image, some information seems to still get lost.
UPDATE (1):
I've been able to examine the issue further and found out, that the original file is exactly 4 bytes bigger than the generated file, thus the resulting PNG seems to be corrupted.
UPDATE (2):
I'm now able to save the file and open it as a valid PNG. The following code seems to be saving the image correctly:
$input = fopen("php://input","r+");
$destination = fopen($myFile, 'w+');
stream_copy_to_stream($input, $destination);
fclose($input);
fclose($destination);
However when trying to open the file with the imagecreatefrompng function I get a 500 error. I'm now trying to figure out if it's a memory issue in PHP.
Problem might be the way you test your POST by copying the "binary" data into a text field.
If you paste the same data into a text editor you won't get a valid image file either when saving this with the png extension.
Try to build a simple form with file field to test your upload
I use nginx for uploads and haven't had a problem, but I use the standard PHP way of uploading files as per: http://www.php.net/manual/en/features.file-upload.post-method.php
I would suggest trying that.
Try using: < ?php $postdata = file_get_contents("php://input"); ?>
To get the raw data. I use it some times to get data sent from a ajax post on cake.
I have an interesting problem. I need to do a progress bar from an asycronusly php file downloading. I thought the best way to do it is before the download starts the script is making a txt file which is including the file name and the original file size as well.
Now we have an ajax function which calling a php script what is intended to check the local file size. I have 2 main problems.
files are bigger then 2GB so filesize() function is out of business
i tried to find a different way to determine the local file size like this:
.
function getSize($filename) {
$a = fopen($filename, 'r');
fseek($a, 0, SEEK_END);
$filesize = ftell($a);
fclose($a);
return $filesize;
}
Unfortunately the second way giving me a tons of error assuming that i cannot open a file which is currently downloading.
Is there any way i can check a size of a file which is currently downloading and the file size will be bigger then 2 GB?
Any help is greatly appreciated.
I found the solution by using an exec() function:
exec("ls -s -k /path/to/your/file/".$file_name,$out);
Just change your OS and PHP to support 64 bit computing. and you can still use filesize().
From filesize() manual:
Return Values
Returns the size of the file in bytes, or FALSE (and generates an
error of level E_WARNING) in case of an error.
Note: Because PHP's integer type is signed and many platforms use
32bit integers, some filesystem functions may return unexpected
results for files which are larger than 2GB.
I just recently asked and solved a question pertaining to uploading .PDF files that are greater than 2 MB into a MySQL database as BLOBS. I had to change some settings in my php.ini file and MySQLs maximum packet setting. However, fixing this issue has led me to discover a new issue with my script.
Now since I can upload files to my BLOB database I attempted to download the file for testing purposes. Much to my dismay when I went to open the .PDF file I received the following error: Failed to load document (error 3) 'file:///tmp/test-13.pdf'. Upon further investigation I found out that the file being downloaded, test.pdf, was only 1 MB, a little less than half of its supposed size in the database of a little more than 2 MB. This is obviously the reason for the error.
The following piece of code is the part of my script I am using for downloading files from the database. It is is at the very top of of script and works Flawlessly for files that are less than 1 MB.
foreach($_REQUEST as $key => $value)
{
if ($value == 'Open')
{
header();
session_start();
$dbh = new PDO('mysql:host='.$_SESSION['OpsDBServer'].'.ops.tns.its.psu.edu;
dbname='.$_SESSION['OpsDB'], $_SESSION['yoM'], $_SESSION['aMa']);
$id = $key;
$sqlDownload = "SELECT name, type, content, size FROM upload WHERE
id='".$id."'";
$result = $dbh->query($sqlDownload);
$download = $result->fetchAll();
$type = $download[0]['type'];
$size = $download[0]['size'];
$name = $download[0]['name'];
$content = $download[0]['content'];
header("Content-type: $type");
header("Content-Disposition: inline; filename=$name");
header("Content-length: $size");
header("Cache-Control: maxage=1");
header("Pragma: public");
echo $content;
exit;
}
}
I am thinking that maybe I have some header statements wrong? I am very confused about what to do. I have searched through php.ini and I have found no settings that I think need to changed and my maximum packet setting for MySQL is 4 MB so a 2 MB should download.
Thanks for any help.
According to (http://dev.mysql.com/doc/refman/5.0/en/blob.html):
The maximum size of
a BLOB or TEXT object is determined by
its type, but the largest value you
actually can transmit between the
client and server is determined by the
amount of available memory and the
size of the communications buffers.
You can change the message buffer size
by changing the value of the
max_allowed_packet variable, but you
must do so for both the server and
your client program.
According to (http://dev.mysql.com/doc/refman/5.0/en/server-parameters.html) the default value for max_allowed_packet is 1048576.
I actually fixed the issue. I changed all of the values that were recommended here in php.ini and my.cnf but I also needed to change a setting for PDO.
I changed:
PDO::MYSQL_ATTR_MAX_BUFFER_SIZE (integer)
Maximum buffer size. Defaults to 1 MiB.
This has to be set when the PDO object is created to work though. All is good now.