I'm using fopen, fwrite and fclose to save a PNG onto my server using this code:
ini_set('memory_limit', '128M');
$f = fopen('../../myFolder/myImage.png', 'w+');
fwrite($f, base64_decode($lowerDesign));
$success = fclose($f);
echo $success != false ? '1' : '0';
Now this works perfectly for small file sizes (1-5kb) but fails for larger images. I'm getting absolutely no errors in my logs whatsoever. All I get is a '0' instead of a '1' and there is no PNG saved.
Obviously, the file size is the issue but I can't think of how to get around it.
Any ideas?
Split $lowerDesign into small chunks, base64_decode() has issues with large amounts of data
Related
When using force_download to download a zip file my code works for a zip file that is 268Mb (31 MP3 files) but not for a zip file that is 287Mb (32 MP3 files), the difference being 1 extra MP3 file added to the zip. The download attempts to start and appears as though it keeps starting over and over a couple of times and shows as failed with Chrome indicating that the zip file is incomplete. Windows reports the zip file which is only 61Kb is invalid when trying to open it.
The zip file gets created and MP3 files added to it by another area of code.
I have increased the memory_limit up to 1024M but its no different.
Below is the code I want working:
$this->load->helper("download");
$lastbasket = "uniquefilename.zip";
$zipdlpath = base_url()."uploads/zipped/".$lastbasket;
$fileContent = file_get_contents($zipdlpath);
force_download($lastbasket, $fileContent);
I have also tried using the following code:
$this->load->helper("download");
$lastbasket = "uniquefilename.zip";
$zipdlpath = FCPATH."uploads/zipped/".$lastbasket;
force_download($zipdlpath, NULL);
Providing a direct link to the zip file works fine (so I know the issue isnt with the zip file itself) but the force_download function in the controller appears to have an issue with larger files or is there a setting I am missing somewhere that is forcing a limit somehow?
PHP 7.1.33
CodeIgniter 3.1.9
Try to increase memory limit by adding this code:
ini_set('memory_limit','1024M');
increase memory limit and use fopen, fread
try this
$this->load->helper("download");
$lastbasket = "uniquefilename.zip";
$zipdlpath = FCPATH."uploads/zipped/".$lastbasket;
force_download($zipdlpath, NULL);
if (is_file($zipdlpath))
{
$chunkSize = 1024 * 1024;
$handle = fopen($zipdlpath, 'rb');
while (!feof($handle))
{
$buffer = fread($handle, $chunkSize);
echo $buffer;
ob_flush();
flush();
}
fclose($handle);
exit;
}
I've tried with the following custom download helper, may it will work for you.
Ref Link - https://github.com/bcit-ci/CodeIgniter/wiki/Download-helper-for-large-files
I have an API endpoint that can receive a POST JSON/XML body (or even raw binary data) inside the body content as payload that should be written immediately to a file on the filesystem.
For backwards compatibility reasons, it cannot be a multipart/form-data.
It works with no problems for body content up to a certain size (around 2.3GB with a 8GB script memory limit).
I've tried all of the followings:
both with and without setting the buffers' sizes
$filename = '/tmp/test_big_file.bin';
$input = fopen('php://input', 'rb');
$output = fopen($filename, 'wb');
stream_set_read_buffer($input, 4096);
stream_set_write_buffer($output, 4096);
stream_copy_to_stream($input, $output);
fclose($input);
fclose($output);
and
$filename = '/tmp/test_big_file.bin';
file_put_contents($filename, file_get_contents('php://input'));
and
$filename = '/tmp/test_big_file.bin';
$input = fopen('php://input', 'rb');
$output = fopen($filename, 'wb');
while (!feof($input)) {
fwrite($output, fread($input, 8192), 8192);
}
fclose($input);
fclose($output);
Unfortunately, none of them works. At one point, I get always the same error:
PHP Fatal error: Allowed memory size of 8589934592 bytes exhausted (tried to allocate 2475803056 bytes) in Unknown on line 0
Also unsetting enable_post_data_reading makes no difference and all the php.ini post/memory/whatever sizes are set to 8GB.
I'm using php-fpm.
Looking what's happening at the memory with free -mt, I can see that the memory used increases slowly at the beginning, going faster after a while, up to a point that no more free memory is left, so the error.
On the temp directory, the file is not directly stream-copied, but instead it is written on a temporary file named php7NARsX or other random strings which is not deleted after the script crashes, so that at the following free -mt check, the available memory is 2.3GB less.
Now my questions:
Why the stream is not copied directly from php://input to the output instead of loading it into memory? (also using php://temp as output stream leads to the same error)
Why is PHP using so much memory? I'm sending a 3GB payload, so why it needs more than 8GB?
Of course, any working solution will be much appreciated. Thank You!
Good day to all, it's my first time to post here.
I want to upload an image in my domain using an image that is encoded to base64,
my image was completely uploaded to the server, but I'm still getting an server error 500,
The memory_limit at my php.ini file is 128M`
I'm using XAMPP server
<?php
header('Content-type : bitmap; charset=utf-8');
$encoded_string = $_POST['string_encoded']; //encoded string
$imagename = 'image.png';
$decoded_string = base64_decode($encoded_string);
$path = 'imageses/'.$imagename;
$file = fopen($path, 'wb');
fwrite($file, $decoded_string);
fclose($file);
?>`
Let's suppose image.png has a size of 2MB. In this case, only decoding it from base64 will write roughly 64 * 2 MB into memory, which is 128 MB. This could be the a cause of the issue. To fix it, increase memory_limit in your php.ini. Another possible problem can be that a script is loaded several times, doing the same large decoding in parallel manner. If everything fails, then you can still achieve success, but not decoding the whole file, only one smaller packet at a time and forgetting the packet when calculated as soon as possible.
I'm trying to get chunked uploads working on a form in my Laravel 4 project. The client side bit works so far, the uploads are chunking in 2MB chunks, and data is being sent from the browser. There's even have a handy progress bar in place to show the upload progress.
The problem is on the PHP side, as I'm unable to write the contents of the upload stream to disk. The system always ends up with a 0 byte file created. The idea is to append the chunks to the already uploaded file as they arrive.
The project is built on Laravel 4, so I'm not sure if Laravel reads the php://input stream and does something with it. Since php://input can only be read once, it possibly means that by the time when my controller actually tries to read it the stream, it would be empty.
The controller looks as follows:
public function upload()
{
$filename = Config::get('tms.upload_path') . Input::file('file')->getClientOriginalName();
file_put_contents($filename, fopen('php://input', 'r'), FILE_APPEND);
}
The file is being created, but it's length always remains at 0 bytes. Any ideas how I can coax the contents of the php://input stream out of the system?
afaik fopen returns a pointer to file, and not an stream, so probably it is not good as a parameter for file_put_contents
can you try with this workaround, instead of your file_put_contents?
$putdata = fopen("php://input", "r");
$fp = fopen($filename, "a");
while ($data = fread($putdata, 1024))
fwrite($fp, $data);
fclose($fp);
fclose($putdata);
The answer to this is simple, I needed to turn off multipart/form-data and use file_get_contents("php://input") to read the contents and pass the result to file_put_contents() like so:
file_put_contents($filename, file_get_contents("php://input"), FILE_APPEND);
This works and fixes my problems.
I have a page with html5 drag and drop upload feature and the file is uploading using PUT method. If I upload large image files, only part of the image is getting saved into the server. Im using the following PHP code to save the file
$putdata = fopen("php://input", "r");
$fp = fopen("/tmp/myputfile" . microtime() . ".jpg", "w");
while ($data = fread($putdata, 1024))
fwrite($fp, $data);
fclose($fp);
fclose($putdata);
Anything wrong with this ? please help
I think is becos the entire file is not completely uploaded yet when you try to read, so it sometimes will return you zero bytes read. But there might still be data being uploaded.
Maybe you can try using the feof function to check if there is any more data to be read?
see "http://www.php.net/manual/en/function.feof.php"
If you are on Windows, you should add the "b" to the mode-parameter of fopen(). see manual BTW. it is only a good idea to add the param for code-portability...