How to prevent PUT requests saving data in a temp file - php

Having a problem when uploads happen using PUT.
When uploads are done a temp file is being created for the entire request, so doing a PUT for a 1GB file creates a 1GB temp file.
A bit of background, I'm using a container image, the base container is "php:7.3-apache".
In it I'm running a PHP website and users can do a PUT request to an endpoint to upload large files.
Because I was reading from "php://input" I wasn't expecting a temp file the size of the PUT to be created.
The code I'm using is basically what's on the PHP website:
https://www.php.net/manual/en/features.file-upload.put-method.php
<?php
/* PUT data comes in on the stdin stream */
$putdata = fopen("php://input", "r");
/* Open a file for writing */
$fp = fopen("myputfile.ext", "w");
/* Read the data 1 KB at a time
and write to the file */
while ($data = fread($putdata, 1024))
fwrite($fp, $data);
/* Close the streams */
fclose($fp);
fclose($putdata);
?>
When I add logging I do see the code being called right away, so how can I make it so that the data I've already processed/written to it's final location doesn't stay in the /tmp or no temp file is used at all.
The goal is to write the data to object storage and have no control of how much free disk space there will be on the host the container is running on.
How do other people solve this, just add storage?, would rather not do this as there can be X number of users uploading files of an unspecified size.

Related

Codeigniter force_download fails on large Zip files

When using force_download to download a zip file my code works for a zip file that is 268Mb (31 MP3 files) but not for a zip file that is 287Mb (32 MP3 files), the difference being 1 extra MP3 file added to the zip. The download attempts to start and appears as though it keeps starting over and over a couple of times and shows as failed with Chrome indicating that the zip file is incomplete. Windows reports the zip file which is only 61Kb is invalid when trying to open it.
The zip file gets created and MP3 files added to it by another area of code.
I have increased the memory_limit up to 1024M but its no different.
Below is the code I want working:
$this->load->helper("download");
$lastbasket = "uniquefilename.zip";
$zipdlpath = base_url()."uploads/zipped/".$lastbasket;
$fileContent = file_get_contents($zipdlpath);
force_download($lastbasket, $fileContent);
I have also tried using the following code:
$this->load->helper("download");
$lastbasket = "uniquefilename.zip";
$zipdlpath = FCPATH."uploads/zipped/".$lastbasket;
force_download($zipdlpath, NULL);
Providing a direct link to the zip file works fine (so I know the issue isnt with the zip file itself) but the force_download function in the controller appears to have an issue with larger files or is there a setting I am missing somewhere that is forcing a limit somehow?
PHP 7.1.33
CodeIgniter 3.1.9
Try to increase memory limit by adding this code:
ini_set('memory_limit','1024M');
increase memory limit and use fopen, fread
try this
$this->load->helper("download");
$lastbasket = "uniquefilename.zip";
$zipdlpath = FCPATH."uploads/zipped/".$lastbasket;
force_download($zipdlpath, NULL);
if (is_file($zipdlpath))
{
$chunkSize = 1024 * 1024;
$handle = fopen($zipdlpath, 'rb');
while (!feof($handle))
{
$buffer = fread($handle, $chunkSize);
echo $buffer;
ob_flush();
flush();
}
fclose($handle);
exit;
}
I've tried with the following custom download helper, may it will work for you.
Ref Link - https://github.com/bcit-ci/CodeIgniter/wiki/Download-helper-for-large-files

PHP creating multimedia file on server

I receive multimedia file from JSON strings sent from devices.
If I store the string into a database and display in html afterwards then all is fine.
My problem is when there are 100's for files (pics,video,audio) then the code to build and display the files stored in the database takes far too long.
So I want to save the media files to the server and reference them instead. Maybe this will help on speed?
I'm having trouble saving correct media files thou .. The data is not viable in a file.
Incoming code for File creation:
$myfile = fopen("./Media/".$mFileName.$mFileExtension, "w") or die("ERROR");
$mMediaToWrite = base64_encode($mTheIncomingMediaData);
fwrite($myfile, $mMediaToWrite );
fclose($myfile);

PHP Amazon S3 access private files through URL

I'm using AWS PHP sdk to save images on S3. Files are saved privately. Then, I'm showing the image thumbnails using the S3 file url in my web application but since the files are private so the images are displayed as corrupt.
When the user clicks on the name of file, a modal is opened to show the file in larger size but file is displayed as corrupt there as well due to the same issue.
Now, I know that there are two ways to make this working. 1. Make the files public. 2. Generate pre-signed urls for files. But I cannot go with any of these two options due to the requirements of my project.
My question is that is there any third way to resolve this issue?
I'd highly advise against this, but you could create a script on your own server that pulls the image via the API, caches it and serves. You can then restrict access however you like without making the images public.
Example pass through script:
$headers = get_headers($realpath); // Real path being where ever the file really is
foreach($headers as $header) {
header($header);
}
$filename = $version->getFilename();
// These lines if it's a download you want to do
// header('Content-Description: File Transfer');
// header("Content-Disposition: attachment; filename={$filename}");
$file = fopen($realpath, 'r');
fpassthru($file);
fclose($file);
exit;
This will barely "touch the sides" and shouldn't delay the appearance of your files too much, but t's still going to take some resources and bandwidth.
You will need to access the files through a script on your server. That script will do some kind of authentication to make sure the request is valid and you want them to see the file. Then fetch the file from S3 using a valid IAM profile that can access the private files. Output the file
Instead of requesting the file from S3 request it from
http://www.yourdomain.com/fetchimages.php?key=8498439834
Then here is some pseudocode in fetchimages.php
<?php
//if authorized to get this image
$key=$_GET['key'];
//validate key is the proper format
//get s3 url from a database based on the $key
//connect to s3 securely and read the file from s3
//output the file
?>
as far as i know you could try to make your S3 bucket a "web server" like this but then you would probably "Make the files public".Then if you have some kind of logic to restrict the access you could create a bucket policy

HTTP PUT to unique filename in PHP

I would like to use HTTP to transfer a file to a webserver and save it with a meaningful filename. I currently have the transfer part working, I need to make the meaningful filename piece work.
If I use Curl to transfer the file, I specify which file to send:
curl -X PUT http://user:passwd#x.x.46.9/put.php --data-binary "session"
The PHP script on the webserver is as follows:
/* PUT data comes in on the stdin stream */
$putdata = fopen("php://input", "r");
/* Open a file for writing */
$fp = fopen("myputfile.ext", "w");
/* Read the data 1 KB at a time
and write to the file */
while ($data = fread($putdata, 1024))
fwrite($fp, $data);
/* Close the streams */
fclose($fp);
fclose($putdata);
?>
Is there a way to extract the filename sent in the HTTP PUT message and use this for the filename? Or, at the very least, is there a way to save the file using the IP address of the source device?
Much obliged.
Yes, you have all that information in the $_REQUEST variable.

Saving large files sent from .Net in PHP through PUT or POST method

I am having trouble in saving a large file ~200MB and above sent through desktop application made in .Net framework and receiving the file in PHP. The desktop application sending the file through writeStream function of .Net.
Problems:
1) I am getting the data in PHP but the final file size exceeds the original file size. For example if the file size is of 36MB the file PHP saves will be of 55MB.
2) And the file also get corrupted after saved on server.
Requirements:
1) I need to save a large file sent from .Net to PHP with any method that is working.
2) Any working example of the problem will be highly appreciated.
Note:
I am making a Dropbox like application. I hope this will give you a general idea of what kind of application I need to make.
PHP Code
/* PUT data comes in on the stdin stream */
$putdata = fopen("php://input", "r");
/* Open a file for writing */
$fp = fopen($targetDir.$fileName, "a");
/* Read the data 1 KB at a time
and write to the file */
while ($data = fread($putdata, 1024)){
fwrite($fp, $data);
}
/* Close the streams */
fclose($fp);
fclose($putdata);

Categories