HTTP PUT to unique filename in PHP - php

I would like to use HTTP to transfer a file to a webserver and save it with a meaningful filename. I currently have the transfer part working, I need to make the meaningful filename piece work.
If I use Curl to transfer the file, I specify which file to send:
curl -X PUT http://user:passwd#x.x.46.9/put.php --data-binary "session"
The PHP script on the webserver is as follows:
/* PUT data comes in on the stdin stream */
$putdata = fopen("php://input", "r");
/* Open a file for writing */
$fp = fopen("myputfile.ext", "w");
/* Read the data 1 KB at a time
and write to the file */
while ($data = fread($putdata, 1024))
fwrite($fp, $data);
/* Close the streams */
fclose($fp);
fclose($putdata);
?>
Is there a way to extract the filename sent in the HTTP PUT message and use this for the filename? Or, at the very least, is there a way to save the file using the IP address of the source device?
Much obliged.

Yes, you have all that information in the $_REQUEST variable.

Related

How to prevent PUT requests saving data in a temp file

Having a problem when uploads happen using PUT.
When uploads are done a temp file is being created for the entire request, so doing a PUT for a 1GB file creates a 1GB temp file.
A bit of background, I'm using a container image, the base container is "php:7.3-apache".
In it I'm running a PHP website and users can do a PUT request to an endpoint to upload large files.
Because I was reading from "php://input" I wasn't expecting a temp file the size of the PUT to be created.
The code I'm using is basically what's on the PHP website:
https://www.php.net/manual/en/features.file-upload.put-method.php
<?php
/* PUT data comes in on the stdin stream */
$putdata = fopen("php://input", "r");
/* Open a file for writing */
$fp = fopen("myputfile.ext", "w");
/* Read the data 1 KB at a time
and write to the file */
while ($data = fread($putdata, 1024))
fwrite($fp, $data);
/* Close the streams */
fclose($fp);
fclose($putdata);
?>
When I add logging I do see the code being called right away, so how can I make it so that the data I've already processed/written to it's final location doesn't stay in the /tmp or no temp file is used at all.
The goal is to write the data to object storage and have no control of how much free disk space there will be on the host the container is running on.
How do other people solve this, just add storage?, would rather not do this as there can be X number of users uploading files of an unspecified size.

PHP input stream returning 0 data - Laravel

I'm trying to get chunked uploads working on a form in my Laravel 4 project. The client side bit works so far, the uploads are chunking in 2MB chunks, and data is being sent from the browser. There's even have a handy progress bar in place to show the upload progress.
The problem is on the PHP side, as I'm unable to write the contents of the upload stream to disk. The system always ends up with a 0 byte file created. The idea is to append the chunks to the already uploaded file as they arrive.
The project is built on Laravel 4, so I'm not sure if Laravel reads the php://input stream and does something with it. Since php://input can only be read once, it possibly means that by the time when my controller actually tries to read it the stream, it would be empty.
The controller looks as follows:
public function upload()
{
$filename = Config::get('tms.upload_path') . Input::file('file')->getClientOriginalName();
file_put_contents($filename, fopen('php://input', 'r'), FILE_APPEND);
}
The file is being created, but it's length always remains at 0 bytes. Any ideas how I can coax the contents of the php://input stream out of the system?
afaik fopen returns a pointer to file, and not an stream, so probably it is not good as a parameter for file_put_contents
can you try with this workaround, instead of your file_put_contents?
$putdata = fopen("php://input", "r");
$fp = fopen($filename, "a");
while ($data = fread($putdata, 1024))
fwrite($fp, $data);
fclose($fp);
fclose($putdata);
The answer to this is simple, I needed to turn off multipart/form-data and use file_get_contents("php://input") to read the contents and pass the result to file_put_contents() like so:
file_put_contents($filename, file_get_contents("php://input"), FILE_APPEND);
This works and fixes my problems.

Saving large files sent from .Net in PHP through PUT or POST method

I am having trouble in saving a large file ~200MB and above sent through desktop application made in .Net framework and receiving the file in PHP. The desktop application sending the file through writeStream function of .Net.
Problems:
1) I am getting the data in PHP but the final file size exceeds the original file size. For example if the file size is of 36MB the file PHP saves will be of 55MB.
2) And the file also get corrupted after saved on server.
Requirements:
1) I need to save a large file sent from .Net to PHP with any method that is working.
2) Any working example of the problem will be highly appreciated.
Note:
I am making a Dropbox like application. I hope this will give you a general idea of what kind of application I need to make.
PHP Code
/* PUT data comes in on the stdin stream */
$putdata = fopen("php://input", "r");
/* Open a file for writing */
$fp = fopen($targetDir.$fileName, "a");
/* Read the data 1 KB at a time
and write to the file */
while ($data = fread($putdata, 1024)){
fwrite($fp, $data);
}
/* Close the streams */
fclose($fp);
fclose($putdata);

HTTP PUT Request: Passing Parameters with File

After numerous various tests with uploading files throught HTTP POST Request, it looks that HTTP PUT Requests are the most suitable for very large files +1GB upload.
The below listed simple code I have tested for HTTP PUT file upload request works well:
JavaScript:
var req = createRequest();
req.open("PUT", "PHP/filePutLoad.php");
req.setRequestHeader("Content-type", "text/plain");
req.onload = function (event)
{
console.log(event.target.responseText);
}
req.send(aUploadedFile.file_object);
PHP:
include 'ChromePhp.php';
require_once 'mysqlConnect.php';
ini_set('max_execution_time', 0);
ChromePHP::log( '$_PUT :' . print_r($_PUT));
/* PUT data comes in on the stdin stream */
$putdata = fopen("php://input", "r");
/* Open a file for writing */
$fp = fopen("myputfile.ext", "w");
/* Read the data 1 KB at a time and write to the file */
while ($data = fread($putdata, 1024))
fwrite($fp, $data);
/* Close the streams */
fclose($fp);
fclose($putdata);
However, I have difficulties delivering arguments and variables with the file being uploaded from JavaScript to PHP. For example, I need to deliver upload target folder, where the new data needs to be stored, ID of the uploader, etc..
Is there a way to combine HTTP PUT Request with HTTP POST to submit arguments?
What are my options if I wish to deliver parameters from JavaScript to PHP along HTTP PUT file upload?
Thank you.
using PUT also, it works when you append the parameters in the query string. I'm also looking for another way for this. Although, this is a workaround I'm using currently
curl -X PUT "http://www.my-service.com/myservice?param1=val1" --data #file.txt

Streaming a file fromFTP and letting user to download it at the same time

I'm creating a backup system, where backups will be generated automaticly, so I will be storing backups on a different server, however when I want to download them, I want the link to be a one time link, this isn't hard to make, however to make this secure I was thinking about storing the files so their not accesible via http on the other server.
So what I would do is connet via ftp, download the file to the main server, then present it for download and deleteit, however this will take a long time if the backup is large, is there a way to stream it from FTP without showing the person who is downloadiong the actual location and not store it on the server?
Here is a very basic example using cURL. It specifies a read callback which will be called when data is available to be read from FTP, and outputs the data to the browser to serve a simultaneous download to the client while the FTP transaction is taking place with the backup server.
This is a very basic exmaple which you can expand on.
<?php
// ftp URL to file
$url = 'ftp://ftp.mozilla.org/pub/firefox/nightly/latest-firefox-3.6.x/firefox-3.6.29pre.en-US.linux-i686.tar.bz2';
// init curl session with FTP address
$ch = curl_init($url);
// specify a callback function for reading data
curl_setopt($ch, CURLOPT_READFUNCTION, 'readCallback');
// send download headers for client
header('Content-type: application/octet-stream');
header('Content-Disposition: attachment; filename="backup.tar.bz2"');
// execute request, our read callback will be called when data is available
curl_exec($ch);
// read callback function, takes 3 params, the curl handle, the stream to read from and the maximum number of bytes to read
function readCallback($curl, $stream, $maxRead)
{
// read the data from the ftp stream
$read = fgets($stream, $maxRead);
// echo the contents just read to the client which contributes to their total download
echo $read;
// return the read data so the function continues to operate
return $read;
}
See curl_setopt() for more info on the CURLOPT_READFUNCTION option.

Categories