Form boundaries and writing php://input to a file in php - php

So I want users to be able to upload big files without having to worry about the post max size values.
The alternative is using PUT and send a file as raw data.
When using jquery I can do this:
var data = new FormData();
jQuery.each($('#file_upload')[0].files, function(i, file) {
data.append('file-'+i, file);
});
$.ajax({
url: 'upload.php?filename=test.pdf',
data: data,
cache: false,
contentType: false,
processData: false,
type: 'PUT',
});
In PHP I can do this:
$f = fopen($_GET['filename'], "w");
$s = fopen("php://input", "r");
while($kb = fread($s, 1024))
{
fwrite($f, $kb, 1024);
}
fclose($f);
fclose($s);
Header("HTTP/1.1 201 Created");
I am not doing:
$client_data = file_get_contents("php://input");
Since putting the whole file into a variable will surely fill up all memory when uploading huge files.
The thing I cannot figure out is how to write the file data without the form boundaries.
Right now it writes at the top of the file something like this:
------WebKitFormBoundaryVz0ZGHLGxBOCUVQG
Content-Disposition: form-data; name="file-0"; filename="somename.pdf"
Content-Type: application/pdf
and at the bottom something like this:
------WebKitFormBoundaryVz0ZGHLGxBOCUVQG--
So I need to parse the data. But for that I need to read the whole data stream into memory and with large video files I don't want to do that.
I did read something about maybe creating a php://temp stream. But no luck yet with that.
How can I write just the content to a file, without the boundary header? And without first pumping all the data into a variable?

Maybe a combination of fgets to stop reading at a newline and checking for the boundaries:
while($kb = fgets($s, 1024))
{
if(strpos($kb, '------') === false) //or !== 0 for first position
{
fwrite($f, $kb, 1024);
}
}

You can use this (there are many like it). It supports chunked uploads which means you won't hit any post/file max sizes as long as each upload chunk is less than the post max size.
It also includes the PHP code you would need on the server side.

There is no need to recreate the wheel. Just use POST and change PHP's config to bigger limits. Those limits can also be set on a per directory / host basis.
Using .htaccess or your apache.conf
php_value upload_max_filesize 10G
php_value post_max_size 10G
It is also a good idea to adjust other limits, like max_input_time.
Don't forget to relocated the received file using move_uploaded_file to avoid any extra work.

Related

Upload big videos from a form to server - PHP

I have to set up a form allowing the upload of videos whose weight is about 20GB for each.
This processing must be done in PHP.
I did a test with Plupload, but it does not work very well beyond 100MB: the file is uploaded, but its data is unusable (cf screenshot).
Do you have any recommendations/best practices?
Thanks.
Just found the solution, in your .js file that handle the upload you just need to add the multipart option and to put the value to false :)
It means that the chunk should be sent as binary stream (ie. multipart : false) and not as multipart/form-data (default ie. multipart : true)
If need here is an example on how I handled video files upload with Plupload :
https://github.com/Rapkalin/bigupload
Hope this helps :)
At first point you need to make some changes in your php.ini configuration file. Look for upload_max_filesize, post_max_size, then you should look for max_execution_time, max_input_time. As i can see your file is missing an extension, easiest and fastest way to handle this is:
$strpos = strpos($file, '.');
if (!$strpos) return false;
$name = substr($file, 0, $strpos);
$ext = substr($file, ($strpos + 1));
Now you can encode your name and append extension later. Also it will be good to paste your script or sample so we can look at it.

PHP - Read content of BLOB file

I uploaded a *.mp4 video and converted it to a Blob type using JavaScript's new Blob() function.
I send this Blob to PHP, using AJAX.
Now I want to read the content of this Blob inside PHP. In other words, I need the binary data of this Blob and want to store it into a PHP variabele.
However, it seems impossible to read a Blob file with PHP, since fread, fopen and file_get_contents are failing all! When I'm opening the Blob URL in my browser the video is playing fine.
My question is, how do I get the binary data of this Blob with PHP, without installing extensions/libraries?
var_dump($_FILES['video']);
Array
(
[name] => blob
[type] => video/mp4
[tmp_name] => C:\xampp\tmp\php43B8.tmp
[size] => 58
[error] => 0
)
// Try 1
if ($stream = fopen($_FILES['video']['tmp_name'], 'r')) {
echo stream_get_contents($stream, filesize($_FILES['video']['tmp_name']));
fclose($stream);
}
// Try 2
if ($fp = fopen($_FILES['video']['tmp_name'], 'r')) {
echo fread($fp, filesize($_FILES['video']['tmp_name']));
fclose($fp);
}
// Try 3
file_get_contents($_FILES['video']['tmp_name'])
Result is always: blob:http://localhost/d53e40bd-686b-46c8-9e81-94789351466f
I know this is a late reply and you may have fixed this issue already but heres what I did for my BLOB processing. Since the blob is already uploaded to the TEMP file location, we first move the uploaded file somewhere using PHP's move_uploaded_file function. From there, you can read contents and such and then delete the file from your server when processing is completed.
Heres a very basic exampleusing an image file:
// Ajax data for posting the blob, remember to set process data to false
formData.append('code', 'uploadedImage');
formData.append('image', blob);
var url = "script.upload.php";
$.ajax({
type: 'POST',
url: url,
data: formData,
processData: false,
contentType: false,
success: function (data) {}
});
// PHP Code for uploading the file to the server
if ($_POST['code'] == "uploadedImage") {
$data = $_FILES['image']['tmp_name'];
move_uploaded_file($data, $_SERVER['DOCUMENT_ROOT'] . "/img/" . time() . ".png");
}
We now have a file like 1542839470.png at our base directory / images location. From here, the image can now be read, moved, streamed, stored, whatever. I am using a handy plugin called Croppie to resize, rotate and crop my images as well for anyone looking for a neat tool. By processing them first, I can avoid having to do anything with them after. As you are doing video, this won't apply in your specific case, but it's a handy plugin for people doing similar stuff (like me).
Once you are done with the file, either by curl to YouTube API or whatever processing you do with the finished file, you can simply delete the file using the following. Cache the video or file name and location from above (simple to do by setting a location $var) and feed that into this command.
unlink("path_to_file_location"); //delete it

(PHP) Unzip and rename CSV files?

I want to download different feeds form some publishers. But the poor thing is, that they are first of all zipped as .gz and as second not in the right format. You can download one of the feeds and check it out. They do not have any filespec... So, I'm forced to add the .csv by myself..
My question now is, how can I unzip those files from the different urls?
How I do rename them, I know. But how do I unzip them?
I already searched for it and found this one:
//This input should be from somewhere else, hard-coded in this example
$file_name = '2013-07-16.dump.gz';
// Raising this value may increase performance
$buffer_size = 4096; // read 4kb at a time
$out_file_name = str_replace('.gz', '', $file_name);
// Open our files (in binary mode)
$file = gzopen($file_name, 'rb');
$out_file = fopen($out_file_name, 'wb');
// Keep repeating until the end of the input file
while (!gzeof($file)) {
// Read buffer-size bytes
// Both fwrite and gzread and binary-safe
fwrite($out_file, gzread($file, $buffer_size));
}
// Files are done, close files
fclose($out_file);
gzclose($file);
But with those feeds it doesn't work...
Here a two example files: file one | file two
Do you have an idea? - Would be very grateful!
Greetings!
windows 10 + php7.1.4 it's work.
The following code has the same effect.
ob_start();
readgzfile($file_name);
file_put_contents($output_filename', ob_get_contents());
ob_clean();
Or you can try to use the gzip command to decompress, and then use the it.
Program execution Functions

How to save a .gif in Classic ASP - converting from PHP script - options for fopen, fwrite

I have this very tiny php script which does exactly what I need - I need to convert it to classic ASP however. I have googled but been unable to find information on anything similar to 'fopen' or 'fwrite' in classic ASP.
My original PHP Script is:
<?php
$responseImg = file_get_contents("http://url.to/the/api/thatreturnsagif");
$fp = fopen("/my/server/public_html/mydirectory/samepicture.gif", "w");
fwrite($fp, $responseImg);
fclose($fp);
?>
Really short, really simple and does just what I need. It makes a call to an API that returns a gif. I save the gif on my local server and a cron-job runs the script every so often to keep the gif up to date.
I'm moving to an IIS server which does not have php on it, so classic ASP will have to suffice.
I've gotten this far:
<%
url = "http://url.to/the/api/thatreturnsagif"
set xmlhttp = server.CreateObject("Msxml2.ServerXMLHTTP.6.0")
xmlhttp.open "GET", url, false
xmlhttp.send ""
Response.write xmlhttp.responseText
set xmlhttp = nothing
%>
I was able to put that together from some other stuff online.
I just need to figure out how to save the gif that will be returned on the server - then I will setup scheduled tasks to run it on an interval.
Any help appreciated.
xmlhttp (instance of IServerXMLHTTPRequest) has a method responseBody that returns array of bytes, use it instead of responseText.
Then write into a stream and save as file.
url = "http://url.to/the/api/thatreturnsagif"
set xmlhttp = Server.CreateObject("Msxml2.ServerXMLHTTP.6.0")
xmlhttp.open "GET", url, false
xmlhttp.send
With Server.CreateObject("Adodb.Stream")
.Type = 1 '1 for binary stream
.Open
.Write xmlhttp.responseBody
.SaveToFile Server.Mappath("\mydirectory\samepicture.gif"), 2 ' 2 for overwrite
.Close
End With
set xmlhttp = nothing
EDITED
First of all, installing PHP on IIS isn't that difficult, it might be a better option for you than rewriting everything in Classic ASP
Defining Response.ContentType is important. Other than that I've never tried this with an image file before so I'm guessing a bit here
Edited - I've tried this and it works. Save the code below as a separate file - give it a name like mygif.asp
<% url = "http://url.to/the/api/thatreturnsagif"
Set mygif = Server.CreateObject("Msxml2.ServerXMLHTTP.6.0")
mygif.open "GET",url, false
mygif.send
Response.ContentType = "image/gif"
Response.binarywrite mygif.ResponseBody
set mygif=nothing %>
Then you can embed it with an img tag just as you would embed a flat image.
<img src="mygif.asp">

PHP input stream returning 0 data - Laravel

I'm trying to get chunked uploads working on a form in my Laravel 4 project. The client side bit works so far, the uploads are chunking in 2MB chunks, and data is being sent from the browser. There's even have a handy progress bar in place to show the upload progress.
The problem is on the PHP side, as I'm unable to write the contents of the upload stream to disk. The system always ends up with a 0 byte file created. The idea is to append the chunks to the already uploaded file as they arrive.
The project is built on Laravel 4, so I'm not sure if Laravel reads the php://input stream and does something with it. Since php://input can only be read once, it possibly means that by the time when my controller actually tries to read it the stream, it would be empty.
The controller looks as follows:
public function upload()
{
$filename = Config::get('tms.upload_path') . Input::file('file')->getClientOriginalName();
file_put_contents($filename, fopen('php://input', 'r'), FILE_APPEND);
}
The file is being created, but it's length always remains at 0 bytes. Any ideas how I can coax the contents of the php://input stream out of the system?
afaik fopen returns a pointer to file, and not an stream, so probably it is not good as a parameter for file_put_contents
can you try with this workaround, instead of your file_put_contents?
$putdata = fopen("php://input", "r");
$fp = fopen($filename, "a");
while ($data = fread($putdata, 1024))
fwrite($fp, $data);
fclose($fp);
fclose($putdata);
The answer to this is simple, I needed to turn off multipart/form-data and use file_get_contents("php://input") to read the contents and pass the result to file_put_contents() like so:
file_put_contents($filename, file_get_contents("php://input"), FILE_APPEND);
This works and fixes my problems.

Categories