How to upload a file byte by byte in php - php

I have a form input type="file" element and it accepts a file . When I upload it and pass this on to the server side php script .
How do I write the temporary file stored in $_FILES["file"]["tmp_name"] byte by byte ?
The below code does not work . It seems to write at the end of the request . IF for eg a connection is lost in between i would like to see if 40 % was complete so that i can resume it .
Any pointers ?
$target_path = "uploads/";
$target_path = $target_path . basename($name);
if (isset($_FILES['file']['tmp_name']) && is_uploaded_file($_FILES['file']['tmp_name'])) {
// Open temp file
$out = fopen($target_path, "wb");
if ($out) {
// Read binary input stream and append it to temp file
$in = fopen($_FILES['file']['tmp_name'], "rb");
if ($in) {
while ($buff = fread($in, 4096))
fwrite($out, $buff);
}
fclose($in);
fclose($out);
}
}

PHP does not hand over control to the file upload target script until AFTER the upload is complete (or has failed). You don't have to do anything to 'accept' the file - PHP and Apache will take care of writing it to the filename specified in the ['tmp_name'] parameter of the $_FILES array.
If you're trying to resume failed uploads, you'll need a much more complicated script.

Related

Resumable Upload with ng-file-upload + PHP (Merging parts)

I'm trying to implement ng-file-upload with Resumable Uploads mode to split big files in chunks and merge them once uploaded. I have implemented ng-file-upload in many projects but it's my first time doing it to upload so big files.
My issue is that I don't know how to make it work the server side files in PHP. I've just got to upload chunks with diferents name but I can't merge them.
Could anybody post an example of server side code in PHP to make work this feature?
This is what I have done up to this point:
AngularJS
$scope.uploadMediaFile = function (file) {
if(file) {
Upload.upload({
ignoreLoadingBar: true,
url: 'app/api/upload/mediaFile.php',
resumeChunkSize: '1MB',
file: file
}).then(function (response) {
if(response.data.success) {
$scope.post.mediaFile = response.data.filename;
$scope.post.duration = response.data.duration;
} else {
console.error(response.data.error);
}
}, null, function (evt) {
console.log(part);
file.progress = Math.min(100, parseInt(100.0 * evt.loaded / evt.total));
});
}
};
mediaFile.php
$filename = $_FILES['file']['name'];
$file_tmp = $_FILES['file']['tmp_name'];
$file_ext = pathinfo($filename, PATHINFO_EXTENSION);
$file_des = $_SERVER['DOCUMENT_ROOT'] . '/storage/content/temp/';
if(!file_exists($file_des)) mkdir($file_des);
// Puting a diferent name for each file part
$new_filename = uniqid() . "." . pathinfo($filename, PATHINFO_EXTENSION);
move_uploaded_file($file_tmp, $file_des . $new_filename)
So far, I get many pieces of same file with diferent names.
Just in case someone is looking similar question, I post my solution.
<?php
// File chunk
$filename = $_FILES['file']['name'];
$file_tmp = $_FILES['file']['tmp_name'];
// Defining temporary directory
$file_des = $_SERVER['DOCUMENT_ROOT'] . '/storage/content/temp/';
// If not exists, create temp dir.
if(!file_exists($file_des)) mkdir($file_des);
// The first chunk have the original name of file uploaded
// so, if it exists in temp dir, upload the other pieces
// with anothers uniques names
if(file_exists($file_des . $filename)) {
$new_name = uniqid() . "." . pathinfo($filename, PATHINFO_EXTENSION);
move_uploaded_file($file_tmp, $file_des . $new_name);
// Now, append the chunk file to the first base file.
$handle = fopen($file_des . $new_name, 'rb');
$buff = fread($handle, filesize($file_des . $new_name));
fclose($handle);
$final = fopen($file_des . $filename, 'ab');
$write = fwrite($final, $buff);
fclose($final);
// Delete chunk
unlink($file_des . $new_name);
} else {
/* MAKE SURE WE DELETE THE CONTENT OF THE DESTINATION FOLDER FIRST,
OTHERWISE CHUNKS WILL BE APPENDED FOR EVER
IN CASE YOU ARE TRYING TO UPLOAD A FILE WITH THE EXACT SAME NAME.
CAREFUL: YOU MAY PREFER TO DELETE ONLY THE FILE
INSTEAD OF THE FOLDER'S CONTENT, IN THE CASE
YOUR FOLDER CONTAINS MORE THAN ONE FILE.
*/
$files_to_delete = glob($file_des."*"); // get all file names
foreach($files_to_delete as $file) // iterate files
{
if(is_file($file))
{
unlink($file); // delete file
}
}
// First chunk of file with original name.
move_uploaded_file($file_tmp, $file_des . $filename);
}

Read single file from ZIP within a directory

I have a ZIP file (with a VPK extension) and I wish to extract a file that is within a directory of the zip file. The uploaded file uploads correctly. Here is my current code. but unfortunately it throws up an error.
$hbid = substr(md5(time()),0,16);
mkdir("pkg/".$hbid, 0700);
mkdir("pkg_image/".$hbid, 0700);
$target_dir = "pkg/" . $hbid . "/";
$target_file = $target_dir . basename($_FILES["fileToUpload"]["name"]);
...
FILE UPLOADING CODE HERE
...
ERROR -> $handle = fopen('zip://./'.$target_file.'#/sce_sys/icon0.png', 'r');
$result = '';
if($handle){
while (!feof($handle)) {
$result .= fread($handle, 8192);
}
fclose($handle);
$file = fopen("pkg_image/".$hbid."/icon0.png");
fwrite($file,$result);
fclose($file);
The error code is this:
fopen(zip://./pkg/0152cc9c0c52da70/4rows_1_1.vpk#/sce_sys/icon0.png): failed to open stream: operation failed
I've never extracted a file this way before but looking at other answers related to this, they all extract a file from the root of the zip, but the file I need is in a subdirectory of the zip file. I'm not entirely sure what I am doing wrong though.
Thanks.
Figured it out. The correction is to replace the /sce_sys with #sce_sys. The initial / is not required for a directory.

Write and read from the same file- PHP

I am trying to write to a file and then read the data from the same file. But sometimes I am facing this issue that the file reading process is getting started even before the file writing gets finished. How can I solve this issue ? How can i make file writing process finish before moving ahead?
// writing to file
$string= <12 kb of specific data which i need>;
$filename.="/ttc/";
$filename.="datasave.html";
if($fp = fopen($filename, 'w'))
{
fwrite($fp, $string);
fclose($fp);
}
// writing to the file
$handle = fopen($filename, "r") ;
$datatnc = fread($handle, filesize($filename));
$datatnc = addslashes($datatnc);
fclose($handle);
The reason it does not work is because when you are done writing a string to the file the file pointer points to the end of the file so later when you try to read the same file with the same file pointer there is nothing more to read. All you have to do is rewind the pointer to the beginning of the file. Here is an example:
<?php
$fileName = 'test_file';
$savePath = "tmp/tests/" . $fileName;
//create file pointer handle
$fp = fopen($savePath, 'r+');
fwrite($fp, "Writing and Reading with same fopen handle!");
//Now rewind file pointer to start reading
rewind($fp);
//this will output "Writing and Reading with same fopen handle!"
echo fread($fp, filesize($savePath));
fclose($fp);
?>
Here is more info on the rewind() method http://php.net/manual/en/function.rewind.php
I have mentioned the URL through which i got the solution. I implemented the same. If you want me to copy the text from that link then here it is :
$file = fopen("test.txt","w+");
// exclusive lock
if (flock($file,LOCK_EX))
{
fwrite($file,"Write something");
// release lock
flock($file,LOCK_UN);
}
else
{
echo "Error locking file!";
}
fclose($file);
Use fclose after writing to close the file pointer and then fopen again to open it.

getimagesize() limiting file size for remote URL

I could use getimagesize() to validate an image, but the problem is what if the mischievous user puts a link to a 10GB random file then it would whack my production server's bandwidth. How do I limit the filesize getimagesize() is getting? (eg. 5MB max image size)
PS: I did research before asking.
You can download the file separately, imposing a maximum size you wish to download:
function mygetimagesize($url, $max_size = -1)
{
// create temporary file to store data from $url
if (false === ($tmpfname = tempnam(sys_get_temp_dir(), uniqid('mgis')))) {
return false;
}
// open input and output
if (false === ($in = fopen($url, 'rb')) || false === ($out = fopen($tmpfname, 'wb'))) {
unlink($tmpfname);
return false;
}
// copy at most $max_size bytes
stream_copy_to_stream($in, $out, $max_size);
// close input and output file
fclose($in); fclose($out);
// retrieve image information
$info = getimagesize($tmpfname);
// get rid of temporary file
unlink($tmpfname);
return $info;
}
You don't want to do something like getimagesize('http://example.com') to begin with, since this will download the image once, check the size, then discard the downloaded image data. That's a real waste of bandwidth.
So, separate the download process from the checking of the image size. For example, use fopen to open the image URL, read little by little and write it to a temporary file, keeping count of how much you have read. Once you cross 5MB and are still not finished reading, you stop and reject the image.
You could try to read the HTTP Content-Size header before starting the actual download to weed out obviously large files, but you cannot rely on it, since it can be spoofed or omitted.
Here is an example, you need to make some change to fit your requirement.
function getimagesize_limit($url, $limit)
{
global $phpbb_root_path;
$tmpfilename = tempnam($phpbb_root_path . 'store/', unique_id() . '-');
$fp = fopen($url, 'r');
if (!$fp) return false;
$tmpfile = fopen($tmpfilename, 'w');
$size = 0;
while (!feof($fp) && $size<$limit)
{
$content = fread($fp, 8192);
$size += 8192; fwrite($tmpfile, $content);
}
fclose($fp);
fclose($tmpfile);
$is = getimagesize($tmpfilename);
unlink($tmpfilename);
return $is;
}

getimagesize() on stream instead of string

I'm using Valum's file uploader to upload images with AJAX. This script submits the file to my server in a way that I don't fully understand, so it's probably best to explain by showing my server-side code:
$pathToFile = $path . $filename;
//Here I get a file not found error, because the file is not yet at this address
getimagesize($pathToFile);
$input = fopen('php://input', 'r');
$temp = tmpfile();
$realSize = stream_copy_to_stream($input, $temp);
//Here I get a string expected, resource given error
getimagesize($input);
fclose($input);
$target = fopen($pathToFile, 'w');
fseek($temp, 0, SEEK_SET);
//Here I get a file not found error, because the image is not at the $target yet
getimagesize($pathToFile);
stream_copy_to_stream($temp, $target);
fclose($target);
//Here it works, because the image is at the desired location so I'm able to access it with $pathToFile. However, the (potentially) malicious file is already in my server.
getimagesize($pathToFile);
The problem is that I want to perform some file validation here, using getimagesize(). getimagesize only supports a string, and I only have resources available, which result in the error: getimagesize expects a string, resource given.
It does work when I perform getimagesize($pathTofile) at the end of the script, but then the image is already uploaded and the damage could already have been done. Doing this and performing the check afterwards and then maybe deleting te file seems like bad practice to me.
The only thing thats in $_REQUEST is the filename, which i use for the var $pathToFile. $_FILES is empty.
How can I perform file validation on streams?
EDIT:
the solution is to first place the file in a temporary directory, and perform the validation on the temporary file before copying it to the destination directory.
// Store the file in tmp dir, to validate it before storing it in destination dir
$input = fopen('php://input', 'r');
$tmpPath = tempnam(sys_get_temp_dir(), 'upl'); // upl is 3-letter prefix for upload
$tmpStream = fopen($tmpPath, 'w'); // For writing it to tmp dir
stream_copy_to_stream($input, $tmpStream);
fclose($input);
fclose($tmpStream);
// Store the file in destination dir, after validation
$pathToFile = $path . $filename;
$destination = fopen($pathToFile, 'w');
$tmpStream = fopen($tmpPath, 'r'); // For reading it from tmp dir
stream_copy_to_stream($tmpStream, $destination);
fclose($destination);
fclose($tmpStream);
PHP 5.4 now supports getimagesizefromstring
See the docs:
http://php.net/manual/pt_BR/function.getimagesizefromstring.php
You could try:
$input = fopen('php://input', 'r');
$string = stream_get_contents($input);
fclose($input);
getimagesizefromstring($string);
Instead of using tmpfile() you could make use of tempnam() and sys_get_temp_dir() to create a temporary path.
Then use fopen() to get a handle to it, copy over the stream.
Then you've got a string and a handle for the operations you need to do.
//Copy PHP's input stream data into a temporary file
$inputStream = fopen('php://input', 'r');
$tempDir = sys_get_temp_dir();
$tempExtension = '.upload';
$tempFile = tempnam($tempDir, $tempExtension);
$tempStream = fopen($tempFile, "w");
$realSize = stream_copy_to_stream($inputStream, $tempStream);
fclose($tempStream);
getimagesize($tempFile);

Categories