I'm trying to get chunked uploads working on a form in my Laravel 4 project. The client side bit works so far, the uploads are chunking in 2MB chunks, and data is being sent from the browser. There's even have a handy progress bar in place to show the upload progress.
The problem is on the PHP side, as I'm unable to write the contents of the upload stream to disk. The system always ends up with a 0 byte file created. The idea is to append the chunks to the already uploaded file as they arrive.
The project is built on Laravel 4, so I'm not sure if Laravel reads the php://input stream and does something with it. Since php://input can only be read once, it possibly means that by the time when my controller actually tries to read it the stream, it would be empty.
The controller looks as follows:
public function upload()
{
$filename = Config::get('tms.upload_path') . Input::file('file')->getClientOriginalName();
file_put_contents($filename, fopen('php://input', 'r'), FILE_APPEND);
}
The file is being created, but it's length always remains at 0 bytes. Any ideas how I can coax the contents of the php://input stream out of the system?
afaik fopen returns a pointer to file, and not an stream, so probably it is not good as a parameter for file_put_contents
can you try with this workaround, instead of your file_put_contents?
$putdata = fopen("php://input", "r");
$fp = fopen($filename, "a");
while ($data = fread($putdata, 1024))
fwrite($fp, $data);
fclose($fp);
fclose($putdata);
The answer to this is simple, I needed to turn off multipart/form-data and use file_get_contents("php://input") to read the contents and pass the result to file_put_contents() like so:
file_put_contents($filename, file_get_contents("php://input"), FILE_APPEND);
This works and fixes my problems.
Related
We are trying to read our website as a file in php and the fread function only can read 255 chars , can anyone help us in this problem?
Code:
$filename = "oururl";
$handle = fopen($filename, "r");
$contents = fread($handle, filesize($filename));
fclose($handle);
Also when we are trying to read file through file_get_content the same result is appeared to us for only 255 chars.
You're trying to get a size of a remote source/file specified by URL. filesize function is able to get size of file if URL wrapper supports/provides file statisticsIn your case, http or https could be used for URL wrapper.
$filename = "http://www.someurl.com";
But Stateless HTTP/HTTPS protocols don't provide file statistics.
I am having trouble in saving a large file ~200MB and above sent through desktop application made in .Net framework and receiving the file in PHP. The desktop application sending the file through writeStream function of .Net.
Problems:
1) I am getting the data in PHP but the final file size exceeds the original file size. For example if the file size is of 36MB the file PHP saves will be of 55MB.
2) And the file also get corrupted after saved on server.
Requirements:
1) I need to save a large file sent from .Net to PHP with any method that is working.
2) Any working example of the problem will be highly appreciated.
Note:
I am making a Dropbox like application. I hope this will give you a general idea of what kind of application I need to make.
PHP Code
/* PUT data comes in on the stdin stream */
$putdata = fopen("php://input", "r");
/* Open a file for writing */
$fp = fopen($targetDir.$fileName, "a");
/* Read the data 1 KB at a time
and write to the file */
while ($data = fread($putdata, 1024)){
fwrite($fp, $data);
}
/* Close the streams */
fclose($fp);
fclose($putdata);
Advise me the most optimal way to save large files from php stdin, please.
iOS developer sends me large video content to server and i have to store it in to files.
I read the stdin thread with video data and write it to the file. For example, in this way:
$handle = fopen("php://input", "rb");
while (!feof($handle)) {
$http_raw_post_data .= fread($handle, 8192);
}
What function is better to use? file_get_contents or fread or something else?
I agree with #hek2mgl that treating this as a multipart form upload would make most sense, but if you can't alter your input interface then you can use file_put_contents() on a stream instead of looping through the file yourself.
$handle = fopen("php://input", "rb");
if (false === file_put_contents("outputfile.dat", $handle))
{
// handle error
}
fclose($handle);
It's cleaner than iterating through the file, and it might be faster (I haven't tested).
Don't use file_get_contents because it would attempt to load all the content of the file into a string
FROM PHP DOC
file_get_contents() is the preferred way to read the contents of a file into a string. It will use memory mapping techniques if supported by your OS to enhance performance.
Am sure you just want to create the movie file on your server .. this is a more efficient way
$in = fopen("php://input", "rb");
$out = fopen('largefile.dat', 'w');
while ( ! feof($in) ) {
fwrite($out, fread($in, 8192));
}
If you use nginx as web server i want to recommend nginx upload module with possibility to resume upload.
I'm trying to debug this issue by posting raw PNG image data to the server with the help of Postman. Here's a screenshot, which might help to understand the issue:
On the server I'm receiving the file as follows:
$png = $GLOBALS["HTTP_RAW_POST_DATA"];
Then I write the data to a new file:
$fh = fopen($myFile, 'w') or die("can't open file");
fwrite($fh, $png);
fclose($fh);
The file gets saved correctly, but it now has a different file size,
417KB instead of 279KB which is the size of the original file.
Now of course, I can't do any image operation as none of the functions (such as getimagesize which returns bool(false)) recognizes the file as a valid image.
I have debugged this process to a point where the issue must be somewhere in the file operations, but I don't understand why the file just doesn't result in the very same file type and size as the original, when the only thing I am doing is using the same raw data.
UPDATE:
I've now compared the encodings of the original file with the uploaded one,
and the former is in ISO-8859-1 and it displays correctly, the latter is in UTF-8 and has about 138kB more in file size.
Now I've achieved to convert the file on the server to ISO-8859-1.
fwrite($fh, iconv("UTF-8", "ISO-8859-1", $png));
The resulting file does now have the same output file size (279kB),
but it is still not recognized as a PNG image, some information seems to still get lost.
UPDATE (1):
I've been able to examine the issue further and found out, that the original file is exactly 4 bytes bigger than the generated file, thus the resulting PNG seems to be corrupted.
UPDATE (2):
I'm now able to save the file and open it as a valid PNG. The following code seems to be saving the image correctly:
$input = fopen("php://input","r+");
$destination = fopen($myFile, 'w+');
stream_copy_to_stream($input, $destination);
fclose($input);
fclose($destination);
However when trying to open the file with the imagecreatefrompng function I get a 500 error. I'm now trying to figure out if it's a memory issue in PHP.
Problem might be the way you test your POST by copying the "binary" data into a text field.
If you paste the same data into a text editor you won't get a valid image file either when saving this with the png extension.
Try to build a simple form with file field to test your upload
I use nginx for uploads and haven't had a problem, but I use the standard PHP way of uploading files as per: http://www.php.net/manual/en/features.file-upload.post-method.php
I would suggest trying that.
Try using: < ?php $postdata = file_get_contents("php://input"); ?>
To get the raw data. I use it some times to get data sent from a ajax post on cake.
I have a page with html5 drag and drop upload feature and the file is uploading using PUT method. If I upload large image files, only part of the image is getting saved into the server. Im using the following PHP code to save the file
$putdata = fopen("php://input", "r");
$fp = fopen("/tmp/myputfile" . microtime() . ".jpg", "w");
while ($data = fread($putdata, 1024))
fwrite($fp, $data);
fclose($fp);
fclose($putdata);
Anything wrong with this ? please help
I think is becos the entire file is not completely uploaded yet when you try to read, so it sometimes will return you zero bytes read. But there might still be data being uploaded.
Maybe you can try using the feof function to check if there is any more data to be read?
see "http://www.php.net/manual/en/function.feof.php"
If you are on Windows, you should add the "b" to the mode-parameter of fopen(). see manual BTW. it is only a good idea to add the param for code-portability...