I am using PHP to upload files. On the server side of the action that gets called after the upload completes is simply:
<?php
print_r($_FILES);
?>
Which outputs something like:
Array
(
[file] => Array
(
[name] => space_needlenever_lonely_alone.mp3
[type] => audio/mp3
[tmp_name] => /srv/www/uploads/php1TCIY9
[error] => 0
[size] => 3768714
)
)
The problem is I don't want to store files on the server disk ( /srv/www/uploads/php1TCIY9) at all. I actually want to store files into MongoDB using GridFS. Is there a way to stream directly into PHP without storing the file on disk on the server first? Storing files on disk requires all kinds of headaches such as permissions, and php ini configurations regarding uploads.
A websocket connection would be awesome, if I can stream the binary data directly into the server side page without writing to disk.
Or, is it possible to interact with MongoDB from a JavaScript point of view (client side) and skip interacting with PHP?
Possible, or a pipe dream?
Your PHP application is not even executed until the file upload is done. Because of this, I believe this is impossible with PHP.
Related
I’m encountering a problem with file upload in PHP. I’m using Dropzone.js which calls a PHP script.
The PHP environment runs on Docker, and the PHP version is : 7.2.28
When I upload an image with Firefox 72 on Mac OSX, I get this in $_FILES :
Array\n(\n [file] => Array\n (\n [name] => image.png\n [type] => \n [tmp_name] => \n [error] => 3\n [size] => 0\n )\n\n)\n,
According to the documentation: error 3 means UPLOAD_ERR_PARTIAL.
The problem happens only with Firefox on Mac OSX, with PNG images with this specific size (158ko).
Other file size (even tiny or big files), other browsers, other file types, or other operating systems work fine.
The docker image runs on 3 different servers, and the problem happens on each installation.
I tried some solutions that I read on the internet, but none of them worked:
php_sapi_name() returns apache2handler
I tried to add 'Accept-Ranges: none' to my php file.
Do you have a clue on what might happen?
Thanks in advance,
I am using the SabreDAV PHP library to connect to a WebDAV server and download some files but it is taking forever to download a 1MB file and I have to download up to 1GB files from that server. I looked at this link http://code.google.com/p/sabredav/wiki/WorkingWithLargeFiles but it is not helpful because it's telling me that I will get a stream when I do a GET but it is not the case.
Here is my code:
$settings = array(
'baseUri' => 'file url',
'userName' => 'user',
'password' => 'pwd'
);
$client = new \Sabre\DAV\Client($settings);
$response = $client->request('GET');
response is an array with a 'body' key that contains the content of the file. What am I doing wrong? I only need the file for read only. How can I can read through the file line by line as quick as possible?
Thanks in advance.
If its taking too long just to download a 1MB file, then I think its not SabreDAV problem but a problem with your server or network, or perhaps the remote server.
The google code link you mentioned just lists a way if you want to transfer very large files, for that you will have to use the stream and fopen way they mentioned, but I think I was able to transfer 1GB files without using that way and just normally when I last used it with OwnCloud.
If you have a VPS/Dedi server, open ssh and use wget command to test the speed and time it takes to download that remote file from WebDAV, if its same as what its taking with SabreDAV, then its a server/network problem and not SabreDAV, else, its a problem with Sabre or your code.
Sorry but I donot have any code to post to help you since the problem itself is not clear and there can be more than 10 things causing it.
PS: You need to increase php limits for execution time, max file upload and max post size too relatively
I have a php web app that I deploy to multiple servers. It works on my local installation as well as every other server I put it on.
But on the current server I'm deploying to, I'm having trouble with all of the file upload features.
When trying to debug it, I realized they are failing when trying to get the finfo(FILEINFO_MIME_TYPE).
Example: a form has an upload field called "file_import". I put in the following line of code into the shared import function.
die(print_r($_FILES['file_import']));
On my local installation, it returns:
Array
(
[name] => US Capital Division.csv
[type] => application/vnd.ms-excel
[tmp_name] => C:\wamp\tmp\php3B7C.tmp
[error] => 0
[size] => 1268
)
1
Uploading the same file on the current deployment it returns:
Array
(
[name] => US Capital Division.csv
[type] => application/vnd.ms-excel
[tmp_name] => /tmp/phpJdRZDW
[error] => 0
[size] => 1268
)
1
They both say they have no errors, but the deployment server only gives the folder name for the tmp_name value, while the local installation gives the temp folder AND file name. So when my code try $_FILES['file_import']['tmp_name'] it errors.
I feel like it has to be server setting or folder setting, but I can't find anything on the web as to what would cause that.
Thanks.
Actually, the phpJdRZDW inside [tmp_name] => /tmp/phpJdRZDW is not a folder... It's just a temporary file name while the file sits inside the /tmp/ folder (waiting to be moved and [usually] renamed).
I worked with some mp3 files, and i needed to show duration of uploaded mp3 file, I used following:
http://www.zedwood.com/article/127/php-calculate-duration-of-mp3
as told in question:
PHP Function to get MP3 duration.
But now I'm facing a problem, there are some files which are not returning any file information.
the array just contains following
Array (
[Filesize] => 16756370,
[Encoding] => Unknown
)
As the "Encoding" is Unknown its not returning any data.
I found a great PHP library, getID3(), that works for VBR files as well.
You can find it right here; it's free and is being developed actively (latest version is from from February 2013):
http://sourceforge.net/projects/getid3/files/getID3%28%29%201.x/1.9.5/
How can I execute a PHP file, while connecting a bucket in Amazon using S3 SDK?
I tried with $s3->getBucket, I got the following array. But I need the results of execution of this file.
Array
(
[content/content.php] => Array
(
[name] => content/content.php
[time] => 1353105202
[size] => 1223
[hash] => 355c51389b51814d4b6c822a6ec28cfe
)
)
Is there any function/method to execute a PHP file?
You can't execute PHP files directly on Amazon S3.
What you would have to do is download it's contents, write the contents to the file, then run the file itself (by using include() or require()).
If the file outputs stuff directly, you can wrap it in ob_start() and ob_get_clean() to get any output.
$source = $s3->getObject($bucket, $object);
$filename = tempnam('/tmp');
$file = fopen($filename, 'w');
fwrite($file, $source);
fclose($file);
ob_start();
require_once $filename;
$result = ob_get_clean();
echo $result; // the result of the executed PHP.
You could potentially use something like s3fs to mount the S3 bucket in your local filesystem and then just include the file as you would any other file.
Depending on the allow_url_fopen and allow_url_include ini settings on the php, this could be implemented using a custom stream wrapper. We are using one such thing in a project of ours. Not authorized to share the code as it is having IPR. But it is based on http://docs.aws.amazon.com/aws-sdk-php/v2/guide/feature-s3-stream-wrapper.html
The Amazon S3 stream wrapper allows you to store and retrieve data from Amazon S3 using built-in PHP functions like file_get_contents, fopen, copy, rename, unlink, mkdir, rmdir, etc.