I would like to create an upload script that doesn't fall under the php upload limit.
There might be an occasion where I need to upload a 2GB, or larger file and I don't want to have to change the whole server execution to above 32MB.
Is there a way to write direct to disk from php?
What method might you propose someone would use to accomplish this? I have read around stack overflow but haven't quite found what I am looking to do.
The simple answer is you can't due to the way that apache handles post data.
If you're adamant to have larger file uploads and still use php for the backend you could write a simple file upload receiver using the php sockets api and run it as a standalone service. Some good details to be found at http://devzone.zend.com/article/1086#Heading8
Though this is an old post, you find it easily via google when looking for a solution to handle big file uploads with PHP.
I'm still not sure if file uploads that increase the memory limit are possible but I think there is a good chance that they are. While looking for a solution to this problem, I found contradicting sources. The PHP manual states
post_max_size: Sets max size of post data allowed.
This setting also affects file upload.
To upload large files, this value must
be larger than upload_max_filesize. If
memory limit is enabled by your
configure script, memory_limit also
affects file uploading. Generally
speaking, memory_limit should be
larger than post_max_size. (http://php.net/manual/en/ini.core.php)
...which implies that your memory limit should be larger than the file you want to upload. However, another user (ragtime at alice-dsl dot com) at php.net states:
I don't believe the myth that 'memory_size' should be the size of
the uploaded file. The files are
definitely not kept in memory...
instead uploaded chunks of 1MB each
are stored under /var/tmp and later on
rebuild under /tmp before moving to
the web/user space.
I'm running a linux-box with only 64MB
RAM, setting the memory_limit to 16MB
and uploading files of sizes about
100MB is no problem at all! (http://php.net/manual/en/features.file-upload.php)
He reports some other related problems with the garbage collector but also states how they can be solved. If that is true, the uploaded file size may well increase the memory limit. (Note, however, that another thing might be to process the uploaded file - then you might have to load it into memory)
I'm writing this before I tried handling large file uploads with PHP myself since I'm evaluating using php or python for this task.
You can do some interesting things based around PHP's sockets. Have you considered writing an applet in Java to upload the file to a listening PHP daemon? This probably won't work on most professional hosting providers, but if you're running your own server, you could make it work. Consider the following sequence:
Applet starts up, sends a request to PHP to open a listening socket
(You'll probably have to write a basic web browser in Java to make this work)
Java Applet reads the file from the file system and uploads it to PHP through the socket that was created in step 1.
Not the cleanest way to do it, but if you disable the PHP script timeout in your php.ini file, then you could make something work.
It isn't possible to upload a file larger than PHP allowed limits with PHP, it's that simple.
Possible workarounds include using a client-side technology - like Java, not sure if Flash and Javascript can do this - to "split" the original file in smaller chunks.
Related
I'm trying to convert a website to use S3 storage instead of local (expensive) disk storage. I solved the download problem using a stream wrapper interface on the S3Client. The upload problem is harder.
It seems to me that when I post to a PHP endpoint, the $_FILES object is already populated and copied to /tmp/ before I can even intercept it!
On top of that, the S3Client->upload() expects a file on the disk already!
Seems like a double-whammy against what I'm trying to do, and most advice I've found uses NodeJS or Java streaming so I don't know how to translate.
It would be better if I could intercept the code that populates $_FILES and then send up 5MB chunks from memory with the S3\ObjectUploader, but how do you crack open the PHP multipart handler?
Thoughts?
EDIT: It is a very low quantity of files, 0-20 per day, mostly 1-5MB sometimes hitting 40~70MB. Periodically (once every few weeks) a 1-2GB file will be uploaded. Hence the desire to move off an EC2 instance and into heroku/beanstalk type PaaS where I won't have much /tmp/ space.
It's hard to comment on your specific situation without knowing the performance requirements of the application and the volume of users needed to access it so I'll try to answer assuming a basic web app uploading profile avatars.
There are some good reasons for this, the file is streamed to the disk for multiple purposes one of which is to conserve memory use. If your file is not on the disk than it is in memory(think disk usage is expensive? bump up your memory usage and see how expensive that gets), which is fine for a single user uploading a small file, but not so great for a bunch of users uploading small files or worse: large files. You'll likely see the best performance if you use the defaults on these libraries and let them stream to and from the disk.
But again I don't know your use case and you may actually need to avoid the disk at all costs for some unknown reason.
I need your help. I want to create a upload script with HTML, JQuery and PHP.
Is it possible to write a script, that can upload very large files(> 5 GB)?
I've try it with FileReader, FormData and Blobs, but even with these, i can't upload large files(my browser crashes after selecting a large file).
PS: I want to write it myself. Don't post any finished scripts.
Regards
Yes. I wrote PHP to upload file exactly 5GB more then one year ago.
FileReader, FormData and Blobs will fail because they all require pre-process and convert in javascript before it get upload.
But, you can easily upload large file with plain simple XMLHttpRequest.
var xhr=new XMLHttpRequest();
xhr.send(document.forms[0]['fileinput']);
It is not a standard or documented way, however, few Chrome and Firefox do support. However, it send file content as is, not multipart/form-data, not http form-data.
You will need to prepare your own http header to provide additional information.
var xhr=new XMLHttpRequest(), fileInput=document.forms[0]['fileinput'];
xhr.setRequestHeader("X-File-Name", encodeURIComponent(getInputFileName(fileInput)));
xhr.setRequestHeader("X-File-Size", getFileSize(fileInput));
xhr.send(fileInput);
PS. well, actually it was not PHP. It was mixed PHP and Java Servlet.
The problem is, that its not really practical. First you have the problem, that you have to restart your upload when you have a browser problem. And this could happen when you upload a big file.
Here is another solution with Ajax:
php uploading large files
AX-JQuery Uploader
Look into "chunking", possibly with a plugin like AX Ajax multi uploader, which should help with both client and server-side file size limits.
Keep in mind that it is important to adjust your PHP.ini variable (which is related to script timing), called Maximum execution time of each script, in seconds max_execution_time = xxxx in order to prevent your script from time out, because uploading large files is as you know time consuming. Check also variable max_input_time = xxxx, which is maximum amount of time each script may spend parsing request data. It's a good idea to limit this time on productions servers in order to eliminate unexpectedly long running scripts, but in your case you may need to increase it.
Consider also changing following variables memory_limit, upload_max_filesize, post_max_size
I dont think web uploads were thought for 5gb+ kind of file, or that the browser is going to transfer this kind of information happily. File system limitation are also an issue. You should think/rethink the file upload depending on the usage scenario. Is the web only option? FTP, streaming, remote dumping are probably better solution that will not block your webserver/webpage while doing the transfer. HTTP is not the best protocol for this.
Think that the browser, PHP and Apache they all have limited memory. My antivirus warns me when chrome uses more than 250 MB per page (which is not considered normal). PHP has a default 128 MB of dedicated memory, and imagine having 100 simultaneous Apache users uploading 5GB files. That is why they invented FTP.
Why do you think those limits exists in PHP, apache...? Because is a way of attack, a security issue and a way of blocking the server which can be easily exploited by ... everybody.
We have files that are hosted on RapidShare which we would like to serve through our own website. Basically, when a user requests http://site.com/download.php?file=whatever.txt, the script should stream the file from RapidShare to the user.
The only thing I'm having trouble getting my head around is how to properly stream it. I'd like to use cURL, but I'm not sure if I can read the download from RapidShare in chunks and then echo them to the user. The best way I've thought of so far is to use a combination of fopen, fread, echo'ing the chunk of the file to the user, flushing, and repeating that process until the entire file is transferred.
I'm aware of the PHP readfile() function aswell, but would that be the best option? Bear in mind that these files can be several GB's in size, and although we have servers with 16GB RAM I want to keep the memory usage as low as possible.
Thank you for any advice.
HTTP has a Header called "Range" which basically allows you to fetch any chunk of a file (knowing that you already know the file size), but since PHP isn't multi-threaded aware, I don't see any benefit of using it.
Afaik, if you don't want to consume all your RAM, the only way to go is a two steps way.
First, stream the remote file using fopen()/fread() (or any php functions which allow you to use stream), split the read in small chunks (2048 bits may be enough), write/append the result to a tempfile(), then "echoing" back to your user by reading the temporary file.
That way, even a file 2To would, basically, consumes 2048 bits since only the chunk and the handle of the file is in memory.
You may also write some kind of proxy manager to cache and keep already downloaded files to avoid the remote reading process if a file is heavily downloaded (and keep it locally for a given time).
am working on a website and i have a big problem when i tried to upload files, i increase upload_max_filesize and post_max_size and the code still understand only as a max. 10M. for any different folder php accepts 100M. but inside the site folder ( which am working inside) it doesn't understand it. i check for local php.ini or .htaccess.
note: am running a linux server.
For uploading bigger files I would suggest a dedicated uploader plug-in.
Like a SWF of Java. For these reasons:
Security - you can easily encode the sent data (encoding ByteArray in AS3.0 is very easy, can be even tokenized so it is hard to intercept the stream)
Reliability - with simple HTTP requests it is hard to actually monitor the upload progress, so the user might choose to close the uploaded (because he thinks it got stuck)
User friendly - again, progress bar
Not limited by server - if you accept it directly with PHP custom code, you won't need any configuring for annoying things like max file size on upload.
On server-side you will need either a Socket listener, or an HTTP tunnel if unavailable.
You can use JumpLoader, which is a Java applet, and it is able to split large files into partitions, and upload them one by one. Then a PHP script rebuilds the original file from the uploaded partitions on the server.
Plupload can split large files into smaller chunks. See the documentation.
Do you run Apache with mod_security? Then check if the LimitRequestBody is in affect.
Here is a good tutorial about Settings for uploading files with PHP.
Thanks guys,
I found the problem. i don't know why is the file is not visible for me /public_html/site/.htaccess
i tried to overwrite it, and it's seems to be working.
Thanks a lot for efforts.
Ive been having trouble with a php and JavaScript upload script accepting large file uploads with Dreamhost. I realize that you are supposed to edit php.ini to change post max size and the memory limit, but it isn't behaving as it should.
The only way I have ever successfully had a large file upload was switching to Dreamhost PS and making the memory limit as high as a file (1GB) but there has to be another cost effective way, otherwise how oils sites like YouTube survive? I get I/O errors if I do not have all this memory available.
Could anyone help? Ive struggled with this for over a month.
You usually can't increase the maximum file size on a shared hosting webspace. Run phpinfo() to see what the exact size limit is. Anything beyond that is probably not going to work on that web space without an upgrade.
Don't confuse max_upload_file_size and memory_limit though. memory_limit applies to how much RAM one instance of your PHP script is allowed to use, and has nothing to do with file uploads.
Looks like the answer was to make a Perl script instead. Using Perl I don't see even a blip in the server memory usage.
Editing the PHP.ini is ultimately the solution. You have to change the max_upload_file_size and maybe the post_max_size, you may also want to increase the max_execution_time of the script. These variables will increase your uploading abilities. You also want to include the actual, modified PHP.ini script in the actual directory where you want the changes made.
Reference:PHP Core Variables
Try creating/copying php.ini into a directory, then going to info.php for details. (info.php - just create a new blank php file with contents: <?php phpinfo() ?>
Also, never leave an info.php file on your server. I change its extension on my sites.
I agree using the changes to php.ini you can choose just how crazy of a file you want to allow. With no limit to size, you are opening yourself up for a world of problems. I think even in perl you should fatal out if the number of bytes of a file exceeds a certain amount. Depending on how tech savy your users are you may end up with more than you bargained for. And you wouldnt want to crash yur whole web server because one user uploaded a 200GB file.