Upload very large files(>5GB) - php

I need your help. I want to create a upload script with HTML, JQuery and PHP.
Is it possible to write a script, that can upload very large files(> 5 GB)?
I've try it with FileReader, FormData and Blobs, but even with these, i can't upload large files(my browser crashes after selecting a large file).
PS: I want to write it myself. Don't post any finished scripts.
Regards

Yes. I wrote PHP to upload file exactly 5GB more then one year ago.
FileReader, FormData and Blobs will fail because they all require pre-process and convert in javascript before it get upload.
But, you can easily upload large file with plain simple XMLHttpRequest.
var xhr=new XMLHttpRequest();
xhr.send(document.forms[0]['fileinput']);
It is not a standard or documented way, however, few Chrome and Firefox do support. However, it send file content as is, not multipart/form-data, not http form-data.
You will need to prepare your own http header to provide additional information.
var xhr=new XMLHttpRequest(), fileInput=document.forms[0]['fileinput'];
xhr.setRequestHeader("X-File-Name", encodeURIComponent(getInputFileName(fileInput)));
xhr.setRequestHeader("X-File-Size", getFileSize(fileInput));
xhr.send(fileInput);
PS. well, actually it was not PHP. It was mixed PHP and Java Servlet.

The problem is, that its not really practical. First you have the problem, that you have to restart your upload when you have a browser problem. And this could happen when you upload a big file.
Here is another solution with Ajax:
php uploading large files
AX-JQuery Uploader

Look into "chunking", possibly with a plugin like AX Ajax multi uploader, which should help with both client and server-side file size limits.

Keep in mind that it is important to adjust your PHP.ini variable (which is related to script timing), called Maximum execution time of each script, in seconds max_execution_time = xxxx in order to prevent your script from time out, because uploading large files is as you know time consuming. Check also variable max_input_time = xxxx, which is maximum amount of time each script may spend parsing request data. It's a good idea to limit this time on productions servers in order to eliminate unexpectedly long running scripts, but in your case you may need to increase it.
Consider also changing following variables memory_limit, upload_max_filesize, post_max_size

I dont think web uploads were thought for 5gb+ kind of file, or that the browser is going to transfer this kind of information happily. File system limitation are also an issue. You should think/rethink the file upload depending on the usage scenario. Is the web only option? FTP, streaming, remote dumping are probably better solution that will not block your webserver/webpage while doing the transfer. HTTP is not the best protocol for this.
Think that the browser, PHP and Apache they all have limited memory. My antivirus warns me when chrome uses more than 250 MB per page (which is not considered normal). PHP has a default 128 MB of dedicated memory, and imagine having 100 simultaneous Apache users uploading 5GB files. That is why they invented FTP.
Why do you think those limits exists in PHP, apache...? Because is a way of attack, a security issue and a way of blocking the server which can be easily exploited by ... everybody.

Related

Handling Very Large Uploads [duplicate]

I want to allow uploads of very large files into our PHP application (hundred of megs - 8 gigs). There are a couple of problems with this however.
Browser:
HTML uploads have crappy feedback, we need to either poll for progress (which is a bit silly) or show no feedback at all
Flash uploader puts entire file into memory before starting the upload
Server:
PHP forces us to set post_max_size, which could result in an easily exploitable DOS attack. I'd like to not set this setting globally.
The server also requires some other variables to be there in the POST vars, such as an secret key. We'd like to be able to refuse the request right away, instead of after the entire file is uploaded.
Requirements:
HTTP is a must.
I'm flexible with client-side technology, as long as it works in a browser.
PHP is not a requirement, if there's some other technology that will work well on a linux environment, that's perfectly cool.
upload_max_filesize can be set on a per-directory basis; the same goes for post_max_size
e.g.:
<Directory /uploadpath/>
php_value upload_max_filesize 10G
php_value post_max_size 10G
</IfModule>
Python Handler?
Using a Python POST handler instead of PHP. Generate a unique identifier from your PHP app that the client can put in the HTTP headers. With mod_python to reject or accept the large upload before the entire POST body is transmitted.
I think
http://www.modpython.org/live/current/doc-html/dir-handlers-hph.html
Allows you to check headers and decline the rest of the POST input. I haven't tried it but might be the right path?
Looking at the source of mod_python, the buffering of the input via read() seems to allow bit-at-a-time evaluation of the HTTP input. Headers are first.
https://svn.apache.org/repos/asf/quetzalcoatl/mod_python/trunk/src/filterobject.c
It's old I know, but maybe someone have this problem nowdays ,too.
Now you can do this with only Javascript and, say, PHP. No Flash or Java required on client side.
demo: http://dnduploader.filkor.org/
The idea is to slice the files with Javascript's Blob slice() method...
How about a Java applet? That's how we had to do it at a company I previously worked for. I know applets suck, especially in this day and age with all our options available, but they really are the most versatile solution to desktop-like problems encountered in web development. Just something to consider.
You can set the post_max_size for just scripts in 1 directory. Place your upload script there, and allow only that script to handle large sizes. It's still possible for that script to be attacked with large/useless files, but it avoids setting it globally.
Use that with APC and you might be able to work out something good:
IBM Developer works article on APC
Tried all of this... this is by far the best I have used yet...
http://www.uploadify.com/
Take a look at jumploader.com
A good java-applet for uploading.
I've used it for uploading images and it works fine. Haven't tried with bigger files than 10MB, but i should work for really big files too.
Have you looked into using APC to check the progress and total file size. Here is a good blog post about it. It might help.
Maybe you could use Webdav and Javascript in the browser
AJAX Big file upload, with progress, to WebDAV
http://www.webdavsystem.com/ajax/programming/upload_progress
A simple library
http://debris.demon.nl/projects/davclient.js/doc/README.html
You can then get the JS to redirect the user to a success page. Secret keys and what-not can be handled in a PHP prelude before handing off the JS Client->WebDAV
I would look into FTP, SSH or SCP this allows you to upload a large file and still have access control over the file as well. This might take a little longer to implement but its probably the most secure way I could think of.
I know it sucks to add another dependency but in my experience, most websites that are doing something like this are using flash on the client side, and uploading the large file as chunks
adobe as a howto on flash file uploads
I also found this tutorial on codeproject:
Multiple File Upload With Progress Bar Using Flash and ASP.NET
PS - I know you're using PHP and not .net, I figured the important part was the flash ;)
I've had success with uploadify, and I would recommend it. It's a jQuery/Flash script that handles large uploads, and you can pass extra parameters to it (like the secret key). To solve the server-side issues, simply use the following code. The changes take affect just for the script they're called in:
//Check to see if the key is there
if(!isset($_POST['secret_key']) || !isValid($_POST['secret_key']))
{
exit("Invalid request");
}
function isValid($key)
{
//Put your validation code here.
}
//This line changes the timeout.
//Give it a value in seconds (3600 = 1 hour)
set_time_limit(3600);
//Set these amounts to whatever you need.
ini_set("post_max_size","8192M");
ini_set("upload_max_filesize","8192M");
//Generally speaking, the memory_limit should be higher
//than your post size. So make sure that's right too.
ini_set("memory_limit","8200M");
EDIT In response to your comment:
Given what you've said, I'm afraid you may not be able to meet your requirements over http. All of the solutions out there are code that add features to http that it was never designed for.
Like you said yourself, it's a simple protocol. Apart from writing your own client software that runs outside of the browser, a java applet, or using a different protocol (like FTP, which was designed for this), you might not get what you want.
I've done the best I could within the given constraints. Sorry I couldn't do better.
Try this: http://www.simple2ftp.com uses a Java based FTP applet from within a clever PHP application wrapper.

php upload_max_filesize problem

am working on a website and i have a big problem when i tried to upload files, i increase upload_max_filesize and post_max_size and the code still understand only as a max. 10M. for any different folder php accepts 100M. but inside the site folder ( which am working inside) it doesn't understand it. i check for local php.ini or .htaccess.
note: am running a linux server.
For uploading bigger files I would suggest a dedicated uploader plug-in.
Like a SWF of Java. For these reasons:
Security - you can easily encode the sent data (encoding ByteArray in AS3.0 is very easy, can be even tokenized so it is hard to intercept the stream)
Reliability - with simple HTTP requests it is hard to actually monitor the upload progress, so the user might choose to close the uploaded (because he thinks it got stuck)
User friendly - again, progress bar
Not limited by server - if you accept it directly with PHP custom code, you won't need any configuring for annoying things like max file size on upload.
On server-side you will need either a Socket listener, or an HTTP tunnel if unavailable.
You can use JumpLoader, which is a Java applet, and it is able to split large files into partitions, and upload them one by one. Then a PHP script rebuilds the original file from the uploaded partitions on the server.
Plupload can split large files into smaller chunks. See the documentation.
Do you run Apache with mod_security? Then check if the LimitRequestBody is in affect.
Here is a good tutorial about Settings for uploading files with PHP.
Thanks guys,
I found the problem. i don't know why is the file is not visible for me /public_html/site/.htaccess
i tried to overwrite it, and it's seems to be working.
Thanks a lot for efforts.

Is uploading very large files (eg 500mb) via php advisable?

I created an simple web interface to allow various users to upload files. I set the upload limit to 100mb but now it turns out that the client occasionally wants to upload files 500mb+.
I know what to alter the php configuration to change the upload limit but I was wondering if there are any serious disadvantages to uploading files of this size via php?
Obviously ftp would be preferable but if possible i'd rather not have two different methods of uploading files.
Thanks
Firstly FTP is never preferable. To anything.
I assume you mean that you transferring the files via HTTP. While not quite as bad as FTP, its not a good idea if you can find another of solving the problem. HTTP (and hence the component programs) are optimized around transferring relatively small files around the internet.
While the protocol supports server to client range requests, it does not allow for the reverse operation. Even if the software at either end were unaffected by the volume, the more data you are pushing across the greater the interval during which you could lose the connection. But the biggest problem is that caveat in the last sentence.
Regardless of the server technology you use (PHP or something else) it's never a good idea to push that big file in one sweep in synchronous mode.
There are lots of plugins for any technology/framework that will do asynchronous upload for you.
Besides the connection timing out, there is one more disadvantage in that file uploading consumes the web server memory. You don't normally want that.
PHP will handle as many and as large a file as you'll allow it. But consider that it's basically impossible to resume an aborted upload in PHP, as scripts are not fired up until AFTER the upload is completed. The larger the file gets, the larger the chance of a network glitch killing the upload and wasting a good chunk of time and bandwidth. As well, without extra work with APC, or using something like uploadify, there's no progress report and users are left staring at a browser showing no visible signs of actual work except the throbber chugging away.

Writing direct to disk with php

I would like to create an upload script that doesn't fall under the php upload limit.
There might be an occasion where I need to upload a 2GB, or larger file and I don't want to have to change the whole server execution to above 32MB.
Is there a way to write direct to disk from php?
What method might you propose someone would use to accomplish this? I have read around stack overflow but haven't quite found what I am looking to do.
The simple answer is you can't due to the way that apache handles post data.
If you're adamant to have larger file uploads and still use php for the backend you could write a simple file upload receiver using the php sockets api and run it as a standalone service. Some good details to be found at http://devzone.zend.com/article/1086#Heading8
Though this is an old post, you find it easily via google when looking for a solution to handle big file uploads with PHP.
I'm still not sure if file uploads that increase the memory limit are possible but I think there is a good chance that they are. While looking for a solution to this problem, I found contradicting sources. The PHP manual states
post_max_size: Sets max size of post data allowed.
This setting also affects file upload.
To upload large files, this value must
be larger than upload_max_filesize. If
memory limit is enabled by your
configure script, memory_limit also
affects file uploading. Generally
speaking, memory_limit should be
larger than post_max_size. (http://php.net/manual/en/ini.core.php)
...which implies that your memory limit should be larger than the file you want to upload. However, another user (ragtime at alice-dsl dot com) at php.net states:
I don't believe the myth that 'memory_size' should be the size of
the uploaded file. The files are
definitely not kept in memory...
instead uploaded chunks of 1MB each
are stored under /var/tmp and later on
rebuild under /tmp before moving to
the web/user space.
I'm running a linux-box with only 64MB
RAM, setting the memory_limit to 16MB
and uploading files of sizes about
100MB is no problem at all! (http://php.net/manual/en/features.file-upload.php)
He reports some other related problems with the garbage collector but also states how they can be solved. If that is true, the uploaded file size may well increase the memory limit. (Note, however, that another thing might be to process the uploaded file - then you might have to load it into memory)
I'm writing this before I tried handling large file uploads with PHP myself since I'm evaluating using php or python for this task.
You can do some interesting things based around PHP's sockets. Have you considered writing an applet in Java to upload the file to a listening PHP daemon? This probably won't work on most professional hosting providers, but if you're running your own server, you could make it work. Consider the following sequence:
Applet starts up, sends a request to PHP to open a listening socket
(You'll probably have to write a basic web browser in Java to make this work)
Java Applet reads the file from the file system and uploads it to PHP through the socket that was created in step 1.
Not the cleanest way to do it, but if you disable the PHP script timeout in your php.ini file, then you could make something work.
It isn't possible to upload a file larger than PHP allowed limits with PHP, it's that simple.
Possible workarounds include using a client-side technology - like Java, not sure if Flash and Javascript can do this - to "split" the original file in smaller chunks.

Upload 1GB files using chunking in PHP

I have a web application that accepts file uploads of up to 4 MB. The server side script is PHP and web server is NGINX. Many users have requested to increase this limit drastically to allow upload of video etc.
However there seems to be no easy solution for this problem with PHP. First, on the client side I am looking for something that would allow me to chunk files during transfer. SWFUpload does not seem to do that. I guess I can stream uploads using Java FX (http://blogs.oracle.com/rakeshmenonp/entry/javafx_upload_file) but I can not find any equivalent of request.getInputStream in PHP.
Increasing browser client_post limits or php.ini upload or max_execution times is not really a solution for really large files (~ 1GB) because maybe the browser will time out and think of all those blobs stored in memory.
Is there any way to solve this problem using PHP on server side? I would appreciate your replies.
plupload is a javascript/php library, and it's quite easy to use and allows chunking.
It uses HTML5 though.
Take a look at tus protocol which is a HTTP based protocol for resumable file uploads so you can carry on where you left off without re-uploading whole data again in case of any interruptions. This protocol has also been adopted by vimeo from May, 2017.
You can find various implementations of the protocol in different languages here. In your case, you can use its javascript client called uppy and use golang or php based server implementation in a server.
"but I can not find any equivalent of request.getInputStream in PHP. "
fopen('php://input'); perhaps?
I have created a JavaFX client to send large files in chunks of max post size (I am using 2 MB) and a PHP receiver script to assemble the chunks into original file. I am releasing the code under apache license here : http://code.google.com/p/gigaupload/
Feel free to use/modify/distribute.
Try using the bigupload script. It is very easy to integrate and can upload up to 2 Gb in chunks. The chunk size is customizable.
How about using a java applet for the uploading and PHP for processing..
You can find an example here for Jupload:
http://sourceforge.net/apps/mediawiki/jupload/index.php?title=PHP_Example
you can use this package
it supports resumable chunk upload.
in the examples/js-examples/resumable-chunk-upload example , you can close and re-open the browser and then resume not completed uploads.
You can definitely write a web app that will accept a block of data (even via a POST) then append that block of data to a file. It seems to me that you need some kind of client side app that will take a file and break it up into chunks, then send it to your web service one chunk at a time. However, it seems a lot easier to create an sftp dir, and let clients just sftp up files using some pre-existing client app.

Categories