Is there any way to upload large files (more than 80 Gb) through a web browser? Previously I have been uploading files (img, png, jpg) using plupload but it seems not to be working for larger files. I would also like to know how to implement a web page where users could upload like Mega.co.nz or Drive.google.com.
If it is impossible to do it using web development tools, can anyone guide me about how I can divide & upload a file in segments?
Thanks.
You can use the JavaScript Blob object to slice large files into smaller chunks and transfer these to the server to be merged together. This has the added benefit of being able to pause/resume downloads and indicate progress.
If you don't fancy doing it yourself there are existing solutions that use this approach. One example is HTML5 Uploader by Filkor.
If I was you I would use something like FTP to accomplish this. If you can use ASP.NET there are already good libraries that exist for file transfer.
Here is a post that shows an example of uploading a file: Upload file to ftp using c#
The catch is you will need a server. I suggest Filezilla. https://filezilla-project.org/
Related
Amazon S3 has a very nice feature that allows the upload of files in parts for larger files. This would be very useful to me if I was using S3, but I am not.
Here's my problem: I am going to have Android phones uploading reasonably large files (~50MB each of binary data) on a semi-regular basis. Because these phones are using the mobile network to do this, and the coverage is spotty in some of the places where they're being used, strong signal cannot be guaranteed. Therefore, doing a simple PUT with 40MB of data in the content body will not work very well. I need to split up the data somehow (probably into 10MB chunks) and upload them whenever the signal will allow it. Once all of the chunks have been uploaded, they need to be merged into a single file.
I have a basic understanding of how the client needs to behave to support this through reading Amazon's S3 Client APIs, but have no idea what the server is doing to allow this. I'm willing to write the server in Python or PHP. Are there any libraries out there for either language to allow this sort of thing? I couldn't find anything after about one hour of searching.
Basically, I'm looking for anything that can help point me in the right direction. Information on this and what protocols and headers to use to make this as RESTful as possible would be fantastic. Thanks!
From the REST API documentation for multi-part upload it seems that Amazon expects the client to break the large file into smaller multiple parts and upload them individually. Prior to uploading you need to obtain an upload id and on every upload you include the upload id and the a part number for the portion of the file being uploaded.
The way you may have to go about structuring is to create a client which can split a huge file into multiple parts and upload them in parallel using the above specified convention.
I need a flash uploader, to use it in my CMS project.
I need something like this, but with greater max upload size (it doesn't allow to upload files larger ini_get('upload_max_filesize')).
My server doesn't allow me to overwrite ini settings, so I'm looking for an uploader which can upload large files independently from the ini settings.
If you want to go around the ini limit, one option would be to change and use an FTP uploader.
I've used once net2ftp and it was easy enough in its installation; I've never used it again since (almost 1 year and a half), but I see from their page that the project is updated and not dead, so you might give it a try.
You just download the package, place it in your webapp, customize it, and you're set.
You might want to create a dedicated FTP user with appropriate permissions, and not use the root one, of course.
You wont be able to post more data to the server than the max_upload_size.
As a workaround you can upload the data to Amazon S3 and sync it back via s3sync.
We have a setup with plupload in place for one of our clients and are able to upload up to 2GB per file (that's a client restriction, I don't know about S3 restrictions)
Mind you that S3 costs some money.
We are developing a video file upload feature in a php website. We need to upload files upto at least 100MB. We are using some flash upload tools which shows progress bars.
When we try, even 10 MB files are itself taking lot of time and progress bar seem to end very fast and we have to wait long time to finish uploading. Is there any good progress bar plugins for large file uploads
Also can we use any other file upload methods other than http upload?
Is it possible to upload file using FTP for video file uploads. I have seen a few samples but nothing seem to working..
There are several options, if you wish to go down the PHP route then take a look http://www.sitepoint.com/upload-large-files-in-php/.
else, you can try the JAVA applet route and a good place to start would be with http://jupload.sourceforge.net/
I think in this case i would use the Java Applet.
FTP can be more tricky for the user than a plain html upload form, but would be the preferred way to upload in my opinion (robust protocol, resume support etc.)
For a nice html upload form, you could have a look at plupload, which offers a wide variety of options for client side plugins. It supports graceful degradation, and in the case of html5 file uploads support, it also supports chunking the upload.
I am a developing a website that involves uploading videos above 50MB.
Which is a better (faster) way of uploading the files to server:
uploading the video files via ftp
or
uploading the files via a form
Thanks
The best way would be with FTP.
FTP is much faster for larger file sizes. File sizes that are below 1MB won't matter as much.
P.S. If you are not the one uploading, then think which is easier for your users. Form is easier but ftp is still faster.
For user experience you should go with the form file upload; The speed of both depend on the internet connection speed and load of the server and client and won't differ that much. It might be a bit much for your webserver if it's handling a lot of users but you can use for example nginx to make that less of a problem.
edit:
here a comparison: http://daniel.haxx.se/docs/ftp-vs-http.html
I use Jupload
It splits the files and uploads them via http. It's also good because you don't need to care about file upload limitations in server config. Speed depends mostly on the client connection info both for HTTP and FTP. Of course there are some differences but not at all so big between them.
Why not offer both? (Seriously - I wrote an app about ten years ago that did this.) Look up "MOVEit DMZ" or research various FTP servers with web portal integration to see how it's being done today.
There's also a third way you should consider and was touched on by the Jupload comment: a local control (Flash, Java, ActiveX, Firefox plug-in, etc. that optimizes the upload experience). If people are uploading multiple large files to your site they may appreciate the speed/reliability boost.
I have a web application that accepts file uploads of up to 4 MB. The server side script is PHP and web server is NGINX. Many users have requested to increase this limit drastically to allow upload of video etc.
However there seems to be no easy solution for this problem with PHP. First, on the client side I am looking for something that would allow me to chunk files during transfer. SWFUpload does not seem to do that. I guess I can stream uploads using Java FX (http://blogs.oracle.com/rakeshmenonp/entry/javafx_upload_file) but I can not find any equivalent of request.getInputStream in PHP.
Increasing browser client_post limits or php.ini upload or max_execution times is not really a solution for really large files (~ 1GB) because maybe the browser will time out and think of all those blobs stored in memory.
Is there any way to solve this problem using PHP on server side? I would appreciate your replies.
plupload is a javascript/php library, and it's quite easy to use and allows chunking.
It uses HTML5 though.
Take a look at tus protocol which is a HTTP based protocol for resumable file uploads so you can carry on where you left off without re-uploading whole data again in case of any interruptions. This protocol has also been adopted by vimeo from May, 2017.
You can find various implementations of the protocol in different languages here. In your case, you can use its javascript client called uppy and use golang or php based server implementation in a server.
"but I can not find any equivalent of request.getInputStream in PHP. "
fopen('php://input'); perhaps?
I have created a JavaFX client to send large files in chunks of max post size (I am using 2 MB) and a PHP receiver script to assemble the chunks into original file. I am releasing the code under apache license here : http://code.google.com/p/gigaupload/
Feel free to use/modify/distribute.
Try using the bigupload script. It is very easy to integrate and can upload up to 2 Gb in chunks. The chunk size is customizable.
How about using a java applet for the uploading and PHP for processing..
You can find an example here for Jupload:
http://sourceforge.net/apps/mediawiki/jupload/index.php?title=PHP_Example
you can use this package
it supports resumable chunk upload.
in the examples/js-examples/resumable-chunk-upload example , you can close and re-open the browser and then resume not completed uploads.
You can definitely write a web app that will accept a block of data (even via a POST) then append that block of data to a file. It seems to me that you need some kind of client side app that will take a file and break it up into chunks, then send it to your web service one chunk at a time. However, it seems a lot easier to create an sftp dir, and let clients just sftp up files using some pre-existing client app.