I am debating whether to use FTP or HTTP for large file upload and downloads. File uploads will consists of text and audio files, from a range of a couple of KB - 200MB. I have a couple of questions, such as:
Which would be faster? HTTP or FTP
Which would be more reliable?
Which would be of greater ease and convenience for the end user?
What other alternatives are there for larger file uploads.
Note: I need to somehow keep track of what files what users upload.
Thanks a bunch!
In my opinion File transfers which are part of a website navigation should use the same protocol, as switching protocol may require additional work on your server, and it will open a new connection.
Moreover, HTTP supports POST and PUT methods which are designed for that goal.
If you simply want to upload files, then FTP is the dedicated protocol, but it may not be implemented on all web browser.
Related
Lets say I have a file common.php used by many pages in my website. Now I want to update the file via FTP, so there will be around 1-2 seconds where the file is not available / still partially being uploaded.
During that time, it causes require('common.php') to report error, thus website is not loading properly.
How to solve cases like this?
Thanks!
You can upload the file with a different name and rename it only after the upload completes. That minimizes the downtime.
Some clients support this even automatically. What further minimizes the downtime.
For example, WinSCP SFTP/FTP client supports this. But with SFTP protocol only, if that's an option for you.
In WinSCP preferences, enable Transfer to temporary filename for All files.
WinSCP will then upload all files with a temporary .filepart extension, overwriting the target file only after the upload finishes.
(I'm the author of WinSCP)
I have a standard HTML form, including a file input, allowing users of a web application to upload files (pictures, documents or videos).
It technically works, except when it comes with a large file...
I have a personal dedicated server on which I can change the PHP configuration to handle larger files ;
But this is not a reliable solution as my client has a shared hosting.
I know HTTP has different limits and is definitely not the best protocol to handle files.
So my objective is to avoid HTTP upload.
I was wondering if there is any way to rely on FTP to upload the selected file.
Any solution to upload files through FTP, directly from client's browser ?
EDIT :
I've read some solutions about Java applets, however this is not
really something I can, or even want to, provide.
And as the files are confidential, using a third-party service is
not possible neither.
I am a developing a website that involves uploading videos above 50MB.
Which is a better (faster) way of uploading the files to server:
uploading the video files via ftp
or
uploading the files via a form
Thanks
The best way would be with FTP.
FTP is much faster for larger file sizes. File sizes that are below 1MB won't matter as much.
P.S. If you are not the one uploading, then think which is easier for your users. Form is easier but ftp is still faster.
For user experience you should go with the form file upload; The speed of both depend on the internet connection speed and load of the server and client and won't differ that much. It might be a bit much for your webserver if it's handling a lot of users but you can use for example nginx to make that less of a problem.
edit:
here a comparison: http://daniel.haxx.se/docs/ftp-vs-http.html
I use Jupload
It splits the files and uploads them via http. It's also good because you don't need to care about file upload limitations in server config. Speed depends mostly on the client connection info both for HTTP and FTP. Of course there are some differences but not at all so big between them.
Why not offer both? (Seriously - I wrote an app about ten years ago that did this.) Look up "MOVEit DMZ" or research various FTP servers with web portal integration to see how it's being done today.
There's also a third way you should consider and was touched on by the Jupload comment: a local control (Flash, Java, ActiveX, Firefox plug-in, etc. that optimizes the upload experience). If people are uploading multiple large files to your site they may appreciate the speed/reliability boost.
I have a web application that accepts file uploads of up to 4 MB. The server side script is PHP and web server is NGINX. Many users have requested to increase this limit drastically to allow upload of video etc.
However there seems to be no easy solution for this problem with PHP. First, on the client side I am looking for something that would allow me to chunk files during transfer. SWFUpload does not seem to do that. I guess I can stream uploads using Java FX (http://blogs.oracle.com/rakeshmenonp/entry/javafx_upload_file) but I can not find any equivalent of request.getInputStream in PHP.
Increasing browser client_post limits or php.ini upload or max_execution times is not really a solution for really large files (~ 1GB) because maybe the browser will time out and think of all those blobs stored in memory.
Is there any way to solve this problem using PHP on server side? I would appreciate your replies.
plupload is a javascript/php library, and it's quite easy to use and allows chunking.
It uses HTML5 though.
Take a look at tus protocol which is a HTTP based protocol for resumable file uploads so you can carry on where you left off without re-uploading whole data again in case of any interruptions. This protocol has also been adopted by vimeo from May, 2017.
You can find various implementations of the protocol in different languages here. In your case, you can use its javascript client called uppy and use golang or php based server implementation in a server.
"but I can not find any equivalent of request.getInputStream in PHP. "
fopen('php://input'); perhaps?
I have created a JavaFX client to send large files in chunks of max post size (I am using 2 MB) and a PHP receiver script to assemble the chunks into original file. I am releasing the code under apache license here : http://code.google.com/p/gigaupload/
Feel free to use/modify/distribute.
Try using the bigupload script. It is very easy to integrate and can upload up to 2 Gb in chunks. The chunk size is customizable.
How about using a java applet for the uploading and PHP for processing..
You can find an example here for Jupload:
http://sourceforge.net/apps/mediawiki/jupload/index.php?title=PHP_Example
you can use this package
it supports resumable chunk upload.
in the examples/js-examples/resumable-chunk-upload example , you can close and re-open the browser and then resume not completed uploads.
You can definitely write a web app that will accept a block of data (even via a POST) then append that block of data to a file. It seems to me that you need some kind of client side app that will take a file and break it up into chunks, then send it to your web service one chunk at a time. However, it seems a lot easier to create an sftp dir, and let clients just sftp up files using some pre-existing client app.
I am building a large website where members will be allowed to upload content (images, videos) up to 20MB of size (maybe a little less like 15MB, we haven't settled on a final upload limit yet but it will be somewhere between 10-25MB).
My question is, should I go with HTTP or FTP upload in this case. Bear in mind that 80-90% of uploads will be smaller size like cca 1-3MB but from time to time some members will also want to upload large files (10MB+).
Is HTTP uploading reliable enough for such large files or should I go with FTP? Is there a noticeable speed difference between HTTP and FTP while uploading files?
I am asking because I'm using Zend Framework which already has HTTP adapter for file uploads, in case I choose FTP I would have to write my own adapter for it.
Thanks!
HTTP definitely puts less of a burden on your clients. A lot of places have proxies or firewalls that block all FTP traffic (in or out).
The big advantage of HTTP is that it goes over firewalls and it's very easy to encrypt---just use HTTPS on port 443 instead of HTTP on port 80. Both go through proxies and firewalls. And these days it's pretty easy to upload a 20MB files over HTTP/HTTPS using a POST.
The problem with HTTP is that it is not restartable for uploads. If you get 80% of the file sent and then there is a failure, you will need to restart at the beginning. That's why vendors are increasingly using flash-based, java-based or javascript-based uploaders and downloaders. These systems can see how much of the file has been sent, send a MAC to make sure that it has arrived properly, and resend the parts that are missing.
A MAC is more important than you might think. TCP checksums are only 32 bits, so there is a 1-in-4-billion chance of an error not being detected. That potentially happens a lot with today's internet.
Is HTTP uploading reliable enough for
such large files
One major advantage of FTP would be the ability to resume aborted uploads. Most FTP servers and clients support this, though it's not always activated. Whereas with HTTP, it's theoretically possible using special headers, but a normal client (i.e. browser) will not support it.
Another advantage would be bulk uploads: very simple in FTP, not so in HTTP.
But why not simply offer both options? HTTP for those who are behind proxies or won't/can't use an FTP client, and FTP for people who have to do upload many or large uploads over unreliable connections.
I do not want to be sarcastic, but File Transfer Protocol must be more reliable on file transfer :)
Resource availability / usage is more of an issue than reliability or speed. Each upload consumes resources - thread / memory / etc - on your web server for the duration of the upload. If content upload traffic is significant for large files it would be better to use FTP simply to free your HTTP server to be more responsive to page requests.
I definitely, opt for the HTTP approach as the rest of the people here. The reason for this is what you've said about most of the files being from one to three megabytes.
The problem is for the "rest", so:
Have you considered allowing users to send larger files through e-mail to a deamon script that gets the emails and uploads the emails to the account associated with the sender?
Or there is the solution of the flash uploader, in a facebook-like approach.
FTP will consume less bandwidth than HTTP, since the latter will need to encode(base64) the binary content into plain text thus increase the total transfer size. (by 1/3).
However, bandwidth consumption might not necessarily be the major concern, compare to other factors like usability and security, in which HTTP prevail.