I am a developing a website that involves uploading videos above 50MB.
Which is a better (faster) way of uploading the files to server:
uploading the video files via ftp
or
uploading the files via a form
Thanks
The best way would be with FTP.
FTP is much faster for larger file sizes. File sizes that are below 1MB won't matter as much.
P.S. If you are not the one uploading, then think which is easier for your users. Form is easier but ftp is still faster.
For user experience you should go with the form file upload; The speed of both depend on the internet connection speed and load of the server and client and won't differ that much. It might be a bit much for your webserver if it's handling a lot of users but you can use for example nginx to make that less of a problem.
edit:
here a comparison: http://daniel.haxx.se/docs/ftp-vs-http.html
I use Jupload
It splits the files and uploads them via http. It's also good because you don't need to care about file upload limitations in server config. Speed depends mostly on the client connection info both for HTTP and FTP. Of course there are some differences but not at all so big between them.
Why not offer both? (Seriously - I wrote an app about ten years ago that did this.) Look up "MOVEit DMZ" or research various FTP servers with web portal integration to see how it's being done today.
There's also a third way you should consider and was touched on by the Jupload comment: a local control (Flash, Java, ActiveX, Firefox plug-in, etc. that optimizes the upload experience). If people are uploading multiple large files to your site they may appreciate the speed/reliability boost.
Related
Okay, here's the rundown.
I'm developing a video hosting site with PHP and jQuery primarily. My client is absolutely set on using Filezilla to upload his videos via FTP instead of allowing me to handle the upload directly with PHP. This makes me mildly psychotic, but hey, the money man calls the shots.
Since I have no way of detecting the completion of those FTP uploads, I set him up with a special FTP account that is rooted into the uploads directory, and designed an interface that monitors that folder and allows him to start transcoding those files for web at his discretion. This is less than optimal. I'd like to show the progress of currently processing uploads on my mobile app, and hopefully start encoding files automatically when the FTP transfer completes.
So what I'm really asking for is either a method of detecting that a file is still being uploaded, or some way of finding the size of the original file that is being uploaded so that I can do comparisons.
Thanks!
On Linux, you can use fuser to see if a file's in use by a process. A more reliable option would be to use the inotify system, which'll give you realtime notifications of file change events.
In case anyone else is ever wondering, the answer to the question reached through the comments and more research on my end to confirm, is that the total size of a file is not passed to the server at all during ftp transfers. Thanks to Jonathan Amend.
After tearing my hair out for the last week, I am looking for some sort of web uploader that allows my customers to upload a bunch of files (often up to 200) and store them to a remote FTP server. What I am looking for is something similar to uploadify, swfupload etc. but has the possibility to upload files via my web page (at my hosting company) and stored to my local ftp server.
I am looking for something similar to uploadify, swfupload and such, but it is absolutely critical that it has the possibility to store the files on my local server.
If this is somehow impossible to do, it could also just upload the files to my website via html (which uploadify etc. does) and after completion copy the files from the web server to my local ftp.
The closest thing i found was something called filechunker and it looked like the perfect solution, BUT it wont let me add multiple files, just one by one.
All help would be greatly apreciated!
Unfortunately I can't give you a concrete answer, but let me say that it should be theoretically possible to do for a Flash or Java application since they can use raw TCP sockets and implement the FTP protocol (but I am not aware of any Flash-based implementation).
If I'm not wrong all major browsers offer native file upload via FTP by browsing to the FTP directory itself (but you can't influence the visual appearance), just like Windows Explorer can access FTP servers and use them like a network drive.
However, I discourage you from using a FTP server at all. That protocol with it's double connection and that passive/non-passive modes often causes problems. It's usually much better to upload via HTTP and implement a HTTP-based file server yourselves, which is rather easy after all (but be very careful not to expose too much of your server's file system).
I see no real reason for using FTP unless you really want to allow your users to use their FTP client of choice, but that is contrary to your question.
Hope this helps.
Update: I just noticed the sentence "copy the files from the web server to my local ftp". In case you are really talking about two different servers I would still suggest a HTTP upload and then forward the file to the FTP server via the PHP script (your web server acting as a proxy).
I don't think it's feasible to upload directly from the browser to your FTP as you would have to have your credentials more or less visible on the website (e.g. in your javascript source).
I once created something similar, but because of that issue I decided to upload via plupload to Amazon S3 and sync the files afterwards via s3sync. The advantages were
Large filesizes (2GB+)
One time Tokens for upload, no need to send credentials to the client
no traffic to your web server (The communication runs client->s3)
Take a look at this thread for an implementation: http://www.plupload.com/punbb/viewtopic.php?id=133
After a wild search i finally found something that I could use. This java applet lets me upload endless amounts of files, zips them down and i managed to pass a php variable into the applet so the zip file is stored with the users e-mail adress as the filename. Cost me $29 though, but well worth it since I now have full control of where the files go, and who uploadeded them.
I created an simple web interface to allow various users to upload files. I set the upload limit to 100mb but now it turns out that the client occasionally wants to upload files 500mb+.
I know what to alter the php configuration to change the upload limit but I was wondering if there are any serious disadvantages to uploading files of this size via php?
Obviously ftp would be preferable but if possible i'd rather not have two different methods of uploading files.
Thanks
Firstly FTP is never preferable. To anything.
I assume you mean that you transferring the files via HTTP. While not quite as bad as FTP, its not a good idea if you can find another of solving the problem. HTTP (and hence the component programs) are optimized around transferring relatively small files around the internet.
While the protocol supports server to client range requests, it does not allow for the reverse operation. Even if the software at either end were unaffected by the volume, the more data you are pushing across the greater the interval during which you could lose the connection. But the biggest problem is that caveat in the last sentence.
Regardless of the server technology you use (PHP or something else) it's never a good idea to push that big file in one sweep in synchronous mode.
There are lots of plugins for any technology/framework that will do asynchronous upload for you.
Besides the connection timing out, there is one more disadvantage in that file uploading consumes the web server memory. You don't normally want that.
PHP will handle as many and as large a file as you'll allow it. But consider that it's basically impossible to resume an aborted upload in PHP, as scripts are not fired up until AFTER the upload is completed. The larger the file gets, the larger the chance of a network glitch killing the upload and wasting a good chunk of time and bandwidth. As well, without extra work with APC, or using something like uploadify, there's no progress report and users are left staring at a browser showing no visible signs of actual work except the throbber chugging away.
I have a general question about this.
When you have a gallery, sometimes people need to upload 1000's of images at once. Most likely, it would be done through a .zip file. What is the best way to go about uploading this sort of thing to a server. Many times, server have timeouts etc. that need to be accounted for. I am wondering what kinds of things should I be looking out for and what is the best way to handle a large amount of images being uploaded.
I'm guessing that you would allow a user to upload a zip file (assuming the timeout does not effect you), and this zip file is uploaded to a specific directory, lets assume in this case a directory is created for each user in the system. You would then unzip the directory on the server and scan the user's folder for any directories containing .jpg or .png or .gif files (etc.) and then import them into a table accordingly. I'm guessing labeled by folder name.
What kind of server side troubles could I run into?
I'm aware that there may be many issues. Even general ideas would be could so I can then research further. Thanks!
Also, I would be programming in Ruby on Rails but I think this question applies accross any language.
There's no reason why you couldn't handle this kind of thing with a web application. There's a couple of excellent components that would be useful for this:
Uploadify (based on jquery/flash)
plupload (from moxiecode, the tinymce people)
The reason they're useful is that in the first instance, it uses a flash component to handle uploads, so you can select groups of files from the file browser window (assuming no one is going to individually select thousands of images..!), and with plupload, drag and drop is supported too along with more platforms.
Once you've got your interface working, the server side stuff just needs to be able to handle individual uploads, associating them with some kind of user account, and from there it should be pretty straightforward.
With regards to server side issues, that's really a big question, depending on how many people will be using the application at the same time, size of images, any processing that takes place after. Remember, the files are kept in a temporary location while the script is processing them, and either deleted upon completion or copied to a final storage location by your script, so space/memory overheads/timeouts could be an issue.
If the images are massive in size, say raw or tif, then this kind of thing could still work with chunked uploads, but implementing some kind of FTP upload might be easier. Its a bit of a vague question, but should be plenty here to get you going ;)
For those many images it has to be a serious app.. thus giving you the liberty to suggest a piece of software running on the client (something like yahoo mail/picassa does) that will take care of 'managing' (network interruptions/resume support etc) the upload of images.
For the server side, you could process these one at a time (assuming your client is sending them that way)..thus keeping it simple.
take a peek at http://gallery.menalto.com
they have a dozen of methods for uploading pictures into galleries.
You can choose ones which suits you.
Either have a client app, or some Ajax code that sends the images one by one, preventing timeouts. Alternatively if this is not available to the public. FTP still works...
I'd suggest a client application (maybe written in AIR or Titanium) or telling your users what FTP is.
deviantArt.com for example offers FTP as an upload method for paying subscribers and it works really well.
Flickr instead has it's own app for this. The "Flickr Uploadr".
I have a web application that accepts file uploads of up to 4 MB. The server side script is PHP and web server is NGINX. Many users have requested to increase this limit drastically to allow upload of video etc.
However there seems to be no easy solution for this problem with PHP. First, on the client side I am looking for something that would allow me to chunk files during transfer. SWFUpload does not seem to do that. I guess I can stream uploads using Java FX (http://blogs.oracle.com/rakeshmenonp/entry/javafx_upload_file) but I can not find any equivalent of request.getInputStream in PHP.
Increasing browser client_post limits or php.ini upload or max_execution times is not really a solution for really large files (~ 1GB) because maybe the browser will time out and think of all those blobs stored in memory.
Is there any way to solve this problem using PHP on server side? I would appreciate your replies.
plupload is a javascript/php library, and it's quite easy to use and allows chunking.
It uses HTML5 though.
Take a look at tus protocol which is a HTTP based protocol for resumable file uploads so you can carry on where you left off without re-uploading whole data again in case of any interruptions. This protocol has also been adopted by vimeo from May, 2017.
You can find various implementations of the protocol in different languages here. In your case, you can use its javascript client called uppy and use golang or php based server implementation in a server.
"but I can not find any equivalent of request.getInputStream in PHP. "
fopen('php://input'); perhaps?
I have created a JavaFX client to send large files in chunks of max post size (I am using 2 MB) and a PHP receiver script to assemble the chunks into original file. I am releasing the code under apache license here : http://code.google.com/p/gigaupload/
Feel free to use/modify/distribute.
Try using the bigupload script. It is very easy to integrate and can upload up to 2 Gb in chunks. The chunk size is customizable.
How about using a java applet for the uploading and PHP for processing..
You can find an example here for Jupload:
http://sourceforge.net/apps/mediawiki/jupload/index.php?title=PHP_Example
you can use this package
it supports resumable chunk upload.
in the examples/js-examples/resumable-chunk-upload example , you can close and re-open the browser and then resume not completed uploads.
You can definitely write a web app that will accept a block of data (even via a POST) then append that block of data to a file. It seems to me that you need some kind of client side app that will take a file and break it up into chunks, then send it to your web service one chunk at a time. However, it seems a lot easier to create an sftp dir, and let clients just sftp up files using some pre-existing client app.