I have a standard HTML form, including a file input, allowing users of a web application to upload files (pictures, documents or videos).
It technically works, except when it comes with a large file...
I have a personal dedicated server on which I can change the PHP configuration to handle larger files ;
But this is not a reliable solution as my client has a shared hosting.
I know HTTP has different limits and is definitely not the best protocol to handle files.
So my objective is to avoid HTTP upload.
I was wondering if there is any way to rely on FTP to upload the selected file.
Any solution to upload files through FTP, directly from client's browser ?
EDIT :
I've read some solutions about Java applets, however this is not
really something I can, or even want to, provide.
And as the files are confidential, using a third-party service is
not possible neither.
Related
I'm currently trying to get an upload screen in a php application (in combination with zend framework 1.9) up and running.
I know how to do the upload itself and also do some basic validations of the files.
What I am wondering is, what are common methods to ensure that the file has been uploaded correctly.
I have looked through the validator examples (especially the MD5 one). Example:
$upload = new Zend_File_Transfer();
$upload->addValidator('Md5', false, '3b3652f336522365223');
But from how I understand it you would already need to know what MD5 SHOULD be the result.
Thus the user would have to put that info into the upload screen so that it can be transmitted alongside the file (it could be that I'm mistaken there though).
So my question is: Is there any way in zend-framework how I can validate files without having any need for additional inputs from the user (in addition to the file itself)?
Or what are the practices used there?
Thanks
There is no need for such a thing in my opinion. This is the responsibility of the TCP protocol, web browser, and PHP engine.
TCP guarantees that packets received by a server are the same as they were when sent by a client. Web browsers are tested to ensure that HTTP uploads are handled properly such that files are uploaded correctly and similarly PHP is tested such that it guarantees HTTP file uploads are saved to the server with the same contents that were uploaded.
Doing something like this is just extra work and will likely never result in an error. To do this you'd likely have to use new HTML5 features that allow Javascript to read local files, then implement some hash function in JS so clients would have to have HTML5 capable browsers supporting the file APIs and.
I need to write a script to upload big Files (~2GB+) to a server.
I don't think HTTP is the right way to do this so I want to use (S)FTP.
There are several tutorials about this (using cURL or ftp_connect) and I understand that i have to set several things in the php.ini.
But all this tutorials upload the file to a remote Server, what I want to do is upload it to the Server the script is running on, without having to upload the file to the Server over HTTP first.
Is this possible? If so, how would I do that?
HTTP can be the right way to upload large files. You can use resumable.js or similar library to split the file in "chunks" and then reassemble the file on the server.
If you decided not to go with HTTP and have shell access, I recommend you use rsync (with the --partial flag) which will do the heavy lifting for you.
I am debating whether to use FTP or HTTP for large file upload and downloads. File uploads will consists of text and audio files, from a range of a couple of KB - 200MB. I have a couple of questions, such as:
Which would be faster? HTTP or FTP
Which would be more reliable?
Which would be of greater ease and convenience for the end user?
What other alternatives are there for larger file uploads.
Note: I need to somehow keep track of what files what users upload.
Thanks a bunch!
In my opinion File transfers which are part of a website navigation should use the same protocol, as switching protocol may require additional work on your server, and it will open a new connection.
Moreover, HTTP supports POST and PUT methods which are designed for that goal.
If you simply want to upload files, then FTP is the dedicated protocol, but it may not be implemented on all web browser.
I'm building a web server out of a spare computer in my house (with Ubuntu Server 11.04), with the goal of using it as a file sharing drive that can also be accessed over the internet. Obviously, I don't want just anyone being able to download some of these files, especially since some would be in the 250-750MB range (video files, archives, etc.). So I'd be implementing a user login system with PHP and MySQL.
I've done some research on here and other sites and I understand that a good method would be to store these files outside the public directory (e.g. /var/private vs. /var/www). Then, when the file is requested by a logged in user, the appropriate headers are given (likely application/octet-stream for automatic downloading), the buffer flushed, and the file is loaded via readfile.
However, while I imagine this would be a piece of cake for smaller files like documents, images, and music files, would this be feasible for the larger files I mentioned?
If there's an alternate method I missed, I'm all ears. I tried setting a folders permissions to 750 and similar, but I could still view the file through normal HTTP in my browser, as if I was considered part of the group (and when I set the permissions so I can't access the file, neither can PHP).
Crap, while I'm at it, any tips for allowing people to upload large files via PHP? Or would that have to be don via FTP?
You want the X-Sendfile header. It will instruct your web server to serve up a specific file from your file system.
Read about it here: Using X-Sendfile with Apache/PHP
That could indeed become an issue with large files.
Isn't it possible to just use FTP for this?
HTTP isn't really meant for large files but FTP is.
The soluton you mentioned is the best possible when the account system is handled via PHP and MySQL. If you want to keep it away from PHP and let the server do the job, you can protect the directory by password via .htaccess file. This way the files won't go through the PHP, but honestly there's nothing you should be worried about. I recommend you to go with your method.
After tearing my hair out for the last week, I am looking for some sort of web uploader that allows my customers to upload a bunch of files (often up to 200) and store them to a remote FTP server. What I am looking for is something similar to uploadify, swfupload etc. but has the possibility to upload files via my web page (at my hosting company) and stored to my local ftp server.
I am looking for something similar to uploadify, swfupload and such, but it is absolutely critical that it has the possibility to store the files on my local server.
If this is somehow impossible to do, it could also just upload the files to my website via html (which uploadify etc. does) and after completion copy the files from the web server to my local ftp.
The closest thing i found was something called filechunker and it looked like the perfect solution, BUT it wont let me add multiple files, just one by one.
All help would be greatly apreciated!
Unfortunately I can't give you a concrete answer, but let me say that it should be theoretically possible to do for a Flash or Java application since they can use raw TCP sockets and implement the FTP protocol (but I am not aware of any Flash-based implementation).
If I'm not wrong all major browsers offer native file upload via FTP by browsing to the FTP directory itself (but you can't influence the visual appearance), just like Windows Explorer can access FTP servers and use them like a network drive.
However, I discourage you from using a FTP server at all. That protocol with it's double connection and that passive/non-passive modes often causes problems. It's usually much better to upload via HTTP and implement a HTTP-based file server yourselves, which is rather easy after all (but be very careful not to expose too much of your server's file system).
I see no real reason for using FTP unless you really want to allow your users to use their FTP client of choice, but that is contrary to your question.
Hope this helps.
Update: I just noticed the sentence "copy the files from the web server to my local ftp". In case you are really talking about two different servers I would still suggest a HTTP upload and then forward the file to the FTP server via the PHP script (your web server acting as a proxy).
I don't think it's feasible to upload directly from the browser to your FTP as you would have to have your credentials more or less visible on the website (e.g. in your javascript source).
I once created something similar, but because of that issue I decided to upload via plupload to Amazon S3 and sync the files afterwards via s3sync. The advantages were
Large filesizes (2GB+)
One time Tokens for upload, no need to send credentials to the client
no traffic to your web server (The communication runs client->s3)
Take a look at this thread for an implementation: http://www.plupload.com/punbb/viewtopic.php?id=133
After a wild search i finally found something that I could use. This java applet lets me upload endless amounts of files, zips them down and i managed to pass a php variable into the applet so the zip file is stored with the users e-mail adress as the filename. Cost me $29 though, but well worth it since I now have full control of where the files go, and who uploadeded them.