Large file uploads from web pages - php

I code primarily in PHP and Perl. I have a client who is insisting on seeking video submissions (any encoding) from the public via one of their pages rather than letting YouTube do its job.
Server in question is a virtual machine and I can adjust ini settings for max post, max upload size etc as needed.
My initial thought is to use a Flash based uploader with PHP on the back end but I wondered if someone might have useful advice and experience on the subject?

Doing large file transfers of HTTP is not usually fun -- but sometimes it's necessary.
For large files, you'll definitely want to provide some kind of progress gauge for end-users.
There are flash-based tools that do this (swfUpload comes to mind).
If you want to avoid flash and do it with pretty html/javascript/css, you can leverage PHP's APC extension, which for some reason provides support for getting upload status from the server, as explained here

You can adjust the post size and use a normal html form. The big problem is not Apache, its http. If anything goes wrong in the transmission you will have no way to detect the error. Further more there is no way to resume the transfer. This is exactly why BitTorrent is so popular.

I don't know how against youtube your client is, but you can use their api to do the uploads from a page on your site.
http://code.google.com/apis/youtube/2.0/developers_guide_protocol.html#Uploading_Videos
See: browser based uploading.

For web-based uploads, there's not many options. Regardless of web platform, web server, etc. you're still transferring over HTTP. The transfer is all or nothing.
Your best option might be to find a Flash, Java, or other client side option that can chunk files and upload them piecemeal, then do a checksum to verify. That will allow for resuming uploads. Unfortunately, I don't know of any such open source component that does this.

Try to convince your client to change point of view.
Using http (and the browser, hell, the browser!) for this kind of issue is rarely a good deal; Will his users wait 40 minutes with the computer and the browser running until the upload is complete?
I dont think so.
Maybe, you could set up a public ftp account, where users can upload but not download and see the others user's files.. then, who want to use FTP software can, who like to do it via browser can too.
The big problem dealing using a browser is that, if something go wrong, you cant resume but have to restart from zero again.
the past year i had the same issue, i gave a look to ZUpload
, but i didnt use it so i can suggest (we wrote a small python script that we send to our customer; the python script create a torrent of the folder our costumer need to send to us, and we download it via utorrent ;)
p.s: again, sorry for my bad english ;)

I used jupload. Yes it looks horrible, but it just works.
With that said, it's still a better idea to convince the client that doing so is stupid.

I would agree with others stating that using HTML is a poor option. I believe there is a size limitation using Flash as well. I know of a script that uses a JavaScript Applet to perform an actual FTP transfer. It is called Simple2FTP and can be found at http://www.simple2ftp.com
Not sure but perhaps worth a try?

Related

Handling Very Large Uploads [duplicate]

I want to allow uploads of very large files into our PHP application (hundred of megs - 8 gigs). There are a couple of problems with this however.
Browser:
HTML uploads have crappy feedback, we need to either poll for progress (which is a bit silly) or show no feedback at all
Flash uploader puts entire file into memory before starting the upload
Server:
PHP forces us to set post_max_size, which could result in an easily exploitable DOS attack. I'd like to not set this setting globally.
The server also requires some other variables to be there in the POST vars, such as an secret key. We'd like to be able to refuse the request right away, instead of after the entire file is uploaded.
Requirements:
HTTP is a must.
I'm flexible with client-side technology, as long as it works in a browser.
PHP is not a requirement, if there's some other technology that will work well on a linux environment, that's perfectly cool.
upload_max_filesize can be set on a per-directory basis; the same goes for post_max_size
e.g.:
<Directory /uploadpath/>
php_value upload_max_filesize 10G
php_value post_max_size 10G
</IfModule>
Python Handler?
Using a Python POST handler instead of PHP. Generate a unique identifier from your PHP app that the client can put in the HTTP headers. With mod_python to reject or accept the large upload before the entire POST body is transmitted.
I think
http://www.modpython.org/live/current/doc-html/dir-handlers-hph.html
Allows you to check headers and decline the rest of the POST input. I haven't tried it but might be the right path?
Looking at the source of mod_python, the buffering of the input via read() seems to allow bit-at-a-time evaluation of the HTTP input. Headers are first.
https://svn.apache.org/repos/asf/quetzalcoatl/mod_python/trunk/src/filterobject.c
It's old I know, but maybe someone have this problem nowdays ,too.
Now you can do this with only Javascript and, say, PHP. No Flash or Java required on client side.
demo: http://dnduploader.filkor.org/
The idea is to slice the files with Javascript's Blob slice() method...
How about a Java applet? That's how we had to do it at a company I previously worked for. I know applets suck, especially in this day and age with all our options available, but they really are the most versatile solution to desktop-like problems encountered in web development. Just something to consider.
You can set the post_max_size for just scripts in 1 directory. Place your upload script there, and allow only that script to handle large sizes. It's still possible for that script to be attacked with large/useless files, but it avoids setting it globally.
Use that with APC and you might be able to work out something good:
IBM Developer works article on APC
Tried all of this... this is by far the best I have used yet...
http://www.uploadify.com/
Take a look at jumploader.com
A good java-applet for uploading.
I've used it for uploading images and it works fine. Haven't tried with bigger files than 10MB, but i should work for really big files too.
Have you looked into using APC to check the progress and total file size. Here is a good blog post about it. It might help.
Maybe you could use Webdav and Javascript in the browser
AJAX Big file upload, with progress, to WebDAV
http://www.webdavsystem.com/ajax/programming/upload_progress
A simple library
http://debris.demon.nl/projects/davclient.js/doc/README.html
You can then get the JS to redirect the user to a success page. Secret keys and what-not can be handled in a PHP prelude before handing off the JS Client->WebDAV
I would look into FTP, SSH or SCP this allows you to upload a large file and still have access control over the file as well. This might take a little longer to implement but its probably the most secure way I could think of.
I know it sucks to add another dependency but in my experience, most websites that are doing something like this are using flash on the client side, and uploading the large file as chunks
adobe as a howto on flash file uploads
I also found this tutorial on codeproject:
Multiple File Upload With Progress Bar Using Flash and ASP.NET
PS - I know you're using PHP and not .net, I figured the important part was the flash ;)
I've had success with uploadify, and I would recommend it. It's a jQuery/Flash script that handles large uploads, and you can pass extra parameters to it (like the secret key). To solve the server-side issues, simply use the following code. The changes take affect just for the script they're called in:
//Check to see if the key is there
if(!isset($_POST['secret_key']) || !isValid($_POST['secret_key']))
{
exit("Invalid request");
}
function isValid($key)
{
//Put your validation code here.
}
//This line changes the timeout.
//Give it a value in seconds (3600 = 1 hour)
set_time_limit(3600);
//Set these amounts to whatever you need.
ini_set("post_max_size","8192M");
ini_set("upload_max_filesize","8192M");
//Generally speaking, the memory_limit should be higher
//than your post size. So make sure that's right too.
ini_set("memory_limit","8200M");
EDIT In response to your comment:
Given what you've said, I'm afraid you may not be able to meet your requirements over http. All of the solutions out there are code that add features to http that it was never designed for.
Like you said yourself, it's a simple protocol. Apart from writing your own client software that runs outside of the browser, a java applet, or using a different protocol (like FTP, which was designed for this), you might not get what you want.
I've done the best I could within the given constraints. Sorry I couldn't do better.
Try this: http://www.simple2ftp.com uses a Java based FTP applet from within a clever PHP application wrapper.

File Upload and Security issues

i want to set the default value of input type file. I have searched a lot but every one says it is impossible due to security reason.
Is there any way that i set default value and when user upload file without navigating to it. It just prompt him that u r going to upload file from this location then only if user agrees then upload.
So no security conflicts. Please, tell me is there any API regarding this problem even in HTML5 or some sort of other solution.
in my case the user have to upload file from same location 500 time in a day
he wants to set path once then next time it uploads from same location(previous one)
Also what if i use java applet for this purpose
Nope, still security issue. Browsers do not even let you open a file dialog via javascript.
As request by the OP (although fastreload has already stated this in his answer, and therefore I believe my answer to be unnecessary)...
Browsers block the setting of the value of an <input type="file"> control for very good security reasons. This includes both pre-setting of the value in the HTML (from something like PHP / ASP.NET / static HTML) and the setting via client-side JavaScript.
The reasons are clear... browsers cannot trust the authors of HTML. If they did, websites would be able to upload any file from the local computer without the users permission.
You could use an ActiveX control (OCX) or a Java Applet to achieve this, but it will still require the user to approve the installation of it.
I will also add what has been mention a couple of time in comments, that a user being expected to upload a file "500 times a day"(!!) sounds like an exceedingly bad piece of design. Consider instead building an application (non-web, just normal desktop) that can be installed on the client machine to upload the file in question.
A trusted Java applet could achieve the stated functionality.
But to save the user visiting the applet page 500 times a day, I would go with the suggestion of #fastreload and make it a (trusted) desktop application that is launched using Java Web Start (if it is a Java based app.).

FTP-upload with PHP, print simple percentage?

hey guys,
i know there are a lot of "for-me-too-complicated" versions of progress-bars for php uploads out there.
however i have only a really basic knowledge of php and i have no idea how to implement this stuff.
i did a working file-upload script that transferres files from the user to my ftp-server. i'm using ftp_connect and ftp_put to do so.
i wonder how complicated it is to print a SIMPLE percentage value on to the page, to let the user know how far the upload has progressed.
i don't want any animated javascript stuff, just a simple percentage that shows the progress.
do you know a tutorial or something, or can you maybe give me a little explanation how i could do that. at least which methods return a progress value.
thank you in advance,matt
I know you said the flash uploaders are too complicated for you and you need a simple solution but the truth is there are none. If you could start your project over I would recommend using some known CMS with file upload support.
I think you should really give something like uploadify another chance. If you have problems with it ask here! There is a uploadify tag and really helpful peoples.
edit after your commenht: As seen on this page theres the idea to use uploadify to get the file to yoru server and then move it normally using ftp to your other space/server.
PHP/Apache talks to the client in a single request only. There is no simple way to actually have the client (know) how far the server is in the process. On uploads the file travels from the client to the server, so we generaly use Flash that can give us that information.
client (flash) -> server
What you are asking is something a bit fancier
client -> server -> ftp
And you want to know the progress between the server and the FTP. Mind you that even if you don't realize it the files are actually beeing transfered to the server and then from the server to the FTP server.
You will probably want to have the server update a database on given intervals with the progress so far and have the client AJAX the server to find out where the server is at.
You can also give socket.io a look!

Best Practice for Uploading Many (2000+) Images to A Server

I have a general question about this.
When you have a gallery, sometimes people need to upload 1000's of images at once. Most likely, it would be done through a .zip file. What is the best way to go about uploading this sort of thing to a server. Many times, server have timeouts etc. that need to be accounted for. I am wondering what kinds of things should I be looking out for and what is the best way to handle a large amount of images being uploaded.
I'm guessing that you would allow a user to upload a zip file (assuming the timeout does not effect you), and this zip file is uploaded to a specific directory, lets assume in this case a directory is created for each user in the system. You would then unzip the directory on the server and scan the user's folder for any directories containing .jpg or .png or .gif files (etc.) and then import them into a table accordingly. I'm guessing labeled by folder name.
What kind of server side troubles could I run into?
I'm aware that there may be many issues. Even general ideas would be could so I can then research further. Thanks!
Also, I would be programming in Ruby on Rails but I think this question applies accross any language.
There's no reason why you couldn't handle this kind of thing with a web application. There's a couple of excellent components that would be useful for this:
Uploadify (based on jquery/flash)
plupload (from moxiecode, the tinymce people)
The reason they're useful is that in the first instance, it uses a flash component to handle uploads, so you can select groups of files from the file browser window (assuming no one is going to individually select thousands of images..!), and with plupload, drag and drop is supported too along with more platforms.
Once you've got your interface working, the server side stuff just needs to be able to handle individual uploads, associating them with some kind of user account, and from there it should be pretty straightforward.
With regards to server side issues, that's really a big question, depending on how many people will be using the application at the same time, size of images, any processing that takes place after. Remember, the files are kept in a temporary location while the script is processing them, and either deleted upon completion or copied to a final storage location by your script, so space/memory overheads/timeouts could be an issue.
If the images are massive in size, say raw or tif, then this kind of thing could still work with chunked uploads, but implementing some kind of FTP upload might be easier. Its a bit of a vague question, but should be plenty here to get you going ;)
For those many images it has to be a serious app.. thus giving you the liberty to suggest a piece of software running on the client (something like yahoo mail/picassa does) that will take care of 'managing' (network interruptions/resume support etc) the upload of images.
For the server side, you could process these one at a time (assuming your client is sending them that way)..thus keeping it simple.
take a peek at http://gallery.menalto.com
they have a dozen of methods for uploading pictures into galleries.
You can choose ones which suits you.
Either have a client app, or some Ajax code that sends the images one by one, preventing timeouts. Alternatively if this is not available to the public. FTP still works...
I'd suggest a client application (maybe written in AIR or Titanium) or telling your users what FTP is.
deviantArt.com for example offers FTP as an upload method for paying subscribers and it works really well.
Flickr instead has it's own app for this. The "Flickr Uploadr".

Quantify streamed video

I'm developing a PHP application which will charge users for the videos they watch. The business model is "everyone pays for how much she watches". For this purpose, I need to;
Implement secure video (FLV) access. (Authorized sessions will gain access)
Calculate how much video (FLV) data is sent from the server.
A trivial solution for this is to read FLV with PHP ("fread") and send it to client chunk by chunk (just "echo"). However I have real performance concerns about this method, because the application server has 1.7GB Rams and just a single core.
In short run we're expecting to get large number of impressions, however we would like to upgrade hardware as late as possible. That's why, I want to implement the requirement with the minimum overhead, in the most effective way.
I'm not tied to a webserver. I prefer Apache 2.2, however lighttpd can also be deployed if it offers a feature for the implementation.
Any idea is appreciated.
Thanks!
The PHP fread solution looks like the way to go, but with the server restriction, I think you will need to tweak the flash player. The flash player could send the server messages based on how much of the video has been played. This might be something to think about. Take a look at the JW FLV Media player, the customisation and Javascript integration will allow you to send xmlhttprequests to the server.
Why not using some videostreaming servers like Red5, I'm sure they have triggers that could perform writing some statistics to a db or something similar.
Another advantage would be that user could skip forward in the video.
So to sum up and for future reference I decided to go with the php fread method, since no satisfactory alternative is suggested.
Thanks to all contributers.

Categories