Hey just a quick question for anyone who has done this. I want to create a video tube site. I have done file uploads before but was wondering if anyone could give me suggestions on what I am planning to do.
The way I am planning is to have a folder in my web directory and to upload videos into the folder after virus scanning and checking mime. The video will then be converted and compressed using FFMPEG into flv.
I will change the name and store the video reference id in mysql so the file name can be fetched and served.
I will serve the files using HTTP_Download to a flash player
$dl = new HTTP_Download();
$dl->setFile("$path");
$dl->setContentDisposition(HTTP_DOWNLOAD_ATTACHMENT, "$path");
$dl->setContentType('video/flv');
$dl->send();
Anyone have any suggestions? Is it a good idea to put all videos in one directory?
You may want to consider a Java based uploader as PHP can run into timeout problems on large uploads.
Also do you FFMPEG processing as a CRON job not at upload as it takes a long time.
Look in something like Wowza Streaming Server to serve the videos. Allows streaming and everything is above the root. I name each video with a UID and send a parameter to the Flash video player to decide which one to play.
Where and how you store them will largely depend on how secure they need to be (i.e. should people be able to access the files in the directory directly? or should it be stored more securely than that?)
If direct access is fine, then putting them all in one folder is okay. If not, then you may want to obscure folder names, store them in a secure Database, or in a folder that is not accessible outside of the server.
Also, I'm hoping you're aware of the massive amounts of storage space and bandwidth such a service will consume? I hope you have a scaled solution ready to deploy if you're really serious about this..
Related
I have seen many questions concerning the storage of user uploaded image files onto a web application, but most of these are dealing with the following:
Indexing of the images, so as to retrieve them later
How to store them (on the server itself as a file or in the database)
I have a question in regards to this subject, but the question is:
In what directory do I put the uploaded image file? (or other file type, for that matter)
I have a small group I am running php apps for. There is very little files that get uploaded, but nontheless, they get uploaded.
I currently have them in my public html document root under /var/www/images/* , however I am told that it is not smart to store your user uploaded content straight to the /var/www/* directory and that it should be stored elsewhere.
However I cannot find a straightforward statement of where "elsewhere" is.
Keep in mind I do not have a server farm where I can establish certain servers for specific purposes (such as uploaded user files).
Therefore, on a single webserver that hosts usual scripting files, etc. what is the best storage practice for such content?
Thank you.
I don't think there's necessarily a 'best practice' per se; anywhere on your server will be fine, so long as you're able to retrieve the images later on. Typically they'd go inside a folder under /var/www/images/.
Personally I'd recommend creating an individual folder to store these user-uploaded images in (as something like /var/www/images/user_uploads), so that they don't get confused with other images you might have uploaded directly to /var/www/images/ (such as backgrounds or core imagery).
I'm facing a dilemma on how to implement file upload and download in a PHP website.
I have these criteria:
Performance - does not give performance issues to the website
File size - around 2GB and up.
Authorization - I want to be able to change who can access the files in PHP. Allow multiple users to gain access to a single file.
User friendly - no additional tools to use.
So here are the methods I'm currently looking at and how I assess them based on my criteria:
Database BLOB
Writing the file data into the output stream will take time and blocks other requests (is this correct?)
I read somewhere that there's a size limit for BLOB.
OK - I can easily control who can download the files here.
OK - No additional tools, just the website.
FTP
OK - since it is designed to store files.
OK - file system is the limit.
I need to create another credentials for each user aside from the username and password for the website. I assume I have to move the file from one location to another to update authorization, but how if multiple users can access one file? Shared directory? It looks messy.
Need another tools/program for accesing their files, need to remember another username and password.
My questions:
Based on my assumptions, do you thimk I understand the methods correctly?
If my assumptions are wrong, is there a way I can do this functionality while meeting my criteria?
PS Please excuse my English.
Why not just use the file system to store the files and store the path to the given file (+ permissions, if needed) in addition in a database.
The upload folder isn't accessible from the public and an wrapper script serves the content to the user.
Performance shouldn't be a problem as you just move/copy the uploaded file to a dedicated data directory.
File size isn't a problem (as long, as you have enough disk space)
The wrapper script handles permissions and serves files to the users
It's as friendly, as you design your ui
for me the usage of BLOB is not the best. I thought about BLOB to upload pictures in my own website, but the best way to upload file is to put them directly on ur server locally.
My website allows users to upload photographs which I store on Amazon's S3. I store the original upload as well as an optimized image and a thumbnail. I want to allow users to be able to export all of their original versions when their subscription expires. So I am thinking the following problems arise
Could be a large volume of data (possibly around 10GB)
How to manage the download process - eg make sure if it gets interrupted where to start from again, how to verify successful download of files
Should this be done with individual files or try and zip the files and download as one file or a series of smaller zipped files.
Are there any tools out there that I can use for this? I have seen Fzip which is an Actionscript library for handling zip files. I have an EC2 instance running that handles file uploads so could use this for downloads also - eg copy files to EC2 from S3, Zip them then download them to user via Flash downloader, use Fzip to uncompress the zip folder to user's hard drive.
Has anyone come across a similar service / solution?
all input appreciated
thanks
I have not dealt with this problem directly but my initial thoughts are:
Flash or possibly jQuery could be leveraged for a homegrown solution, having the client send back information on what it has received and storing that information in a database log. You might also consider using Bit Torrent as a mediator, your users could download a free torrent client and you could investigate a server-side torrent service (maybe RivetTracker or PHPBTTracker). I'm not sure how detailed these get, but at the very least, since you are assured you are dealing with a single user, if they become a seeder you can wipe the old file and begin on the next.
Break larger than 2GB files into 2GB chunks to accommodate users with FAT32 drives that can't handle > ~4GB files. Break down to 1GB if space on the server is limited, keeping a benchmark on what's been zipped from S3 via a database record
Fzip is cool but I think it's more for client side archiving. PHP has ZIP and RAR libraries (http://php.net/manual/en/book.zip.php) you can use to round up files server-side. I think any solution you find will require you to manage security on your own by keeping records in a database of who's got what and download keys. Not doing so may lead to people leeching your resources as a file delivery system.
Good luck!
I have a flash .swf file that I embed on my webpage. On my server I have the .swf file and multiple image folders. I would like to load every file in one of those folders into the flash slideshow. How should I go about doing this? I tried used Air but it doesn't work on my system as an application so I doubt it will work online. Eventually I plan on making a menu where you can select different folders to display and since they are of different sizes, a foreach loop would be optimal. Keeping a txt file with the number of images is also possible if theres a way to read that in, but I would prefer the more dynamic approach. I am working towards using php for the website if that helps find a solution.
Thanks,
-Mike
Also my slideshow works great online currently but i have to hardcode in the number of files.
I would suggest to have a PHP script on your server that takes care of parsing those folders, and return the list of files to Flash (with a valid public URL).
Basically at your application startup, you would call the PHP script to retrieve the full list of file (XML is a good format to be returned, or AMF if you have a lot of folders/files).
After all you have to do is manipulate that data to load whatever folder/files the user is willing to see.
Just for your information, Flash doesn't have access to the Filesystem, so it's impossible to parse folders directly from Flash. (However It is possible with an Air Application)
I have a general question about this.
When you have a gallery, sometimes people need to upload 1000's of images at once. Most likely, it would be done through a .zip file. What is the best way to go about uploading this sort of thing to a server. Many times, server have timeouts etc. that need to be accounted for. I am wondering what kinds of things should I be looking out for and what is the best way to handle a large amount of images being uploaded.
I'm guessing that you would allow a user to upload a zip file (assuming the timeout does not effect you), and this zip file is uploaded to a specific directory, lets assume in this case a directory is created for each user in the system. You would then unzip the directory on the server and scan the user's folder for any directories containing .jpg or .png or .gif files (etc.) and then import them into a table accordingly. I'm guessing labeled by folder name.
What kind of server side troubles could I run into?
I'm aware that there may be many issues. Even general ideas would be could so I can then research further. Thanks!
Also, I would be programming in Ruby on Rails but I think this question applies accross any language.
There's no reason why you couldn't handle this kind of thing with a web application. There's a couple of excellent components that would be useful for this:
Uploadify (based on jquery/flash)
plupload (from moxiecode, the tinymce people)
The reason they're useful is that in the first instance, it uses a flash component to handle uploads, so you can select groups of files from the file browser window (assuming no one is going to individually select thousands of images..!), and with plupload, drag and drop is supported too along with more platforms.
Once you've got your interface working, the server side stuff just needs to be able to handle individual uploads, associating them with some kind of user account, and from there it should be pretty straightforward.
With regards to server side issues, that's really a big question, depending on how many people will be using the application at the same time, size of images, any processing that takes place after. Remember, the files are kept in a temporary location while the script is processing them, and either deleted upon completion or copied to a final storage location by your script, so space/memory overheads/timeouts could be an issue.
If the images are massive in size, say raw or tif, then this kind of thing could still work with chunked uploads, but implementing some kind of FTP upload might be easier. Its a bit of a vague question, but should be plenty here to get you going ;)
For those many images it has to be a serious app.. thus giving you the liberty to suggest a piece of software running on the client (something like yahoo mail/picassa does) that will take care of 'managing' (network interruptions/resume support etc) the upload of images.
For the server side, you could process these one at a time (assuming your client is sending them that way)..thus keeping it simple.
take a peek at http://gallery.menalto.com
they have a dozen of methods for uploading pictures into galleries.
You can choose ones which suits you.
Either have a client app, or some Ajax code that sends the images one by one, preventing timeouts. Alternatively if this is not available to the public. FTP still works...
I'd suggest a client application (maybe written in AIR or Titanium) or telling your users what FTP is.
deviantArt.com for example offers FTP as an upload method for paying subscribers and it works really well.
Flickr instead has it's own app for this. The "Flickr Uploadr".