I'm stuck wondering what the best solution is to handling large file uploads and sending them to a third-party API. Any pointers on what would be a good solution would be very welcome. Thank you in advance.
The end goal is to send video files to this API - https://docs.bunny.net/reference/manage-videos#video_uploadvideo. The complication is that the files are often large - up to 5GB in size.
I have an existing website built in PHP7 that runs on a LAMP setup on Amazon Lightsail and I want to add a feature for users to upload video files.
Currently I'm uploading the files directly to Amazon S3 using a pre-signed URL. This part is working fine.
But I need to send the files to the API mentioned above. This is where I get stuck!
I think there's two options to explore - (1) find a way to upload directly to the API and skip the S3 upload or (2) continue with uploading to S3 first and then transfer to the API. But I'm not sure if option 1 is even possible or how to do option 2!
With option 1, I'm wondering if there's a way to upload the files from the users directly to the API. If I do this using the regular HTML form upload, then the files are stored temporarily on my server before I can use cURL through PHP to transfer them to the API. This is really time consuming and feels very inefficient. But I don't know how else to send the files to the API without them first being on my server. Maybe there's an option here that I don't know about!
With option 2, I can already upload large files directly to S3 with pre-signed URLs and this process seems to run fine. But I don't know how I would then send the file from S3 to the API. I can use an S3 trigger on new files. But when I looked at Lambda, they have a tiny file size limit. Because my site is hosted on Lightsail, I noticed they have a container option. But I don't know if that can be used for this purpose and if so, how.
Basically, I'm not sure what solution is best, nor how to proceed with that. And maybe there's an option 3 that I'm not aware of!
I would welcome your input.
Many thanks in advance.
Related
Good afternoon!
I am trying to replace my host with an S3 Amazon service for images. The website currently uploads images successfully using an uploader without problems. So the questions I have are about migrating from host to S3. Trust me I have read the Amazon support blogs and searched the net. But I can find no single, concise, or easy to understand technical advice--just a lot of Amazon jargon.
For info, at the moment the uploader does the two things you would expect it to do well:
1) it uploads the files to a target folder to the specified (Define) UPLOAD path(no problem)
2) it generates the image POSTS ready for processing (no problem).
It is a nice uploader with lots of goodies for image management so I don't want to replace it- I simply want to build a bridge from the outputs it creates (as above) into Amazon S3?.
So my specific questions are can anyone tell in simple terms how to:-
..enable users to Upload the images posted by my uploader to Amazon S3
..Allow users to get images back via their own dynamic page views.
Many thanks in anticipation
ps for info this is the uploader but im not sure that is needed bearing in mind it works as expected and produced the outputs above and I cannot replace it at this stage of the development process.
https://github.com/blueimp/jQuery-File-Upload
Relatively new to web development here, but am trying to implement an image upload feature, the contents of which will be previewed to the person (administrator) uploading the image, and then stored in a database (and displayed to the end user on a different page).
I found a resource that uses a Imageshack API, and was a bit confused about what this is and how the person implemented the API to achieve the image upload. The code for this is here: http://www.sceditor.com/posts/how-to-upload-and-insert-an-image/
When I googled "Imageshack API," I kept running across something that said I need to request a key. What does this mean, and do I have to do it? Is this the easiest way to go about creating an image upload feature for my purposes?
Thank you all very much!
Imageshack API is for uploading image files to your account hosted at Imageshack.com. It seems that you want to upload image files to your own website and store such files on your own web servers (either in a cloud service such as AWS or your co-located/managed servers at some data centres). So, you probably do not want to use Imageshack.
As to how to upload image files using HTML & PHP, you may want to check out a short tutorial at:
www.w3schools.com/php/php_file_upload.asp
Also, by the way, storing image files into a database such as MySQL may not be a good idea -- image files should be stored as files. It is faster to access such image files on a web server than to access image contents stored in a database.
I am creating a video website and I have codeigniter libraries that are successfully uploading the files to amazon S3 and encoding the files with encoding.com to MP4's.
My problem is that if I upload the file to S3 first, and then send to encoding.com it works perfectly but it doesn't send back to amazon S3. Would it be proper to just somehow download the encoding.com URL onto my server and then reupload it to S3 again? The url is like:
http://encoding.com.result.s3.amazonaws.com/684981684684-681684384384_53169775.mp4
I don't see anything in the encoding.com api about reuploading the finished file back to S3 or onto the host server. Is it standard practice to just use the encoding.com generated URL to show the client side files? Like using encoding.com as a CDN?
I'm just confused on the best order to do what I'm trying to accomplish. Anyone have any ideas?
Probably not exactly the answer you're looking for, but this is super straight forward with zencoder. S3 to s3 is fully supported and one of the nice, easy features about the platform.
An s3 URL looks something like:
"url": "s3://mybucket/path/file.mp4"
Works for both inputs and outputs.
Zencoder is straightforward to implement, faster, and has better features so save yourself some time and energy.
I'm currently working on a Django application using AngularJS for the frontend part.
I want for the moment to upload some images on the server and get as callback a list with the paths of the uploaded files. I want to send afterwards to the API (I am using Tastypie framework) as a POST request the callback and insert it into the database in a specific field. My issue is that I don't know how to approach the images upload (I should use PHP?) part and most important how to receive the callback? I hope that I explain clear enough. :D
Documentation for uploading files in django: https://docs.djangoproject.com/en/dev/topics/http/file-uploads/
HTML only allows to upload one file per upload-controller. To upload multiple files either the user can create a zip file (or similar), upload that and on the server side you extract it into multiple files. Or you have to use a flash component. There are lots of these available, for example http://developer.yahoo.com/yui/uploader/
I'm starting a new project that involves users paying to see educational videos. These videos (FLV) are hosted with Amazon S3 while the site itself is hosted on a regular web host.
I've tried to read up on securing the S3 files, and can't find any good solution for this. I don't want my users to download the videos directly.
I read something about setting up a HTTP streaming server, but I'm not quite sure how a service like this works, and how to set it up.
Anybody with any experiences on how to solve this?
You might like to look at s2Member for Wordpress - it has Amazon s3 protected files built in, with the time lapse thing, plus protected pages/ etc that you can setup pretty easily.
I don't want my users to download the videos directly.
Get used to it.
Even with an RTMP streaming server, it's pretty easy to save the streams. You can add a load of obfuscation at the server and decoder to try to defeat the automated tools, but in the end what you have here is the unsolvable Copy Protection Problem.
There is no way to hide network path from end-user. However, you can go using expiring passwords that depend on time, user and content-section if Amazon allows you to use .htaccess files or PHP scripts.