Encoding.com and Amazon S3 workflow? - php

I am creating a video website and I have codeigniter libraries that are successfully uploading the files to amazon S3 and encoding the files with encoding.com to MP4's.
My problem is that if I upload the file to S3 first, and then send to encoding.com it works perfectly but it doesn't send back to amazon S3. Would it be proper to just somehow download the encoding.com URL onto my server and then reupload it to S3 again? The url is like:
http://encoding.com.result.s3.amazonaws.com/684981684684-681684384384_53169775.mp4
I don't see anything in the encoding.com api about reuploading the finished file back to S3 or onto the host server. Is it standard practice to just use the encoding.com generated URL to show the client side files? Like using encoding.com as a CDN?
I'm just confused on the best order to do what I'm trying to accomplish. Anyone have any ideas?

Probably not exactly the answer you're looking for, but this is super straight forward with zencoder. S3 to s3 is fully supported and one of the nice, easy features about the platform.
An s3 URL looks something like:
"url": "s3://mybucket/path/file.mp4"
Works for both inputs and outputs.
Zencoder is straightforward to implement, faster, and has better features so save yourself some time and energy.

Related

Handling large file uploads and sending to API

I'm stuck wondering what the best solution is to handling large file uploads and sending them to a third-party API. Any pointers on what would be a good solution would be very welcome. Thank you in advance.
The end goal is to send video files to this API - https://docs.bunny.net/reference/manage-videos#video_uploadvideo. The complication is that the files are often large - up to 5GB in size.
I have an existing website built in PHP7 that runs on a LAMP setup on Amazon Lightsail and I want to add a feature for users to upload video files.
Currently I'm uploading the files directly to Amazon S3 using a pre-signed URL. This part is working fine.
But I need to send the files to the API mentioned above. This is where I get stuck!
I think there's two options to explore - (1) find a way to upload directly to the API and skip the S3 upload or (2) continue with uploading to S3 first and then transfer to the API. But I'm not sure if option 1 is even possible or how to do option 2!
With option 1, I'm wondering if there's a way to upload the files from the users directly to the API. If I do this using the regular HTML form upload, then the files are stored temporarily on my server before I can use cURL through PHP to transfer them to the API. This is really time consuming and feels very inefficient. But I don't know how else to send the files to the API without them first being on my server. Maybe there's an option here that I don't know about!
With option 2, I can already upload large files directly to S3 with pre-signed URLs and this process seems to run fine. But I don't know how I would then send the file from S3 to the API. I can use an S3 trigger on new files. But when I looked at Lambda, they have a tiny file size limit. Because my site is hosted on Lightsail, I noticed they have a container option. But I don't know if that can be used for this purpose and if so, how.
Basically, I'm not sure what solution is best, nor how to proceed with that. And maybe there's an option 3 that I'm not aware of!
I would welcome your input.
Many thanks in advance.

Upload to S3 via Cloudfront

So as we all know Cloudfront now supports uploading to S3 via the edge points.
However, I'm not really sure how to do this? I know it'll not be fully featured (i.e. not support authorize headers and part uploads) but I'm keen to do straight uploads.
I'm working in PHP, though it doesn't appear this is supported on the API yet as a method. Doing something rather simpler looks to require various authorisation milestones.
Has anyone found the best way to do this yet or just some suggestions for the best way as I'm trying CURL POST and other such things to no avail.
You should probably read more about the browser-based POST uploads feature of Amazon S3 on the AWS docs. Doing an POST upload to Amazon S3 requires you to send a special JSON policy doc along with your POST request and upload. The S3 PostObject class in the AWS SDK for PHP is helpful for generating this policy, based on your provided options, as well as generating other form element values you need to include with your form/request.
Though I haven't tried it yet, you probably just need to swap out the S3 bucket endpoint for your CloudFront distribution endpoint to do the upload via the CloudFront edge location. Also, make sure your distribution is configured to accept POST requests. To get a more official answer to your question, I'd ask the Amazon CloudFront team on the CloudFront forum.
You should be able to do this now, I would give it another try. Be sure to use the S3 bucket's DNS name when signing your request, not your CloudFront distribution's.

How to store image directly to AWS S3 bucket, uploaded using fckeditor?

Recently I have moved my existing site in Amazon AWS and I am in trouble to store the uploaded image or any other file type directly from Fck-editor to Aws S3 bucket. I have search my problem in Google, but not found any proper solution in this regard.
Does anyone have any advise or possible solutions?
Thanks for your time.
To do this would require the client (here: the browser) to have your secret credentials to access S3 to store anything. Unless the system is only going to be used by you alone, providing your secret credentials to random website visitors sounds like a bad idea. You're better off uploading to your server first, which can then store the file at S3. This also allows you to do some validation on the uploaded data first.
I think its a unique requirement,
you can just hack the FCk-editor uploading section and implement S3 uploads.
Here this question will help you to get basic idea of S3 uploads.
Hope its helps..

Zend_Cloud_Storage how to send files to browser?

I'd like to store my asset in the cloud. I thought Zend_Cloud_Storage may be the right library? For now they are saved locally using Zend_Cloud_StorageService_Adapter_Filesystem. Upload works so far.
What is the right way to send files from Zend_Cloud_StorageService_Adapter_Filesystem to the browser? I guess this is only possible by streaming them through php? If yes any code examples out there because I have problems getting it working with all the file header.
Using S3 I probably would make a redirect or directly link the files to the s3 bucket?
I'm pretty new to cloud storage so I really appreciate your help.
Kind regards, Manuel
The Zend Cloud services are fairly new and from my experience the documentation still needs a bit of work. Depending on the cloud service, you have either the choice of fetching your objects and delivering them to the user or in some cases you can give the user a direct URL to the asset (this largely depends on if you need to control access to the object).
To fetch an item programmatically, you should be able to get it to work with right header, for example:
$image = $storage->fetchItem("/my/remote/path/picture.jpg");
header("content-type: image/jpg");
echo file_get_contents($image);
In theory you can get the content type for the header from the metadata, but I'm not sure if it is consistent between adapters.
Linking directly depends on your Cloud. For the filesystem you should know the path to where your files are, so you can link from there if they are accessible. In Amazon, you can set the permissions to public when you upload the file - there is an example in the S3 doco here.
I'd recommend having a look at the Zend Service S3 doco, Zend Cloud is just a wrapper for it and in trying to be generic it looses a lot of functionality. if you work with it for any length of time I suspect you'll eventually end up doing calls straight to the service.
Good luck!

Loading secret FLV files from Amazon S3

I'm starting a new project that involves users paying to see educational videos. These videos (FLV) are hosted with Amazon S3 while the site itself is hosted on a regular web host.
I've tried to read up on securing the S3 files, and can't find any good solution for this. I don't want my users to download the videos directly.
I read something about setting up a HTTP streaming server, but I'm not quite sure how a service like this works, and how to set it up.
Anybody with any experiences on how to solve this?
You might like to look at s2Member for Wordpress - it has Amazon s3 protected files built in, with the time lapse thing, plus protected pages/ etc that you can setup pretty easily.
I don't want my users to download the videos directly.
Get used to it.
Even with an RTMP streaming server, it's pretty easy to save the streams. You can add a load of obfuscation at the server and decoder to try to defeat the automated tools, but in the end what you have here is the unsolvable Copy Protection Problem.
There is no way to hide network path from end-user. However, you can go using expiring passwords that depend on time, user and content-section if Amazon allows you to use .htaccess files or PHP scripts.

Categories