I want to transcode video in 360p, 480p, 720p and then upload to amazon s3.
Currently we are using php library FFMPEG
I have successfully transcode video on my server. But I did not get that how to achieve same on amazon s3.
Do I need to first upload original video on s3 and then get that video and transcode in different format and send to amazon s3? is it possible?
Or if any other way than please suggest me.
S3 is not a block file system, it is an object file system. The difference here is that, normally, you cant mount a S3 bucket like a standard unix FS and work on file with fopen(), fwrite() ect... Some trick exists to work on S3 like any other FS but I would suggest an other option :
You have to transcode the video on a localy mounted FS (like an AWS EFS, or a local file system), then "push" (or upload) the whole transcoded video onto the S3 bucket. Of course, you can improve this process in may ways (remove temp file, do parallel works, use Lambda service, or task in containers...). You should avoid to do many upload/download from or to S3 (because it is time and cost consuming). Use a local storage as much as possible, then push the resulting data when they are ready on S3.
Also AWS have a service to do video transcodification : https://aws.amazon.com/en/elastictranscoder/
Related
We have a zip file containing a large JSON file in it that we need to unzip. We currently fetch the file in a lambda using Laravel, copy and unzip it locally (in the lambda), and then upload the JSON file back to S3 for processing.
It however seems that the lambda doesn't process large files (moving it back to s3 after unzipping in the lambda). (180MB). We could continue to up the lambda resources (deployed via Laravel Vapor), however, we're looking for an in-memory option that could perhaps provide a streaming way of unzipping.
Is there such a solution for PHP?
I want users to upload large files (like HD videos and big PDFs) and am using this line of code to upload the files. Storage::disk('s3')->putFile('uploads', new File($request->file('file_upload')));
the problem is even though I have a good internet speed it takes a very long time to upload the file. what can I do to get a faster file upload?
There are actually two network calls involved in the process.
From client-side the file gets uploaded to your server.
Via server-to-server call the file gets uploaded from your server to s3.
The only way to reduce the delay is to directly upload the files from client to s3 using client-side SDKs securely. With this the files will be directly stored in the S3 bucket.
Once the files are uploaded to s3 via AWS S3 client-side SDKs, you can post the attributes of the file along with the download URL to Laravel and update it to DB.
The plus point of this approach is it allows you to show actual file upload progress at the client-side.
This can be done via the AWS amplify library which provides great integration with S3: https://docs.amplify.aws/start
The other options:
JS: https://softwareontheroad.com/aws-s3-secure-direct-upload/
Android:
https://grokonez.com/android/uploaddownload-files-images-amazon-s3-android
iOS:
https://aws.amazon.com/blogs/mobile/amazon-s3-transfer-utility-for-ios/
Please use this
$file_name = $request->file('name');
$disk = Storage::disk('s3');
$disk->put($filePath, fopen($file_name, 'r+'));
Instead of
Storage::disk('s3')->put($filePath, file_get_contents($file_name));
And also increase
post_max_size,upload_max_filesize,max_execution_time,memory_limit
i have been working in a video website [platform: php] where i need to upload videos in Amazon S3 server and extract thumbnail.
I have created a bucket and uploaded video file successfully in that bucket. But i don't know how to extract the thumbnail from that uploaded video. So, that's where i stuck.
Any help will be appreciated. Thanks in advance!
Here you've got some options.
First - you can "extract" a thumbnail from the video before you upload it to AWS. Something like: upload video to your server, convert it to appropriate format if needed, take a thumbnail (or thumbnails), save them somewhere (e.g. on S3 or your local server) and then upload the video to S3. The disatvantage of this method is that your local server will have to do a lot of extra work, instead of serving your web visitors.
Second - you can use Amazon EC2 computing service for that: upload video to S3, trigger EC2 (e.g. with cron jobs) to take the video from S3, convert it, take thumbnails and upload the final result (converted video + thumbnails) back to S3. Disatvantages are: it's not very easy to implement this "communication" (you'll have to solve a lot of problems, like ensuring stable converts, creating job queues etc.), plus you'll have to use one more AWS service along with S3.
What's about video converting and getting thumbnails? There are many tools and programs for that. I like using ffmpeg for video converting (also there's PHP wrapper for using it's functionality with php - php-ffmpeg, but using ffmpeg itself (e.g. using php's exec() function) will give you more flexibility and features, please read documentation for more details). FFMpeg can extract thumbnails from videos as well, but it takes some time (there are lots of discussions about how to do it effectively), but I'd suggest you to use ffmpegthumbnailer for this purpose. It has simpler usage and is optimized especially for getting thumbnails from video files.
You should do this on the machine you are using for the upload. Here you have direct file-system access to the file. Once it is in S3 it is accessible via HTTP only.
The tool ffmpeg can be used to create a thumbnail of many video formats.
Example:
ffmpeg -i "video.flv" -ss 00:00:10 -f image2 "thumbnail.jpg"
Would create a thumbnail at video second 10 and save it as thumbnail.jpg
When using PHP you can use system to execute
It seems like the images read from amazon s3 load really slow. I had the images on the same server as the website and it loaded super fast. Is it loading slow cause it has to access it from s3 now?
Is there nothing i can really do about it ??
Using this to read the image files:
$secure_link = gs_prepareS3URL("myAmazon."/thumb/thumb_".$id, $bucket);
readfile($secure_link);
Function is from : http://www.richardpeacock.com/blog/2010/07/amazon-aws-s3-query-string-authentication-php
If you're embedding the images, you should serve them through Amazon CloudFront (Amazon's CDN Service). CloudFront loads the image/file from S3 (or a custom origin) then caches it on their edge servers.
CloudFront Tutorial - http://www.hongkiat.com/blog/amazon-cloudfront-how-to-setup-cloudfront-to-work-with-s3/
S3 is just for Static Storage, by default it does not have the optimized performance, though there are ways to make it better.
Before you configure CloudFront, you should try enabling `Transfer Acceleration' on S3 bucket.
Source: https://docs.aws.amazon.com/AmazonS3/latest/userguide/transfer-acceleration.html
Benefits are,
Your customers upload to a centralized bucket from all over the
world.
You transfer gigabytes to terabytes of data on a regular basis
across continents.
You can't use all of your available bandwidth over the internet when uploading to Amazon S3.
This comes at a price.. (https://aws.amazon.com/about-aws/whats-new/2016/04/transfer-files-into-amazon-s3-up-to-300-percent-faster/)
Pricing for Amazon S3 Transfer Acceleration is simple, with no upfront
costs or long-term commitments. You simply pay a low, per-GB rate for
data transferred through the service. The pricing is designed to be
risk free: if Amazon S3 Transfer Acceleration isn’t likely to make a
difference in the speed of an upload (like when you upload data over
the short distance from a client in Tokyo to an S3 bucket in Japan),
you won’t be charged anything extra for that upload. For more
information on pricing, see Amazon S3 pricing.
I have a Codeigniter web app that is uploading many tiny files every hour to Amazon S3, which is causing my S3 request charges to shoot up real fast. One way to overcome this will be to zip up the file, upload the zip file to S3, then unzip it when it is on S3.
Can this be done using EC2? Or is there a better method to achieve this? Thank you!!
EDIT: If I were to use EC2, do I use PHP to trigger the creation of a EC2 instance, upload the PHP file required to unzip the zipped files, copy the uncompressed files to S3, then destroy the EC2 instance?
If you have an EC2 machine in the same region I would suggest you upload it there zipped and then it drop it to s3 from there unzipped. S3 cannot unzip it on its own as its all static.
Theres no charges between ec2 and s3 so ec2 can handle the unzipping and then write it out into your s3 bucket without additional transfer charges.
You can write code in a lambda to unzip a file of S3 bucket, you just have to use it, AWS Lambda will do this for you.
Referece:
https://github.com/carloscarcamo/aws-lambda-unzip-py/blob/master/unzip.py
https://github.com/mehmetboraezer/aws-lambda-unzip
S3 is just storage. Whatever file you upload is the file that is stored. You cannot upload a zip file then extract it once its in S3. If you wrote the application the best thing I could say is to try to re-design how you store the files. S3 requests are pretty cheap... you must be making a lot of requests.
I have been using this service to unzip files full of thousands of tiny image files, each zip I upload is about 4GB, and costs around $1 to unzip using http://www.cloudzipinc.com/service/s3_unzip, maybe that might help someone.
Having said that, you might find it easier to use Python with the Boto library. That will work far more efficiently than PHP.