I am using ImageMagick in Codeigniter 2.x to save multiple copies of an image for a social site. After each crop, it sends the image to an AWS s3 bucket.
I am wondering if there is any way to make this process Asynchronous in PHP to keep the user from having to wait for 4 crops and saves before returning the response.
You can introduce a queuing system in your codeigniter project. If you are on AWS, one option is to send a message to SQS to indicate an image has been uploaded, and you can process the images asynchronously without holding up the user, https://aws.amazon.com/articles/PHP/1602
You can also use AWS Lambda taking a serverless approach, http://docs.aws.amazon.com/lambda/latest/dg/walkthrough-s3-events-adminuser-create-test-function-create-function.html
or Cloudinary (http://cloudinary.com/) offers a great cloud solution that will provide a very flexible image resizing/processing service for most projects for free.
Related
I'm stuck wondering what the best solution is to handling large file uploads and sending them to a third-party API. Any pointers on what would be a good solution would be very welcome. Thank you in advance.
The end goal is to send video files to this API - https://docs.bunny.net/reference/manage-videos#video_uploadvideo. The complication is that the files are often large - up to 5GB in size.
I have an existing website built in PHP7 that runs on a LAMP setup on Amazon Lightsail and I want to add a feature for users to upload video files.
Currently I'm uploading the files directly to Amazon S3 using a pre-signed URL. This part is working fine.
But I need to send the files to the API mentioned above. This is where I get stuck!
I think there's two options to explore - (1) find a way to upload directly to the API and skip the S3 upload or (2) continue with uploading to S3 first and then transfer to the API. But I'm not sure if option 1 is even possible or how to do option 2!
With option 1, I'm wondering if there's a way to upload the files from the users directly to the API. If I do this using the regular HTML form upload, then the files are stored temporarily on my server before I can use cURL through PHP to transfer them to the API. This is really time consuming and feels very inefficient. But I don't know how else to send the files to the API without them first being on my server. Maybe there's an option here that I don't know about!
With option 2, I can already upload large files directly to S3 with pre-signed URLs and this process seems to run fine. But I don't know how I would then send the file from S3 to the API. I can use an S3 trigger on new files. But when I looked at Lambda, they have a tiny file size limit. Because my site is hosted on Lightsail, I noticed they have a container option. But I don't know if that can be used for this purpose and if so, how.
Basically, I'm not sure what solution is best, nor how to proceed with that. And maybe there's an option 3 that I'm not aware of!
I would welcome your input.
Many thanks in advance.
What is the most effiecient and less resource implementation when uploading images to S3 having a restful API setup?
Should the web application handle the generation of thumbnails and upload to Amazon S3 and make an API request on successful upload?
OR
Should the web application pass the image request to the rest API to handle the generation of thumbnails and upload to Amazon S3 then save data to DB on success?
Ideally, you would want to write an amazon lambda function to deal with the image uploading, i.e.: aws docs. This way you would only need to upload one image (saving on throughput) and amazon would seamlessly handle the image resizing separately from the api (as image handling, such as resizing and uploading, should ideally be a separate service).
Out of the two choices that you posted in your question, i would definitely choose the second one, because:
You don't want your user to upload multiple images. UX, users data costs on mobile, uploading time, possibility of failing - everything increases dramatically if you let the web app handle this task
you can have much more freedom and more sophisticated tools like Imagick to work with your images.
You can potentially handle uploading (to amazon) and resizing a-synchronically, further improving the perceived speed.
I am developing an application which allows people to upload photos to an Amazon S3 instance.
Though I have just edited my PHP config variables to allow a maximum of 10mb per image, my webapp returns a 500 server error without any response whenever I try to upload a photo below but close to that 10mb maximum (like 9.4mb).
The projects is made up of a webapp built using NodeJS and a Laravel PHP API. The file upload runs via this webapp.
I would very much appreciate any idea on what I could do to resolve this or any experiences with AWS and photo upload.
Thanks!
We have six appication server(under LVS) which send request randomly to all servers.
Around 8 years back we used to store images in databases.
Pros : Can be accessed from all the application server
Cons : Slow
Around 5 years back, we shifted images to store as a file on one of the six application server with the help of nginx rules that make sure all image read/write request go to single server.
Pros : Fast
Cons : All images read/write request go to single server.
Question: Is there any better images to solve the following issue:
1. Can be accessed from all application server.
2. Fast access
Note : we move images to common image server after some time.
We don not move instantly as we dont want to reply on that server and also it will increase user upload time.
You can leverage the power of Content Delivery Networks (CDN) and storage buckets which is provided by services like AWS.
Upload all images to a single server say an AWS S3 bucket https://aws.amazon.com/s3/ which will help you to get all images from a central server and will be accessible from all application servers.
You can then link your S3 bucket with a CDN service like AWS's Cloudfront or some of the free services like Cloudflare.
https://aws.amazon.com/cloudfront/
Read more about how to use S3 for PHP here:
https://devcenter.heroku.com/articles/s3
http://docs.aws.amazon.com/AmazonS3/latest/dev/RetrieveObjSingleOpPHP.html
Read more about linking an S3 bucket to cloudfront here:
http://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/MigrateS3ToCloudFront.html
So AWS's S3 will provide you globally accessible images and Cloudfront CSN will provide you amazing speeds.
I am planning on using a CDN for user image and video uploads for my site. The only problem is I cannot upload direct to a CDN as I need to process and manipulate the image and videos before they get stored on my server using GD Image Library and FFMPEG.
What is the best way for me to get the capacity and delivery benefit of a CDN but still be able to process the files that get uploaded there?
I want the videos and images to be available for review after upload almost instantly. I don't want the user to have to experience a double upload time (sent to my server, then sent to CDN).
The only solution I can think of is storing a temporary version of the files on my server and then behind the scenes sending a copy to the CDN. Once the upload to the CDN is complete all paths to the file would be switch to the CDN instead of my server. Does this make sense?
Use a CDN that provides an API.
A great CDN service in the UK and USA is Rackspace. They provide an API for a lot of programming languages such as PHP.
You may have to spend processing power but from then on bandwidth can be on the CDNs back. You can also outsource video encoding with a service like panda stream