I am developing an application which allows people to upload photos to an Amazon S3 instance.
Though I have just edited my PHP config variables to allow a maximum of 10mb per image, my webapp returns a 500 server error without any response whenever I try to upload a photo below but close to that 10mb maximum (like 9.4mb).
The projects is made up of a webapp built using NodeJS and a Laravel PHP API. The file upload runs via this webapp.
I would very much appreciate any idea on what I could do to resolve this or any experiences with AWS and photo upload.
Thanks!
Related
I'm stuck wondering what the best solution is to handling large file uploads and sending them to a third-party API. Any pointers on what would be a good solution would be very welcome. Thank you in advance.
The end goal is to send video files to this API - https://docs.bunny.net/reference/manage-videos#video_uploadvideo. The complication is that the files are often large - up to 5GB in size.
I have an existing website built in PHP7 that runs on a LAMP setup on Amazon Lightsail and I want to add a feature for users to upload video files.
Currently I'm uploading the files directly to Amazon S3 using a pre-signed URL. This part is working fine.
But I need to send the files to the API mentioned above. This is where I get stuck!
I think there's two options to explore - (1) find a way to upload directly to the API and skip the S3 upload or (2) continue with uploading to S3 first and then transfer to the API. But I'm not sure if option 1 is even possible or how to do option 2!
With option 1, I'm wondering if there's a way to upload the files from the users directly to the API. If I do this using the regular HTML form upload, then the files are stored temporarily on my server before I can use cURL through PHP to transfer them to the API. This is really time consuming and feels very inefficient. But I don't know how else to send the files to the API without them first being on my server. Maybe there's an option here that I don't know about!
With option 2, I can already upload large files directly to S3 with pre-signed URLs and this process seems to run fine. But I don't know how I would then send the file from S3 to the API. I can use an S3 trigger on new files. But when I looked at Lambda, they have a tiny file size limit. Because my site is hosted on Lightsail, I noticed they have a container option. But I don't know if that can be used for this purpose and if so, how.
Basically, I'm not sure what solution is best, nor how to proceed with that. And maybe there's an option 3 that I'm not aware of!
I would welcome your input.
Many thanks in advance.
What is the most effiecient and less resource implementation when uploading images to S3 having a restful API setup?
Should the web application handle the generation of thumbnails and upload to Amazon S3 and make an API request on successful upload?
OR
Should the web application pass the image request to the rest API to handle the generation of thumbnails and upload to Amazon S3 then save data to DB on success?
Ideally, you would want to write an amazon lambda function to deal with the image uploading, i.e.: aws docs. This way you would only need to upload one image (saving on throughput) and amazon would seamlessly handle the image resizing separately from the api (as image handling, such as resizing and uploading, should ideally be a separate service).
Out of the two choices that you posted in your question, i would definitely choose the second one, because:
You don't want your user to upload multiple images. UX, users data costs on mobile, uploading time, possibility of failing - everything increases dramatically if you let the web app handle this task
you can have much more freedom and more sophisticated tools like Imagick to work with your images.
You can potentially handle uploading (to amazon) and resizing a-synchronically, further improving the perceived speed.
I am using ImageMagick in Codeigniter 2.x to save multiple copies of an image for a social site. After each crop, it sends the image to an AWS s3 bucket.
I am wondering if there is any way to make this process Asynchronous in PHP to keep the user from having to wait for 4 crops and saves before returning the response.
You can introduce a queuing system in your codeigniter project. If you are on AWS, one option is to send a message to SQS to indicate an image has been uploaded, and you can process the images asynchronously without holding up the user, https://aws.amazon.com/articles/PHP/1602
You can also use AWS Lambda taking a serverless approach, http://docs.aws.amazon.com/lambda/latest/dg/walkthrough-s3-events-adminuser-create-test-function-create-function.html
or Cloudinary (http://cloudinary.com/) offers a great cloud solution that will provide a very flexible image resizing/processing service for most projects for free.
Please help me understand the process of uploading files to Amazon S3 server via PHP. I have a website on EC2, which will have PHP script for uploading file to S3 server from client's machine. What I need to understand, is whether the file will go directly to S3 from client's machine, or if it will first be uploaded onto EC2, and then to S3. If it's the second option, then how can I optimize the upload so that file goes directly to S3 from client's machine?
It is possible to upload a file to S3 using any of the scenarios you specified.
In the first scenario, the file gets uploaded to your PHP backend on EC2 and then you upload it from PHP to S3 via a PUT request. Basically, in this scenario, all uploads pass through your EC2 server.
The second option is to upload the file directly to S3 from the client's browser. This is done by using a POST request directly to S3 and a policy that you can generate using your PHP logic, and attach it to the POST request. This policy is basically a set of rules that allow S3 to accept the upload (without it anyone would be able to upload anything in your bucket).
In this second scenario, your PHP scripts on EC2 will only need to generate a valid policy for the upload, but the actual file that's being uploaded will go directly to S3 without passing trough your EC2 server.
You can get more info on the second scenario here:
http://aws.amazon.com/articles/1434
even if it's not PHP specific, it explains how to generate the policy and how to form the POST request.
You can also get more information by reading through the API docs for POST requests:
http://docs.aws.amazon.com/AmazonS3/latest/API/RESTObjectPOST.html
EDIT: The official AWS SDK for PHP contains a helper class for doing this: http://docs.aws.amazon.com/aws-sdk-php-2/latest/class-Aws.S3.Model.PostObject.html
I've got a media-rich site that's going just beyond the what our server can handle in terms of storage. Estimate between 500 gigs and 2 terabytes.
The media is uploaded through the website usually 500k to 30 megs at a time and are just videos and photos users have uploaded.
Using the PHP FTP functions the media is then copied from the temp directory into the media directoy.
I'm looking for the best way to handle storing the file after the user has uploaded it.
EDIT
I have a cloud computing account with Mosso and all our sites are hosted on dedicated boxes with RackSpace (traditional). My question applies to the actual process of getting media in to the site the way it currently is and then what to do next...
Try Rackspace Mosso or Amazon S3 with Cloud Front
Also think about using a Content delivery network, to speed up your visitors load times.
Both have an Application Programing Interface that can be used with your internal systems, for uploading new content.
Twitter use Amazon S3 for users avatars.