How uploading to S3 through PHP script on EC2 works? - php

Please help me understand the process of uploading files to Amazon S3 server via PHP. I have a website on EC2, which will have PHP script for uploading file to S3 server from client's machine. What I need to understand, is whether the file will go directly to S3 from client's machine, or if it will first be uploaded onto EC2, and then to S3. If it's the second option, then how can I optimize the upload so that file goes directly to S3 from client's machine?

It is possible to upload a file to S3 using any of the scenarios you specified.
In the first scenario, the file gets uploaded to your PHP backend on EC2 and then you upload it from PHP to S3 via a PUT request. Basically, in this scenario, all uploads pass through your EC2 server.
The second option is to upload the file directly to S3 from the client's browser. This is done by using a POST request directly to S3 and a policy that you can generate using your PHP logic, and attach it to the POST request. This policy is basically a set of rules that allow S3 to accept the upload (without it anyone would be able to upload anything in your bucket).
In this second scenario, your PHP scripts on EC2 will only need to generate a valid policy for the upload, but the actual file that's being uploaded will go directly to S3 without passing trough your EC2 server.
You can get more info on the second scenario here:
http://aws.amazon.com/articles/1434
even if it's not PHP specific, it explains how to generate the policy and how to form the POST request.
You can also get more information by reading through the API docs for POST requests:
http://docs.aws.amazon.com/AmazonS3/latest/API/RESTObjectPOST.html
EDIT: The official AWS SDK for PHP contains a helper class for doing this: http://docs.aws.amazon.com/aws-sdk-php-2/latest/class-Aws.S3.Model.PostObject.html

Related

Feed Pull from Amazon s3 Bucket in a third party application

I need your help or get some Idea in doing the Integration of s3 through a third party vendor application. I have some issues that I am not able to resolve, since I am not an expert with the Amazon s3 stuff.
So I have an application that is called Medallia and I am trying to perform a feed pull to export or download a file which is uploaded into the s3 bucket. I was able to figure out how to upload the file to the s3 bucket, but I am stuck on downloading the files or (feed pulling) the file to the app
I have added the information below for you to review!
Amazon s3 (Vendor Application Settings)
pull method: Amazon s3
Feedpull: feedpull.va.dev.us-west-1.medallia
Amazon s3 access Key: AKIA****(Confidential)
End Point: s3-website-us-gov-west-1.amazonaws.com
I have added the URL from where i took the end point from. I am not sure if i have to use Web or Storage. (http://docs.aws.amazon.com/govcloud-us/latest/UserGuide/using-govcloud-endpoints.html) Please inform!
Why am I doing this?
- So that I can have the files that gets uploaded to Amazon s3 bucket be pulled on schedule every end of the week to update/maintain the third party application.
Additional Info Needed (If Possible)
I need to have the python installed and have a Command to COI and then drop files on to the s3 bucket. This will help me to test the s3 Bucket (on the web services) through Automatic feed pull.
ERRORS
Before no end-point
Error while pulling feedpull. The configuration is using the wrong endpoint for the specified bucket for s3. exception: nonfatal error.
After end-point
Error while pulling feedpull: s3 Pull (feedpull I)Unable to execute HTTP request. Connect timed out. One possible cause is a malformed endpoint. nonfatal error exception.
Thank you!
The web site hosting endpoints are for unsecured, publicly-accessible content, and don't support authentication... so that isn't likely what you are looking for.
Assuming you're working in US GovCloud, you want one of the REST endpoints:
s3-fips-us-gov-west-1.amazonaws.com
s3-us-gov-west-1.amazonaws.com
The first one is FIPS 140 Level 2 compliant, so I would suggest that this is the one you want, unless you have a reason not to.

AWS Lambda as a file verification system for s3 uploads

I need to use s3 to store content from users and govern access, in a social network type of service.
So far this is what I have thought of doing:
Client tells LAMP server he wants to upload a file
LAMP authenticates, and generates a presigned url for s3 where user can upload. It also creates an encrypted version of that key using private key. Then it adds this key, along with the user who started it in a mysql table(along with when it was started)
LAMP then sends the key and the digital signature from 2. to client.
Client then uploads the file to s3
After finishing, he tells LAMP that that file was completed. It sends in the key and the digital signature.
LAMP makes sure both the key and the signature match. If they do, LAMP knows that the client is honest about the key being given to him(and he has not randomly generated it)
LAMP then checks s3 to make sure that the file with that key exists, if it does,then delete the row which was added in 2.
After asking around, I was told that it is not possible for s3 itself to verify that a file is a 'valid' file of a certain type(I want to enforce only images )
So I decided to use aws lambda to verify it(if its a 'wrong' file, just delete it) . Just after the file is uploaded to s3, lambda can be fired.
However, it is possible that BEFORE lambda finishes checking the file,step 7 above gets executed. This means my server will think file is valid.
Is there any way to make an s3 upload+lambda execution atomic ?
Any suggestions are welcome

Download multiple images from Amazon S3 in a zip file?

I have a web page on a web hosting and images are stored on Amazon S3. I want with php be able to download multiple images from Amazon S3 through my web page in a zip file.
What are my options and what is the best?
What I know, it is not possible to compress files on S3. Can I use Amazon lambda?
Best solution I've come across.
The user selects on my website which images they want to downloaded.
I get the file name from my database on my web host and download the
images from S3 to a temporary directory on my web host.
A zip file is created in a temporary directory and a link is sent
to the user. After a certain time, I clear up the temporary directory (with a script) on my web host.
But it would be great if there are a way that did not go through my hosting to create and download the zip-file.
AWS S3 is "basic building blocks", so it doesn't support a feature like zipping multiple objects together.
You've come up with a good method to do it, though you could stream the objects into a zip file rather than downloading them. EC2 instances can do this very quickly because they tend to have fast connections to S3.
Lambda doesn't work for this, as it is only triggered when an object is placed into an S3 bucket. You are doing the opposite.

Make requests from amazon S3 to EC2

I'm using S3 for storing generated HTML and JS files (from an EC2 instance).
I'm trying to make AJAX requests from these JS files to my PHP files on my EC2 instance.
So, I've added this line to my htaccess:
Header set Access-Control-Allow-Origin *
When I test locally, AJAX requests are made from S3 to my localhost, it works perfectly.
But when I try to make these requests to EC2 I get the "not allowed by Access-Control-Allow-Origin" message...
Does anyone know if it's possible to do such a thing ?
If not is there anyway to dynamically load content to S3 files ?
Thanks

How to Delete a file from Amazon S3 by HTTP Request using PHP

I would like to find a simple PHP script that will send a HTTP Request to my bucket in Amazon S3 and delete a file. I have found code around the web using PHP that manages S3 files, but what I am looking for is just the delete request. I already know the name of the file and the bucket, I just want to send the request to delete it.

Categories