Feed Pull from Amazon s3 Bucket in a third party application - php

I need your help or get some Idea in doing the Integration of s3 through a third party vendor application. I have some issues that I am not able to resolve, since I am not an expert with the Amazon s3 stuff.
So I have an application that is called Medallia and I am trying to perform a feed pull to export or download a file which is uploaded into the s3 bucket. I was able to figure out how to upload the file to the s3 bucket, but I am stuck on downloading the files or (feed pulling) the file to the app
I have added the information below for you to review!
Amazon s3 (Vendor Application Settings)
pull method: Amazon s3
Feedpull: feedpull.va.dev.us-west-1.medallia
Amazon s3 access Key: AKIA****(Confidential)
End Point: s3-website-us-gov-west-1.amazonaws.com
I have added the URL from where i took the end point from. I am not sure if i have to use Web or Storage. (http://docs.aws.amazon.com/govcloud-us/latest/UserGuide/using-govcloud-endpoints.html) Please inform!
Why am I doing this?
- So that I can have the files that gets uploaded to Amazon s3 bucket be pulled on schedule every end of the week to update/maintain the third party application.
Additional Info Needed (If Possible)
I need to have the python installed and have a Command to COI and then drop files on to the s3 bucket. This will help me to test the s3 Bucket (on the web services) through Automatic feed pull.
ERRORS
Before no end-point
Error while pulling feedpull. The configuration is using the wrong endpoint for the specified bucket for s3. exception: nonfatal error.
After end-point
Error while pulling feedpull: s3 Pull (feedpull I)Unable to execute HTTP request. Connect timed out. One possible cause is a malformed endpoint. nonfatal error exception.
Thank you!

The web site hosting endpoints are for unsecured, publicly-accessible content, and don't support authentication... so that isn't likely what you are looking for.
Assuming you're working in US GovCloud, you want one of the REST endpoints:
s3-fips-us-gov-west-1.amazonaws.com
s3-us-gov-west-1.amazonaws.com
The first one is FIPS 140 Level 2 compliant, so I would suggest that this is the one you want, unless you have a reason not to.

Related

Store moodledata on S3 and manage using AWS SDK for PHP

I would like to store my moodledata folder on S3 bucket and share with multiple EC2 instances for autoscaling.
So without using S3fs, I want to manage this moodledata on S3 bucket using AWS SDK for PHP.
Also, I have searched many forums but I did not get any solution with AWS SDK for PHP. Regards,
There is a plugin https://moodle.org/plugins/tool_objectfs which almost does this. It does not replace the file system but shifts files to S3 based on critera. Example if file size is greater than 10kb. You will still need a shared file system but would be able to keep the cost down by shifting old and large files off to S3. In AWS the best way to have a shared file system is EFS service. It would be challenging to completely replace the file system with S3 as S3 is not a file system but an object store.

How to integrate opencart with Amazon CDN?

I have been trying to integrate Amazon S3 & Amazon Cloudfront with my website (Hosted on Amazon EC2) developed using opencart from past few days. While searching, i found a lot of extensions from which not a single one fits the requirement.
According to extensions all data be stored on local volume storage & you can create a sub-domain with directory root to /image/ directory & access the images form sub-domain. But here i do not see how the images and all get to Amazon S3. I might be missing something here. But below is what i want to implement.
What i want is to store all images & downloadable to Amazon S3 & retrieve the same from Amazon S3 using Amazon Cloudfront. When admin uploads an image then that gets stored in Amazon S3 instead of local volume storage.
I have gone through the /image/ models, library files which comes with opencart default installation. After seeing files, it looks impossible to implement what i want in current structure. The solution i see is rather i create my own library for this & update each & every file of opencart where images are being used or use any of the extensions present (using this will cause issues while using Amazon Elastic load balancing or Amazon Auto Scaling).
Any suggestions !!!
Searching the market place I found this solution, Amazon CloudFront / S3 Integration
The extensions says:
speed up image loading on your website using Amazon CloudFront. This integration allows you to easily upload your OpenCart image cache onto S3 and serve it through CloudFront.
Finally i found the way for this. Basically we have two Questions in my above questions :-
1. Integrate Amazon S3
2. Integrate Amazon CloudFront
Integrate Amazon S3
The best practice for this is to streamline all directories and files inside '{$ROOT}/image' completely with S3. Ultimately the goal is to make the application as much scalable as possible. This way when we put load balancer in front of our application then it won't create any issues as files no longer being saved on our local storage.
To achieve this - one have to customize the application so that when ever admin add/update any images then they all get add/update to S3 instead of local storage. Also when images get pulled in website front then they all get pulled from S3 instead of local storage.
Integrate Amazon CloudFront
This has two options:-
2a. With S3 implemented - Just need to provide the S3:Bucket:ARN in amazon cloudfront and change the images url throughtout the web application.
2b. Without S3 (Local Storage used) - Instead of S3:Bucket:ARN we need to give image directory url of our application to amazon cloudfront for example:- www.example.com/image/ that's it. Now change the urls of images through out the web application and images will be pulled out from amazon cloudfront url.
The easiest way I found is to host in AWS Lightsail plan. Although the AWS Lightsail plan does not support Opencart by default we can use the LAMP stack and add the opencart code in it.
Host LAMP stack in AWS Lightsail, Opencart hosting in AWS
In this post we describe
How to set up an instance in Lightsail
How to select the free tier plan
Update system and PHP version in AWS Lightsail
Opencart installation steps in the AWS Lightsail LAMP stack
Create Static IP
Create DNS Zone
Add Name servers to your domain registrar
Create a database, database user and grant access
Install free Let’s Encrypt Certificate
Configure HTTP to HTTPS redirection for Opencart
Activate the SEO URL
How to setup FileZilla SFTP in AWS Lightsail to transfer files?
PHPMyAdmin access
How to upgrade to a higher Lightsail package?
How to set up the CDN?
Let us know if any questions or concerns.

Change Cloudfront next origin file request

I have setup a cloudfront system for a website.
To serve of the fly picture transformation, i added a custom origin, being the website.
So, in my distribution, i have 2 origins :
- s3 bucket
- mywebsite.com/images
Wen i call cdn.mywebsite.com/500/picture.jpg
It will call my website like : website.com/api.php/file/500/picture.jpg
I get the s3 object, create the thumb, save it on server then upload to s3.
Up to here, it all works.
Now, i would like that the next request to this file does not go to my origin custom website, but to the s3 stored file.
I cannot find a way to define order of importance (weight) for multiple origins.
It seams that once frontcloud has a "route", it keeps the same one.
Any ideas ?
You cannot do multiple origins for a Cloud Front Distributions from the AWS side you have to customize this using either
Amazon CloudFront REST API
Bucket Explorer
Check this guide has both steps http://www.bucketexplorer.com/documentation/amazon-s3--how-to-create-distributions-post-distribution-with-multiple-origin-servers.html

Wordpress installed with Elastick BeanStalk overwrites uploaded images

I have Wordpress instance on Amazon Elastic BeanStalk. When I upload instance with EB scripts, the whole page is being replaced, also uploaded images which can be attached to posts. And after such automatic deploy, posts have missing pictures:)
I tried to solve this:
1) I logged into Amazon machine with SFTP, but my user ec2-user has only read-access to files. So I was not able overwrite only part of application, with retaining uploaded files.
2) I read I can use Amazon S3 as external storage for upload files. This is still not tested by me:). Do you know if this is good approach?
3) Any other approach for this problem? How to organize it on amazon: machine backup probably should be set?
The Elastic Beanstalk environment is essentially stateless; meaning that all data that is persisted to disk will be lost when the application is updated, the server is rebuilt or the environment scales.
The best way in my option is to use a plugin that writes all media files to AWS S3; something similar to the Amazon S3 and Cloudfront plugin.
You log files should also be shipped to a remote syslog server which you can either build yourself or use a 3rd party.
Google: loggly, logstash, graylog, splunk

How uploading to S3 through PHP script on EC2 works?

Please help me understand the process of uploading files to Amazon S3 server via PHP. I have a website on EC2, which will have PHP script for uploading file to S3 server from client's machine. What I need to understand, is whether the file will go directly to S3 from client's machine, or if it will first be uploaded onto EC2, and then to S3. If it's the second option, then how can I optimize the upload so that file goes directly to S3 from client's machine?
It is possible to upload a file to S3 using any of the scenarios you specified.
In the first scenario, the file gets uploaded to your PHP backend on EC2 and then you upload it from PHP to S3 via a PUT request. Basically, in this scenario, all uploads pass through your EC2 server.
The second option is to upload the file directly to S3 from the client's browser. This is done by using a POST request directly to S3 and a policy that you can generate using your PHP logic, and attach it to the POST request. This policy is basically a set of rules that allow S3 to accept the upload (without it anyone would be able to upload anything in your bucket).
In this second scenario, your PHP scripts on EC2 will only need to generate a valid policy for the upload, but the actual file that's being uploaded will go directly to S3 without passing trough your EC2 server.
You can get more info on the second scenario here:
http://aws.amazon.com/articles/1434
even if it's not PHP specific, it explains how to generate the policy and how to form the POST request.
You can also get more information by reading through the API docs for POST requests:
http://docs.aws.amazon.com/AmazonS3/latest/API/RESTObjectPOST.html
EDIT: The official AWS SDK for PHP contains a helper class for doing this: http://docs.aws.amazon.com/aws-sdk-php-2/latest/class-Aws.S3.Model.PostObject.html

Categories