How to integrate opencart with Amazon CDN? - php

I have been trying to integrate Amazon S3 & Amazon Cloudfront with my website (Hosted on Amazon EC2) developed using opencart from past few days. While searching, i found a lot of extensions from which not a single one fits the requirement.
According to extensions all data be stored on local volume storage & you can create a sub-domain with directory root to /image/ directory & access the images form sub-domain. But here i do not see how the images and all get to Amazon S3. I might be missing something here. But below is what i want to implement.
What i want is to store all images & downloadable to Amazon S3 & retrieve the same from Amazon S3 using Amazon Cloudfront. When admin uploads an image then that gets stored in Amazon S3 instead of local volume storage.
I have gone through the /image/ models, library files which comes with opencart default installation. After seeing files, it looks impossible to implement what i want in current structure. The solution i see is rather i create my own library for this & update each & every file of opencart where images are being used or use any of the extensions present (using this will cause issues while using Amazon Elastic load balancing or Amazon Auto Scaling).
Any suggestions !!!

Searching the market place I found this solution, Amazon CloudFront / S3 Integration
The extensions says:
speed up image loading on your website using Amazon CloudFront. This integration allows you to easily upload your OpenCart image cache onto S3 and serve it through CloudFront.

Finally i found the way for this. Basically we have two Questions in my above questions :-
1. Integrate Amazon S3
2. Integrate Amazon CloudFront
Integrate Amazon S3
The best practice for this is to streamline all directories and files inside '{$ROOT}/image' completely with S3. Ultimately the goal is to make the application as much scalable as possible. This way when we put load balancer in front of our application then it won't create any issues as files no longer being saved on our local storage.
To achieve this - one have to customize the application so that when ever admin add/update any images then they all get add/update to S3 instead of local storage. Also when images get pulled in website front then they all get pulled from S3 instead of local storage.
Integrate Amazon CloudFront
This has two options:-
2a. With S3 implemented - Just need to provide the S3:Bucket:ARN in amazon cloudfront and change the images url throughtout the web application.
2b. Without S3 (Local Storage used) - Instead of S3:Bucket:ARN we need to give image directory url of our application to amazon cloudfront for example:- www.example.com/image/ that's it. Now change the urls of images through out the web application and images will be pulled out from amazon cloudfront url.

The easiest way I found is to host in AWS Lightsail plan. Although the AWS Lightsail plan does not support Opencart by default we can use the LAMP stack and add the opencart code in it.
Host LAMP stack in AWS Lightsail, Opencart hosting in AWS
In this post we describe
How to set up an instance in Lightsail
How to select the free tier plan
Update system and PHP version in AWS Lightsail
Opencart installation steps in the AWS Lightsail LAMP stack
Create Static IP
Create DNS Zone
Add Name servers to your domain registrar
Create a database, database user and grant access
Install free Let’s Encrypt Certificate
Configure HTTP to HTTPS redirection for Opencart
Activate the SEO URL
How to setup FileZilla SFTP in AWS Lightsail to transfer files?
PHPMyAdmin access
How to upgrade to a higher Lightsail package?
How to set up the CDN?
Let us know if any questions or concerns.

Related

Store moodledata on S3 and manage using AWS SDK for PHP

I would like to store my moodledata folder on S3 bucket and share with multiple EC2 instances for autoscaling.
So without using S3fs, I want to manage this moodledata on S3 bucket using AWS SDK for PHP.
Also, I have searched many forums but I did not get any solution with AWS SDK for PHP. Regards,
There is a plugin https://moodle.org/plugins/tool_objectfs which almost does this. It does not replace the file system but shifts files to S3 based on critera. Example if file size is greater than 10kb. You will still need a shared file system but would be able to keep the cost down by shifting old and large files off to S3. In AWS the best way to have a shared file system is EFS service. It would be challenging to completely replace the file system with S3 as S3 is not a file system but an object store.

How to backup a AmazonAWS website (Laravel) to AmazonAWS S3 bucket?

It would strike me as something rather mundane, but so far I cannot find any reference to how I can backup my Amazon AWS hosted application to my AWS S3 Bucket automatically on a regular basis?
Long story short, I have a AWS VPS with MySQL and WordPress installed and now am looking to do weekly/monthly backups of the site, including the DB. This seems to be possible via a WP plugin called "Updatedraft Plus" which allows for backups to be created and pushed over to S3 buckets, but given that this is a rather simple feature, I am quite sure that Amazon must have a solution for this already, I just cant seem to figure out how to do it.
I run both a WP site and a Laravel application if that matters, so could rely on either to assist with the backup.
Thanks for any advice on this.

Wordpress installed with Elastick BeanStalk overwrites uploaded images

I have Wordpress instance on Amazon Elastic BeanStalk. When I upload instance with EB scripts, the whole page is being replaced, also uploaded images which can be attached to posts. And after such automatic deploy, posts have missing pictures:)
I tried to solve this:
1) I logged into Amazon machine with SFTP, but my user ec2-user has only read-access to files. So I was not able overwrite only part of application, with retaining uploaded files.
2) I read I can use Amazon S3 as external storage for upload files. This is still not tested by me:). Do you know if this is good approach?
3) Any other approach for this problem? How to organize it on amazon: machine backup probably should be set?
The Elastic Beanstalk environment is essentially stateless; meaning that all data that is persisted to disk will be lost when the application is updated, the server is rebuilt or the environment scales.
The best way in my option is to use a plugin that writes all media files to AWS S3; something similar to the Amazon S3 and Cloudfront plugin.
You log files should also be shipped to a remote syslog server which you can either build yourself or use a 3rd party.
Google: loggly, logstash, graylog, splunk

PHP and Amazon S3 as a File Storage System for dynamic website

I have a dynamic website on which users can upload different files to be reviewed by different experts.
Users can upload/download their files to/from our server.
How can I use the Amazon S3 and PHP to expand my storage? I already have an AWS account and made a research on the net, but i still can't figure out How to make sure that every user can access only his files and the one he will upload will be in a specific bucket with his username?
You shouldn't create a separate bucket for each user. You should use one bucket per application/environment. To work with Amazon S3 you can use AWS SDK for PHP. Your file "structure" will look something like this: /{bucket}/{userid}/images/*, /{bucket}/{userid}/videos/*, etc.

Using AWS services to create subdomains on the fly to host a site on S3

INFO: I am working on an app built in php which facilitates users to create HTML templates and publish them on web.As the templates would be static I thought of using amazon S3 for storing them as it can host static websites and has good infrastructure overall for the application.br/>
QUERY: I am facing an issue while publishing it to the web.I want the template to be published as a subdomain on my domain,for eg: I own www.ABC.com,I ask the user to name the template,if he names it as mysite,I publish his template as mysite.ABC.com(similar for all users).Now,I can store the template in the S3 bucket using putObjectFile in the aws s3 api,but I am not sure how can I create a subdomain(on the fly) and publish it on that domain.(I want to automate the process for the user). Also,can I make the bucket as hosting static website using the API?
Earlier,I worked with cpanel and cpanel API's allow us to create domains and do a FTP to the domain with the content,I am not clear how can I achieve it here.
RESEARCH: The success till now I have achieved is,I have hosted a site using the S3 console.Using the AWS services I have moved the files to a bucket with the name same as the subdomain of the user.I want now to have the bucket endpoint changed to the subdomain.
REFERENCE: This website does the same,they create a directory like structure and publish the website on web.I don't know if they host it on Amazon,but I want to achieve something similar.Hope I am clear and get some guidance.
Thank you for the attention
I researched a similar scenario some time ago but I was unable to use S3 for this because of a S3 limitation.
To host a static website on S3 on a custom domain or subdomain, you need to create a bucket with a name that matches that domain. And because each S3 account is limited to 100 buckets, you will only be able to host those many domains or subdomains on a single account.
Based on the use case you described, I suspect this S3 limitation will also force you to find another solution.
In my case, the solution was to set up an EC2 instance that proxies requests to S3 after rewriting them. For example if someone requests:
http://mysite.abc.com/file.html
that goes through our EC2 server where the request is rewritten and forwarded to S3 as something like:
http://ourbucket.s3.amazonaws.com/mysite.abc.com/file.html
UPDATE:
There are several proxy servers that you could use for this but I would recommend nginx as it worked great in our case. To get started, check out the following nginx docs:
http://wiki.nginx.org/HttpRewriteModule
http://wiki.nginx.org/HttpProxyModule

Categories