I'm new to AWS, I'm running code on its EBS environment. I wanna regularly deploy code to the beanstalk environment to make updates to all our running instances.
But I also have a WordPress blog for our main website separate from the main website code. I have already setup the RDS instance to be used by WordPress. But the thing is every time I deploy code to our main beanstalk environment it overrides the files of WordPress that we have available locally. For example if some author made a new post before i deployed the code, the WordPress files gets overwritten, removing the new post files (images and stuff).
So my question is, how can I detach WordPress from our beanstalk environment? I don't want to create a separate beanstalk environment just for WordPress.
Is there any way I can use S3 buckets to host WordPress files and then make the files somehow available in the beanstalk environment we're running for our main site without creating a new environment? If there's an option then what happens to dynamic files being uploaded by user? Will they be saved in S3 by WordPress?
You should definitely separate WordPress from your application. They are different systems, there's no reason to run them on the same host.
There are some extensions for WordPress that can publish the WordPress site as static HTML, which can then be hosted from an Amazon S3 bucket. This makes the site read-only so interactive features won't work (eg search, eCommerce) but it's fine for normal blog pages.
WP Static HTML Output
Simply Static
WP-Static
Question on Quora
If that's not suitable, simply run it on a separate EC2 instance outside of the Beanstalk environment. You might even consider using Amazon Lightsail.
The main problem seems to me that your are not using the right configurations that belong to a wordpress + EBS installation.
EBS creates a new application version when you deploy.
Therefore you are cannot access anything from the previous application version including uploads folder
Conclusion, you need to detach dynamic files from the application level while keeping the database the same.
How to? Use either a mounted EFS and/or S3. In combination with the Wordpress S3 offload plugin
I am assuming you are using an RDS database that doesn't run on the host instance. If not, that is definitely not recommended. I can highly recommend following the best practices step-by-step installation including files here.
Related
I have been trying to integrate Amazon S3 & Amazon Cloudfront with my website (Hosted on Amazon EC2) developed using opencart from past few days. While searching, i found a lot of extensions from which not a single one fits the requirement.
According to extensions all data be stored on local volume storage & you can create a sub-domain with directory root to /image/ directory & access the images form sub-domain. But here i do not see how the images and all get to Amazon S3. I might be missing something here. But below is what i want to implement.
What i want is to store all images & downloadable to Amazon S3 & retrieve the same from Amazon S3 using Amazon Cloudfront. When admin uploads an image then that gets stored in Amazon S3 instead of local volume storage.
I have gone through the /image/ models, library files which comes with opencart default installation. After seeing files, it looks impossible to implement what i want in current structure. The solution i see is rather i create my own library for this & update each & every file of opencart where images are being used or use any of the extensions present (using this will cause issues while using Amazon Elastic load balancing or Amazon Auto Scaling).
Any suggestions !!!
Searching the market place I found this solution, Amazon CloudFront / S3 Integration
The extensions says:
speed up image loading on your website using Amazon CloudFront. This integration allows you to easily upload your OpenCart image cache onto S3 and serve it through CloudFront.
Finally i found the way for this. Basically we have two Questions in my above questions :-
1. Integrate Amazon S3
2. Integrate Amazon CloudFront
Integrate Amazon S3
The best practice for this is to streamline all directories and files inside '{$ROOT}/image' completely with S3. Ultimately the goal is to make the application as much scalable as possible. This way when we put load balancer in front of our application then it won't create any issues as files no longer being saved on our local storage.
To achieve this - one have to customize the application so that when ever admin add/update any images then they all get add/update to S3 instead of local storage. Also when images get pulled in website front then they all get pulled from S3 instead of local storage.
Integrate Amazon CloudFront
This has two options:-
2a. With S3 implemented - Just need to provide the S3:Bucket:ARN in amazon cloudfront and change the images url throughtout the web application.
2b. Without S3 (Local Storage used) - Instead of S3:Bucket:ARN we need to give image directory url of our application to amazon cloudfront for example:- www.example.com/image/ that's it. Now change the urls of images through out the web application and images will be pulled out from amazon cloudfront url.
The easiest way I found is to host in AWS Lightsail plan. Although the AWS Lightsail plan does not support Opencart by default we can use the LAMP stack and add the opencart code in it.
Host LAMP stack in AWS Lightsail, Opencart hosting in AWS
In this post we describe
How to set up an instance in Lightsail
How to select the free tier plan
Update system and PHP version in AWS Lightsail
Opencart installation steps in the AWS Lightsail LAMP stack
Create Static IP
Create DNS Zone
Add Name servers to your domain registrar
Create a database, database user and grant access
Install free Let’s Encrypt Certificate
Configure HTTP to HTTPS redirection for Opencart
Activate the SEO URL
How to setup FileZilla SFTP in AWS Lightsail to transfer files?
PHPMyAdmin access
How to upgrade to a higher Lightsail package?
How to set up the CDN?
Let us know if any questions or concerns.
It would strike me as something rather mundane, but so far I cannot find any reference to how I can backup my Amazon AWS hosted application to my AWS S3 Bucket automatically on a regular basis?
Long story short, I have a AWS VPS with MySQL and WordPress installed and now am looking to do weekly/monthly backups of the site, including the DB. This seems to be possible via a WP plugin called "Updatedraft Plus" which allows for backups to be created and pushed over to S3 buckets, but given that this is a rather simple feature, I am quite sure that Amazon must have a solution for this already, I just cant seem to figure out how to do it.
I run both a WP site and a Laravel application if that matters, so could rely on either to assist with the backup.
Thanks for any advice on this.
I'm trying to build an autoscaled infrastructure for a WordPress site on Google Compute Engine. For WordPress, I want to use the LEMP(Ubuntu-18, Nginx, Mysql, PHP) stack but with a separate Cloud SQL instance as Database.
Here's my plan:
Create a Boot disk with the WordPress site installed & setup
Create an Instance template from that Boot Disk
Create Instance Groups for my required regions with the Template above.
Create an HTTP Load Balancer to autoscale the instances.
But, I'm really confused at the first step, how I should set up for WordPress site to create an Instance Template, I don't know how can we set up our Apps on Custom Image OR Boot Disk.
Is the approach above is the right one?
How can I set up my WordPress site to use in an Instance Template?
Help me, please!
Thanks in advance!
Autoscaling feature of managed instance groups is usually applicable to stateless VM instances. An autoscaler adds or removes instances from a managed instance group. Therefore, any data stored on root disks of VMs can be lost.
As you specified in your plan, the stateful component of your LEMP stack (Databases) has to be implemented outside of the managed instance group.
To create a template for the managed instance group, you can take the following steps:
Setup, configure and test your website on a single VM (stateless components) which is configured to connect to Cloud SQL instance (stateful component).
Create a custom image from the VM's disk
gcloud compute images create [IMAGE_NAME] --source-disk [SOURCE_DISK] --source-disk-zone [ZONE]
Use this custom image to create an instance template for your managed instance group
These steps can be done by using gcloud command or Google Cloud Console.
I have Wordpress instance on Amazon Elastic BeanStalk. When I upload instance with EB scripts, the whole page is being replaced, also uploaded images which can be attached to posts. And after such automatic deploy, posts have missing pictures:)
I tried to solve this:
1) I logged into Amazon machine with SFTP, but my user ec2-user has only read-access to files. So I was not able overwrite only part of application, with retaining uploaded files.
2) I read I can use Amazon S3 as external storage for upload files. This is still not tested by me:). Do you know if this is good approach?
3) Any other approach for this problem? How to organize it on amazon: machine backup probably should be set?
The Elastic Beanstalk environment is essentially stateless; meaning that all data that is persisted to disk will be lost when the application is updated, the server is rebuilt or the environment scales.
The best way in my option is to use a plugin that writes all media files to AWS S3; something similar to the Amazon S3 and Cloudfront plugin.
You log files should also be shipped to a remote syslog server which you can either build yourself or use a 3rd party.
Google: loggly, logstash, graylog, splunk
I am currently building a storage service, but I am very small and don't want to setup or pay for an Amazon S3 account. I already have my own hosting service which I want to use. However, I want to make it simple to switch to Amazon S3 if the need arises. Therefore, I would like to have essentially an S3 'clone' on my server, which I can simply redirect to amazons servers at a later time. Is there any package that can do this?
EDIT: I am on a shared server where I cannot install software, so is there a simple php page that can do this?
Nimbus allows for that. From FAQ:
Cumulus is an open source implementation of the S3 REST API. Some
features such as versioning and COPY are not yet implemented, but some
additional features are added, such as file system usage quotas.
http://www.nimbusproject.org/doc/nimbus/faq/#what-is-cumulus
http://www.ubuntu.com/cloud
you would need several host computers but it works well.
I too have a backup service, but it's based on several 48TB raid arrays.