How to backup a AmazonAWS website (Laravel) to AmazonAWS S3 bucket? - php

It would strike me as something rather mundane, but so far I cannot find any reference to how I can backup my Amazon AWS hosted application to my AWS S3 Bucket automatically on a regular basis?
Long story short, I have a AWS VPS with MySQL and WordPress installed and now am looking to do weekly/monthly backups of the site, including the DB. This seems to be possible via a WP plugin called "Updatedraft Plus" which allows for backups to be created and pushed over to S3 buckets, but given that this is a rather simple feature, I am quite sure that Amazon must have a solution for this already, I just cant seem to figure out how to do it.
I run both a WP site and a Laravel application if that matters, so could rely on either to assist with the backup.
Thanks for any advice on this.

Related

Resourcespace DAM system - setting external storage as AWS S3

I've been spending few days trying to figure out how to set aws s3 as external storage for Resourcespace. and i've been getting more confused with the this app.
I'm using the opensource version and trying to customize it to my needs.
I've been through the web app's lengthy documentation but couldn't find anything about setting storage (like other web apps out there) However, I found a feature called syncdir where it sets an alternative external storage (for backup) but not as an external storage, as from the documentation, it doesent seem to have a direct method to specify storage/integrate s3 with it.
I've tried the following:
I've tried using aws s3 integration and how to integrate to any php website, by changing storing directory of 'storagedir' and directory of 'syncdir' in config.default file (i added the require s3 autoload file and added aws keys in config file), but it's not working, site is still storing locally
Note: I've integrated aws s3 before with Laravel 5.7 & Codeigniter 3 frameworks successfully.
I tried adding the require aws-autoload into the file where uploading functions is, and tried to look for the code responsible to upload, but code seems confusing to me where the upload functionality is (its not a php funtion where $_FILES receives your upload.
Changed place of require aws-autoload into include/general.php, but no luck.
Followed up with some forums on the matter like:
using external storage
Amazon S3 integration
I'm assuming that using the config file (to store AWS credentials and storage set to s3 bucket url), i include the aws-autoload in general/upload file, and it would automatically understand where it should upload, but no error or bug is reporting to address it.
But most of what i found is related to the paid version of the DAM system where it seems to be already set up on amazon.
Please advise, Any help is appreciated.
I'm using Wamp on Winddows 10 PC btw
Check this discussion out, it might help you :
https://groups.google.com/forum/#!topic/resourcespace/JT833klfwjc
It look like it is still a work in progress, so you may see the WIP code,
You will find links to code in the mentioned link.

How to integrate opencart with Amazon CDN?

I have been trying to integrate Amazon S3 & Amazon Cloudfront with my website (Hosted on Amazon EC2) developed using opencart from past few days. While searching, i found a lot of extensions from which not a single one fits the requirement.
According to extensions all data be stored on local volume storage & you can create a sub-domain with directory root to /image/ directory & access the images form sub-domain. But here i do not see how the images and all get to Amazon S3. I might be missing something here. But below is what i want to implement.
What i want is to store all images & downloadable to Amazon S3 & retrieve the same from Amazon S3 using Amazon Cloudfront. When admin uploads an image then that gets stored in Amazon S3 instead of local volume storage.
I have gone through the /image/ models, library files which comes with opencart default installation. After seeing files, it looks impossible to implement what i want in current structure. The solution i see is rather i create my own library for this & update each & every file of opencart where images are being used or use any of the extensions present (using this will cause issues while using Amazon Elastic load balancing or Amazon Auto Scaling).
Any suggestions !!!
Searching the market place I found this solution, Amazon CloudFront / S3 Integration
The extensions says:
speed up image loading on your website using Amazon CloudFront. This integration allows you to easily upload your OpenCart image cache onto S3 and serve it through CloudFront.
Finally i found the way for this. Basically we have two Questions in my above questions :-
1. Integrate Amazon S3
2. Integrate Amazon CloudFront
Integrate Amazon S3
The best practice for this is to streamline all directories and files inside '{$ROOT}/image' completely with S3. Ultimately the goal is to make the application as much scalable as possible. This way when we put load balancer in front of our application then it won't create any issues as files no longer being saved on our local storage.
To achieve this - one have to customize the application so that when ever admin add/update any images then they all get add/update to S3 instead of local storage. Also when images get pulled in website front then they all get pulled from S3 instead of local storage.
Integrate Amazon CloudFront
This has two options:-
2a. With S3 implemented - Just need to provide the S3:Bucket:ARN in amazon cloudfront and change the images url throughtout the web application.
2b. Without S3 (Local Storage used) - Instead of S3:Bucket:ARN we need to give image directory url of our application to amazon cloudfront for example:- www.example.com/image/ that's it. Now change the urls of images through out the web application and images will be pulled out from amazon cloudfront url.
The easiest way I found is to host in AWS Lightsail plan. Although the AWS Lightsail plan does not support Opencart by default we can use the LAMP stack and add the opencart code in it.
Host LAMP stack in AWS Lightsail, Opencart hosting in AWS
In this post we describe
How to set up an instance in Lightsail
How to select the free tier plan
Update system and PHP version in AWS Lightsail
Opencart installation steps in the AWS Lightsail LAMP stack
Create Static IP
Create DNS Zone
Add Name servers to your domain registrar
Create a database, database user and grant access
Install free Let’s Encrypt Certificate
Configure HTTP to HTTPS redirection for Opencart
Activate the SEO URL
How to setup FileZilla SFTP in AWS Lightsail to transfer files?
PHPMyAdmin access
How to upgrade to a higher Lightsail package?
How to set up the CDN?
Let us know if any questions or concerns.

Deploy WordPress using S3 to be used in Elastic Beanstalk

I'm new to AWS, I'm running code on its EBS environment. I wanna regularly deploy code to the beanstalk environment to make updates to all our running instances.
But I also have a WordPress blog for our main website separate from the main website code. I have already setup the RDS instance to be used by WordPress. But the thing is every time I deploy code to our main beanstalk environment it overrides the files of WordPress that we have available locally. For example if some author made a new post before i deployed the code, the WordPress files gets overwritten, removing the new post files (images and stuff).
So my question is, how can I detach WordPress from our beanstalk environment? I don't want to create a separate beanstalk environment just for WordPress.
Is there any way I can use S3 buckets to host WordPress files and then make the files somehow available in the beanstalk environment we're running for our main site without creating a new environment? If there's an option then what happens to dynamic files being uploaded by user? Will they be saved in S3 by WordPress?
You should definitely separate WordPress from your application. They are different systems, there's no reason to run them on the same host.
There are some extensions for WordPress that can publish the WordPress site as static HTML, which can then be hosted from an Amazon S3 bucket. This makes the site read-only so interactive features won't work (eg search, eCommerce) but it's fine for normal blog pages.
WP Static HTML Output
Simply Static
WP-Static
Question on Quora
If that's not suitable, simply run it on a separate EC2 instance outside of the Beanstalk environment. You might even consider using Amazon Lightsail.
The main problem seems to me that your are not using the right configurations that belong to a wordpress + EBS installation.
EBS creates a new application version when you deploy.
Therefore you are cannot access anything from the previous application version including uploads folder
Conclusion, you need to detach dynamic files from the application level while keeping the database the same.
How to? Use either a mounted EFS and/or S3. In combination with the Wordpress S3 offload plugin
I am assuming you are using an RDS database that doesn't run on the host instance. If not, that is definitely not recommended. I can highly recommend following the best practices step-by-step installation including files here.

Wordpress installed with Elastick BeanStalk overwrites uploaded images

I have Wordpress instance on Amazon Elastic BeanStalk. When I upload instance with EB scripts, the whole page is being replaced, also uploaded images which can be attached to posts. And after such automatic deploy, posts have missing pictures:)
I tried to solve this:
1) I logged into Amazon machine with SFTP, but my user ec2-user has only read-access to files. So I was not able overwrite only part of application, with retaining uploaded files.
2) I read I can use Amazon S3 as external storage for upload files. This is still not tested by me:). Do you know if this is good approach?
3) Any other approach for this problem? How to organize it on amazon: machine backup probably should be set?
The Elastic Beanstalk environment is essentially stateless; meaning that all data that is persisted to disk will be lost when the application is updated, the server is rebuilt or the environment scales.
The best way in my option is to use a plugin that writes all media files to AWS S3; something similar to the Amazon S3 and Cloudfront plugin.
You log files should also be shipped to a remote syslog server which you can either build yourself or use a 3rd party.
Google: loggly, logstash, graylog, splunk

How can I create an Amazon S3 clone on my host?

I am currently building a storage service, but I am very small and don't want to setup or pay for an Amazon S3 account. I already have my own hosting service which I want to use. However, I want to make it simple to switch to Amazon S3 if the need arises. Therefore, I would like to have essentially an S3 'clone' on my server, which I can simply redirect to amazons servers at a later time. Is there any package that can do this?
EDIT: I am on a shared server where I cannot install software, so is there a simple php page that can do this?
Nimbus allows for that. From FAQ:
Cumulus is an open source implementation of the S3 REST API. Some
features such as versioning and COPY are not yet implemented, but some
additional features are added, such as file system usage quotas.
http://www.nimbusproject.org/doc/nimbus/faq/#what-is-cumulus
http://www.ubuntu.com/cloud
you would need several host computers but it works well.
I too have a backup service, but it's based on several 48TB raid arrays.

Categories