I have Wordpress instance on Amazon Elastic BeanStalk. When I upload instance with EB scripts, the whole page is being replaced, also uploaded images which can be attached to posts. And after such automatic deploy, posts have missing pictures:)
I tried to solve this:
1) I logged into Amazon machine with SFTP, but my user ec2-user has only read-access to files. So I was not able overwrite only part of application, with retaining uploaded files.
2) I read I can use Amazon S3 as external storage for upload files. This is still not tested by me:). Do you know if this is good approach?
3) Any other approach for this problem? How to organize it on amazon: machine backup probably should be set?
The Elastic Beanstalk environment is essentially stateless; meaning that all data that is persisted to disk will be lost when the application is updated, the server is rebuilt or the environment scales.
The best way in my option is to use a plugin that writes all media files to AWS S3; something similar to the Amazon S3 and Cloudfront plugin.
You log files should also be shipped to a remote syslog server which you can either build yourself or use a 3rd party.
Google: loggly, logstash, graylog, splunk
Related
I would like to store my moodledata folder on S3 bucket and share with multiple EC2 instances for autoscaling.
So without using S3fs, I want to manage this moodledata on S3 bucket using AWS SDK for PHP.
Also, I have searched many forums but I did not get any solution with AWS SDK for PHP. Regards,
There is a plugin https://moodle.org/plugins/tool_objectfs which almost does this. It does not replace the file system but shifts files to S3 based on critera. Example if file size is greater than 10kb. You will still need a shared file system but would be able to keep the cost down by shifting old and large files off to S3. In AWS the best way to have a shared file system is EFS service. It would be challenging to completely replace the file system with S3 as S3 is not a file system but an object store.
I have been trying to integrate Amazon S3 & Amazon Cloudfront with my website (Hosted on Amazon EC2) developed using opencart from past few days. While searching, i found a lot of extensions from which not a single one fits the requirement.
According to extensions all data be stored on local volume storage & you can create a sub-domain with directory root to /image/ directory & access the images form sub-domain. But here i do not see how the images and all get to Amazon S3. I might be missing something here. But below is what i want to implement.
What i want is to store all images & downloadable to Amazon S3 & retrieve the same from Amazon S3 using Amazon Cloudfront. When admin uploads an image then that gets stored in Amazon S3 instead of local volume storage.
I have gone through the /image/ models, library files which comes with opencart default installation. After seeing files, it looks impossible to implement what i want in current structure. The solution i see is rather i create my own library for this & update each & every file of opencart where images are being used or use any of the extensions present (using this will cause issues while using Amazon Elastic load balancing or Amazon Auto Scaling).
Any suggestions !!!
Searching the market place I found this solution, Amazon CloudFront / S3 Integration
The extensions says:
speed up image loading on your website using Amazon CloudFront. This integration allows you to easily upload your OpenCart image cache onto S3 and serve it through CloudFront.
Finally i found the way for this. Basically we have two Questions in my above questions :-
1. Integrate Amazon S3
2. Integrate Amazon CloudFront
Integrate Amazon S3
The best practice for this is to streamline all directories and files inside '{$ROOT}/image' completely with S3. Ultimately the goal is to make the application as much scalable as possible. This way when we put load balancer in front of our application then it won't create any issues as files no longer being saved on our local storage.
To achieve this - one have to customize the application so that when ever admin add/update any images then they all get add/update to S3 instead of local storage. Also when images get pulled in website front then they all get pulled from S3 instead of local storage.
Integrate Amazon CloudFront
This has two options:-
2a. With S3 implemented - Just need to provide the S3:Bucket:ARN in amazon cloudfront and change the images url throughtout the web application.
2b. Without S3 (Local Storage used) - Instead of S3:Bucket:ARN we need to give image directory url of our application to amazon cloudfront for example:- www.example.com/image/ that's it. Now change the urls of images through out the web application and images will be pulled out from amazon cloudfront url.
The easiest way I found is to host in AWS Lightsail plan. Although the AWS Lightsail plan does not support Opencart by default we can use the LAMP stack and add the opencart code in it.
Host LAMP stack in AWS Lightsail, Opencart hosting in AWS
In this post we describe
How to set up an instance in Lightsail
How to select the free tier plan
Update system and PHP version in AWS Lightsail
Opencart installation steps in the AWS Lightsail LAMP stack
Create Static IP
Create DNS Zone
Add Name servers to your domain registrar
Create a database, database user and grant access
Install free Let’s Encrypt Certificate
Configure HTTP to HTTPS redirection for Opencart
Activate the SEO URL
How to setup FileZilla SFTP in AWS Lightsail to transfer files?
PHPMyAdmin access
How to upgrade to a higher Lightsail package?
How to set up the CDN?
Let us know if any questions or concerns.
We have developed our application in Laravel and now we are planning to transfer it to Amazon server where we have to separate our application logic with file storage. Basically, we want to move our whole application storage to a cloud server(Amazon S3) and application logic to Amazon EC2 server.
In our system, we manipulate (resize images, merge images, make thumbnails from videos, etc) many storage files locally. We will not be going to store any files on the application server once we migrate to Amazon server. So, our concern is how we can manipulate cloud server files?
Earlier all files are present on the application server so file manipulation was easy to process but after migrating whole storage to cloud server how we can manipulate files that are on the cloud server with manipulation logic resides on the application server?
Any Response will be helpful
Thanks in advance...
To manipulate S3 file, I think first we need to download the file locally. Once, we have file locally we can apply any operation on that particular file. We can delete the local file later.
Here are the documents to directly upload from or download to a local file using the Amazon S3.
https://aws.amazon.com/blogs/developer/transferring-files-to-and-from-amazon-s3/
https://docs.aws.amazon.com/aws-sdk-php/v3/guide/
Thanks
I'm new to AWS, I'm running code on its EBS environment. I wanna regularly deploy code to the beanstalk environment to make updates to all our running instances.
But I also have a WordPress blog for our main website separate from the main website code. I have already setup the RDS instance to be used by WordPress. But the thing is every time I deploy code to our main beanstalk environment it overrides the files of WordPress that we have available locally. For example if some author made a new post before i deployed the code, the WordPress files gets overwritten, removing the new post files (images and stuff).
So my question is, how can I detach WordPress from our beanstalk environment? I don't want to create a separate beanstalk environment just for WordPress.
Is there any way I can use S3 buckets to host WordPress files and then make the files somehow available in the beanstalk environment we're running for our main site without creating a new environment? If there's an option then what happens to dynamic files being uploaded by user? Will they be saved in S3 by WordPress?
You should definitely separate WordPress from your application. They are different systems, there's no reason to run them on the same host.
There are some extensions for WordPress that can publish the WordPress site as static HTML, which can then be hosted from an Amazon S3 bucket. This makes the site read-only so interactive features won't work (eg search, eCommerce) but it's fine for normal blog pages.
WP Static HTML Output
Simply Static
WP-Static
Question on Quora
If that's not suitable, simply run it on a separate EC2 instance outside of the Beanstalk environment. You might even consider using Amazon Lightsail.
The main problem seems to me that your are not using the right configurations that belong to a wordpress + EBS installation.
EBS creates a new application version when you deploy.
Therefore you are cannot access anything from the previous application version including uploads folder
Conclusion, you need to detach dynamic files from the application level while keeping the database the same.
How to? Use either a mounted EFS and/or S3. In combination with the Wordpress S3 offload plugin
I am assuming you are using an RDS database that doesn't run on the host instance. If not, that is definitely not recommended. I can highly recommend following the best practices step-by-step installation including files here.
I use this code
move_uploaded_file($file_tmp,$us_id.".jpg");
but after run this script it not error but file not appear into folder ,how can it do?
before these, I test in localhost it work.
You haven't specified what $file_tmp contains, but... in an Azure Web App, the root folder is at d:\home\site\wwwroot. And you'll find the %HOME% environment variable set to d:\home.
Edited based on #David's comments and Kudu's Azure runtime environment
In a Cloud environment, saving files to the current filesystem of your Web App is not advised. If you simply upload your site through FTP, you might not have issues, but if you rely on Continuous Integration or automated deployment scenarios, your site folder might change or have content overwritten.
That is why, for storage of files that need to be accesed in the future or need to be permantently saved, you might want to use something like Azure Blob Storage. Not only is really cheap, but you can apply CDN over it for improving your files delivery.
Here is how to use it on PHP and the Azure SDK for PHP.
As I leverage your code at Azure window sever can't upload file from php in my test project, it works perfectly on my side, even I don't value the $us_id in your code, the picture is still updated to the uploadimg folder with the name .jpg.
I suspect whether if your Web Apps's disk space has reached the limitation of your App Service plan.
As every App Service pricing tier has a limit disk space and will shared in all the web apps in this App Service plan. You can check the metric on dashboard page of your web app portal, e.g.
You can refer to https://azure.microsoft.com/en-us/pricing/details/app-service/ for details.