I have My web services coded in php and its hosted on aws using elastic bean stalk .If I want to edit any of my code I am connecting to the EC# instance from file zilla and connecting to the server.Sincek is in autoscaling what it does is it automatically scales up and down depending upon the traffic and data storage so when it scales down It takes the copy of the latest zip file uploaded from the aws dashbord and replace that with the current system configuration , it does not take back up of files uploaded from file zilla.So is there any way where in I can get back my previously uploded files to the server from file zilla?
I even tried connecting to the EC2 instance using ssh but I could not find my previous files over there also . Is it this that correct way to upload any application is from elastic beanstalk dashbord only ? and not editing from file zilla?
You are right that it will pick up the version of the file originally deployed when the instances scale up or down.
The recommended workflow for this scenario are to upload the zip file to the AWS console using the "Upload and Deploy" button.
You can also use CLI tools or APIs like:
awscli: http://docs.aws.amazon.com/cli/latest/reference/elasticbeanstalk/index.html
eb http://docs.aws.amazon.com/elasticbeanstalk/latest/dg/command-reference-eb.html
UpdateEnvironment API: http://docs.aws.amazon.com/elasticbeanstalk/latest/APIReference/API_UpdateEnvironment.html
Given that your current workflow involves the console, you can upload a new version of the file using the AWS console.
Read the walkthrough here for more details:
http://docs.aws.amazon.com/elasticbeanstalk/latest/dg/GettingStarted.Walkthrough.html
Related
I am trying to create my first php app using GCS.
I have placed all my php app files in the bucket (including my .yaml file) created by the app engine, but I am struggling to access the files to deploy the app using the cloud shell in app engine?
If I has them stored in git I would have used the clone commands but would assume that this would have placed the files in the same bucket that I have manually uploaded the files to through the bucket browser?!
I either cant find the correct file path as the buckets do not appear to be part of my /home/username directory or I am missing a key link like an API to the cloud storage service (GSUtil?).
Can anyone point me in the right direction?
JH
You shouldn't be manually placing any files into the (app's staging) GCS bucket yourself to deploy an app to app engine.
Instead, place the files (including your app.yaml) in a working directory on your machine (or cloud shell), then use the gcloud app deploy command to deploy them. You can see an example of how to deploy a PHP app here. The gcloud app deploy command is going to look for the source files on the local filesystem.
Note that using git isn't a requirement (though using some form of source control is always a good development practice), it is just the way the files are copied onto the local machine in this example.
(Likewise -- GCS buckets won't show up in the cloud shell filesystem -- GCS is a blob storage service not a normal filesystem, so you would need to use a tool like gsutil or the GCS API to access it).
I created multiple REST api's for my website on a local server(wamp) and now i am trying to move it to a online server(aws elastic beanstalk). My question is where do i build my database with the tables and rows etc just like in phpmyadmin? I figured that "upload and deploy" means upload the php file i made but when i do i get an error saying "health degraded". So what i want to do is basically move my local server to an online one with aws EB. I watched a bunch of videos and did a lot of research but can't seem to find the way to go about this problem.
My question is where do i build my database with the tables and rows
etc just like in phpmyadmin?
You can connect to the RDS instance created by beanstalk using a Database Management Client Tool like MySQL Workbench, Heidi SQL & etc using the DNS name, created for RDS instance. However one challenge you will face is to access the RDS instance from your client machine, since its not a good practice to make the Database publicly accessible. You can create a EC2 instance (Windows or Linux with GUI) inside the same VPC, connect to it (Remote Desktop or SSH) and install the tools so that you can use the tools inside the server, to connect to the RDS instance.
I figured that "upload and deploy" means upload the php file i made
but when i do i get an error saying "health degraded"
To understand the structure in code inside the Zip file, I would recommend to create a Beanstalk environment with a sample project available in Beanstalk and download the sample project artifact (Zip file) from S3 so that you can compare the project structure requirements.
If you prefer to go through the documentation, you can refer this. If nothing goes well, connect to the ElasticBeanstalk provisioned EC2 instance (Either using Remote Desktop for Windows or SSH to linux) and investigate the deployed artifacts.
I have an image upload in my Laravel application and would like to have those images uploaded to shared network drive located on external Windows computer.
Is it somehow possible to mount a spesific folder at my shared network drive as Laravel Storage? Laravel 5.4's manual says that there is only s3 and Rackspace included as build-in drivers.
Thanks in advance.
I had a similar requirement and solved it by mounting the network share to a folder (or you could mount to a drive) in the local file system. There are plenty of guides on the net for doing this depending on your server operating system. You can then use the "local" file system driver to access the files.
Note that this means some sort of script on the server needs to handle connecting with the credentials required, rather than the laravel application, which may/may not be suitable in your environment.
You can use different drive to store your upload media. For example if you want to use googleDrive as your storage drive just need to follow below steps:
1) Pull filesystem adepter for google drive
2) Setup config/filesystem.php
3) Setup service provider
Setup .env with client ID and authentication stuffs and done.
You can check the detail about how to setup GDrive by this example code.
I have an Azure virtual machine which has more than 30 GB of images in its drive. Also there is a mysql workbench database running on it.
I want to utilise this DB and images into my azure cloud hosted app(Web+MySQL).
My php web app pulls images from the folder present in the VM's drive and shows the corresponding record of that image from the Mysql workbench database also present in the VM. How can I do that?
We cannot directly request for the image files in VMs from web application. But we can consider a workaround, which will leverage Azure File Storage. You can store the image files into the shared folder into File Storage, then you can visit the file via REST API within PHP.
You can follow https://learn.microsoft.com/en-us/azure/storage/storage-dotnet-how-to-use-files#mount-the-file-share to mount the file share in your VM.
Additionally, do not forget to enable the 445 outbound rule in firewall in VM and Network Secure Group if you are using the VM in ARM mode.
When successfully, you can get the similar file sys structure like:
Can any one suggest what is the configuration need to be done to run a simple php file in the Azure cloud service.
I am able to view my simple php file in Azure Web Site, But i cant view the same file in Azure Cloud service.
Can some one provide me some suggestions?
When you say "I am able to view my simple php file in Azure Web Site, But i cant view the same file in Azure Cloud service.", I believe you mean you can browse your PHP site in Windows Azure Websites however not with Windows Azure Service configured as something.cloudapp.net. Isn't it?
As you asked for suggestions here are some steps you can use to troubleshoot your problem:
When you access your PHP file i.e. servicename.cloudapp.net/phpinfo.php does it shows file not found error or it returns error related with PHP runtime? This is because you would need to put together PHP runtime as well as all the PHP files in your package before deploying it. If you miss any of important content, your site will not run. So you must be sure that all the contents specific to your application are in your Package which you deployed to Windows Azure cloud service.
To verify you have all the contents in your Package, you can rename cspkg to zip and then unzip it. After that you can rename cssx file to zip again and unzip it to see if all the contents are there.
You can enabled RDP access to your Azure VM and log into to check for consistency
Also for Azure Cloud Service you use this documentation: http://azurephp.interoperabilitybridges.com/articles/packaging-applications