I have developed an Android app which does a GET/PUT operations with a test web server for fetching and storing data in JSON format. For PUT using a PHP script on webserver via parameters in url. Now to scale the solution, i am exploring AWS solutions. How to achieve PHP script based logic handling on webserver with AWS S3 or any other cloud based storage which can perform this php based logic handling.
I would recommend to use AWS Lambda- It doesn't need servers to be hosted and maintained, which is a big headache. With AWS Lambda, in few minutes you can get your web application running. Your PHP scripts can be hosted as service and AWS will take care of its scalability and availability.
https://www.youtube.com/watch?v=eOBq__h4OJ4
https://aws.amazon.com/serverless/build-a-web-app/
https://aws.amazon.com/blogs/compute/scripting-languages-for-aws-lambda-running-php-ruby-and-go/
AWS S3 you can use for storing your static files like images, css, javascript etc...
Related
I have a database (MySQL) created in Amazon RDS. Also created a PHP class for connecting to that database and which will return some webservice as JSON format according to the request. Which I am using for my mobile and Angular applications.
Currently I am hosting that php file in one of my GoDaddy server and calling that PHP file from the app(s).
My question is - can I use AWS Lambda for this purpose (for deploying php file ) ? In some tutorials I found Lambda will not support PHP.
The PHP file serves the following functionalities
Creating various JSON files from MySQL table.
Uploading images to Amazon S3 bucket.
Sending push notifications (FCM) to Android / iOS devices.
AWS Lambda currently does not support PHP.
You could use any one of the supported languages for your purpose.
For push notifications you could use SNS
Read more: https://aws.amazon.com/lambda/faqs/
Can we upload image files to a GCP running compute engine directly using PHP?
I know that it can be done using the backend ssh on GCP.
I have a solution that works in my local system to upload files but does not work on GCP.
Based on initial research it appears that GCP does not allow file upload directly.
Need confirmation though or if there is a way in which file upload can work, please enlighten.
I do not think it can be done directly through php, through ssh yes.
You can upload files with PHP.
If you want to upload to the server local disk, things become a bit simpler. Just ensure you have PHP configured for upload: PHP File Upload
If you want to upload to Google Storage, say to a bucket, you'd need to authenticate and get access there. Google Cloud SDK can provide this. Then you can install API Client Libraries for PHP and optional Cloud SDK components. That way you might call gcloud shell command or use API. Cloud SDK will provide transparent connection and authentication for you so that you can upload to the Google Storage.
Please see:
Cloud SDK
Google API Client Libraries
I've developed an android app that interact with my database by using some php scripts (one for each function of my app) that returns a json object with response data.
Now i need to build up a website too that do the same tasks of my app, but i would fix up my server code.
Should i have to maintain my app php scripts separate from website scripts (i'm planning to use some php framework to develop it), or there's a different way to do it?
No! Same script will work for all platforms.
If you follow proper protocols you will be good :)
Use Rest Console or similar tools to test your webservice on browser.
If you are able to get JSON response, then its good for all platform.
If you want to separate out the platforms and devices on server that can be handled by using user agent check at server end.
I am developing PHP plugins for CMS systems that at the moment communicates with my LAMP (PHP server) setup. As I am about to rewrite my server and PHP plugins and I am looking for a way to bypass the server konfiguration, maintaining and so on.
The server receives JSON, saves information from the JSON to my MySQL database, creates new JSON calls to external API's handles the response, saves part of it to the database. Merges pdf files from the different API's and creates a final JSON response to the CMS plugins.
My questions is in regards to a big update of my modules; Is there a setup that allows me to disgard my LAMP setup and use a cloud service? I have looked at Apigee and Parse but I don't know if they can make external API calls and handle the response of the API's?
If this can be done is it using Node.js?
Thanks.
Certainly Apigee can make outbound calls either through our policy based proxies or with a Node based proxy. Passivation of data can be accomplished through our KVM policies.
You can try it out with the free offering and see if it makes sense.
So you want standard website hosting with a MySQL database?
Any web host can do this for you. They manage the server, handle updates, etc. You just run your code in your little folder. Setup your domain. Connect to the database that they setup.
How much traffic are you doing? Do you need a whole server? A wee one or a giant one? Failover? Backups?
You should also look into Application Hosting with one of the big providers if you are worried about scaling.
http://aws.amazon.com/application-hosting/
http://www.rackspace.com/saas
Hi I'm really new to amazon s3. I want to download a file to my server (which is in s3) from a given another s3 location (credentials will be provided). This must be done using a php (cakephp 1.2) method and there is no user interface for this. (this could be a cron job probably). I couldn't find any good article regarding this by googling. Any sample code that I can do this work?
Check out AWS SDK for PHP.
If I understand you correctly, you want to copy an Amazon S3 object (file) from one AWS account to a different AWS account without downloading and uploading it to a separate system (apparently an Amazon EC2 instance)?!
It is possible to copy an object within a single account by means of copyObject(), but cross account operations aren't supported by this API (and neither for any other AWS resources, as far as I know, which is likely a deliberate decision to ease and streamline the security architecture and process).
So while your use case is sound, there is no other solution than channeling this process through your server, i.e. download from the source account and upload to the target account.
This shouldn't be much of a problem cost or performance wise though, because There is no Data Transfer charge between Amazon EC2 and other Amazon Web Services within the same region (i.e. between Amazon EC2 US West and Amazon S3 in US West) (see section Data Transfer in Amazon EC2 Pricing) and these operations will facilitate Amazon's decent internal network infrastructure (rather than crossing the public internet).