Access "data" folder from another OpenShift application - php

I have an existing PHP app working on an Openshift application which is saving some files into the "DATA" directory.
Now I am migrate the same code to nodejs and I am putting it on another OpenShift application (in the same OpenShift account and domain).
What I would like to achieve is to have the PHP application writing files to the "DATA" dir of the nodejs application to keep existing and working PHP code running and creating files, but consuming those files from the nodejs app.
In the future I can plan to migrate existing and missing PHP code to nodejs as well.
Thank you!
Best regards,
Camillo

This isn't going to be possible. The DATA directory is local to the OpenShift gear. You can't even share a DATA directory between multiple gears of a single scaled application. You would probably be better off switching to a distributed file system such as Amazon S3 or Dropbox that both applications could access.

Related

Creation of large files in Google App Engine flexible using PHP

I have project that needs to output hundreds of photos from one template file, compress them into a .zip file, and push them to the customer's browser. After that, the .zip file can be deleted.
Google App Engine (PHP) does not allow you to write files like you would in a standard web server.
How can this be accomplished with GAE flexible?
As you have already known, App Engine flexible does not allow you to write files on the system even if it runs on a VM. The reason is that it runs within more Docker containers and you will not have the guarantee that you will find the file.
An alternative for this is to change a bit your workflow and to use Cloud Storage as an intermediate. You can send the photos directly to Cloud Storage, and the users will be able to download them directly from Cloud Storage. Here you have a guide on how to achieve this from App Engine flex for PHP.

How can I deploy my PHP application on Google cloud Platform?

I have previously developed 1 website using php and mysql now I want to deploy it on Google cloud platform.
It will be great if someone only point out steps for below points
1.install PHP 7.x
2.creating .YAML files
3.creating mysql databse
4.upload my PHP file
5.FTP setup
Thank you
You can start setting up you environment like the documentation describes.
For creating ".YAML" files there is also documentation by google.
To create your MySQL database you should check this documentation about creating Cloud SQL.
To upload PHP file you can use follow the documentation and learn more about Cloud Storage and how to upload objects.
About ftp setup, can you specify why you trying to achieve using it? If you want to transfer files you can you see again the documentation of 4.
You can find a large set of example applications for deploying to App Engine here, or on cloud.google.com.
Try to use Google App Engine - this is the easiest way to deploy app withount install LAMP and FTP. Withount MySQL its free! See the quickstart: https://cloud.google.com/appengine/docs/standard/php7/quickstart
Answering your question in the comment;
I've created .YAML file and also created MySQL databse and uploaded
all PHP files to bucket as shown in step 4. now how can I create URL
that call the Index.php ?
If you deploy your app in App Engine you don't necessarily need to upload all your application files to Cloud Storage because App Engine will put your static files in a managed static files server. You only need to set this in the app.yaml file.
I suggest you to take a look at the this sample application's files , so you can have a good idea on how to set all configuration files in order to deploy your app to App Engine standard environment and connect to Cloud SQL.
App Engine will create a default URL for you app under appspot.com domain, check for details here.
For more information on how you can connect to Cloud SQL from GAE standard take a look here.
Just follow steps
here
it will help you setup LAMP server and upload files to server using FTP
(which you can connect using SMTP-keyfile method) and also this document help you to setup mySQL database

How to manipulate cloud server files using Laravel?

We have developed our application in Laravel and now we are planning to transfer it to Amazon server where we have to separate our application logic with file storage. Basically, we want to move our whole application storage to a cloud server(Amazon S3) and application logic to Amazon EC2 server.
In our system, we manipulate (resize images, merge images, make thumbnails from videos, etc) many storage files locally. We will not be going to store any files on the application server once we migrate to Amazon server. So, our concern is how we can manipulate cloud server files?
Earlier all files are present on the application server so file manipulation was easy to process but after migrating whole storage to cloud server how we can manipulate files that are on the cloud server with manipulation logic resides on the application server?
Any Response will be helpful
Thanks in advance...
To manipulate S3 file, I think first we need to download the file locally. Once, we have file locally we can apply any operation on that particular file. We can delete the local file later.
Here are the documents to directly upload from or download to a local file using the Amazon S3.
https://aws.amazon.com/blogs/developer/transferring-files-to-and-from-amazon-s3/
https://docs.aws.amazon.com/aws-sdk-php/v3/guide/
Thanks

How can i upload file to azure wwwroot folder

I use this code
move_uploaded_file($file_tmp,$us_id.".jpg");
but after run this script it not error but file not appear into folder ,how can it do?
before these, I test in localhost it work.
You haven't specified what $file_tmp contains, but... in an Azure Web App, the root folder is at d:\home\site\wwwroot. And you'll find the %HOME% environment variable set to d:\home.
Edited based on #David's comments and Kudu's Azure runtime environment
In a Cloud environment, saving files to the current filesystem of your Web App is not advised. If you simply upload your site through FTP, you might not have issues, but if you rely on Continuous Integration or automated deployment scenarios, your site folder might change or have content overwritten.
That is why, for storage of files that need to be accesed in the future or need to be permantently saved, you might want to use something like Azure Blob Storage. Not only is really cheap, but you can apply CDN over it for improving your files delivery.
Here is how to use it on PHP and the Azure SDK for PHP.
As I leverage your code at Azure window sever can't upload file from php in my test project, it works perfectly on my side, even I don't value the $us_id in your code, the picture is still updated to the uploadimg folder with the name .jpg.
I suspect whether if your Web Apps's disk space has reached the limitation of your App Service plan.
As every App Service pricing tier has a limit disk space and will shared in all the web apps in this App Service plan. You can check the metric on dashboard page of your web app portal, e.g.
You can refer to https://azure.microsoft.com/en-us/pricing/details/app-service/ for details.

Uploading and accessing files in Azure without using the Data Storage

If using Windows Azure and an ASP.NET web site with PHP script to upload files, can I access those files from the server or must I use the Data Storage facilities?
i.e. I'd like to reference the files directly from html\server... etc. I think I probably should be able to.
Thank you.
How are you hosting your ASP.NET website. If it's hosted in a Web Role, then you won't have access to a persistent "standard" file system for that role. (especially if you've scaled over multiple instances).
Have a look at the following tutorial on using blob storage from PHP. http://www.windowsazure.com/en-us/develop/php/how-to-guides/blob-service/
You can use the blob storage quite easily to access them with standard HTTP links from your ASP.NET site. i.e.
http://your-storage-account.blob.core.windows.net/your-container/file.txt

Categories