Shared network drive as Laravel 5.4 Storage - php

I have an image upload in my Laravel application and would like to have those images uploaded to shared network drive located on external Windows computer.
Is it somehow possible to mount a spesific folder at my shared network drive as Laravel Storage? Laravel 5.4's manual says that there is only s3 and Rackspace included as build-in drivers.
Thanks in advance.

I had a similar requirement and solved it by mounting the network share to a folder (or you could mount to a drive) in the local file system. There are plenty of guides on the net for doing this depending on your server operating system. You can then use the "local" file system driver to access the files.
Note that this means some sort of script on the server needs to handle connecting with the credentials required, rather than the laravel application, which may/may not be suitable in your environment.

You can use different drive to store your upload media. For example if you want to use googleDrive as your storage drive just need to follow below steps:
1) Pull filesystem adepter for google drive
2) Setup config/filesystem.php
3) Setup service provider
Setup .env with client ID and authentication stuffs and done.
You can check the detail about how to setup GDrive by this example code.

Related

How do I deploy the files from my app bucket in GCS from cloud shell in App engine?

I am trying to create my first php app using GCS.
I have placed all my php app files in the bucket (including my .yaml file) created by the app engine, but I am struggling to access the files to deploy the app using the cloud shell in app engine?
If I has them stored in git I would have used the clone commands but would assume that this would have placed the files in the same bucket that I have manually uploaded the files to through the bucket browser?!
I either cant find the correct file path as the buckets do not appear to be part of my /home/username directory or I am missing a key link like an API to the cloud storage service (GSUtil?).
Can anyone point me in the right direction?
JH
You shouldn't be manually placing any files into the (app's staging) GCS bucket yourself to deploy an app to app engine.
Instead, place the files (including your app.yaml) in a working directory on your machine (or cloud shell), then use the gcloud app deploy command to deploy them. You can see an example of how to deploy a PHP app here. The gcloud app deploy command is going to look for the source files on the local filesystem.
Note that using git isn't a requirement (though using some form of source control is always a good development practice), it is just the way the files are copied onto the local machine in this example.
(Likewise -- GCS buckets won't show up in the cloud shell filesystem -- GCS is a blob storage service not a normal filesystem, so you would need to use a tool like gsutil or the GCS API to access it).

How to utilise AZURE VM's drive data into an azure cloud hosted app

I have an Azure virtual machine which has more than 30 GB of images in its drive. Also there is a mysql workbench database running on it.
I want to utilise this DB and images into my azure cloud hosted app(Web+MySQL).
My php web app pulls images from the folder present in the VM's drive and shows the corresponding record of that image from the Mysql workbench database also present in the VM. How can I do that?
We cannot directly request for the image files in VMs from web application. But we can consider a workaround, which will leverage Azure File Storage. You can store the image files into the shared folder into File Storage, then you can visit the file via REST API within PHP.
You can follow https://learn.microsoft.com/en-us/azure/storage/storage-dotnet-how-to-use-files#mount-the-file-share to mount the file share in your VM.
Additionally, do not forget to enable the 445 outbound rule in firewall in VM and Network Secure Group if you are using the VM in ARM mode.
When successfully, you can get the similar file sys structure like:

Azure Web Apps: FTP-uploaded file permissions

I'm in the process of getting a Laravel 5 app working on Azure Web Apps and am encountering an issue via Laravel's temporary storage.
Any time a template renders, Laravel attempts to cache it to the local filesystem. Unfortunately, for some reason Laravel doesn't have permission to write to its storage directory.
I am deploying my application from my build server via FTP
I am running on the free-tier shared infrastructure (just while I'm getting set up)
My deployment server is running Linux
In this circumstance, it's obvious what the problem is. Unfortunately, what I don't understand is why my web server doesn't have access to write to the directories my FTP user uploads.
Ideally any solution offered will be one that I can automate as part of my deploy process.
According to http://clivern.com/working-with-laravel-caching/, you can change the directory of the cache files using the cache.php configuration file. I'd like to suggest you to use $_SERVER['DOCUMENT_ROOT'] to obtain the root folder of your web app, and then construct a path for the cache files.

how to get files uploaded from file zilla in elastic beanstalk

I have My web services coded in php and its hosted on aws using elastic bean stalk .If I want to edit any of my code I am connecting to the EC# instance from file zilla and connecting to the server.Sincek is in autoscaling what it does is it automatically scales up and down depending upon the traffic and data storage so when it scales down It takes the copy of the latest zip file uploaded from the aws dashbord and replace that with the current system configuration , it does not take back up of files uploaded from file zilla.So is there any way where in I can get back my previously uploded files to the server from file zilla?
I even tried connecting to the EC2 instance using ssh but I could not find my previous files over there also . Is it this that correct way to upload any application is from elastic beanstalk dashbord only ? and not editing from file zilla?
You are right that it will pick up the version of the file originally deployed when the instances scale up or down.
The recommended workflow for this scenario are to upload the zip file to the AWS console using the "Upload and Deploy" button.
You can also use CLI tools or APIs like:
awscli: http://docs.aws.amazon.com/cli/latest/reference/elasticbeanstalk/index.html
eb http://docs.aws.amazon.com/elasticbeanstalk/latest/dg/command-reference-eb.html
UpdateEnvironment API: http://docs.aws.amazon.com/elasticbeanstalk/latest/APIReference/API_UpdateEnvironment.html
Given that your current workflow involves the console, you can upload a new version of the file using the AWS console.
Read the walkthrough here for more details:
http://docs.aws.amazon.com/elasticbeanstalk/latest/dg/GettingStarted.Walkthrough.html

PHP in Azure cloud service

Can any one suggest what is the configuration need to be done to run a simple php file in the Azure cloud service.
I am able to view my simple php file in Azure Web Site, But i cant view the same file in Azure Cloud service.
Can some one provide me some suggestions?
When you say "I am able to view my simple php file in Azure Web Site, But i cant view the same file in Azure Cloud service.", I believe you mean you can browse your PHP site in Windows Azure Websites however not with Windows Azure Service configured as something.cloudapp.net. Isn't it?
As you asked for suggestions here are some steps you can use to troubleshoot your problem:
When you access your PHP file i.e. servicename.cloudapp.net/phpinfo.php does it shows file not found error or it returns error related with PHP runtime? This is because you would need to put together PHP runtime as well as all the PHP files in your package before deploying it. If you miss any of important content, your site will not run. So you must be sure that all the contents specific to your application are in your Package which you deployed to Windows Azure cloud service.
To verify you have all the contents in your Package, you can rename cspkg to zip and then unzip it. After that you can rename cssx file to zip again and unzip it to see if all the contents are there.
You can enabled RDP access to your Azure VM and log into to check for consistency
Also for Azure Cloud Service you use this documentation: http://azurephp.interoperabilitybridges.com/articles/packaging-applications

Categories