We have an PHP app, where for encryption of the connection with database we need to provide 3 files that shouldn't be publicly accessible, but should be present on the server to make the DB connection (https://www.cleardb.com/developers/ssl_connections)
Obviously we don't want to store them in the SCM with the app code, so the only idea that comes to my mind is using post-deploy action hook and fetch those files from storage account (with keys and URIs provided in the app parameters).
Is there a nicer/cleaner way to achieve this? :)
Thank you,
You can try to use Custom Deployment Script to execute additional scripts or command during the deployment task. So you can create a php script whose functionality is to download the certificate files from Blob Storage to server file system location. And then in your PHP application, the DB connection can use these files.
Following are the general steps:
Enable composer extension in your portal:
Install azure-cli module via npm, refer to https://learn.microsoft.com/en-us/azure/xplat-cli-install for more info.
Create deployment script for php via command azure site deplotmentscript --php
Execute command composer require microsoft/windowsazure, make sure you have a composer.json with the storage sdk dependency.
Create php script in your root directory to download flies from Blob Storage(e.g. named run.php):
require_once 'vendor/autoload.php';
use WindowsAzure\Common\ServicesBuilder;
use MicrosoftAzure\Storage\Common\ServiceException;
$connectionString = "<connection_string>";
$blobRestProxy = ServicesBuilder::getInstance()->createBlobService($connectionString);
$container = 'certificate';
$blobs = ['client-key.pem','client-cert.pem','cleardb-ca.pem'];
foreach($blobs as $k => $b){
$blobresult = $blobRestProxy->getBlob($container, $b);
$source = stream_get_contents($blobresult->getContentStream());
$result = file_put_contents($b, $source);
}
Modify the deploy.cmd script, add santence php run.php under the step KuduSync.
Deploy your application to Azure Web App via Git.
Any further concern, please feel free to let me know.
Related
Is it possible to get an image from google cloud bucket in a similar fashion to file_get_contents? I need the image for mess around with without needing to have it on my local machine/server
i have tried using the google cloud storage library for php (https://github.com/googleapis/google-cloud-php-storage) but it keeps giving me "failed to open stream" error, which leads me to believe that I may provided the wrong url to the file_get_content function.
I should mention that all the files are set to public, with public storage.cloud.google.com links available
My current code is similar to this:
require 'vendor/autoload.php';
use Google\Cloud\Storage\StorageClient;
$storage = new StorageClient();
$storage->registerStreamWrapper();
$contents = file_get_contents('gs://{BUCKET}/{IMAGE NAME}.jpg');
While my current bucket structure is more like this
BUCKET - FOLDER - IMAGE
I have tried adding the folder in the URL and it still gave me the same "failed to open stream" error. Could anyone point me at what am i doing wrong? Do I need to authenticate? Do I even need the library for this? I tried using file_get_content($public_url) but since storage.cloud.google.com links redirect you that obviously didn't work hahaha
Any help would be appreciated
Cheers
I have installed the Cloud SDK, and used the default service account credentials to connect to Google Cloud services link.
In general, the Google Cloud PHP library uses Service Account
credentials to connect to Google Cloud services. When running on
Compute Engine the credentials will be discovered automatically. When
running on other environments, the Service Account credentials can be
specified by providing the path to the JSON keyfile for the account
(or the JSON itself) in environment variables.
Run composer require google/cloud
Create a test.php file:
<?php
# Includes the autoloader for libraries installed with composer
require __DIR__ . '/vendor/autoload.php';
# Imports the Google Cloud client library
use Google\Cloud\Storage\StorageClient;
# Your Google Cloud Platform project ID
$projectId = 'my_project';
# Instantiates a client
$storage = new StorageClient([
'projectId' => $projectId
]);
# The name for the new bucket
$storage->registerStreamWrapper();
$contents = file_get_contents('gs://my-buck-sample/image.JPG');
echo 'content' . $contents . ' created.';
?>
3.Run php test.php
So I've been playing around with heroku and I really like it. it's fast and it just works. However i have encountered a problem with my gallery app: https://miko-gallery.herokuapp.com . Create a free account , create an album and try uploading a photo. It will not display. I have run the php artisan storage:link command, but it does not work. What am i missing here?
EDIT
I've just tried a new thing, I tried running heroku run bash and i cd'ed into storage/app/public folder, and it does not contain the folder images which was supposed to be there.
My code for saving the photo is here (works on localhost):
public function store(Request $request)
{
$ext = $request->file('items')->getClientOriginalExtension();
$filename = str_random(32).'.'.$ext;
$file = $request->file('items');
$path = Storage::disk('local')->putFileAs('public/images/photos', $file, $filename);
$photo = new Photo();
$photo->album_id = $request->album_id;
$photo->caption = $request->caption;
$photo->extension = $request->file('items')->getClientOriginalExtension();
$photo->path = $path.'.'.$photo->extension;
$photo->mime = $request->file('items')->getMimeType();
$photo->file_name = $filename;
$photo->save();
return response()->json($photo, 200);
}
Heroku's filesystem is dyno-local and ephemeral. Any changes you make to it will be lost the next time each dyno restarts. This happens frequently (at least once per day).
As a result, you can't store uploads on the local filesystem. Heroku's official recommendation is to use something like Amazon S3 to store uploads. Laravel supports this out of the box:
Laravel provides a powerful filesystem abstraction thanks to the wonderful Flysystem PHP package by Frank de Jonge. The Laravel Flysystem integration provides simple to use drivers for working with local filesystems, Amazon S3, and Rackspace Cloud Storage. Even better, it's amazingly simple to switch between these storage options as the API remains the same for each system.
Simply add league/flysystem-aws-s3-v3 ~1.0 to your dependencies and then configure it in config/filesystems.php.
if you don't have ssh access then simply create a route.so you can hit this command simply by hitting url
Route::get('/artisan/storage', function() {
$command = 'storage:link';
$result = Artisan::call($command);
return Artisan::output();
})
firstly unlink existing link from storage
I am now using the old Windows Azure SDK for PHP (2011).
require "../../Microsoft/WindowsAzure/Storage/Blob.php";
$accountName = 'resources';
$accountKey = 'accountkey';
$storageClient = new Microsoft_WindowsAzure_Storage_Blob('blob.core.windows.net',$accountName,$accountKey);
$storageClient->blobExists('resources','blob.jpg');
I want to upgrade to the new SDK but don't know how to install it manually.
The website is running on an "Azure Webapp" via FTP.
I managed to find that I need to copy the "WindowsAzure" folder, and reference the classes I need, but how?
I followed the instructions on this page https://azure.microsoft.com/nl-nl/documentation/articles/php-download-sdk/ but got stuck after that.
edit: I have no experience with pear or composer, I want to manually install (upload) it via ftp.
EDIT:
Now I've got the following:
require_once('WindowsAzure/WindowsAzure.php');
use WindowsAzure\Common\ServicesBuilder;
use WindowsAzure\Common\ServiceException;
// Create blob REST proxy.
$connectionString='DefaultEndpointsProtocol=http;AccountName=ttresources;AccountKey=***************';
$blobRestProxy = ServicesBuilder::getInstance()->createBlobService($connectionString);
$container = '87d57f73-a327-4344-91a4-27848d319a66';
$blob = '8C6CBCEB-F962-4939-AD9B-818C124AD3D9.mp4';
$blobexists = $blobRestProxy->blobExists($container,$blob);
echo $blobexists;
echo 'test';
The line $blobRestProxy = ServicesBuilder::getInstance()->createBlobService($connectionString); is blocking everything that happens after it.
After including the HTTP/Request2.php, the page told me to place the Net/Url2 and the PEAR package in the root of my project (the same dir as where I have the page that loads the windowsazure.php), they can be found at:
http://pear.php.net/package/Net_URL2/redirected
http://pear.php.net/package/PEAR/download
You can get the lasted Azure SDK for PHP on GitHub repository at https://github.com/Azure/azure-sdk-for-php. Then upload the WindowsAzure folder to your Azure Web Apps via FTP.
And include the SDK in PHP scripts like:
require_once 'WindowsAzure/WindowsAzure.php';
As in new Azure SDK for PHP, it requires some other dependencies. As your script is blocked at $blobRestProxy = ServicesBuilder::getInstance()->createBlobService($connectionString); you may check whether you have and dependency HTTP/Request2. New SDK leverages it to handle http requests.
You may require this dependency first. Also, you can enable display_errors for troubleshooting.
And in new SDK, it seems there is no function blobExists any more. In stead, you may try :
// Get blob.
$blob = $blobRestProxy->getBlob("mycontainer", "myblob");
if ($blob) {
//blob exists
}
The google docs states
The GCS stream wrapper is built in to the run time, and is used when you supply a file name starting with gs://.
When I look into the app.yaml, I see where the runtime is selected. I have selected php runtime. However when I try to write to my bucket I get an error saying the wrapper is not found for gs://. But when I try to write to my bucket using the helloworld.php script that is provided by google here https://cloud.google.com/appengine/docs/php/gettingstarted/helloworld and modifying it so that it says
<?php
file_put_contents('gs://<app_id>.appspot.com/hello.txt', 'Hello');
I have to deploy the app in order for the write to be successful. I am not understanding why I have to deploy the app everytime to get the wrapper I need to write to my bucket. How come I can not write to my bucket from a random php script?
Google says
"In the Development Server, when a Google Cloud Storage URI is specified we emulate this functionality by reading and writing to temporary files on the user's local filesystem"
So, "gs://" is simulated locally - to actually write to GCS buckets using the stream wrapper, it has to run from App Engine itself.
Try something like this:
use google\appengine\api\cloud_storage\CloudStorageTools;
$object_url = "gs://bucket/file.png";
$options = stream_context_create(['gs'=>['acl'=>'private', 'Content-Type' => 'image/png']]);
$my_file = fopen($object_url, 'w', false, $options);
fwrite($my_file, $file_data));
fclose($my_file);
So ive followed the guide on how to use GCS on their site: https://developers.google.com/appengine/docs/php/googlestorage/
But the following code does not work. I cannot access my bucket, and the CloudStorageTools is not even found.
require_once 'google/appengine/api/cloud_storage/CloudStorageTools.php';
use google\appengine\api\cloud_storage\CloudStorageTools;
// check if GCS bucket if writable
$my_bucket = 'gs://my_bucket/';
$check1 = is_writable($my_bucket); //returns nothing
class_exists("CloudStorageTools"); //returns false
I also added google_app_engine.allow_include_gs_buckets into the php.ini file. Still no support for GCS.
Anyone have some running code they could share?
I think it may have something to do with permissions set to your bucket. Make sure you added your app account email Application Settings -> Service Account Name.
I ported my app with no issues and using Smarty to write into and include from the bucket. I did not have to declare:
require_once 'google/appengine/api/cloud_storage/CloudStorageTools.php';
use google\appengine\api\cloud_storage\CloudStorageTools;
and the app reads and writes to the bucket with no problems.
I have had however issues with some WordPress porting, when although the bucket existed and had properly setup access permission, for some reason app throws a Fatal Error and I can see it in the app logs, that application was trying to access the cloud storage via gs:// wrapper.
Hope that helps.
Hi dude make sure you bucket is Public Write enabled
use this gsutil command to do so (google how to configure gsutil utility)
$ gsutil acl ch -u AllUsers:W gs://example-bucket-name