Download files from firebase Storage with php - php

i am new in firebase web if it possible to upload, download, and delete file using php. i have upload file using JS but i want to download using PHP.
Here is script of download file using JS but i want in PHP.
Thanks in advance...
My Code
[START storage_quickstart]
# Includes the autoloader for libraries installed with composer
require __DIR__ . '/vendor/autoload.php';
# Imports the Google Cloud client library
use Google\Cloud\Storage\StorageClient;
# Your Google Cloud Platform project ID
$projectId = 'My project ID';
# Instantiates a client
$storage = new StorageClient([
'projectId' => $projectId
]);
# The name for the new bucket
$bucketName = 'my bucket';
# Creates the new bucket
$bucket = $storage->createBucket($bucketName);
echo 'Bucket ' . $bucket->name() . ' created.';
# [END storage_quickstart]
return $bucket;

The short answer is that you should use gcloud-php. This requires that you set up a service account (or use Google Compute Engine/Container Engine/App Engine which provide default credentials).
It's likely that you'll create a service account, download a keyfile.json, and provide it as an argument to the StorageClient, like so:
# Instantiates a client
$storage = new StorageClient([
'keyFilePath' => '/path/to/key/file.json',
'projectId' => $projectId
]);
Alternatively, it looks like they've built another layer of abstraction, which takes the same arguments but allows you to use lots of other services:
use Google\Cloud\ServiceBuilder;
$gcloud = new ServiceBuilder([
'keyFilePath' => '/path/to/key/file.json',
'projectId' => 'myProject'
]);
$storage = $gcloud->storage();
$bucket = $storage->bucket('myBucket');

That's an old question, but I was struggling with same problem... hope my solution help someone.
In fact, I really don't know if there is an official way to do that, but I created the method below and it worked for me.
function storageFileUrl($name, $path = []) {
$base = 'https://firebasestorage.googleapis.com/v0/b/';
$projectId = 'your-project-id';
$url = $base.$projectId.'/o/';
if(sizeof($path) > 0) {
$url .= implode('%2F', $path).'%2F';
}
return $url.$name.'?alt=media';
}
To access files in the root of bucket:
$address = storageFileUrl('myFile');
Result: https://firebasestorage.googleapis.com/v0/b/your-project-id.appspot.com/o/myFile?alt=media
To access files inside some folder, do:
$address = storageFileUrl('myFile', ['folder', 'subfolder']);
Result: https://firebasestorage.googleapis.com/v0/b/your-project-id.appspot.com/o/folder%2Fsubfolder%2FmyFile?alt=media
Enjoy.

Related

Google Storage - dynamically change ACL on single object (PHP)

I need to update the ACL basically by adding or removing the allUsers entity.
I have the PHP library and what I'm doing at moment is:
$storage = new StorageClient([
'projectId' => "xxxxx",
'keyFilePath' => mykey,
]);
$bucket = $storage->bucket('mybucket');
$acl = $bucket->acl('objectAccessControls', 'path/file/on/bucket');
if(add){
$acl->add('allUsers', 'READER');
}else{
$acl->delete('allUsers');
}
With this code actually changes ALL bucket configuration, not the file only.
How can I correctly specify the path of a specific file and change permissions only on path/file/on/bucket? I'm using the wrong functions?
Here the documentation that I'm using
https://googleapis.github.io/google-cloud-php/#/docs/google-cloud/v0.90.0/storage/acl
This is the case if add:
This is the else case:
UPDATE 1:
Using this to delete seems working -> https://cloud.google.com/storage/docs/json_api/v1/objectAccessControls/delete
Tried to include the parameters listed here to the call I do, something like this:
$options = ['object' => 'path/obj'];
$acl->delete('allUsers', $options)
Still not working
Actually I've solved by using the Google_Service
$client = new Google_Client();
$client->setApplicationName('GoogleBuck/0.1');
$client->useApplicationDefaultCredentials(); // app engine env
$client->addScope('https://www.googleapis.com/auth/devstorage.full_control');
$storage = new Google_Service_Storage($client);
$acl = new Google_Service_Storage_ObjectAccessControl($client);
$acl->setEntity('allUsers');
$acl->setRole('READER');
$acl->setBucket($bucketName);
$acl->setObject($objectName);
To add
$response = $storage->objectAccessControls->insert($bucketName, $objectName, $acl);
To delete
$response = $storage->objectAccessControls->delete($bucketName, $objectName, 'allUsers');

storageclient class can't be found?

I'm trying desperately to figure out how to create a simple audio transcription script (for longer audio files) via PHP (the only language I know). I'm getting the error Class 'Google\Cloud\Storage\StorageClient' not found
I'm using the gcloud console code editor and everything should be installed (unless there is a separate composer install just for cloud storage, although I haven't been able to find anything about it in the documentation if there is).
I also entered gcloud auth application-default print-access-token which printed out an access token, but I don't know what (if any) I'm supposed to do with that other than the "set GOOGLE_APPLICATION_CREDENTIALS" command that I copied and pasted into the console shell prompt.
Here's the php code:
<?php
namespace Google\Cloud\Samples\Speech;
require __DIR__ . '/vendor/autoload.php';
use Exception;
# [START speech_transcribe_async_gcs]
use Google\Cloud\Speech\SpeechClient;
use Google\Cloud\Storage\StorageClient;
use Google\Cloud\Core\ExponentialBackoff;
$projectId = 'xxxx';
$speech = new SpeechClient([
'projectId' => $projectId,
'languageCode' => 'en-US',
]);
$filename = "20180925_184741_L.mp3";
# The audio file's encoding and sample rate
$options = [
'encoding' => 'LINEAR16',
'sampleRateHertz' => 16000,
'languageCode' => 'en-US',
'enableWordTimeOffsets' => false,
'enableAutomaticPunctuation' => true,
'model' => 'video',
];
function transcribe_async_gcs($bucketName, $objectName, $languageCode = 'en-US', $options = [])
{
// Create the speech client
$speech = new SpeechClient([
'languageCode' => $languageCode,
]);
// Fetch the storage object
$storage = new StorageClient();
$object = $storage->bucket($bucketName)->object($objectName);
// Create the asyncronous recognize operation
$operation = $speech->beginRecognizeOperation(
$object,
$options
);
// Wait for the operation to complete
$backoff = new ExponentialBackoff(10);
$backoff->execute(function () use ($operation) {
print('Waiting for operation to complete' . PHP_EOL);
$operation->reload();
if (!$operation->isComplete()) {
throw new Exception('Job has not yet completed', 500);
}
});
// Print the results
if ($operation->isComplete()) {
$results = $operation->results();
foreach ($results as $result) {
$alternative = $result->alternatives()[0];
printf('Transcript: %s' . PHP_EOL, $alternative['transcript']);
printf('Confidence: %s' . PHP_EOL, $alternative['confidence']);
}
}
}
# [END speech_transcribe_async_gcs]
transcribe_async_gcs("session_audio", $filename, "en-US", $options);
With apologies, PHP is not a language I'm proficient with but, I suspect you haven't (and must) install the client library for Cloud Storage so that your code may access it. This would explain its report that the Class is missing.
The PHP client library page includes two alternatives. One applies if you're using Composer, the second -- possibly what you want -- a direct download which you'll need to path correctly for your code.
Some time ago, I wrote a short blog post providing a simple example (using Cloud Storage) for each of Google's supported languages. Perhaps it will help you too.

how do i upload a file to a directory in google cloud storage using Google_Client library

i want to upload a file to google cloud storage using google client php library on github. Am able to upload file to cloud storage but am not able to upload to a directory in cloud storage. i get the error message No such object: bucketName/abc/test.jpg
$client = new Google_Client();
putenv('GOOGLE_APPLICATION_CREDENTIALS=files/google_cloud.json');
$client->useApplicationDefaultCredentials();
$storage = new Google\Cloud\Storage\StorageClient([
'projectId' => $googleprojectID
]);
$sPath = "files/com/test.jpg";
$objectName = "/abc/test.jpg";
$bucketName = $googlebucketName;
$bucket = $storage->bucket($bucketName);
$bucket->upload( fopen($sPath, 'r') );
$object = $bucket->object($objectName);
$info = $object->update(['acl' => []], ['predefinedAcl' => 'PUBLICREAD']);
First of all, let me share with you this documentation page where you will find the complete reference for the Google Cloud Storage PHP Client Library. More specifically, if you have a look at the upload() method, you will see that in order to set the name of the object uploaded (and therefore its location, given that GCS has a flat namespace), you have to use the options parameter, which can contain a name field pointing to the right location to upload.
Also, note that the correct object name should not start with a slash /, given that it will automatically be added after the bucket name. Therefore, you should modify your code to add something like this:
$sPath = "files/com/test.jpg";
$objectName = "abc/test.jpg"; # Note the removal of "/" here
$options = [
'name' => $objectName
];
$bucketName = $googlebucketName;
$bucket = $storage->bucket($bucketName);
$bucket -> upload(
fopen($sPath, 'r'),
$options
);

Bigquery could not get default credentials

I'm trying to setup Google Bigquery with Firebase and am having some issues. I have gcloud installed on my machine (MacOS Sierra) and have google cloud installed via composer on my project.
The following code on my project:
# Includes the autoloader for libraries installed with composer
require __DIR__ . '/vendor/autoload.php';
# Imports the Google Cloud client library
use Google\Cloud\BigQuery\BigQueryClient;
# Your Google Cloud Platform project ID
$projectId = 'hidden here only';
# Instantiates a client
$bigquery = new BigQueryClient([
'projectId' => $projectId
]);
# The name for the new dataset
$datasetName = 'test_dataset';
# Creates the new dataset
$dataset = $bigquery->createDataset($datasetName);
echo 'Dataset ' . $dataset->id() . ' created.';
All I'm trying to do is just create a dataset within bigquery via the library but I'm not able to due to the following error:
Fatal error: Uncaught Google\Cloud\Exception\ServiceException: Could not load the default credentials. Browse to https://developers.google.com/accounts/docs/application-default-credentials for more information in /Applications/MAMP/htdocs/projects/work/bigquery-tests/vendor/google/cloud/src/RequestWrapper.php on line 219
I've tried running gcloud beta auth applications-default login as the example code says to do but after logging in on the browser, the error is still present. Any help would be great, thanks!
You were very close, just you need to setup the service account default credentials see lines with putenv and useApplicationDefaultCredentials(). This is a working code I have using the library https://github.com/google/google-api-php-client You need to obtain your service account key file from the console: https://console.cloud.google.com/iam-admin/serviceaccounts/
composer.json
{
"require": {
"google/cloud": "^0.13.0",
"google/apiclient": "^2.0"
}
}
php file
# Imports the Google Cloud client library
use Google\Cloud\BigQuery\BigQueryClient;
use Google\Cloud\ServiceBuilder;
$query="SELECT repository_url,
repository_has_downloads
FROM [publicdata:samples.github_timeline]
LIMIT 10";
$client = new Google_Client();
putenv('GOOGLE_APPLICATION_CREDENTIALS='.dirname(__FILE__) . '/.ssh/dummyname-7f0004z148e1.json');//this can be created with other ENV mode server side
$client->useApplicationDefaultCredentials();
$builder = new ServiceBuilder([
'projectId' => 'edited',
]);
$bigQuery = $builder->bigQuery();
$job = $bigQuery->runQueryAsJob($query);
$info=$job->info();
// print_r($info);
// exit;
$queryResults = $job->queryResults();
/*$queryResults = $bigQuery->runQuery(
$query,
['useLegacySql' => true]);*/
if ($queryResults->isComplete())
{
$i = 0;
$rows = $queryResults->rows();
foreach ($rows as $row)
{
$i++;
$result[$i] = $row;
}
}
else
{
throw new Exception('The query failed to complete');
}
print_r($result);

Delete object or bucket in Amazon S3?

I created a new amazon bucket called "photos". The bucket url is something like:
www.amazons3.salcaiser.com/photos
Now I upload subfolders containing files, into that bucket for example
www.amazons3.salcaiser.com/photos/thumbs/file.jpg
My questions are, does thumbs/ is assumed a new bucket or is it an object?
Then if I want to delete the entire thumbs/ directory need I first to delete all files inside that or can I delete all in one time?
In the case you are describing, "photos" is the bucket. S3 does not have sub-buckets or directories. Directories are simulated by using slashes in the object key. "thumbs/file.jpg" is an object key and "thumbs/" would be considered a key prefix.
Dagon's examples are good and use the older version 1.x of the AWS SDK for PHP. However, you can do this more easily with the newest 2.4.x version AWS SDK for PHP which includes a helper method for deleting multiple objects.
<?php
// Include the SDK. This line depends on your installation method.
require 'aws.phar';
use Aws\S3\S3Client;
$s3 = S3Client::factory(array(
'key' => 'your-aws-access-key',
'secret' => 'your-aws-secret-key',
));
// Delete the objects in the "photos" bucket with the a prefix of "thumbs/"
$s3->deleteMatchingObjects('photos', 'thumbs/');
//Include s3.php file first in code
if (!class_exists('S3'))
require_once('S3.php');
//AWS access info
if (!defined('awsAccessKey'))
define('awsAccessKey', 'awsAccessKey');
if (!defined('awsSecretKey'))
define('awsSecretKey', 'awsSecretKey');
//instantiate the class
$s3 = new S3(awsAccessKey, awsSecretKey);
if ($s3->deleteObject("bucketname", `filename`)) {
echo 'deleted';
}
else
{
echo 'no file found';
}
found some code snippets for 'directory' deletion - i did not write them:
PHP 5.3+:
$s3 = new AmazonS3();
$bucket = 'your-bucket';
$folder = 'folder/sub-folder/';
$s3->get_object_list($bucket, array(
'prefix' => $folder
))->each(function($node, $i, $s3) {
$s3->batch()->delete_object($bucket, $node);
}, array($s3));
$responses = $s3->batch()->send();
var_dump($responses->areOK());
Older PHP 5.2.x:
$s3 = new AmazonS3();
$bucket = 'your-bucket';
$folder = 'folder/sub-folder/';
$s3->get_object_list($bucket, array(
'prefix' => $folder
))->each('construct_batch_delete', array($s3));
function construct_batch_delete($node, $i, &$s3)
{
$s3->batch()->delete_object($bucket, $node);
}
$responses = $s3->batch()->send();
var_dump($responses->areOK());
I have implemented this in Yii as,
$aws = Yii::$app->awssdk->getAwsSdk();
$s3 = $aws->createS3();
$s3->deleteMatchingObjects('Bucket Name','object key');

Categories