I'm trying to upload an image to my Firebase Storage using Laravel (PHP).
public function uploadObject($bucketName, $objectName, $source) {
$projectId = 'cloudestate-d10da';
$storage = new StorageClient([
'projectId' => $projectId,
'keyFilePath' => __DIR__.'/StorageAcc.json'
]);
$file = fopen($source, 'r');
$bucket = $storage->bucket("images");
$object = $bucket->upload($file, [
'name' => $objectName
]);
printf('Uploaded %s to gs://%s/%s' . PHP_EOL, basename($source), $bucketName, $objectName);
}
But when I try to run it I get the following error:
{
"error": {
"errors": [
{
"domain": "global",
"reason": "forbidden",
"message": "storageacc#cloudestate-d10da.iam.gserviceaccount.com does not have storage.objects.create access to images/testezc.png."
}
],
"code": 403,
"message": "storageacc#cloudestate-d10da.iam.gserviceaccount.com does not have storage.objects.create access to images/testezc.png."
}
}
I also gave my Service Account "Owner" role but didn't seem to fix the error.
First, you must verify that the configuration variables that are used are the same as the project you are trying to run. Try printing the environment variables.
console.log(process.env.FIREBASE_CONFIG)
Firebase establishes by default this configuration. If you want to change this variables you can use.
curl -sL https://firebase.tools | bash
Then proceed to authenticate yourself:
firebase login
Then list the project that are available:
firebase projects:list
To view all the project aliases:
firebase use
Then to match the project configuration variables set the alias you are using to the current local environment.
firebase use <alias>
Finally, verify the configuration variables are those you needed.
Related
I got some problem while managing migration from PHP5 to PHP7.
Actually, what I was doing with php5, to get an URL where upload a file, was something like this:
$options = [ 'gs_bucket_name' => 'mybucket.appspot.com' ];
$myvar = $_GET['myvar'];
$upload_url = CloudStorageTools::createUploadUrl($myvar, $options);
return $upload_url;
I've uploaded the library and now I'm trying to use
Google\Cloud\Storage\StorageClient
I've converted the code above to something like this:
$bucketName = 'mybucket.appspot.com';
$objectName = $_GET['myvar'];
$storage = new StorageClient([
'scopes' => [
'https://www.googleapis.com/auth/iam',
'https://www.googleapis.com/auth/devstorage.full_control',
]
]);
$bucket = $storage->bucket($bucketName);
$object = $bucket->object($objectName);
$upload_url = $object->signedUrl(
new \DateTime('90 min'),
[
'version' => 'v4',
]
);
return $upload_url;
I got a 403 FORBIDDEN error
Fatal error: Uncaught GuzzleHttp\Exception\ClientException: Client error: `POST https://iamcredentials.googleapis.com/v1/projects/-/serviceAccounts/myproject#appspot.gserviceaccount.com:signBlob?alt=json` resulted in a `403 Forbidden` response:
{
"error": {
"code": 403,
"message": "IAM Service Account Credentials API has not been used in project 123456 (truncated...)
....
Some suggestions? In the PHP5 version upload is working (with the older code), so I suppose my app angine service account has the correct permissions setted.
Thanks
Solved by passing .json file for authentication.
$storage = new StorageClient([
'projectId' => "my_project",
'keyFilePath' => 'key/serviceaccount.json',
]);
I have written an integration for the AWS SDK in PHP to send the keys for files on an S3 bucket and retrieve a pre-signed url, which more or less follows this example. This worked perfectly except I need to now check if the file exists on s3 and return empty string if it does not.
I am doing this with my code below:
<?php
public function getUrl($key, $region, $version, $fileBucket): string
{
$this->s3Client = new Aws\S3\S3Client([
'region' => $region,
'version' => $version,
]);
if ($this->s3Client->doesObjectExist($fileBucket, $key) === false) {
return '';
}
$cmd = $this->s3Client->getCommand('GetObject', [
'Bucket' => $fileBucket,
'Key' => $key
]);
$request = $this->s3Client->createPresignedRequest($cmd, $preSignedExpiration);
$preSignedUrl = (string) $request->getUri();
return $preSignedUrl ?? '';
}
My code returned the presigned urls perfectly until I added the if statement checking the method doesObjectExist(). doesObjectExist() returns false for all of my objects on S3, and not just the ones with a $key that does not exist in the S3 bucket. So that rules out the S3 keys, buckets or api keys being invalid, which has been the answer I have come across researching this issue primarily.
Also if you happen to know a better way to do this using what is returned in $cmd or $request, without having to call doesObjectExist() (if it is possible) that would simplify my code a bit.
Edit:
The policy for the IAM user I am accessing the AWS S3 API through looks like this:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:ListBucket",
"s3:DeleteObject"
],
"Resource": [
"arn:aws:s3:::example-dev-us-east-1-111111111111",
"arn:aws:s3:::example-dev-us-east-1-111111111111/*"
]
}
]
}
And the s3 Bucket policy looks like:
{
"Sid": "dev-web-app-get-policy",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::2222222222:role/example-dev-us-east-1-111111111111"
},
"Action": [
"s3:GetObject",
"s3:DeleteObject",
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::example-dev-us-east-1-111111111111",
"arn:aws:s3:::example-dev-us-east-1-111111111111/*"
]
}
So it would seem that I should be able to access the needed resources. I listed fake bucket names, for security. But I have checked that those mathc s3 in the console.
When have tried calling the the api for the HeadObject like this:
$this->s3Client->headObject([
'Bucket' => $fileBucket,
'Key' => $key
]);
And I get the following message:
"Error executing \"HeadObject\" on \"https://example-dev-us-east-1-111111111111/54bf7350-a819-41dd-9da3-3710eb8b095a-Screenshot_2021-03-09_20-15-12.png\"; AWS HTTP error: Client error: `HEAD https://example-dev-us-east-1-111111111111/54bf7350-a819-41dd-9da3-3710eb8b095a-Screenshot_2021-03-09_20-15-12.png` resulted in a `403 Forbidden` response (client): 403 Forbidden (Request-ID: XR6W77BB817407K2) - ",
"url": "/files.json"
Not sure if that helps.
Thanks!
You need the s3:GetObject permission to invoke the HeadObject API, which is what the PHP SDK invokes when your code calls doesObjectExist().
If the object you request does not exist, the error Amazon S3 returns
depends on whether you also have the s3:ListBucket permission.
If you have the s3:ListBucket permission on the bucket, Amazon S3
returns an HTTP status code 404 ("no such key") error.
If you don’t
have the s3:ListBucket permission, Amazon S3 returns an HTTP status
code 403 ("access denied") error.
So, you probably have s3:ListBucket but not s3:GetObject.
I am trying to check if a bucket exists in Laravel PHP.
I am getting a 403 on the exists() method. Why?
See line 160 https://github.com/googleapis/google-cloud-php/blob/master/Storage/src/Bucket.php
$storageClient = new \Google\Cloud\Storage\StorageClient([
'projectId' => env('GCS_PROJECT_ID'),
'keyFilePath' => storage_path(env('GCS_KEY_FILE')),
]);
$bucket = $storageClient->bucket('mybucketname');
if (!$bucket->exists()) {
$bucket = $storageClient->createBucket('mybucketname');
}
{ "error": {
"code": 403,
"message": "myaccount#api-project-xxxxxxxxx.iam.gserviceaccount.com does not have storage.buckets.get access to downloads.",
"errors": [ {
"message": "myaccount#api-project-xxxxxxxx.iam.gserviceaccount.com does not have storage.buckets.get access to mybucketname.",
"domain": "global",
"reason": "forbidden"
} ]
} }
Your service account does not have the storage.buckets.get permission. In order to check whether a bucket exists and then create it, assign roles/storage.admin to your service account.
For reference, see:
Cloud IAM roles for Cloud Storage
Cloud IAM permissions for Cloud Storage
I've come across an issue using the googleapis/google-api-php-client library, specifically the Dataflow Service that I cannot solve.
When I try to use the library I set up the request like so:
$this->client = new \Google_Client();
$this->client->setAuthConfig(config_path('google-service-account.json'));
$this->client->setIncludeGrantedScopes(true);
$this->client->addScope(\Google_Service_Dataflow::CLOUD_PLATFORM);
$body = [
"gcsPath" => "gs://{$this->bucket}/{$this->template}",
"location" => "us-central1",
];
$parameters = new \Google_Service_Dataflow_LaunchTemplateParameters;
$parameters->setJobName($this->jobname);
$parameters->setParameters($body);
$service = new \Google_Service_Dataflow($this->client);
$request = $service->projects_templates->launch($this->project, $parameters);
And I get the following error:
{
"error": {
"code": 400,
"message": "(11f8b78933fc59c3): Bad file name: , expected
'gs://\u003cbucket\u003e/\u003cpath\u003e'",
"errors": [
{
"message": "(11f8b78933fc59c3): Bad file name: , expected
'gs://\u003cbucket\u003e/\u003cpath\u003e'",
"domain": "global",
"reason": "badRequest"
}
],
"status": "INVALID_ARGUMENT"
}
}
It seems that the path is getting corrupted along the way, I've checked and it gets fine until the Guzzle object is instantiated to send the request inside the library.
I'm pretty lost at this point so any suggestion or clue is welcome.
Thank you in advance.
No gcsPath is given in the query params for the request constructed by the SDK.
This is because gcsPath is set to be an option for Google_Service_Dataflow_LaunchTemplateParameters.
It is documented that request query parameters be given as optional params
(See https://github.com/googleapis/google-api-php-client-services/blob/v0.81/src/Google/Service/Dataflow/Resource/ProjectsTemplates.php#L73.)
$opt_params = [
"gcsPath" => "gs://{$this->bucket}/{$this->template}",
"location" => "us-central1",
];
$template_params = [
// Keep template params here.
];
$launch_params = new \Google_Service_Dataflow_LaunchTemplateParameters;
$launch_params->setJobName($this->jobname);
$parameters->setParameters($template_params);
$service = new \Google_Service_Dataflow($this->client);
$request = $service->projects_templates->launch($this->project, $parameters, $opt_params);
I am trying to build a very simplified piece of code that is to simply upload a local file from server to my personal drive account. Right now I am struggling with authentication issues, and not sure what kind of credential type I need.
Importantly, since this is only accessing my own private drive, I want to upload my credentials once and not have it ask in future. I am not trying to use oAuth to access my user's drives at all. It seems most documentation is for if we are trying to authenticate and access OTHER users drives?
I think I can manage the upload code once I can authenticate, but right now I can't even list my existing files.
Here is my code so far:
<?php
require_once 'google-api-php-client/vendor/autoload.php';
/**
* Returns an authorized API client.
* #return Google_Client the authorized client object
*/
function getClient($credentialsPath = ''){
$client = new Google_Client();
$client->setApplicationName('Google Drive API');
$client->setAuthConfig($credentialsPath); // saved some credentials from console, but not sure which ones to use?
$client->setScopes(Google_Service_Drive::DRIVE);
return $client;
}
// Get the API client and construct the service object.
$client = getClient($credentialsPath);
$service = new Google_Service_Drive($client);
// Print the names and IDs for up to 10 files.
$optParams = array(
'pageSize' => 10,
'fields' => 'nextPageToken, files(id, name)'
);
$results = $service->files->listFiles($optParams);
if (count($results->getFiles()) == 0) {
print "No files found.\n";
} else {
print "Files:\n";
foreach ($results->getFiles() as $file) {
printf("%s (%s)\n", $file->getName(), $file->getId());
}
}
When I run this code I get the following error code:
Fatal error: Uncaught exception 'Google_Service_Exception' with message '{ "error": { "errors": [ { "domain": "usageLimits", "reason": "dailyLimitExceededUnreg", "message": "Daily Limit for Unauthenticated Use Exceeded. Continued use requires signup.", "extendedHelp": "https://code.google.com/apis/console" } ], "code": 403, "message": "Daily Limit for Unauthenticated Use Exceeded. Continued use requires signup." }
The few things I looked at suggest this code is not about daily limit, but in fact incorrect usage of credentials. Any idea how I can fix this?