I've been trying to set permissions of a file I upload through code to a shared drive through a service account so that the file is publicly visible, unfortunately everything I've tried thus far doesn't seem to be working. Uploading the file to the shared drive works correctly, but making it publicly accessible does not. I create my permission like this below. Note, the below call is deferred.
$permissions = new Google_Service_Drive_Permission([
'type' => 'domain',
'domain' => '[DOMAIN]',
'role' => 'reader',
'allowFileDiscovery' => false,
'supportsAllDrives' => true,
'transferOwnership' => true,
'fields' => '*'
]);
$service->permissions->create($status['id'], $permissions);
$status['id'] is returned after all the file chunks are uploaded, and I've printed it to make sure that it is the correct file id. The service account uploads the file, so it should have permission to update these permissions, but it doesn't seem to be working. If it matters, the service account is a contributor on the shared drive. Any guidance would be appreciated, and I can further update my post with more of my code if needed.
It turns out I was putting the parameters in the wrong spot. In order to get what I wanted I used this
$params = [
'supportsAllDrives' => true
];
$service->permissions->create($status['id'], new Google_Service_Drive_Permission([
'role' => 'reader',
'type' => 'domain',
'domain' => '[DOMAIN]',
'allowFileDiscovery' => false
]), $params);
$service->permissions->create($status['id'], $permissions);
Related
im doing an Laravel 9 project, where i store image on azure accout.
I use this library https://github.com/matthewbdaly/laravel-azure-storage.
I try to connect with SAS Token it works but i got AuthorizationPermissionMismatch when i try to read it :
Route::get('/azure-test', function() {
$path = '';
$disk = \Storage::disk('azure');
$files = $disk->files($path);
dump($files);exit;
}
My configuration :
'driver' => 'azure',
'driver' => 'azure',
'sasToken' => env('AZURE_STORAGE_SAS_TOKEN'),
'container' => env('AZURE_STORAGE_CONTAINER'),
'url' => env('AZURE_STORAGE_URL'),
'prefix' => null,
'endpoint' => env('AZURE_STORAGE_ENDPOINT'),
'retry' => [
'tries' => 3,
'interval' => 500,
'increase' => 'exponential'
],
Just to be clear the file exist, i test it without SAS token configuration it display informations about my test file. I already searched and did some change, like assigned my account to roles "Storage Blob Data Contributor” and “Storage Queue Data Contributor", my sas token still work when i try to see my file "https://xxxxxxxx.blob.core.windows.net/container_name/im_test_file.pdf?sp=r&st=2022-06-24T15:32:22Z&se=2024-04-30T23:32:22Z&sv=2021-06-08&sr=c&sig=QSz6SZ6UrSMg0jqyKEr4bnnGqrMuxK2EIbGgTTbP%2F10%3D" it works.
Any Idea ?
AFAIK, To resolve "AuthorizationPermissionMismatch" error try assigning Storage Blob Data Reader role to your account like below:
Please note that Azure role assignments can take up to five
minutes to propagate.
Check whether you have given the below permissions while creating the SAS token:
For more in detail, please refer below links:
Authorize access to blobs with AzCopy & Azure Active Directory | Microsoft Docs
Fixed – authorizationpermissionmismatch Azure Blob Storage – Nishant Rana's Weblog
I am trying to commit a file to Gitlab through the API.
The code had been working for over 6 months but now has stopped working, nothing in the code had changed. The file is commited to Gitlab, however it is corrupted.
I have went through the Guzzle documentation and everything looks correct, and I have done the same for the Gitlab documentation about commits.
I am now using the Laravel Illuminate/Http class to send the commits but the same thing is still happening. I am able to commit to Gitlab, but the file is not formatted correctly.
$response = Http::withHeaders([
'PRIVATE-TOKEN' => $gitlab_token,
])
->post('https://gitlab.com/api/v4/projects/.../repository/commits/', [
'branch' => 'test-branch',
'commit_message' => 'Updated audio file test.mp3',
'actions' => array(array(
'action' => 'update',
'file_path' => 'filePath/../.mp3',
'content' => base64_encode($var)
)),
]);
If I do not encode the contents of the file to base 64 I get the error:
Malformed UTF-8 characters, possibly incorrectly encoded in file
Has anything changed on the API side that has effected how files are processed for committing? Has anyone else found a solution?
I think you have two problems; first there's no content type specified. The post request will be sent just as multipart/form-data with a file attachment, or application/x-www-url-encoded without. This API is expecting JSON data.
Second, there is a parameter in the commit endpoint to specify file encoding. It defaults to "text" but you are sending a base64-encoded file.
I'm not familiar enough with the Gitlab API to say for sure, but your file_path property looks strange as well; shouldn't it just be ".mp3" to be put into that directory?
Try this:
$response = Http::withHeaders([
'PRIVATE-TOKEN' => $gitlab_token,
'Content-Type' => 'application/json',
])
->post('https://gitlab.com/api/v4/projects/.../repository/commits/', [
'branch' => 'test-branch',
'commit_message' => 'Updated audio file test.mp3',
'actions' => [
[
'action' => 'update',
'file_path' => '.mp3',
'content' => base64_encode($var),
'encoding' => 'base64',
]
],
]);
Well, the problem is here. I created a local project to create a product in Woocommerce mounted in wordpress on a remote server. My local project code is this one
<?php
require __DIR__ . '/vendor/autoload.php';
use Automattic\WooCommerce\Client;
function creaProd(){
$precio = $_POST['Total'];
$imagen = $_POST['Imagen'];
$descrip = $_POST['Descripcion'];
$tipo = $_POST['Tipo'];
$woocommerce = new Client(
'http://example.com',
'ck_sdfsdfsdfsfdxxx',
'cs_sdfsdfsfsdfaxxx',
[
'wp_api' => true,
'version' => 'wc/v1',
]
);
$data = [
'name' => $tipo,
'type' => 'simple',
'regular_price' => $precio,
'description' => $descrip,
'short_description' => $descrip,
'categories' => [
[
'id' => 9
],
[
'id' => 14
]
],
'images' => [
[
'src' => 'http://demo.woothemes.com/woocommerce/wp-content/uploads/sites/56/2013/06/T_2_front.jpg',
'position' => 0
],
[
'src' => 'http://demo.woothemes.com/woocommerce/wp-content/uploads/sites/56/2013/06/T_2_front.jpg',
'position' => 1
]
]
];
print_r($woocommerce->post('products', $data));
}
creaProd();
And evertything works fine, the problem is, that I have tried a bunch of things, but I just don't get to create the product working in the wordpress project.
I put it in the wp-includes folder and the wp-content, but didn't work.
I tried to call an ajax to example.com/wp-includes/myFile.php but I can't reach it, I can reach files like example.com/wp-includes/option.php and all the files already there, but if I upload that one, I just can't, and I don't know where to put the vendor folder either.
Which is the right way to integrate this project to my real site in Wordpress?
Hope someone knows how to do this. Thanks.
I think the best way to integrate third party libraries into Wordpress is creating your own plugins (This for me is the best option, cause you can use other API Wordpress even security stuff like if a user is login or have the right permissions). they are simple to create and they can be enabled through the Wordpress dashboard admin.
Here is some post about it:
How to create plugin - Wordpress Documentation
in this article you can find how to write a plugin in Wordpress from the official documentation
[Note to reviewer I am the same person as user 8256950. When I tried to create a login for 8256950 it create a new login 8262086 instead. Don't know why but I do destroy all cookies daily.]
Your project is a REST client which is usually run from a different server. It is not part of the WordPress server and I would put its files in its own directory. It is not a plugin. It is also not AJAX. (No JavaScript is used in the REST client to REST server communication but of course the client can be invoked by Javascript.)
Concerning your specific problem reaching files it would be helpful if you provided the Network log from your browser. On Chrome 'More tools' -> 'Developer tools' -> 'Network'. Look for the request for you file and see if there is an error message.
I am using laravel 5.0 and utlising the socialite extension to enable twitter login. I encountered a problem with the retrieval of the users twitter profile picture.
The url for the profile picture I receive from twitter is in the following format.
http://pbs.twimg.com/profile_images/662983942727999489/q5I9DMyE_normal.png
This is saved to my db and shown when the user logs into their account. The problem is this image is serving over HTTP and is producing browser warnings when users are accessing their account, as not all the page content is served over HTTPS.
Is there any way to save the twitter profile picture with HTTPS compared to HTTP.
$user = User::create([
'provider_id' => $userData->id,
'name' => $userData->name,
'username' => $userData->nickname,
'email' => $userData->email,
'avatar' => $userData->avatar,
'active' => 1,
]);
I save the user twitter data to my db as shown above and it the $userData->avatar part which is saving the HTTP url.
I can't seem to work a way around this and can't find much documentation on the issue. Any help would be appreciated.
Well, verifying this url, it seems simple https:// for the same url works, so you can do:
$user = User::create([
'provider_id' => $userData->id,
'name' => $userData->name,
'username' => $userData->nickname,
'email' => $userData->email,
'avatar' => str_replace('http://','https://',$userData->avatar),
'active' => 1,
]);
I have an AWS S3 account which contains 3 buckets. I need to be able to generate access codes for a new user so that they can access the buckets and add/delete files (preferably only their own, but not a deal breaker).
I have managed to get as far as granting access to new users using IAM. However, when I read the metadata of uploaded objects (in PHP using the AWS SDK) the owner comes back as the main AWS account.
I've read pages of documentation but can't seem to find anything relating to determining who the owner (or uploader) of the file was.
Any advice or direction massively appreciated!
Thanks.
If your only problem is to find the owner of Uploaded file.
You can pass the owner info as meta-data of uploaded file.
Check http://docs.amazonwebservices.com/AmazonS3/latest/dev/UsingMetadata.html
In php code while uploading :
// Instantiate the class.
$s3 = new AmazonS3();
$response = $s3->create_object(
$bucket,
$keyname2,
array(
'fileUpload' => $filePath,
'acl' => AmazonS3::ACL_PUBLIC,
'contentType' => 'text/plain',
'storage' => AmazonS3::STORAGE_REDUCED,
'headers' => array( // raw headers
'Cache-Control' => 'max-age',
'Content-Encoding' => 'gzip',
'Content-Language' => 'en-US',
'Expires' => 'Thu, 01 Dec 1994 16:00:00 GMT',
),
'meta' => array(
'uploadedBy' => 'user1',
) )
);
print_r($response);
Check php api for more info.