How to transfer single file to Google Cloud Storage Through Script - php

So I'll preface this by stating that my boss is asking me to set up a Google Cloud Storage back up process without having the Google Bucket instance or the .json key file for Google Cloud Platform. Basically, I'm writing a script to batch the process to our Linux and Windows instances of Google Cloud Platform machines to take the back up file that will be stored daily, send it to the Google Cloud Storage, then the delete the file from the system (so that the backup file doesn't take up space on the instance). Since all of this is relatively newer to me, I was hoping to get some advice on whether the key (.json private key) is needed for this process and how to include it; obviously, I don't have the key but I want to add a placeholder until we set that up.
I found Google's documentation pretty helpful but a tad overwhelming. Here is the link to what I have used for reference: https://cloud.google.com/storage/docs/uploading-objects#storage-upload-object-code-sample
Here is my code:
<?php
require_once 'vendor/autoload.php';
use Google\Cloud\Storage\StorageClient;
// Set up storage client with keyfilepath or .json key necessary?
// Set the parameters for Google cloud bucket name, filename, and path to file.
$bucketName = "fishbowl-storage";
$fileNameA = "backup_filename";
$fileNameB = "quickbook_filename";
$dataBackupSource = "";
$dataQBBackupSource = "";
// Set the function that will upload the file to google Cloud Storage.
function upload_object($bucketName, $fileName, $source)
{
// Create the storage client and store the file in the cloud.
$storage = new StorageClient();
$file = fopen($source, 'r');
$bucket = $storage->bucket($bucketName);
$file = $bucket->upload($file, [
'name' => $fileName
]);
// Print the success message to the user.
printf('Uploaded %s to gs://%s/%s' . PHP_EOL, basename($source), $bucketName, $fileName);
}
// Check the OS to determine whether Linux or Windows.
if(php_uname('s') == "Linux"){
// Set the file path to grab back up.
$dataBackupSource = "../Fishbowl/backup";
$dataQBBackupSource = "../Fishbowl/otherbackup";
// Use try/catch statement to catch issues with private key.
try{
// run the function to place both backup files in the Google Cloud Storage Bucket instance.
upload_object($bucketName, $fileNameA, $dataBackupSource);
upload_object($bucketName, $fileNameB, $dataQBBackupSource);
// Delete the old file stored in the path.
if(!unlink($fileNameA)){
print ("The file cannot be deleteted or isn't present in this directory...");
}
else{
print ("The file " . $fileNameA . " has been deleted.");
}
}
catch (Exception $e){
print "This process failed for the following reason: " . $e;
return false;
}
}
else if(php_uname('s') == "Windows NT"){
// Set the file path to grab back up.
$dataBackupSource = "../Fishbowl/backup";
$dataQBBackupSource = "../Fishbowl/otherbackup";
// Use try/catch statement to catch issues with private key.
try{
// run the function to place both backup files in the Google Cloud Storage Bucket instance.
upload_object($bucketName, $fileNameA, $dataBackupSource);
upload_object($bucketName, $fileNameB, $dataQBBackupSource);
// Delete the old file stored in the path.
if(!unlink($fileNameA)){
print ("The file cannot be deleteted or isn't present in this directory...");
}
else{
print ("The file " . $fileNameA . " has been deleted.");
}
}
catch (Exception $e){
print "This process failed for the following reason: " . $e;
return false;
}
}
else{
print "The operating system has another name...";
}
?>
TLDR: Do I need the JSON Key to be included in my script to pass a file to the Google Cloud Storage and, if so, where do I put it?

I think that this is the documentation that you are looking for. Basically you need to use the Cloud Client Libraries for the Google Cloud Storage API.These are the steps that you need to follow
1.Installing the client library
composer require google/cloud-storage
2.Setting up authentication. In this step you will create the service account and the Key
After setting the environment variable, you don't need to explicitly specify your credentials in code when using a Google Cloud client library as is mention here
3.Use the Client library. In this case you want to upload a file.
use Google\Cloud\Storage\StorageClient;
/**
* Upload a file.
*
* #param string $bucketName the name of your Google Cloud bucket.
* #param string $objectName the name of the object.
* #param string $source the path to the file to upload.
*
* #return Psr\Http\Message\StreamInterface
*/
function upload_object($bucketName, $objectName, $source)
{
$storage = new StorageClient();
$file = fopen($source, 'r');
$bucket = $storage->bucket($bucketName);
$object = $bucket->upload($file, [
'name' => $objectName
]);
printf('Uploaded %s to gs://%s/%s' . PHP_EOL, basename($source),
$bucketName, $objectName);
}

Related

How to zip & download folders and files stored in S3 storage in Laravel [duplicate]

Is there a way to zip and download files and folders which are in Amazon S3 bucket, together in Laravel? I Want to zip the three folders and one file in the picture together and download it
Here's a half baked solution in a route file. Hope it helps.
https://flysystem.thephpleague.com/docs/adapter/zip-archive/
composer require league/flysystem-ziparchive
I put this in routes/web.php just to play with.
<?php
use Illuminate\Support\Facades\Storage;
use League\Flysystem\Filesystem;
use League\Flysystem\ZipArchive\ZipArchiveAdapter;
Route::get('zip', function(){
// see laravel's config/filesystem.php for the source disk
$source_disk = 's3';
$source_path = '';
$file_names = Storage::disk($source_disk)->files($source_path);
$zip = new Filesystem(new ZipArchiveAdapter(public_path('archive.zip')));
foreach($file_names as $file_name){
$file_content = Storage::disk($source_disk)->get($file_name);
$zip->put($file_name, $file_content);
}
$zip->getAdapter()->getArchive()->close();
return redirect('archive.zip');
});
You'll definitely want to do something different than just plopping it in the public dir. Maybe stream it out straight out as a download or save it somewhere better. Feel free to post comment/questions and we can discuss.
I did it the following way after looking at some solutions by streaming the zip directly to the client by using https://github.com/maennchen/ZipStream-PHP :
if ($uploads) {
return response()->streamDownload(function() use ($uploads) {
$opt = new ArchiveOptions();
$opt->setContentType('application/octet-stream');
$zip = new ZipStream("uploads.zip", $opt);
foreach ($uploads as $upload) {
try {
$file = Storage::readStream($upload->path);
$zip->addFileFromStream($upload->filename, $file);
}
catch (Exception $e) {
\Log::error("unable to read the file at storage path: $upload->path and output to zip stream. Exception is " . $e->getMessage());
}
}
$zip->finish();
}, 'uploads.zip');
}

Google Speech to Text api client return without exception but no actual results

I am using sample code from googles site and this throws no exceptions but returns no results.
If I use the API explorer the same data works just fine. I have tried different files (all from google sample code) different settings. All of which give me the same result, Nothing.
function transcribe_sync($content)
{
// set string as audio content
$audio = (new RecognitionAudio())
->setContent($content);
// set config
$encoding = AudioEncoding::LINEAR16;
$sampleRateHertz = 32000;
$languageCode = 'en-US';
$config = (new RecognitionConfig())
->setEncoding($encoding)
->setSampleRateHertz($sampleRateHertz)
->setAudioChannelCount(1)
->setMaxAlternatives(1)
->setLanguageCode($languageCode);
// create the speech client
$client = new SpeechClient();
try {
$response = $client->recognize($config, $audio);
echo $response->getResults()
}
catch (\Exception $e) {
$this->handleError('Error determining recognition. ' . $e->getMessage());
}
finally {
$client->close();
}
My resolution to this issues was the way I was passing the file (don't think the file was being populated correctly or at all). It was weird that I did not get an error. Because of the length of my audio files, I ended up integrating google storage to upload the file () and used:
$audio = (new RecognitionAudio())->setUri("gs://...");
... longRunningRecognize($config, $audio);
Hope this helps someone.

Dropbox account as Wordpress uploads folder

As I'm developing a site with the potential of holding a huge number of images I'm wondering if I can set the standard WordPress uploads folder to a Dropbox account?
And if so, how do I do that?
It would be great when I can implement it in a maner that even WordPress don't know it is a 'remote' folder. Media upload should work the same way as in a native WordPress setup.
I've read about the possibility to use another folder instead of wp-content/uploads but I could not find any info about using Dropbox for this.
Yes, you can do it. As long as you keep the same structure on Dropbox and save Dropbox shareable links as meta data for both the original file and generated sizes, a simple yet fully working setup for it would be something like the following, by using thephpleague/flysystem with their Dropbox Adapter:
Step 1
Add a file called composer.json to the root of your theme with this content:
{
"require": {
"league/flysystem": "^1",
"league/flysystem-dropbox": "^1"
}
}
Step 2
Install Composer by following these instructions
Step 3
Using the command line on your terminal/console, go to your theme directory and run:
composer install -o
Step 4
Create Dropbox App here.
I suggest that you select "App folder" as the type of access.
A directory matching the name of you app will be created on the "Apps" directory at the root of your Dropbox account; This will be your "uploads" directory on Dropbox.
Step 5
Go to your app admin page and generate a new access token.
Save the access token somewhere and also copy the "App secret"
Step 6
Add the following to your functions.php:
use League\Flysystem\AdapterInterface;
use League\Flysystem\Adapter\Local as LocalAdapter;
use League\Flysystem\Dropbox\DropboxAdapter;
use League\Flysystem\Filesystem;
use League\Flysystem\MountManager;
use Dropbox\Client as DropboxClient;
// Autoload vendors
require_once __DIR__ .'/vendor/autoload.php';
/**
* Class that will handle uploading to Dropbox
*
*/
class SO40950172Filesystem {
/**
* Contains several mounted filesystems
*
* #var League\Flysystem\MountManager object
*/
protected $filesystem;
/**
* Contains Dropbox client
*
* We need this accessible to create shareable links
*
* #var Dropbox\Client object
*/
protected $dropbox_client;
/**
* Instantiates this class
*/
public function __construct() {
// Get WordPress uploads directory info
$uploads_info = wp_upload_dir();
// Create Local filesystem
$local_adapter = new LocalAdapter($uploads_info['basedir']);
$local_fs = new Filesystem($local_adapter, [
'visibility' => AdapterInterface::VISIBILITY_PUBLIC
]);
// Create Dropbox filesystem
$this->dropbox_client = new DropboxClient($app_access_token, $app_secret);
$dropbox_adapter = new DropboxAdapter($this->dropbox_client, '/');
$dropbox_fs = new Filesystem($dropbox_adapter, [
'visibility' => AdapterInterface::VISIBILITY_PUBLIC
]);
// Set filesystem manager
$this->filesystem = new MountManager([
'local' => $local_fs,
'dropbox' => $dropbox_fs
]);
}
/**
* Uploads file to Dropbox
*
* #param string $path Path to file
* #return object Current object
*/
public function uploadToDropbox($path)
{
// Normalize path
$path = static::getNormalizedPath($path);
// Get content from the local file
$content = $this->read("local://$path");
// Push file to Dropbox
$this->put("dropbox://$path", $content);
return $this;
}
/**
* Deletes file from Dropbox
*
* #param string $path Path to file
* #return object Current object
*/
public function deleteFromDropbox($path)
{
// Normalize path
$path = static::getNormalizedPath($path);
// Delete file from Dropbox
$this->delete("dropbox://$path");
return $this;
}
/**
* Returns the unique identifier path section of a Dropbox URL
*
* #param string $path Path to file
* #return string Dropbox URL unique identifier
*/
public function getDropboxUrlID($path)
{
// Normalize path
$path = static::getNormalizedPath($path);
// Get unique link
$url = $this->dropbox_client->createShareableLink("/$path");
// Parse URL to retrive its path
$url_info = parse_url($url);
$url_path = $url_info['path'];
// Remove "s/" section and file name from the URL path
$id = str_replace(['s/', basename($path)], '', $url_path);
// Return Dropbox unique identifier for this file URL
return trim($id, '/');
}
/**
* Returns clean & relative paths
*
* #param string $path Raw path
* #return string Parsed path
*/
public static function getNormalizedPath($path)
{
// Get WordPress uploads directory info
$uploads_info = wp_upload_dir();
// Remove uploads base path so that we end up
// with the "/YYYY/MM/filename.extension" format
$path = str_replace($uploads_info['basedir'], '', $path);
// Remove uploads base url so that we end up
// with the "/YYYY/MM/filename.extension" format
$path = str_replace($uploads_info['baseurl'], '', $path);
// Remove forward slashes on both ends
$path = trim($path, '/');
// Return path
return $path;
}
/**
* Making sure all calls go to $this->filesystem
*
* #param string $name Method name
* #param array $args Method arguments
* #return mixed
*/
public function __call($name, array $args)
{
if (method_exists($this->filesystem, $name))
throw new \Exception("\League\Flysystem\MountManager doesn't have \"$name\" method");
return call_user_func_array([$this->filesystem, $name], $args);
}
}
// Manipulate media URLs sitewide
add_filter('wp_get_attachment_url', 'so_40950172_get_dropbox_url', 9, 2);
function so_40950172_get_dropbox_url($absolute_url, $post_id) {
// Get normalized path
$path = SO40950172Filesystem::getNormalizedPath($absolute_url);
// Get only the filename
$path = basename($path);
// Get Dropbox URL unique ID
$id = get_post_meta($post_id, 'dropbox_id_'. $path, true);
// Return absolute URL
return $id ? "https://dl.dropboxusercontent.com/s/$id/$path/?dl=0" : $path;
}
// Upload new and updated files to Dropbox
add_filter('wp_update_attachment_metadata', 'so_40950172_upload_to_dropbox', 9, 2);
function so_40950172_upload_to_dropbox($data, $post_id) {
// Get filesystem
$fs = new SO40950172Filesystem();
// Upload original file to Dropbox
$fs->uploadToDropbox($data['file']);
// Add Dropbox URL unique ID to meta data
add_post_meta($post_id, 'dropbox_id_'. basename($data['file']), $fs->getDropboxUrlID($data['file']));
// Upload intermediate image sizes
if (isset($data['sizes']) && $data['sizes']) {
// Get year and month prefix (e.g /YYYY/MM) from original file
$base_path = dirname($data['file']);
// Loop through all sizes
foreach ($data['sizes'] as $size_name => $size_data) {
// Set path for current size
$size_path = $base_path .'/'. $size_data['file'];
// Upload size to Dropbox
$fs->uploadToDropbox($size_path);
// Add Dropbox URL unique ID to meta data
add_post_meta($post_id, 'dropbox_id_'. basename($size_path), $fs->getDropboxUrlID($size_path));
}
}
return $data;
}
// Delete Dropbox file
add_filter('wp_delete_file', 'so_40950172_delete_dropbox_file');
function so_40950172_delete_dropbox_file($absolute_path) {
// Get filesystem
$fs = new SO40950172Filesystem();
// Delete file from Dropbox
$fs->deleteFromDropbox($absolute_path);
}
Step 7
On the code your just pasted into functions.php:
Replace $app_access_token with the Dropbox app access token you generated
Replace $app_secret with the Dropbox app secret
NOTES
The original file and generated sizes will also be saved locally, but you don't need to worry about them. You can even delete the local file after confirmation of a successful upload, if you want and/or care about disk space.
I also tested the built-in image editor and it worked without any issues.
If you ever need to move things on the Dropbox side and since there is no info saved on the database (this is just fine), all you need to do is to update the functions above.
Apparently you can mirror the WordPress structure on Dropbox, but you can't simply link to them using a base URL and the WordPress uploads structure to get the URLs, you really need to get the shareable link for each original file and generated sizes and store something about them as metadata. On the code above I chose to only store the unique part of the URL as metadata, which really is the only unique thing about them.
I know this is off-topic, but I would recommend either AWS S3 or Google Cloud Storage because you can access your files with the exact same file structure you have on Dropbox. No need to save anything as meta data.
that's an interesting idea, but WordPress uses DB relations in order to manage the uploaded images - so i think you must handle the images through the media uploader. TL:DR; - you can't.

File not visible in Drive after API upload and inserting permissions using php

I'm crowbarring the google api into a laravel app.
The file appears to upload correctly but remains invisible in the Drive GUI.
Many searches later and I've got this code to set the permissions of the newly created file:
$id = id of created file
$value = 'my#email.address.com';
$type = "anyone";
$role = "writer";
$drive->insertPermission($id, $value, $type, $role);
function insertPermission($fileId, $value, $type, $role) {
$service = $this->service;
$newPermission = new \Google_Service_Drive_Permission();
$newPermission->setType($type);
$newPermission->setRole($role);
try {
return $service->permissions->create($fileId, $newPermission);
} catch (Exception $e) {
print "An error occurred: " . $e->getMessage();
}
return NULL;
}
This seems to work. No errors are thrown. But I still can't see the file when I search in my drive. Any ideas?
As pushpendra pointed out I need to change
$type="anyone"; to $type="user";
and
$role="writer"; to $role="reader";
Also - in the end I solved this by creating a folder in the drive UI and sharing that folder with the service account. The service account is able to create/update files in that shared folder and I get a notification when it does so.

Google Drive PHP API returns deleted files

I'm trying to sync files between Amazon S3 and Google Drive with a service.
I created a service account and shared a folder with its email.
Then with the automated service I built, I created some folders.
Everything was fine until I tried to delete these folders from Google Drive UI. They disappeared from the UI, but my service still receives them as if they were present.
Here is the code I'm using to list the files/folders:
private function retrieveAllGoogleDriveFiles($service) {
$result = array();
$pageToken = NULL;
do {
try {
$parameters = array();
if ($pageToken) {
$parameters['pageToken'] = $pageToken;
}
$parameters['q'] = "trashed=false";
$files = $service->files->listFiles($parameters);
$result = array_merge($result, $files->getItems());
$pageToken = $files->getNextPageToken();
} catch (Exception $e) {
print "An error occurred: " . $e->getMessage();
$pageToken = NULL;
}
} while ($pageToken);
return $result;
}
Here is the authorization mechanism I'm using:
$credentials = new Google_Auth_AssertionCredentials(
'1234567890-somehashvalueshere123123#developer.gserviceaccount.com',
array('https://www.googleapis.com/auth/drive'),
$private_key,
'notasecret',
'http://oauth.net/grant_type/jwt/1.0/bearer'
);
$googleClient = new Google_Client();
$googleClient->setAssertionCredentials($credentials);
if ($googleClient->getAuth()->isAccessTokenExpired()) {
$googleClient->getAuth()->refreshTokenWithAssertion();
}
$googleService = new Google_Service_Drive($googleClient);
$files_list = $this->retrieveAllGoogleDriveFiles($googleService);
The case where a user deletes a file/folder from Google Drive is real to me and I should make the sync back to S3 and remove the file/folder from there also.
Thanks in advance.
When a folder is shared among multiple users, it is the owner who has the ability to delete a folder, not the other users.
Essentially what is happening is by hitting 'remove', the user is removing it from their Drive (unsubscribing), they are not deleting the folder. This is why it appears still appears in your service accounts file list.
One possible solution solution is to do a change owner so the non-service account owns the file. In that case, remove really will be a trash/delete sort of operation. However, without understanding your use case/underlying problem to solve, it is hard to through out solutions. Some of these things may be easier if the users are enterprise/work accounts rather than private/personal accounts.

Categories