I am using laravel-google-cloud-storage to store images and retrieve them one by one. Is it possible that I can get all the folders and images from the Google Cloud Storage? If possible, how do I get this done?
I was trying to use this flysystem-google-cloud-storage to retrieve it but they are similar to the first link I have provided.
What I want to achieve is I want to select an image using the Google Cloud Storage with all the folders and images in it and put it in my form instead of selecting an image from my local.
UPDATE:
This is what I have tried so far base from this documentation.
$storageClient = new StorageClient([
'projectId' => 'project-id',
'keyFilePath' => 'myKeyFile.json',
]);
$bucket = $storageClient->bucket('my-bucket');
$buckets = $storageClient->buckets();
Then tried adding foreach which returns empty and also I have 6 folders in my Bucket.
foreach ($buckets as $bucket) {
dd($bucket->name());
}
It's been a week since my post has not been answered. I'll just post and share to anyone of what I did since last week.
I am using Laravel 5.4 at the moment.
So I installed laravel-google-cloud-storage and flysystem-google-cloud-storage in my application.
I created a Different controller since I am retrieving the images from Google Cloud Storage via Ajax.
All you need to do is to get your Google Cloud Storage credentials which can be located in your Google Cloud Storage Dashboard > Look for the APIs then click the link below that stated "Go to APIs overview > Credentials. Just download the credentials which is in JSON file format and put it in your root or anywhere you wanted to (I still don't know where should I properly put this file). Then the next is we get your Google Cloud Storage Project ID which can be located in the Dashboard.
Then this is my setup in my controller that connects from my Laravel application to Google Cloud Storage which I am able to Upload, Retrieve, Delete, Copy files.
use Google\Cloud\Storage\StorageClient;
use League\Flysystem\Filesystem;
use League\Flysystem\Plugin\GetWithMetadata;
use Superbalist\Flysystem\GoogleStorage\GoogleStorageAdapter;
class GoogleStorageController extends Controller
{
// in my method
$storageClient = new StorageClient([
'projectId' => 'YOUR-PROJECT-ID',
'keyFilePath' => '/path/of/your/keyfile.json',
]);
// name of your bucket
$bucket = $storageClient->bucket('your-bucket-name');
$adapter = new GoogleStorageAdapter($storageClient, $bucket);
$filesystem = new Filesystem($adapter);
// this line here will retrieve all your folders and images
$contents = $filesystem->listContents();
// you can get the specific directory and the images inside
// by adding a parameter
$contents = $filesystem->listContents('directory-name');
return response()->json([
'contents' => $contents
]);
}
Related
I wanna download files inside a google drive shared folder in which I have editor permission.
I am planning to implement that function as cronjob.
So I would like to ask if I can do that with just HTTP request or use google API library and write some codes in order to achieve what I want?
I'm using Laravel PHP.
Can you tell me the workflow that I should use?
Thank you so much in advance. :D
Following is sketch codes what I think of.
$files = HTTP::get('https://www.googleapis.com/drive/v3/files',
['key'=> myApiKey ,
'q'=> "parents in '{folder_id}' and trashed = false' ,
'supportsAllDrives' => true,
'includeItemsFromAllDrives'=>true, ]);
foreach($files as $file){
//Here I will download the file.
}
I couldn't find out there a PHP tutorial for storing a file in the free bucket of a Google Apps Engine project and retrieving its contents. The idea of this post is to do it step by step assuming the GAE project was correctly created.
1) If you created a GAE project it has been granted a 5Gb of a free Google Cloud Storage bucket. Its name is "YOUR_PROJECT_ID.appspot.com"
https://console.cloud.google.com/storage/browser
2) A Service Account must be created and asgigned to use the SDK.
Steps Here
3) This is the basic PHP code to store a file with a "Hello World" content. This code could be executed from the terminal windows.
<?php
$filename = "tutorial.txt"; // filename in the bucket
$txt_toSave = "Hello Word"; // text content in the file
// lets add code here
?>
php GCStorage_save_example.php
4) This is the basic PHP code to retrieve a file's content from the bucket.
<?php
// lets add code here
echo $txt_fileContent;
?>
php GCStorage_retrieve_example.php
a) if permissions must be granted feel free to add the step
b) if any other step must be done feel free to add it
You can find the PHP code samples for Google Cloud Storage operations in the Official Google Cloud Storage documentation [1].
For instance, an example to store a file in Google Cloud Storage according to the documentation [2] would be:
use Google\Cloud\Storage\StorageClient;
/**
* Upload a file.
*
* #param string $bucketName the name of your Google Cloud bucket.
* #param string $objectName the name of the object.
* #param string $source the path to the file to upload.
*
* #return Psr\Http\Message\StreamInterface
*/
function upload_object($bucketName, $objectName, $source)
{
$storage = new StorageClient();
$file = fopen($source, 'r');
$bucket = $storage->bucket($bucketName);
$object = $bucket->upload($file, [
'name' => $objectName
]);
printf('Uploaded %s to gs://%s/%s' . PHP_EOL, basename($source), $bucketName, $objectName);
}
You can follow the How-To [3] for retrieving an object from the bucket as well. Note that this is no different than storing/downloading objects from other buckets, as you can specify from which bucket to pull/push from.
[1] https://cloud.google.com/storage/docs/how-to
[2] https://cloud.google.com/storage/docs/uploading-objects
[3] https://cloud.google.com/storage/docs/downloading-objects
I'm fairly new to writing in PHP and I've hit a snag while developing a Google Drive connection. I'm trying to create a small widget for a WordPress site wherein users on the site can upload and download files from their Google Drives.
These users are from a closed GSuite Domain and I use a Domain-Wide Service account in order to authenticate and connect to the user's drives.
I am able to connect, retrieve a list of files, and successfully "download" a files contents, though, this download does not actually download a file to the computer, simply returns the contents.
References:
-Uploading: https://developers.google.com/drive/api/v3/manage-uploads
-Downloading: https://developers.google.com/drive/api/v3/manage-downloads
-- I need to place a download button on my page which will allow a user to select a file and download it to their computer without saving it on the server. Currently it just returns the content of the file which I can store in a variable, but I do not know how to have a client request it and download it.
I need an upload button as well but I am working on a solution that I believe will work. I've read many different articles suggesting Ajax, Node, etc...
Ajax:
Calling a PHP function from an HTML form in the same file?
Upload UI:
https://www.htmlgoodies.com/beyond/cms/create-a-file-uploader-in-wordpress.html
In my client initialization file:
$client = new Google_Client();
$client->useApplicationDefaultCredentials();
$client->addScope(Google_Service_Drive::DRIVE);
$client->setSubject($user_to_impersonate);
$driveService = new Google_Service_Drive($client);
return $driveService;
In the php script that loads the widget on the page:
$drive_connection = getUserGoogleDriveService();
$retrieved_files = array();
$filesList = $drive_connection->files->listFiles();
foreach($filesList->files as $file)
{
array_push($retrieved_files, array(
'filename' => $file->name,
'fileid' => $file->id,
'file' => $file
));
}
$get_specific_file = array_search('example_file_name',
array_column($retrieved_files, 'filename'));
Edit: I added my code thus far. This code successfully lets me store and retrieve files from my array for further use. How would I tie this into a client side UI to upload files to the google "create" function.
I am trying to store image in S3 bucket and i am using laravel 5.5 i am new and i am stuck here: What i am trying is:
My Controller:
public function imageUploadPost(Request $request)
{
$this->validate($request, [
'image' => 'required|image|mimes:jpeg,png,jpg,gif,svg|max:2048',
]);
$imageName = time().'.'.$request->image->getClientOriginalExtension();
$image = $request->file('image');
$t = Storage::disk('s3')->put($imageName, file_get_contents($image), 'public');
$imageName = Storage::disk('s3')->url($imageName);
return back()
->with('success','Image Uploaded successfully.')
->with('path',$imageName);
}
My routes:
Route::post('s3-image-upload','S3ImageController#imageUploadPost');
My config/filesystems.php
's3' => [
'driver' => 's3',
'key' => env('AccessKeyID'),
'secret' => env('SecretAccessKey'),
'region' => env('region'),
'bucket' => env('mybucket'),
],
And i am getting these values from my env file my .env file looks like:
AccessKeyID=xyz
SecretAccessKey=xyz
region=us-east-2
mybucket=spikessales
Now when i upload file and hit upload button it says :
Encountered a permanent redirect while requesting https://spikessales.s3.us-east-2.amazonaws.com/1519812331.jpg. Are you sure you are using the correct region for this bucket?
Here I am confused how to put my region also I have created bucket name (spikessales)
and I dont know how to give region as I am giving region which is present as aws browser url: look like:
https://s3.console.aws.amazon.com/s3/home?region=us-east-2
I am giving rgion which present at the end of this url (us-east-2) as u can see in my env file.
And the region which I have created during creating bucket name is US East(N.Virginia). please tell me how to write region correctly.
Any help will be highly appreciated!
In your AWS API call set the region from your AWS S3 settings (it is shown right in S3 bucket GUI), and do not pay attention to region shown in URL.
In my AWS S3 console for example it also shows region=us-east-2 in URL, although I set up EU (Frankfurt) region in AWS S3 settings.
To find your S3 bucket region follow these steps from this zappysys.com article
Open your AWS Console by visiting https://console.aws.amazon.com/
From dashboard click on S3 option (or visit
https://console.aws.amazon.com/s3/home)
You will see all buckets in the left side list
Click on desired S3 bucket name
Click on Properties Tab at the top
Now you will see Region for the selected bucket along with many
other properties.
You can now change the region in your env file based on what you see here.
i am in the proccess of creating a "Content Management System" for a "start up company". I have a Post.php model in my project, the following code snippet is taken from the Create method:
if(Request::file('display_image') != null){
Storage::disk('s3')->put('/app/images/blog/'.$post->slug.'.jpg', file_get_contents(Request::file('display_image')));
$bucket = Config::get('filesystems.disks.s3.bucket');
$s3 = Storage::disk('s3');
$command = $s3->getDriver()->getAdapter()->getClient()->getCommand('GetObject', [
'Bucket' => Config::get('filesystems.disks.s3.bucket'),
'Key' => '/app/images/blog/'.$post->slug.'.jpg',
'ResponseContentDisposition' => 'attachment;'
]);
$request = $s3->getDriver()->getAdapter()->getClient()->createPresignedRequest($command, '+5 minutes');
$image_url = (string) $request->getUri();
$post->display_image = $image_url;
The above code checks if there is a "display_image" file input in the request object.
If it finds a file it uploads it directly to AWS S3 storage. I want to save the link of the file in the Database, so i can link it later in my views.
Hence i use this piece of code:
$request = $s3->getDriver()->getAdapter()->getClient()->createPresignedRequest($command, '+5 minutes');
$image_url = (string) $request->getUri();
$post->display_image = $image_url;
I get a URL, the only problem is that whenever i visit the $post->display_image URL i get a 403 permission denied. Obviously no authentication takes place when using the URL of the image.
How to solve this? I need to be able to link all my images/files from amazon S3 to the front-end interface of the website.
You could open up those S3 URLs to public viewing, but you probably wouldn't want to. You have to pay for the outgoing bandwidth every time someone views one of those images.
You might want to check out Glide, a pretty simple-to-use image library that supports S3. Make sure to reduce the load requirements on your server and wallet by setting caching headers on the images you serve.
Alternatively, you could use a CloudFront distribution as a caching proxy in front of your S3 bucket.