I'm fairly new to writing in PHP and I've hit a snag while developing a Google Drive connection. I'm trying to create a small widget for a WordPress site wherein users on the site can upload and download files from their Google Drives.
These users are from a closed GSuite Domain and I use a Domain-Wide Service account in order to authenticate and connect to the user's drives.
I am able to connect, retrieve a list of files, and successfully "download" a files contents, though, this download does not actually download a file to the computer, simply returns the contents.
References:
-Uploading: https://developers.google.com/drive/api/v3/manage-uploads
-Downloading: https://developers.google.com/drive/api/v3/manage-downloads
-- I need to place a download button on my page which will allow a user to select a file and download it to their computer without saving it on the server. Currently it just returns the content of the file which I can store in a variable, but I do not know how to have a client request it and download it.
I need an upload button as well but I am working on a solution that I believe will work. I've read many different articles suggesting Ajax, Node, etc...
Ajax:
Calling a PHP function from an HTML form in the same file?
Upload UI:
https://www.htmlgoodies.com/beyond/cms/create-a-file-uploader-in-wordpress.html
In my client initialization file:
$client = new Google_Client();
$client->useApplicationDefaultCredentials();
$client->addScope(Google_Service_Drive::DRIVE);
$client->setSubject($user_to_impersonate);
$driveService = new Google_Service_Drive($client);
return $driveService;
In the php script that loads the widget on the page:
$drive_connection = getUserGoogleDriveService();
$retrieved_files = array();
$filesList = $drive_connection->files->listFiles();
foreach($filesList->files as $file)
{
array_push($retrieved_files, array(
'filename' => $file->name,
'fileid' => $file->id,
'file' => $file
));
}
$get_specific_file = array_search('example_file_name',
array_column($retrieved_files, 'filename'));
Edit: I added my code thus far. This code successfully lets me store and retrieve files from my array for further use. How would I tie this into a client side UI to upload files to the google "create" function.
Related
I am using laravel-google-cloud-storage to store images and retrieve them one by one. Is it possible that I can get all the folders and images from the Google Cloud Storage? If possible, how do I get this done?
I was trying to use this flysystem-google-cloud-storage to retrieve it but they are similar to the first link I have provided.
What I want to achieve is I want to select an image using the Google Cloud Storage with all the folders and images in it and put it in my form instead of selecting an image from my local.
UPDATE:
This is what I have tried so far base from this documentation.
$storageClient = new StorageClient([
'projectId' => 'project-id',
'keyFilePath' => 'myKeyFile.json',
]);
$bucket = $storageClient->bucket('my-bucket');
$buckets = $storageClient->buckets();
Then tried adding foreach which returns empty and also I have 6 folders in my Bucket.
foreach ($buckets as $bucket) {
dd($bucket->name());
}
It's been a week since my post has not been answered. I'll just post and share to anyone of what I did since last week.
I am using Laravel 5.4 at the moment.
So I installed laravel-google-cloud-storage and flysystem-google-cloud-storage in my application.
I created a Different controller since I am retrieving the images from Google Cloud Storage via Ajax.
All you need to do is to get your Google Cloud Storage credentials which can be located in your Google Cloud Storage Dashboard > Look for the APIs then click the link below that stated "Go to APIs overview > Credentials. Just download the credentials which is in JSON file format and put it in your root or anywhere you wanted to (I still don't know where should I properly put this file). Then the next is we get your Google Cloud Storage Project ID which can be located in the Dashboard.
Then this is my setup in my controller that connects from my Laravel application to Google Cloud Storage which I am able to Upload, Retrieve, Delete, Copy files.
use Google\Cloud\Storage\StorageClient;
use League\Flysystem\Filesystem;
use League\Flysystem\Plugin\GetWithMetadata;
use Superbalist\Flysystem\GoogleStorage\GoogleStorageAdapter;
class GoogleStorageController extends Controller
{
// in my method
$storageClient = new StorageClient([
'projectId' => 'YOUR-PROJECT-ID',
'keyFilePath' => '/path/of/your/keyfile.json',
]);
// name of your bucket
$bucket = $storageClient->bucket('your-bucket-name');
$adapter = new GoogleStorageAdapter($storageClient, $bucket);
$filesystem = new Filesystem($adapter);
// this line here will retrieve all your folders and images
$contents = $filesystem->listContents();
// you can get the specific directory and the images inside
// by adding a parameter
$contents = $filesystem->listContents('directory-name');
return response()->json([
'contents' => $contents
]);
}
I was wondering if any one can help with following issue.
I have an Api with node/express for image uploads
server1
var upload = multer({ dest: 'uploads/'})
app.post('/api/upload', upload.single('file'), function(req, res, next) {
if (req.file && req.file.path) {
//used to have imgur upload
//imgur.uploadFile(req.file.path)
// .then(function (json) {
//});
}
});
server2
I have PHP Api for image uploads as well which I have set up for self-hosting using pictshare. I want to redirect file uploads to new API by redirecting uploads to PHP Api server.
Have tried multer, multiparty, needle, request and various other methods... but somehow couldn't figure out.
Is there way to direct multer to new destination?
File is being saved in uploads/ folder, maybe better would be to upload/direct that file to new server and return new url from server2 ?
Looked around for pipelining upload images, with not much luck..
note: server1 api is being used by mobile app so wouldn't want to release app update if it can be handled from server side.
To upload a file (image or whatever) you need to have it in your drive first (or in memory, but I will tackle the drive case first). Then you can read it and send it.
The APIs that receive files use to receive forms. Usually they expect to receive a form where the attachment comes as a file field.
Here you can see a working example uploading an existing file in my drive to an external API by using the 'superagent' library for Node.
const
fs = require('fs'),
agent = require('superagent');
const stream = fs.createReadStream('path/to/downloaded/file');
agent.post(`urlOfApi/uploadFileEndpoint`)
.type('form')
.attach('file', stream.path);
I'm using AWS PHP sdk to save images on S3. Files are saved privately. Then, I'm showing the image thumbnails using the S3 file url in my web application but since the files are private so the images are displayed as corrupt.
When the user clicks on the name of file, a modal is opened to show the file in larger size but file is displayed as corrupt there as well due to the same issue.
Now, I know that there are two ways to make this working. 1. Make the files public. 2. Generate pre-signed urls for files. But I cannot go with any of these two options due to the requirements of my project.
My question is that is there any third way to resolve this issue?
I'd highly advise against this, but you could create a script on your own server that pulls the image via the API, caches it and serves. You can then restrict access however you like without making the images public.
Example pass through script:
$headers = get_headers($realpath); // Real path being where ever the file really is
foreach($headers as $header) {
header($header);
}
$filename = $version->getFilename();
// These lines if it's a download you want to do
// header('Content-Description: File Transfer');
// header("Content-Disposition: attachment; filename={$filename}");
$file = fopen($realpath, 'r');
fpassthru($file);
fclose($file);
exit;
This will barely "touch the sides" and shouldn't delay the appearance of your files too much, but t's still going to take some resources and bandwidth.
You will need to access the files through a script on your server. That script will do some kind of authentication to make sure the request is valid and you want them to see the file. Then fetch the file from S3 using a valid IAM profile that can access the private files. Output the file
Instead of requesting the file from S3 request it from
http://www.yourdomain.com/fetchimages.php?key=8498439834
Then here is some pseudocode in fetchimages.php
<?php
//if authorized to get this image
$key=$_GET['key'];
//validate key is the proper format
//get s3 url from a database based on the $key
//connect to s3 securely and read the file from s3
//output the file
?>
as far as i know you could try to make your S3 bucket a "web server" like this but then you would probably "Make the files public".Then if you have some kind of logic to restrict the access you could create a bucket policy
I am using Google API Client to upload and convert a file to that user's google drive. How can I automatically share that file with one specific user?
$file = new Google_Service_Drive_DriveFile();
$file->title = $title;
$chunkSizeBytes = 1 * 1024 * 1024;
// Call the API with the media upload, defer so it doesn't immediately return.
$client->setDefer(true);
$request = $service->files->insert($file, array(
'convert' => true,
));
The webapp is an HTML form that creates a document based on user data. This document needs to later be edited by myself, saved in my google drive, and deleted from the user's google drive.
What is the best/easiest way to achieve this?
To share a file with the Drive API you use Permissions.insert() with the type "user", role of either "reader", "writer", or "commenter", and a value which is their email address. More information about sharing files is available in this developer guide.
I'm writing a web app that at one point allows a user to upload a photo to a flickr account (mine). I want to do this without saving the intermediate image on the server my web app is on.
What I've got so far is a page which implements phpFlickr and accepts a POST from a simple html form. I use $_FILES['file']['tmp_name'] as the path for phpFlickr to use. Here's the code:
<?php
require_once("phpFlickr.php");
$f = new phpFlickr("apikey", "secret", true);
$_SESSION['phpFlickr_auth_redirect'] = "post_upload.php";
$myPerms = $f->auth("write");
$token = $f->auth_checkToken();
$phid = $f->sync_upload($_FILES['file']['tmp_name']);
echo "Uploading Photo..." . $phid;
?>
I'm guessing that the tmp file is being lost because of the redirect that happens when $f->auth("write") is called, but I don't know. Is there a way to preserve it? Is there any way to do this without saving the file to the server?
Answer: There is No way to directly upload a file to Flickr without saving it as an intermediate file.
I've moved on to using move_uploaded_file() followed by a flickr API call, and its working perfectly.
I've also managed to get it to play nice with the excellent Jquery Uploadify, which lets me send multiple files to it in one go.