I am trying to fetch/retrieve files stored on AWS Glacier using PHP. But I am not able to find any method to do so.
What I want is just fetch/retrieve from AWS Glacier using PHP. If anyone have idea about it then please suggest me.
Thanks.
As per example from github you can retrieve a file using the following
// Use the us-west-2 region and latest version of each client.
$sharedConfig = [
'region' => 'us-west-2',
'version' => 'latest'
];
// Create an SDK class used to share configuration across clients.
$sdk = new Aws\Sdk($sharedConfig);
// Create an Amazon Glacier client using the shared configuration data.
$client = $sdk-> createGlacier();
//Download our archive from Amazon to our server
$result = $aws->getJobOutput(array(
'vaultName' => '<YOUR VAULT>', //The name of the vault
'jobId' => 'XXXX' //supply the unique ID of the job that retrieved the archive
));
$data = $result->get('body'); //Sets the file data to a variable
$description = $result->get('archiveDescription'); //Sets file description to a variable
//deletes the temp file on our server if it exists
if(file_exists("files/temp")){
unlink("files/temp");
}
$filepath = "files/temp";
$fp = fopen($filepath, "w"); //creates a new file temp file on our web server
fwrite($fp, $data); //write the data in our variable to our temp file
//Your archive is now ready for download on your web server
You can review PHP Glacier ref documentation for more details
Related
I am trying to convert a local Excel.xlsx file with all existent design, format and formulas existent in my local Excel file. How can I do that using Google API with PHP?
What I was doing but not working was :
$client = new \Google_Client();
$client->setApplicationName('Name');
$client->setScopes([\Google_Service_Drive::DRIVE]);
$client->setAccessType('offline');
$client->setAuthConfig($_SERVER['DOCUMENT_ROOT'] . '/credentials.json');
$service = new Google_Service_Drive($client);
$fileID = '';
$path = $_SERVER['DOCUMENT_ROOT'] . '/includes/files/';
$fileName = 'MAIN.xlsx';//this is the file I want to convert to Google sheet
$filePathName = $path.$fileName;
$mimeType = 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet';
$file->setMimeType($mimeType);
$createdFile = $service->files->copy($file, array(
'data' => $filePathName,
'mimeType' => $mimeType,
'convert' => true,
));
But that is not working. How should I correct?
I believe your goal as follows.
You want to upload a XLSX file from the local PC to Google Document.
When the XLSX file is uploaded, you want to convert it to Google Spreadsheet.
You want to achieve this using googleapis for PHP.
Modification points:
In your script, the method of "Files: copy" is used. This method copies the file on Google Drive. So I think that this cannot be used for achieving your goal. I think that this is the reason of your issue.
From your error message, I thought that you might be using Drive API v3. I guess that this might be also the reason of your issue.
From above points, when your script is modified, it becomes as follows.
Modified script:
$service = new Google_Service_Drive($client); // Please use your $client here.
$path = $_SERVER['DOCUMENT_ROOT'] . '/includes/files/';
$fileName = 'MAIN.xlsx';//this is the file I want to convert to Google sheet
$filePathName = $path.$fileName;
$file = new Google_Service_Drive_DriveFile();
$file->setName('MAIN.xlsx');
$file->setMimeType('application/vnd.google-apps.spreadsheet');
$data = file_get_contents($filePathName);
$createdFile = $service->files->create($file, array(
'data' => $data,
'uploadType' => 'multipart'
));
printf("%s\n", $createdFile->getId());
Note:
In this modified script, it supposes that you have already been able to get and out values for Google Drive using Drive API. Please be careful this.
In this method, the maximum file size is 5 MB. Please be careful this. When you want to upload the large file, please use the resumable upload. Ref
References:
Upload file data
Create files
I am using laravel-google-cloud-storage to store images and retrieve them one by one. Is it possible that I can get all the folders and images from the Google Cloud Storage? If possible, how do I get this done?
I was trying to use this flysystem-google-cloud-storage to retrieve it but they are similar to the first link I have provided.
What I want to achieve is I want to select an image using the Google Cloud Storage with all the folders and images in it and put it in my form instead of selecting an image from my local.
UPDATE:
This is what I have tried so far base from this documentation.
$storageClient = new StorageClient([
'projectId' => 'project-id',
'keyFilePath' => 'myKeyFile.json',
]);
$bucket = $storageClient->bucket('my-bucket');
$buckets = $storageClient->buckets();
Then tried adding foreach which returns empty and also I have 6 folders in my Bucket.
foreach ($buckets as $bucket) {
dd($bucket->name());
}
It's been a week since my post has not been answered. I'll just post and share to anyone of what I did since last week.
I am using Laravel 5.4 at the moment.
So I installed laravel-google-cloud-storage and flysystem-google-cloud-storage in my application.
I created a Different controller since I am retrieving the images from Google Cloud Storage via Ajax.
All you need to do is to get your Google Cloud Storage credentials which can be located in your Google Cloud Storage Dashboard > Look for the APIs then click the link below that stated "Go to APIs overview > Credentials. Just download the credentials which is in JSON file format and put it in your root or anywhere you wanted to (I still don't know where should I properly put this file). Then the next is we get your Google Cloud Storage Project ID which can be located in the Dashboard.
Then this is my setup in my controller that connects from my Laravel application to Google Cloud Storage which I am able to Upload, Retrieve, Delete, Copy files.
use Google\Cloud\Storage\StorageClient;
use League\Flysystem\Filesystem;
use League\Flysystem\Plugin\GetWithMetadata;
use Superbalist\Flysystem\GoogleStorage\GoogleStorageAdapter;
class GoogleStorageController extends Controller
{
// in my method
$storageClient = new StorageClient([
'projectId' => 'YOUR-PROJECT-ID',
'keyFilePath' => '/path/of/your/keyfile.json',
]);
// name of your bucket
$bucket = $storageClient->bucket('your-bucket-name');
$adapter = new GoogleStorageAdapter($storageClient, $bucket);
$filesystem = new Filesystem($adapter);
// this line here will retrieve all your folders and images
$contents = $filesystem->listContents();
// you can get the specific directory and the images inside
// by adding a parameter
$contents = $filesystem->listContents('directory-name');
return response()->json([
'contents' => $contents
]);
}
I started working on uploading a file to akamai netstorage using PHP and referred few API's in GitHub. I couldn't upload a video file. Though i can create and write contents in them.
<?php
require 'Akamai.php';
$service = new Akamai_Netstorage_Service('******.akamaihd.net');
$service->authorize('key','keyname','version');
$service->upload('/dir-name/test/test.txt','sample text');
?>
I referred this API. I also referred few others but couldn't get the right way to upload a video/image file. The code which i wrote above is working perfectly. Now i need to upload a video file instead of writing contents to a text file.
There is a more modern library for Akamai's NetStorage, which is built as a plugin for FlySystem, the akamai-open/netstorage.
Once you have it installed, you need to setup the authentication and the HTTP client (based on Guzzle):
$signer = new \Akamai\NetStorage\Authentication();
$signer->setKey($key, $keyName);
$handler = new \Akamai\NetStorage\Handler\Authentication();
$handler->setSigner($signer);
$stack = \GuzzleHttp\HandlerStack::create();
$stack->push($handler, 'netstorage-handler');
$client = new \Akamai\Open\EdgeGrid\Client([
'base_uri' => $host,
'handler' => $stack
]);
$adapter = new \Akamai\NetStorage\FileStoreAdapter($client, $cpCode);
And then you can create the filesystem object, and upload the file:
$fs = new \League\Flysystem\Filesystem($adapter);
// Upload a file:
// cpCode, action, content signature, and request signature is added transparently
// Additionally, all required sub-directories are created transparently
$fs->write('/path/to/write/file/to', $fileContents);
However, because you're uploading a video file I would suggest you use a stream rather than reading the contents in to memory. To do this, you use writeStream() instead:
$fs = new \League\Flysystem\Filesystem($adapter);
$stream = fopen('/path/to/local/file', 'r+');
$fs->writeStream('/path/to/write/file/to', $stream);
I'm trying to upload an image to a bucket in Google Cloud Storage via PHP (my server code handles the upload to Google)
The file is uploaded OK, but the Type (when looking in the browser) is application/octet-stream
Where is my mistake?
$data = file_get_contents('<local file path>.jpg');
$gso = new Google_Service_Storage_StorageObject();
$gso->setName("test8.jpg");
$gso->setContentType('image/jpeg');
$postbody = array('uploadType' => 'multipart', 'data' => $data);
$ret = $service->objects->insert($user_bucket, $gso, $postbody);
I'm trying to upload files to my bucket using a piece of code like this:
$s3 = new AmazonS3();
$bucket = 'host.domain.ext'; // My bucket name matches my host's CNAME
// Open a file resource
$file_resource = fopen('picture.jpg', 'r');
// Upload the file
$response = $s3->create_object($bucket, 'picture.jpg', array(
'fileUpload' => $file_resource,
'acl' => AmazonS3::ACL_PUBLIC,
'headers' => array(
'Cache-Control' => 'public, max-age=86400',
),
));
But I get the "NoSuchBucket" error, the weird thing is that when I query my S3 account to retrieve the list of buckets, I get the exact same name I'm using for uploading host.domain.ext.
I tried creating a different bucket with no dots in the name and it works perfectly... yes, my problem is my bucket name, but I need to keep the FQDN convention in order to map it as a static file server on the Internet. Does anyone know if is there any escaping I can do to my bucket name before sending it to the API to prevent the dot crash? I've already tried regular expressions and got the same result.
I'd try using path style urls as suggested in the comments in a related AWS forum thread...
$s3 = new AmazonS3();
$s3->path_style = true;