Anyone who has had anything to do with google files, can you tell me how to overwrite a file?
Now I use this function but, even if the names are the same, it creates another one.
function uploadFile($fileName, $folderId, $file) {
$service = getGoogleDriveService();
$fileMetadata = new Google_Service_Drive_DriveFile(array(
'name' => $fileName,
'parents' => array($folderId),
));
try {
$newFile = $service->files->create(
$fileMetadata,
array(
'data' => file_get_contents($file),
'mimeType' => mime_content_type($file),
'uploadType' => 'multipart',
)
);
return $newFile->id;
} catch (Exception $e) {
return 'An error ocurred : ' . $e->getMessage();
}
}
Related
Actually this code is uploading files to normal Gdrive but i need to upload my files into the shared drive can anyone help me. All i need is to Upload my files to to my shared drive. Below is my current code :-
require __DIR__ . '/vendor/autoload.php';
use Google\Client;
use Google\Service\Drive;
# TODO - PHP client currently chokes on fetching start page token
function uploadBasic()
{
try {
$client = new Client();
putenv('GOOGLE_APPLICATION_CREDENTIALS=./credentials.json');
$client->useApplicationDefaultCredentials();
$client->addScope(Drive::DRIVE);
$driveService = new Drive($client);
$file = getcwd().'/637fdc0994855.mp4';
$filename = basename($file);
$mimeType = mime_content_type($file);
$fileMetadata = new Drive\DriveFile(array(
'name' => $filename,
'parents' => ['1VBi8C04HBonM6L4CfL-jHWZ0QoQRyrCL']
));
$content = file_get_contents('637fdc0994855.mp4');
$file = $driveService->files->create($fileMetadata, array(
'data' => $content,
'mimeType' => $mimeType,
'uploadType' => 'multipart',
'fields' => 'id'
));
printf("File ID: %s\n", $file->id);
return $file->id;
} catch (Exception $e) {
echo "Error Message: " . $e;
}
}
uploadBasic();
In order to upload the file to the shared drive, please modify as follows.
From:
$file = $driveService->files->create($fileMetadata, array(
'data' => $content,
'mimeType' => $mimeType,
'uploadType' => 'multipart',
'fields' => 'id'
));
To:
$file = $driveService->files->create($fileMetadata, array(
'data' => $content,
'mimeType' => $mimeType,
'uploadType' => 'multipart',
'fields' => 'id',
'supportsAllDrives' => true // <--- Added
));
Note:
In this case, your client has no permission for writing the shared drive, an error occurs. Please be careful about this.
Reference:
Files: create
I am looking in docs and Oracle sdk to see if there's anything we can upload to oracle storage.
But i didn't found any php sdk from Oracle or am i missing something?
I have researched a lot, please help me. I want to use php sdk to upload the files and folder to Oracle cloud and serve those file urls to my application.
Anyone looking for the solution for the same so i have figured it out and posting the answer here.
After looking at too many online references i got to know Oracle is compatible with Amazon s3 SDK. So all you need is to use AWS sdk and get the access key and secret from Oracle and you are done. Posting some code.
<?php
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\Exception\AwsException;
use Aws\S3\Exception\S3Exception;
define('ORACLE_ACCESS_KEY', '***************************************');
define('ORACLE_SECRET_KEY', '***************************************');
define('ORACLE_REGION', '***************************************');
define('ORACLE_NAMESPACE', '***************************************');
function get_oracle_client($endpoint)
{
$endpoint = "https://".ORACLE_NAMESPACE.".compat.objectstorage.".ORACLE_REGION.".oraclecloud.com/{$endpoint}";
return new Aws\S3\S3Client(array(
'credentials' => [
'key' => ORACLE_ACCESS_KEY,
'secret' => ORACLE_SECRET_KEY,
],
'version' => 'latest',
'region' => ORACLE_REGION,
'bucket_endpoint' => true,
'endpoint' => $endpoint
));
}
function upload_file_oracle($bucket_name, $folder_name = '', $file_name)
{
if (empty(trim($bucket_name))) {
return array('success' => false, 'message' => 'Please provide valid bucket name!');
}
if (empty(trim($file_name))) {
return array('success' => false, 'message' => 'Please provide valid file name!');
}
if ($folder_name !== '') {
$keyname = $folder_name . '/' . $file_name;
$endpoint = "{$bucket_name}/";
} else {
$keyname = $file_name;
$endpoint = "{$bucket_name}/{$keyname}";
}
$s3 = get_oracle_client($endpoint);
$s3->getEndpoint();
$file_url = "https://objectstorage.".ORACLE_REGION.".oraclecloud.com/n/".ORACLE_NAMESPACE."/b/{$bucket_name}/o/{$keyname}";
try {
$s3->putObject(array(
'Bucket' => $bucket_name,
'Key' => $keyname,
'SourceFile' => $file_name,
'StorageClass' => 'REDUCED_REDUNDANCY'
));
return array('success' => true, 'message' => $file_url);
} catch (S3Exception $e) {
return array('success' => false, 'message' => $e->getMessage());
} catch (Exception $e) {
return array('success' => false, 'message' => $e->getMessage());
}
}
function upload_folder_oracle($bucket_name, $folder_name)
{
if (empty(trim($bucket_name))) {
return array('success' => false, 'message' => 'Please provide valid bucket name!');
}
if (empty(trim($folder_name))) {
return array('success' => false, 'message' => 'Please provide valid folder name!');
}
$keyname = $folder_name;
$endpoint = "{$bucket_name}/{$keyname}";
$s3 = get_oracle_client($endpoint);
try {
$manager = new \Aws\S3\Transfer($s3, $keyname, 's3://' . $bucket_name . '/' . $keyname);
$manager->transfer();
return array('success' => true);
} catch (S3Exception $e) {
return array('success' => false, 'message' => $e->getMessage());
} catch (Exception $e) {
return array('success' => false, 'message' => $e->getMessage());
}
}
The above code is working and tested for more details please visit link - https://docs.cloud.oracle.com/en-us/iaas/Content/Object/Tasks/s3compatibleapi.htm
i created new module "googleapi.module" from there i'm trying to load the google api library to create some file in google drive, but i receive an error:
[message:protected] => file does not exist
/home/main/public_html/backoffice/sites/all/modules/main/googleapi/library/google-api-php-client-2.2.0/src/Google/Client.php
I think it's something to do with drupal paths, because i have a simple index.php file that works perfect and loading the library...
Here is the drupal code:
function googleapi_permission() {
return array(
'access googleapi' => array('title' => t('Access Google API')),
);
}
function googleapi_menu() {
$items = array();
$items['googleapi'] = array(
'title' => t('Google API'),
'page callback' => 'googleapi_main',
'access callback' => 'user_access',
'access arguments' => array('access googleapi'),
'type' => MENU_CALLBACK,
);
return $items;
}
function googleapi_main() {
$path = drupal_get_path('module', 'googleapi');
require_once "./$path/library/google-api-php-client-2.2.0/vendor/autoload.php";
$config_file = 'Zebraclick_Drive_API-891f9b06f8ae.json';
$folderId = '0B-XFow04K90UPTRYRFRsZk5HdDA';
$filename = 'text.'.time();
$filemime = 'text/plain';
try {
$data = 'server test';
$client = new Google_Client();
$scopes = ['https://www.googleapis.com/auth/drive', 'https://www.googleapis.com/auth/drive.appdata', 'https://www.googleapis.com/auth/drive.file'];
$client->setAuthConfig($config_file);
$client->setScopes($scopes);
$service = new Google_Service_Drive($client);
$file = new Google_Service_Drive_DriveFile(array(
'name' => $filename,
'parents' => array($folderId),
'mimeType' => 'application/vnd.google-apps.document'
));
$file->setName($filename);
$result = $service->files->create($file, array(
'data' => $data,
'mimeType' => $filemime,
'uploadType' => 'multipart'
));
$fileId = $result['id'];
$publicOriginallink = "https://drive.google.com/open?id=".$fileId;
$type = 'anyone';
$role = 'writer';
$msg = 'File saved. Please check the folder in Drive.';
}
catch(Exception $e) {
print_r($e);
$msg = $e->getMessage() . '<br />';
$msg .= 'Error occurred. Please try again. If this happens again, please contact the developer.';
return $msg;
}
}
try with
require_once "$path/library/google-api-php-client-2.2.0/vendor/autoload.php";
or
require_once __DIR__."/library/google-api-php-client-2.2.0/vendor/autoload.php";
If it's not efficient , maybe autoload.php needs namespace definition
I have code that is suppose to upload an 8GB file to the server. The problem I an running into appears to be memory issues because my server only has 4GB of ram. My script for upload is:
$s3 = S3Client::factory(array(
'credentials' => $credentials
));;
// 2. Create a new multipart upload and get the upload ID.
$response = $s3->createMultipartUpload(array(
'Bucket' => $bucket,
'Key' => $obect,
//'Body' => (strlen($body) < 1000 && file_exists($body)) ? Guzzle\Http\EntityBody::factory(fopen($body, 'r+')) : $body,
'ACL' => $acl,
'ContentType' => $content_type,
'curl.options' => array(
CURLOPT_TIMEOUT => 12000,
)
));
$uploadId = $response['UploadId'];
// 3. Upload the file in parts.
$file = fopen($body, 'r');
$parts = array();
$partNumber = 1;
while (!feof($file)) {
$result = $s3->uploadPart(array(
'Bucket' => $bucket,
'Key' => $obect,
'UploadId' => $uploadId,
'PartNumber' => $partNumber,
'Body' => fread($file, 10 * 1024 * 1024),
));
$parts[] = array(
'PartNumber' => $partNumber++,
'ETag' => $result['ETag'],
);
}
$result = $s3->completeMultipartUpload(array(
'Bucket' => $bucket,
'Key' => $obect
,
'UploadId' => $uploadId,
'Parts' => $parts,
));
$url = $result['Location'];
return true;
} catch (Aws\Exception\S3Exception $e) {
error_log($e -> getMessage() . ' ' . $e -> getTraceAsString());
return false;
} catch (Exception $e) {
error_log($e -> getMessage() . ' ' . $e -> getTraceAsString());
return false;
}
Has anyone come across this problem before? And how do I resolve the memory issues?
I've uploaded nearly 25k files (large media files) to an s3 bucket. I used AWS SDK2 for PHP (S3Client::putObject) to perform uploads. Now, I need to update metadata for these files i.e change the ContentDisposition to attachment and assign a filename.
Is there a way to perform this without requiring to re-upload the file? Please help.
Yes, you can use the copyObject method, where you set the CopySource parameter equal to the Bucket and Key parameters.
Example:
// setup your $s3 connection, and define the bucket and key for your resource.
$s3->copyObject(array(
'Bucket' => $bucket,
'CopySource' => "$bucket/$key",
'Key' => $key,
'Metadata' => array(
'ExtraHeader' => 'HEADER VALUE'
),
'MetadataDirective' => 'REPLACE'
));
Update Cache Control Metadata on S3 Objects
<?php
define('S3_BUCKET', 'bucket-name');
define('S3_ACCESS_KEY', 'your-access-key');
define('S3_SECRET_KEY', 'secret-key');
define('S3_REGION', 'ap-south-1'); //Mumbai
require 'vendors/aws/aws-autoloader.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
try {
$s3 = S3Client::factory(array(
'version' => 'latest',
'region' => S3_REGION,
'credentials' => array(
'secret' => S3_SECRET_KEY,
'key' => S3_ACCESS_KEY,
)
));
$objects = $this->s3->getIterator('ListObjects', array('Bucket' => S3_BUCKET));
echo "Keys retrieved!\n";
foreach ($objects as $object) {
echo $object['Key'] . "\n";
$s3->copyObject(array(
'Bucket' => S3_BUCKET,
'CopySource' => S3_BUCKET . '/' . $object['Key'],
'Key' => $key,
'ContentType' => 'image/jpeg',
'ACL' => 'public-read',
'StorageClass' => 'REDUCED_REDUNDANCY',
'CacheControl' => 'max-age=172800',
'MetadataDirective' => 'REPLACE'
));
}
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
Try this
For delete existing object
$keyname = 'product-file/my-object1.dll';
try
{
$delete = $this->s3->deleteObject([
'Bucket' => 'belisc',
'Key' => $keyname
]);
if ($delete['DeleteMarker']){
return true;
} else {
return false;
}
}
catch (S3Exception $e) {
return $e->getAwsErrorMessage();
}
For check object
return true if object is still exists
$keyname = 'product-file/my-object1.dll';
try {
$this->s3->getObject([
'Bucket' => 'belisc',
'Key' => $keyname
]);
return true;
} catch (S3Exception $e) {
return $e->getAwsErrorMessage();
}
Then you can upload the new one
try {
return $this->s3->putObject([
'Bucket' => 'belisc',
'Key' => 'product-file/MiFlashSetup_eng.rar',
'SourceFile' => 'c:\MiFlashSetup_eng.rar'
]);
} catch (S3Exception $e) {
die("There was an error uploading the file. ".$e->getAwsErrorMessage());
}