I'd created my webservice to upload files, it's working fine, but would like to use compress gzip, this part it's not working, how can I upload my files with gzip ?
$client = new Google_Client();
$storageService = new Google_Service_Storage($client );
$bucket = $this->bucketName;
$file_name = md5(uniqid(rand(), true)) . '.' . $ext;
$file_content = urldecode($myFile);
try {
$postbody = array(
'name' => $file_name,
'data' => file_get_contents($file_content),
'uploadType' => "resumable",
'predefinedAcl' => 'publicRead',
);
$options = array('Content-Encoding' => 'gzip');
$gsso = new Google_Service_Storage_StorageObject();
$gsso->setName( $file_name );
$gsso->setContentEncoding( "gzip" );
$storageService->objects->insert( $bucket, $gsso, $postbody, $options );
} catch ( Exception $e){
$result['error'] = json_decode($e->getMessage());
}
Related
Actually this code is uploading files to normal Gdrive but i need to upload my files into the shared drive can anyone help me. All i need is to Upload my files to to my shared drive. Below is my current code :-
require __DIR__ . '/vendor/autoload.php';
use Google\Client;
use Google\Service\Drive;
# TODO - PHP client currently chokes on fetching start page token
function uploadBasic()
{
try {
$client = new Client();
putenv('GOOGLE_APPLICATION_CREDENTIALS=./credentials.json');
$client->useApplicationDefaultCredentials();
$client->addScope(Drive::DRIVE);
$driveService = new Drive($client);
$file = getcwd().'/637fdc0994855.mp4';
$filename = basename($file);
$mimeType = mime_content_type($file);
$fileMetadata = new Drive\DriveFile(array(
'name' => $filename,
'parents' => ['1VBi8C04HBonM6L4CfL-jHWZ0QoQRyrCL']
));
$content = file_get_contents('637fdc0994855.mp4');
$file = $driveService->files->create($fileMetadata, array(
'data' => $content,
'mimeType' => $mimeType,
'uploadType' => 'multipart',
'fields' => 'id'
));
printf("File ID: %s\n", $file->id);
return $file->id;
} catch (Exception $e) {
echo "Error Message: " . $e;
}
}
uploadBasic();
In order to upload the file to the shared drive, please modify as follows.
From:
$file = $driveService->files->create($fileMetadata, array(
'data' => $content,
'mimeType' => $mimeType,
'uploadType' => 'multipart',
'fields' => 'id'
));
To:
$file = $driveService->files->create($fileMetadata, array(
'data' => $content,
'mimeType' => $mimeType,
'uploadType' => 'multipart',
'fields' => 'id',
'supportsAllDrives' => true // <--- Added
));
Note:
In this case, your client has no permission for writing the shared drive, an error occurs. Please be careful about this.
Reference:
Files: create
I need to download thousands of images from an S3 bucket. I'm using the PHP SDK for AWS S3, but I can't find a way to remove the memory it's using when downloading the images.
$keyname = 'XXXXXXXXXX';
$bucket = 'XXXXXXXXXX';
$region = 'eu-west-1';
$credentials = new Aws\Credentials\Credentials('XXXXXXXXXX', 'XXXXXXXXXX');
$s3 = new Aws\S3\S3Client([
'version' => 'latest',
'region' => $region,
'credentials' => $credentials,
]);
$idDownloadedImages = array();
do {
$files = $this->check_new_images();
if ( !empty( $files ) ) {
foreach ( $files as $file ) {
$UrlMedia = basename( $file->UrlMedia );
$pathFile = $pathToPlugin . 'images/' . $UrlMedia;
$object = $keyname . $UrlMedia;
try {
// Get the object.
$result = $s3->getObject([
'Bucket' => $bucket,
'Key' => $object,
'SaveAs' => $pathFile,
]);
//gc_collect_cycles();
//xdebug_debug_zval('result');
$idDownloadedImages[] = $file->idMedia;
} catch ( S3Exception $e ) {
$errorCode = '(Error Code: ' . $e->getStatusCode() . ') ' . $e->getAwsErrorMessage();
}
//unset($result);
$result['Body']->__destruct();
}
$idDownloadedImages = implode( ',', $idDownloadedImages );
$wpdb->query( 'UPDATE wp_icat_Media
SET error=0
WHERE idMedia IN (' . $idDownloadedImages .')' );
$idDownloadedImages = array();
}
} while ( $this->check_new_images() );
I have tried to force the PHP garbage collector every time I download an image, to do a set() of $result and $idDownloadedImages and even to do $result['Body']->__destruct(), but I still don't know why it stores the images in memory.
Memory is only freed by forcing the gc_collect_cycles() at the end of the script.
From the documentation, Amazon recommends calling gc_collect_cycles on the before_upload event.
Here is the sample code:
$uploader = new MultipartUploader($client, $source, [
'bucket' => 'your-bucket',
'key' => 'your-key',
'before_upload' => function(\Aws\Command $command) {
gc_collect_cycles();
}
]);
For example: I can copy only one file but I need opportunity to copy a few files because I have problems with the speed of application.
$fileMimeType = $this->service->files->get($googleFileId, array('fields' => 'mimeType'));
$metadata = new \Google_Service_Drive_DriveFile(
array(
'name' => uniqid(),
'uploadType' => 'multipart',
'mimeType' => isset($fileMimeType->mimeType) ? $fileMimeType->mimeType : false
)
);
$file = $this->service->files->copy($googleFileId, $metadata, array('fields' => 'id'));
Here is a sample from the docs of the google-api-client :
<?php
$client = new Google_Client();
$client->setApplicationName("Client_Library_Examples");
// Warn if the API key isn't set.
if (!$apiKey = getApiKey()) {
echo missingApiKeyWarning();
return;
}
$client->setDeveloperKey($apiKey);
$service = new Google_Service_Books($client);
$client->setUseBatch(true);
$batch = $service->createBatch();
$optParams = array('filter' => 'free-ebooks');
$req1 = $service->volumes->listVolumes('Henry David Thoreau', $optParams);
$batch->add($req1, "thoreau");
$req2 = $service->volumes->listVolumes('George Bernard Shaw', $optParams);
$batch->add($req2, "shaw");
$results = $batch->execute();
In your case, i think it will look like this :
P.S : Try enabling the batch processing when creating your client as show above ($client->setUseBatch(true);)
$batch = $this->service->createBatch();
$fileMimeType = $this->service->files->get($googleFileId, array('fields' => 'mimeType'));
$metadata = new \Google_Service_Drive_DriveFile(
array(
'name' => uniqid(),
'uploadType' => 'multipart',
'mimeType' => isset($fileMimeType->mimeType) ? $fileMimeType->mimeType : false
)
);
$fileCopy = $this->service->files->copy($googleFileId, $metadata, array('fields' => 'id'));
$batch->add($fileCopy, "copy");
$results = $batch->execute();
More details here
Abdelkarim EL AMEL: Thanks, you forward me to one of solving :)
We can't use this code:
$fileCopy = $this->service->files->copy($googleFileId, $metadata, array('fields' => 'id'));
$batch->add($fileCopy, "copy");
Because method $batch->add accept Psr\Http\Message\RequestInterface, $this->service->files->copy returned Google_Service_Drive_DriveFile
But we can create GuzzleHttp\Psr7\Request as same creating method copy from $this->service->files->copy.
This code solved my problem:
private function copyFileRequest($googleFileId)
{
// build the service uri
$url = $this->service->files->createRequestUri('files/{fileId}/copy', [
'fileId' => [
'location' => 'path',
'type' => 'string',
'value' => $googleFileId,
],
]);
$request = new Request('POST', $url, ['content-type' => 'application/json']);
return $request;
}
public function batchCopy()
{
$googleFileIdFirst = 'First file';
$googleFileIdSecond = 'Second file';
$batch = $this->service->createBatch();
$batch->add($this->copyFileRequest($googleFileIdFirst));
$batch->add($this->copyFileRequest($googleFileIdSecond));
$results = $batch->execute();
/** #var Response $result */
foreach ($results as $result) {
/** #var Stream $body */
$body = (string)$result->getBody(); // Returned json body with copy file
}
}
Other way, we can set defer as true for Google_Client
$client = new \Google_Client();
$client->useApplicationDefaultCredentials();
$client->setScopes([\Google_Service_Drive::DRIVE]);
// Calls should returned request not be executed
$client->setDefer(true)
$service = new \Google_Service_Drive($this->client);
And all methods from service return Psr\Http\Message\RequestInterface
$batch = $service->createBatch();
$googleServiceDriveFile = new \Google_Service_Drive_DriveFile(['name' => uniqid()]);
$request = $service->files->copy($googleFileId, $googleServiceDriveFile, ['fields' => 'id']);
$batch->add($request);
$results = $batch->execute();
I'm having some issues when trying to upload an image to AWS S3. It seems to upload the file correctly but, whenever I try to download or preview, it can't be opened. Currently, this is the upload code I'm using:
<?php
require_once 'classes/amazon.php';
require_once 'includes/aws/aws-autoloader.php';
use Aws\S3\S3Client;
$putdata = file_get_contents("php://input");
$request = json_decode($putdata);
$image_parts = explode(";base64,", $request->image);
$image_type_aux = explode("image/", $image_parts[0]);
$image_type = $image_type_aux[1];
$image_base64 = $image_parts[1];
$dateTime = new DateTime();
$fileName = $dateTime->getTimestamp() . "." . $image_type;
$s3Client = S3Client::factory(array(
'region' => 'eu-west-1',
'version' => '2006-03-01',
'credentials' => array(
'key' => Amazon::getAccessKey(),
'secret' => Amazon::getSecretKey(),
)
));
try {
$result = $s3Client->putObject(array(
'Bucket' => Amazon::getBucket(),
'Key' => 'banners/' . $fileName,
'Body' => $image_base64,
'ContentType' => 'image/' . $image_type,
'ACL' => 'public-read'
));
echo $result['ObjectURL'] . "\n";
} catch(S3Exception $e) {
echo $e->getMessage() . "\n";
}
?>
So, when I check the console after uploading the image file, it has the expected size, permissions and headers but, as I said, whenever I try to open the file, it fails.
What could be the problem here? Thanks in advance.
The issue here is you appear to be uploading the base64 encoded version of the image and not the raw bytes of the image. Take $image_base64 and decode into raw bytes first http://php.net/manual/en/function.base64-decode.php . I am sure if you tried to open those "images" in a text editor you would see base64 hex data.
you can upload "on the fly" by using the function $s3Client->upload like the following example:
<?php
$bucket = 'bucket-name';
$filename = 'image-path.extension';
$imageData = base64_decode(end(explode(",", $base64)));
$upload = $s3Client->upload($bucket, $filename, $imageData, 'public-read');
$upload->get('ObjectURL');
I use silex php 2.0 and the following code fount
$s3 = $app['aws']->createS3();
$putdata = file_get_contents("php://input");
$data = json_decode($request->getContent(), true);
$data = (object) $data;
$image_parts = explode(";base64,", $data->image);
$image_type_aux = explode("image/", $image_parts[0]);
$image_type = $image_type_aux[1];
$image_base64 = base64_decode($image_parts[1]);
$result = $s3->putObject([
'ACL' => 'public-read',
'Body' => $image_base64,
'Bucket' => 'name-bucket',
'Key' => 'test_img.jpeg',
'ContentType' => 'image/' . $image_type,
]);
var_dump($result['ObjectURL']);
I'm trying to upload a picture on my amazon S3 via their PHP SDK. So I made a little script to do so. However, my script doesn't work and my exception doesn't send me back any error message.
I'm new with AWS thank you for your help.
Here is the code :
Config.php
<?php
return array(
'includes' => array('_aws'),
'services' => array(
'default_settings' => array(
'params' => array(
'key' => 'PUBLICKEY',
'secret' => 'PRIVATEKEY',
'region' => 'eu-west-1'
)
)
)
);
?>
Index.php
<?php
//Installing AWS SDK via phar
require 'aws.phar';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = 'infact';
$keyname = 'myImage';
// $filepath should be absolute path to a file on disk
$filepath = 'image.jpg';
// Instantiate the client.
$s3 = S3Client::factory('config.php');
// Upload a file.
try {
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filePath,
'ContentType' => 'text/plain',
'ACL' => 'public-read',
'StorageClass' => 'REDUCED_REDUNDANCY'
));
// Print the URL to the object.
echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
?>
EDIT : I'm now using this code but its still not working. I don't even have error or exception message.
<?php
require 'aws.phar';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = 'infactr';
$keyname = 'sample';
// $filepath should be absolute path to a file on disk
$filepath = 'image.jpg';
// Instantiate the client.
$s3 = S3Client::factory(array(
'key' => 'key',
'secret' => 'privatekey',
'region' => 'eu-west-1'
));
try {
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filePath,
'ACL' => 'public-read',
'ContentType' => 'image/jpeg'
));
// Print the URL to the object.
echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
?>
Try something like this (from the AWS docs):
<?php
require 'aws.phar';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = '<your bucket name>';
$keyname = 'sample';
// $filepath should be absolute path to a file on disk
$filepath = '/path/to/image.jpg';
// Instantiate the client.
$s3 = S3Client::factory(array(
'key' => 'your AWS access key',
'secret' => 'your AWS secret access key'
));
try {
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filepath,
'ACL' => 'public-read'
));
// Print the URL to the object.
echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
?>
It works fine for me as long as you have the right credentials. Keep in mind that the key name is the name of your file in S3 so if you want to have your key have the same name of your file you have to do something like: $keyname = 'image.jpg'; . Also, a jpg is generally not a plain/text file type, you can ommit that Content-type field or you can just simply specify: image/jpeg
$s3 = S3Client::factory('config.php');
should be
$s3 = S3Client::factory(include 'config.php');
For those looking an up to date working version, this is what I am using
// Instantiate the client.
$s3 = S3Client::factory(array(
'credentials' => [
'key' => $s3Key,
'secret' => $s3Secret,
],
'region' => 'us-west-2',
'version' => "2006-03-01"
));
try {
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $s3Bucket,
'Key' => $fileId,
'SourceFile' => $filepath."/".$fileName
));
return $result['ObjectURL'];
} catch (S3Exception $e) {
return false;
}
An alternative way to explain is by showing the curl, and how to build it in php - the pragmatic approach.
Please don't stone me for ugly code, just thought that this example is easy to follow for uploading to Azure from PHP, or other language.
$azure1 ='https://viperprodstorage1.blob.core.windows.net/bucketnameAtAzure/';
$azure3 ='?sr=c&si=bucketnameAtAzure-policy&sig=GJ_verySecretHashFromAzure_aw%3D';
$shellCmd='ls -la '.$outFileName;
$lsOutput=shell_exec($shellCmd);
#print_r($lsOutput);
$exploded=explode(' ', $lsOutput);
#print_r($exploded);
$fileLength=$exploded[7];
$curlAzure1="curl -v -X PUT -T '" . $outFileName . "' -H 'Content-Length: " . $fileLength . "' ";
$buildedCurlForUploading=$curlAzure1."'".$azure1.$outFileName.$azure3."'";
var_dump($buildedCurlForUploading);
shell_exec($buildedCurlForUploading);
This is the actual curl
shell_exec("curl -v -X PUT -T 'fileName' -H 'Content-Length: fileSize' 'https://viperprodstorage1.blob.core.windows.net/bucketnameAtAzure/fileName?sr=c&si=bucketNameAtAzure-policy&sig=GJ_verySecretHashFromAzure_aw%3D'")
Below are the code for upload image/file in amazon s3 bucket.
function upload_agreement_data($target_path, $source_path, $file_name, $content_type)
{
$fileup_flag = false;
/*------------- call global settings helper function starts ----------------*/
$bucketName = "pilateslogic";
//$global_setting_option = '__cloud_front_bucket__';
//$bucketName = get_global_settings($global_setting_option);
/*------------- call global settings helper function ends ----------------*/
if(!$bucketName)
{
die("ERROR: Template bucket name not found!");
}
// Amazon profile_template template js upload URL
$target_profile_template_js_url = "/".$bucketName."/".$target_path;
// Chatching profile_template template js upload URL
//$source_profile_template_js_url = dirname(dirname(dirname(__FILE__))).$source_path."/".$file_name;
// file name
$template_js_file = $file_name;
$this->s3->setEndpoint("s3-ap-southeast-2.amazonaws.com");
if($this->s3->putObjectFile($source_path, $target_profile_template_js_url, $template_js_file, S3::ACL_PUBLIC_READ, array(), array("Content-Type" => $content_type)))
{
$fileup_flag = true;
}
return $fileup_flag;
}