I am trying to read the data from a txt file on my amazon AWS bucket. But the body key in the response array is shown as NULL. My code -
function s3_file_get_contents($path, $private = TRUE, $bucket = '') {
require_once(CODE_BASE_DIR . '/ds_engine/docSuggest/external/aws-sdk-3/aws-autoloader.php');
try {
$s3Client = new Aws\S3\S3Client(array('region' => S3_ENDPOINT_REGION, 'version' => S3_ENDPOINT_VERSION,
'credentials' => array(
'key' => S3_SUGGESTADOC_API_KEY,
'secret' => S3_SUGGESTADOC_API_SECRET,
),
));
$result = $s3Client->getObject(array(
'Bucket' => $private ? S3_BUCKET_DOCSUGGEST : S3_BUCKET_SUGGESTADOC,
'Key' => $path,
));
} catch (Exception $e) {
$error = $e->getMessage();
log_message('ERROR', '['.__FUNCTION__.'] Exception: '.$error);
}
die(print_array($result['body']));
return $error ? $error : $result['body'];
}
The file contains some text but nothing is displayed in the console. Rest assured, I have setup the connection properly and there is no issues in that. I am able to download the file but just not read from it.
P.S - The response metadata has an object URL. Using that the file can be downloaded. So I guess I am hitting the correct path but still no success.
The data is in $result['Body'], not in $result['body'].
Look at the documentation:
http://docs.aws.amazon.com/aws-sdk-php/v2/guide/service-s3.html#downloading-objects
Use var_dump($result) to understand better than structure of the response.
Related
I am working on a process to upload a large number of files to S3, and for my smaller files I am building a list of commands using getCommand to upload them concurrently, like this:
$commands = array();
$commands[] = $s3Client->getCommand('PutObject', array(
'Bucket' => 'mybucket',
'Key' => 'filename.ext',
'Body' => fopen('filepath', 'r'),
));
$commands[] = $s3Client->getCommand('PutObject', array(
'Bucket' => 'mybucket',
'Key' => 'filename_2.ext',
'Body' => fopen('filepath_2', 'r'),
));
etc.
try {
$pool = new CommandPool($s3Client, $commands, [
'concurrency' => 5,
'before' => function (CommandInterface $cmd, $iterKey) {
//Do stuff before the file starts to upload
},
'fulfilled' => function (ResultInterface $result, $iterKey, PromiseInterface $aggregatePromise) {
//Do stuff after the file is finished uploading
},
'rejected' => function (AwsException $reason, $iterKey, PromiseInterface $aggregatePromise) {
//Do stuff if the file fails to upload
},
]);
// Initiate the pool transfers
$promise = $pool->promise();
// Force the pool to complete synchronously
$promise->wait();
$promise->then(function() { echo "All the files have finished uploading!"; });
} catch (Exception $e) {
echo "Exception Thrown: Failed to upload: ".$e->getMessage()."<br>\n";
}
This works fine for the smaller files, but some of my files are large enough that I'd like them to automatically be uploaded in multiple parts. So, instead of using getCommand('PutObject'), which uploads an entire file, I'd like to use something like getCommand('ObjectUploader') so that the larger files can be automatically broken up as needed. However, when I try to use getCommand('ObjectUploader') it throws an error and says that it doesn't know what to do with that. I'm guessing that perhaps the command has a different name, which is why it is throwing the error. But, it's also possible that it's not possible to do it like this.
If you've worked on something like this in the past, how have you done it? Or even if you haven't worked on it, I'm open to any ideas you might have.
Thanks!
References:
https://docs.aws.amazon.com/sdk-for-php/v3/developer-guide/guide_commands.html#command-pool
https://docs.aws.amazon.com/sdk-for-php/v3/developer-guide/s3-multipart-upload.html#object-uploader
I decided to go a different direction with this, and instead of using an array of concurrent commands I am now using a set of concurrent MultipartUploader promises, as shown in an example on this page: https://500.keboola.com/parallel-multipart-uploads-to-s3-in-php-61ff03ffc043
Here are the basics:
//Create an array of your file paths
$files = ['file1', 'file2', ...];
//Create an array to hold the promises
$promises = [];
//Create MultipartUploader objects, and add them to the promises array
foreach($files as $filePath) {
$uploader = new \Aws\S3\MultipartUploader($s3Client, $filePath, $uploaderOptions);
$promises[$filePath] = $uploader->promise();
}
//Process the promises once they are all complete
$results = \GuzzleHttp\Promise\unwrap($promises);
I have a amazon SES setup to forward/save emails in a s3 bucket.
In s3 I have a bucket policy which reads as following and is support to automatically make all objects in the folder public (I know this is not ideal to do):
{
"Version": "2012-10-17",
"Id": "Policy1517940937030",
"Statement": [
{
"Sid": "Stmt1517940933454",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::pipedemail/*"
}
]
}
I then have a php script that is used to list out the objects in the s3 bucket and grab each object so we can put them into a mysql.
My issues is that I can list out the objects without any issue. However my script tries to parse and save the contents of each object into a mysql table and it only works if I manually login to the s3 and click "make public" on each of the objects? so my s3 bucket policy does not seem to be working? I know my other option is to use the $s3->getObject() tag to get the object but I am not sure how to use that tag with the php file_get_contents() method so that i can get the raw email object file to be parsed?
here is my code
of course I ** out the logins/api keys.
<?php
//mysql connection
$servername = "*****";
$username = "***";
$password ="****";
$databasename ="***";
$con = mysqli_connect("$servername","$username","$password","$databasename");
if (mysqli_connect_errno())
{
echo "Failed to connect to MySQL: " . mysqli_connect_error();
}
//load the php mime parse library
require_once __DIR__.'/vendor/autoload.php';
$Parser = new PhpMimeMailParser\Parser();
//Include the AWS SDK using the Composer autoloader.
require 'awssdk/aws-autoloader.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
// AWS Info
$bucketName = 'pipedemail';
$IAM_KEY = '******';
$IAM_SECRET = '*****';
// Connect to AWS
try {
// You may need to change the region. It will say in the URL when the bucket is open
// and on creation. us-east-2 is Ohio, us-east-1 is North Virgina
$s3 = S3Client::factory(
array(
'credentials' => array(
'key' => $IAM_KEY,
'secret' => $IAM_SECRET
),
'version' => 'latest',
'region' => 'us-east-1'
)
);
} catch (Exception $e) {
// We use a die, so if this fails. It stops here. Typically this is a REST call so this would
// return a json object.
die("Error: " . $e->getMessage());
}
// Use the high-level iterators (returns ALL of your objects).
$objects = $s3->getIterator('ListObjects', array('Bucket' => $bucketName));
foreach ($objects as $object)
{
$objectkey = $object['Key'];
$path = "https://s3.amazonaws.com/pipedemail/$objectkey";
//lets get the raw email file to parse it
$Parser->setText(file_get_contents($path));
// Once we've indicated where to find the mail, we can parse out the data
//$to = $Parser->getHeader('to'); // "test" <test#example.com>, "test2" <test2#example.com>
//$addressesTo = $Parser->getAddresses('to'); //Return an array : [[test, test#example.com, false],[test2, test2#example.com, false]]
$from = $Parser->getHeader('from'); // "test" <test#example.com>
$addressesFrom = $Parser->getAddresses('from'); //Return an array : test, test#example.com, false
$subject = $Parser->getHeader('subject');
//html of email body
$html_emailbody = $Parser->getMessageBody('html');
//insert the above pp variables data into a mysql table
mysqli_query($con, "INSERT into emails(From_email,email_subject, email_body) VALUES('$from','$subject','$html_emailbody')");
//now lets delete the object since we already took the email and saved it into mysql
$s3->deleteObject(array('Bucket' => $bucketName, 'Key' => $objectkey));
}
?>
Yes, you should be using the getObject method as suggested by ceejayoz. You'll need to make sure the IAM user associated with the IAM key/secret you are using has s3:GetObject permissions or this call will fail.
Your code would then look something like:
$objectkey = $object['Key'];
// Get the object
$result = $s3->getObject(array(
'Bucket' => $bucketName,
'Key' => $objectkey
));
//lets get the raw email file to parse it
$Parser->setText($result['Body']);
I cant seem to update file in google drive with the following code, everything goes fine but file remains untouched? I am working with v3 api.
function updateFile($service, $fileId, $data) {
try {
$emptyFile = new Google_Service_Drive_DriveFile();
$file = $service->files->get($fileId);
$service->files->update($fileId, $emptyFile, array(
'data' => $data,
'mimeType' => 'text/csv',
'uploadType' => 'multipart'
));
} catch (Exception $e) {
print "An error occurred: " . $e->getMessage();
}
}
I managed to do it, you have to put empty file as second argument, not sure why but this post helped me a lot: Google Drive API v3 Migration
This is final solution:
function updateFile($service, $fileId, $data) {
try {
$emptyFile = new Google_Service_Drive_DriveFile();
$service->files->update($fileId, $emptyFile, array(
'data' => $data,
'mimeType' => 'text/csv',
'uploadType' => 'multipart'
));
} catch (Exception $e) {
print "An error occurred: " . $e->getMessage();
}
}
where $fileId is file you are updating, and data is new content you are updating your file.
Dont forget to refresh google drive after this because it's preview doesnt change and I lost one hour on that :/. Hope this helps.
function updateFile($fileId,$newDescription){
try {
// First retrieve the file from the API.
$emptyFile = new Google_Service_Drive_DriveFile();
// File's new metadata.
$emptyFile->setDescription($newDescription);
// Send the request to the API.
$driveService->files->update($fileId, $emptyFile, array());
print 'success';
} catch (Exception $e) {
print "An error occurred: " . $e->getMessage();
}
}//end update
The method is essential if you wish to update staff like desciption. I copied the idea from v2.
// File's new metadata.
$file->setTitle($newTitle);
$file->setDescription($newDescription);
$file->setMimeType($newMimeType);
NB: Also you have to ensure 3 parameter of update function is an array
Also as stated in other helpful answer; Ensure you refresh google drive to check
I've got an array of file information that is being looped through and using the AWS PHP 2 skd to upload the contents of these files to the cloud, it all works brilliantly until I try and add meta data, at this point, it adds the the metadata to the first object created, but after that I get the following error message:
Fatal error: Uncaught Aws\S3\Exception\SignatureDoesNotMatchException: AWS Error Code: SignatureDoesNotMatch, Status Code: 403, AWS Request ID: 8FC9360F2EB687EE, AWS Error Type: client, AWS Error Message: The request signature we calculated does not match the signature you provided. Check your key and signing method., User-Agent: aws-sdk-php2/2.2.1 Guzzle/3.3.1 curl/7.24.0 PHP/5.3.13 thrown in D:\Inetpub\wwwroot\ThirdParty_Resources\AWS_SDK2\aws\aws-sdk-php\src\Aws\Common\Exception\NamespaceExceptionFactory.php on line 89
I've cropped the code from my loop to highlight the area that is being naughty.
foreach($aFiles as $aFile) {
$arr_ObjectMeta = array(
'OriginalFileName' => $aFile['FileName']
, 'Description' => $aFile['FileDesc']
, 'Keywords' => $aFile['FileKeyW']
);
// get the file to upload
$obj_FileUpload = $obj_S3->putObject($sBucket, $sBucketFolder . $sFilenameToSave, $sFile, 'public-read', $arr_ObjectMeta);
if($obj_FileUpload) {
$files_uploaded++;
} else {
$files_not_uploaded++;
}
// clear the file upload S3 response object
unset($obj_FileUpload);
// delete the downloaded file
unlink($sServerUploadFolder.$sFilenameToSave);
}
So the second time around the loop, it seems to bomb because of the different meta values. When the meta data is the same, the loop executes without issue. Any help/pointers would be great.
You might be confusing the putObject method method with the upload helper method.
The upload helper method is available as of version 2.4 of the SDK. Using the upload method you could do the following:
try {
$sKey = $sBucketFolder . $sFilenameToSave;
$obj_FileUpload = $obj_S3->upload($sBucket, $sKey, $sFile, 'public-read', array(
'Metadata' => $arr_ObjectMeta
));
$files_uploaded++;
} catch (\Aws\S3\Exception\S3Exception $e) {
$files_not_uploaded++;
}
You can do the same thing with the putObject method as well, it is just slightly more verbose.
try {
$obj_FileUpload = $obj_S3->putObject(array(
'Bucket' => $sBucket
'Key' => $sBucketFolder . $sFilenameToSave,
'SourceFile' => $sFile,
'ACL' => 'public-read'
'Metadata' => $arr_ObjectMeta
));
$files_uploaded++;
} catch (\Aws\S3\Exception\S3Exception $e) {
$files_not_uploaded++;
}
I'm trying to upload a local CSV-file to Google Drive and display it like a Google Spreadsheet there. However, when I go to my Google Drive and click the link to my file, I can only download it, not view it as a spreadsheet. I've tried using the ?convert=true but the file doesn't get converted. I've also tried using application/vnd.google-apps.spreadsheet as the mime type but noting changes or I get a 400 Bad request response.
When I right click the file, I can choose to open with Google Spreadsheets which then displays the file correctly. I can't find anything about this in the current documentation over at Google and searches on google haven't help a whole lot.
What I've done so far is creating a new, empty Google spreadsheet and tried filling it with my CSV file but that gives me a 500 Internal Error.
$file = new Google_DriveFile();
$file->setTitle('Exported data from ' . $this->event->title);
$file->setDescription('Exported data from ' . $this->event->title);
$file->setMimeType( 'text/csv' );
try {
$createdFile = $this->drive->files->insert($file, array(
'data' => $data,
'mimeType' => 'application/vnd.google-apps.spreadsheet'
), array('convert'=>true));
// Uncomment the following line to print the File ID
// print 'File ID: %s' % $createdFile->getId();
$additionalParams = array(
'newRevision' => false,
'data' => $data,
'convert'=>true //no change if this is true or false
);
$newFile = $this->drive->files->get( $createdFile->getId() );
$newFile->setMimeType('text/csv');
$updated = $this->drive->files->update($createdFile->getId(), $newFile, $additionalParams);
preint_r($updated);
} catch (Exception $e) {
print "An error occurred: " . $e->getMessage();
}
I've looked at the API for Google Drive but haven't found anything useful. I'm wondering if I should use the Google Spreadsheets API or is the Google Drive API the one to use solely?
Many thanks in advance,
Waldemar
Try the following. It's from the File:insert reference in the Google documentation but I added the convert parameter:
/**
* Insert new file.
*
* #param Google_DriveService $service Drive API service instance.
* #param string $title Title of the file to insert, including the extension.
* #param string $description Description of the file to insert.
* #param string $parentId Parent folder's ID.
* #param string $mimeType MIME type of the file to insert.
* #param string $filename Filename of the file to insert.
* #return Google_DriveFile The file that was inserted. NULL is returned if an API error occurred.
*/
function insertFile($service, $title, $description, $parentId, $mimeType, $filename) {
$file = new Google_DriveFile();
$file->setTitle($title);
$file->setDescription($description);
$file->setMimeType($mimeType);
// Set the parent folder.
if ($parentId != null) {
$parent = new ParentReference();
$parent->setId($parentId);
$file->setParents(array($parent));
}
try {
$data = file_get_contents($filename);
$createdFile = $service->files->insert($file, array(
'data' => $data,
'mimeType' => $mimeType,
'convert' => true,
));
// Uncomment the following line to print the File ID
// print 'File ID: %s' % $createdFile->getId();
return $createdFile;
} catch (Exception $e) {
print "An error occurred: " . $e->getMessage();
}
}
i am try above code it will not working for me
it will just stop after login
just use
$file->setMimeType('application/vnd.google-apps.spreadsheet);
$createdFile = $service->files->insert($file, array(
'data' => $data,
'mimeType' => 'text/csv',
'convert' => true,
));
it will work for me
I have to add another parameters to make it work.
'uploadType' => 'multipart'
$createdFile = $service->files->insert($file,array(
'data' => $data,
'mimeType' => 'text/csv',
'convert' => true,
'uploadType' => 'multipart',
));
Now it's working.