I've got an array of file information that is being looped through and using the AWS PHP 2 skd to upload the contents of these files to the cloud, it all works brilliantly until I try and add meta data, at this point, it adds the the metadata to the first object created, but after that I get the following error message:
Fatal error: Uncaught Aws\S3\Exception\SignatureDoesNotMatchException: AWS Error Code: SignatureDoesNotMatch, Status Code: 403, AWS Request ID: 8FC9360F2EB687EE, AWS Error Type: client, AWS Error Message: The request signature we calculated does not match the signature you provided. Check your key and signing method., User-Agent: aws-sdk-php2/2.2.1 Guzzle/3.3.1 curl/7.24.0 PHP/5.3.13 thrown in D:\Inetpub\wwwroot\ThirdParty_Resources\AWS_SDK2\aws\aws-sdk-php\src\Aws\Common\Exception\NamespaceExceptionFactory.php on line 89
I've cropped the code from my loop to highlight the area that is being naughty.
foreach($aFiles as $aFile) {
$arr_ObjectMeta = array(
'OriginalFileName' => $aFile['FileName']
, 'Description' => $aFile['FileDesc']
, 'Keywords' => $aFile['FileKeyW']
);
// get the file to upload
$obj_FileUpload = $obj_S3->putObject($sBucket, $sBucketFolder . $sFilenameToSave, $sFile, 'public-read', $arr_ObjectMeta);
if($obj_FileUpload) {
$files_uploaded++;
} else {
$files_not_uploaded++;
}
// clear the file upload S3 response object
unset($obj_FileUpload);
// delete the downloaded file
unlink($sServerUploadFolder.$sFilenameToSave);
}
So the second time around the loop, it seems to bomb because of the different meta values. When the meta data is the same, the loop executes without issue. Any help/pointers would be great.
You might be confusing the putObject method method with the upload helper method.
The upload helper method is available as of version 2.4 of the SDK. Using the upload method you could do the following:
try {
$sKey = $sBucketFolder . $sFilenameToSave;
$obj_FileUpload = $obj_S3->upload($sBucket, $sKey, $sFile, 'public-read', array(
'Metadata' => $arr_ObjectMeta
));
$files_uploaded++;
} catch (\Aws\S3\Exception\S3Exception $e) {
$files_not_uploaded++;
}
You can do the same thing with the putObject method as well, it is just slightly more verbose.
try {
$obj_FileUpload = $obj_S3->putObject(array(
'Bucket' => $sBucket
'Key' => $sBucketFolder . $sFilenameToSave,
'SourceFile' => $sFile,
'ACL' => 'public-read'
'Metadata' => $arr_ObjectMeta
));
$files_uploaded++;
} catch (\Aws\S3\Exception\S3Exception $e) {
$files_not_uploaded++;
}
Related
on output when open page -
File ID: 1qG8tteyVhAbB_rbu_VUvaE9ReqnSjEAh...
But on google drive no new files are created. I want upload file to cron but now i want only download test.pdf and end.
require_once './google-api-php-client/vendor/autoload.php';
use Google\Client;
use Google\Service\Drive;
function uploadBasic()
{
try {
$client = new Client();
//$client->useApplicationDefaultCredentials();
$client->setAuthConfig('./google-api-php-client/1710-6c50418be6b2.json');
$client->addScope(Drive::DRIVE);
$driveService = new Drive($client);
$fileMetadata = new Drive\DriveFile(array(
'parents' => ['225qhcKKyf8Ot0IhrRxRtqgHNTxLV1LiyI'],
'name' => 'test.pdf',
'mimeType' => 'application/pdf'));
$mimeType=mime_content_type($fileMetadata);
$content = file_get_contents('https://example.com/test.pdf');
$file = $driveService->files->create($fileMetadata, array([
'data' => $content,
'mimeType' => 'application/pdf',
'uploadType' => 'multipart',
'fields' => 'id']));
printf("File ID: %s\n", $file->id);
return $file->id;
} catch(Exception $e) {
echo "Error Message: ".$e;
}
}
uploadBasic();
how to debug issue
The fastest way to debug this is to do a File.list. This will tell you if in fact the file was uploaded.
You are not setting parents yin your meta data, so the file will have been uploaded to the root directory.
service account
Remember if you are using a service account that the files are uploaded into the service accounts google drive account, not your personal drive account.
To upload to your personal drive account you would need to create a directory on your drive account, share that directory with your service account using the service account email address. The service account email address can be found in the json key file its the only one with an #.
Then set parents in the meta data to the folder on your drive account
$fileMetadata = new Drive\DriveFile(array(
'parents' => { 'FolderId' }
'name' => 'ASB-Background-3.png'));
File 0 size error after edit
You edited your question. It originally stated you were doing this
$content = file_get_contents('./google-api-php-client/ASB-Background-3.png');
It is bad practice to update your question and change your code. It changes the answer to your question and in this case your error message.
That being said From the documentation for file_get_contents
file_get_contents — Reads entire file into a string
There is nothing in the documentation that states that this method could load a file from a url. So your edit changing to a URL is probably not going to work.
file_get_contents('https://example.com/test.pdf');
Your file is uploading with 0 because your not giving it a file. Download that file on to the machine running it and then send it, or write our own method which will accept a url and return a string file conents.
upload image
Files are uploaded in two parts first the fileMetadata and then the file itself.
MimeType must be properly set to that of the file you are uploading. file_get_contents will only work on a file that is currently accessible by your code.
If the file size is 0 make sure
to check the most recent uploaded file. every create will create a new file.
ensure that your code has access to the file you are uploading.
make sure the mimeType is correct.
Sample.
try {
$client = new Client();
$client->useApplicationDefaultCredentials();
$client->addScope(Drive::DRIVE);
$driveService = new Drive($client);
$fileMetadata = new Drive\DriveFile(array(
'name' => 'photo.jpg'));
$content = file_get_contents('../files/photo.jpg');
$file = $driveService->files->create($fileMetadata, array([
'data' => $content,
'mimeType' => 'image/jpeg',
'uploadType' => 'multipart',
'fields' => 'id']));
printf("File ID: %s\n", $file->id);
return $file->id;
} catch(Exception $e) {
echo "Error Message: ".$e;
}
I am working on a process to upload a large number of files to S3, and for my smaller files I am building a list of commands using getCommand to upload them concurrently, like this:
$commands = array();
$commands[] = $s3Client->getCommand('PutObject', array(
'Bucket' => 'mybucket',
'Key' => 'filename.ext',
'Body' => fopen('filepath', 'r'),
));
$commands[] = $s3Client->getCommand('PutObject', array(
'Bucket' => 'mybucket',
'Key' => 'filename_2.ext',
'Body' => fopen('filepath_2', 'r'),
));
etc.
try {
$pool = new CommandPool($s3Client, $commands, [
'concurrency' => 5,
'before' => function (CommandInterface $cmd, $iterKey) {
//Do stuff before the file starts to upload
},
'fulfilled' => function (ResultInterface $result, $iterKey, PromiseInterface $aggregatePromise) {
//Do stuff after the file is finished uploading
},
'rejected' => function (AwsException $reason, $iterKey, PromiseInterface $aggregatePromise) {
//Do stuff if the file fails to upload
},
]);
// Initiate the pool transfers
$promise = $pool->promise();
// Force the pool to complete synchronously
$promise->wait();
$promise->then(function() { echo "All the files have finished uploading!"; });
} catch (Exception $e) {
echo "Exception Thrown: Failed to upload: ".$e->getMessage()."<br>\n";
}
This works fine for the smaller files, but some of my files are large enough that I'd like them to automatically be uploaded in multiple parts. So, instead of using getCommand('PutObject'), which uploads an entire file, I'd like to use something like getCommand('ObjectUploader') so that the larger files can be automatically broken up as needed. However, when I try to use getCommand('ObjectUploader') it throws an error and says that it doesn't know what to do with that. I'm guessing that perhaps the command has a different name, which is why it is throwing the error. But, it's also possible that it's not possible to do it like this.
If you've worked on something like this in the past, how have you done it? Or even if you haven't worked on it, I'm open to any ideas you might have.
Thanks!
References:
https://docs.aws.amazon.com/sdk-for-php/v3/developer-guide/guide_commands.html#command-pool
https://docs.aws.amazon.com/sdk-for-php/v3/developer-guide/s3-multipart-upload.html#object-uploader
I decided to go a different direction with this, and instead of using an array of concurrent commands I am now using a set of concurrent MultipartUploader promises, as shown in an example on this page: https://500.keboola.com/parallel-multipart-uploads-to-s3-in-php-61ff03ffc043
Here are the basics:
//Create an array of your file paths
$files = ['file1', 'file2', ...];
//Create an array to hold the promises
$promises = [];
//Create MultipartUploader objects, and add them to the promises array
foreach($files as $filePath) {
$uploader = new \Aws\S3\MultipartUploader($s3Client, $filePath, $uploaderOptions);
$promises[$filePath] = $uploader->promise();
}
//Process the promises once they are all complete
$results = \GuzzleHttp\Promise\unwrap($promises);
I am trying to read the data from a txt file on my amazon AWS bucket. But the body key in the response array is shown as NULL. My code -
function s3_file_get_contents($path, $private = TRUE, $bucket = '') {
require_once(CODE_BASE_DIR . '/ds_engine/docSuggest/external/aws-sdk-3/aws-autoloader.php');
try {
$s3Client = new Aws\S3\S3Client(array('region' => S3_ENDPOINT_REGION, 'version' => S3_ENDPOINT_VERSION,
'credentials' => array(
'key' => S3_SUGGESTADOC_API_KEY,
'secret' => S3_SUGGESTADOC_API_SECRET,
),
));
$result = $s3Client->getObject(array(
'Bucket' => $private ? S3_BUCKET_DOCSUGGEST : S3_BUCKET_SUGGESTADOC,
'Key' => $path,
));
} catch (Exception $e) {
$error = $e->getMessage();
log_message('ERROR', '['.__FUNCTION__.'] Exception: '.$error);
}
die(print_array($result['body']));
return $error ? $error : $result['body'];
}
The file contains some text but nothing is displayed in the console. Rest assured, I have setup the connection properly and there is no issues in that. I am able to download the file but just not read from it.
P.S - The response metadata has an object URL. Using that the file can be downloaded. So I guess I am hitting the correct path but still no success.
The data is in $result['Body'], not in $result['body'].
Look at the documentation:
http://docs.aws.amazon.com/aws-sdk-php/v2/guide/service-s3.html#downloading-objects
Use var_dump($result) to understand better than structure of the response.
I have followed this tutorial http://blog.fineuploader.com/2013/08/16/fine-uploader-s3-upload-directly-to-amazon-s3-from-your-browser/ to upload images to Amazon S3 with fineuploader and uploading works fine. The problem is when I want to see a picture I have uploaded to S3.
I get the error
[06-Jan-2015 12:30:19 Europe/Berlin] PHP Fatal error: Uncaught Aws\S3\Exception\AccessDeniedException: AWS Error Code: AccessDenied, Status Code: 403, AWS Request ID: 6F9935EA1BE9F4F5, AWS Error Type: client, AWS Error Message: Access Denied, User-Agent: aws-sdk-php2/2.7.12 Guzzle/3.9.2 curl/7.24.0 PHP/5.3.28 ITR
thrown in /home/connecti/public_html/aws/Aws/Common/Exception/NamespaceExceptionFactory.php on line 91
When I run this test exampel
<?php
require 'aws/aws-autoloader.php';
use Aws\S3\S3Client;
// Instantiate the S3 client with your AWS credentials
$s3 = S3Client::factory(array(
'key' => 'MY_KEY',
'secret' => 'MY_SECRET_KEY',
));
$bucket = 'MY_BUCKET';
// Use the high-level iterators (returns ALL of your objects).
$objects = $s3->getIterator('ListObjects', array('Bucket' => $bucket));
echo "Keys retrieved!\n";
foreach ($objects as $object) {
echo $object['Key'] . "\n";
}
// Use the plain API (returns ONLY up to 1000 of your objects).
$result = $s3->listObjects(array('Bucket' => $bucket));
echo "Keys retrieved!\n";
foreach ($result['Contents'] as $object) {
echo $object['Key'] . "\n";
}
?>
My key, secret key and bucket are correct!
Same with other examples.
What do I need to do? Can anyone give me an example how I show an image uploaded by fineuploader and if I have to do any setting on Amazon (in addition to what I have done from the fineuploader blog)?
The error message suggests that the server-side key does not have proper permissions to make a ListObjects or some related call on the bucket in question. You'll need to re-evaluate the IAM user/group associated with your server-side key and ensure it has all required assigned permissions.
Below is the code I am using in my data upload program.
if(isset($_POST['upload'])){
try {
$dailyUploadsFile = $analytics->management_dailyUploads->upload(
$_REQUEST['accountId'], // your accountID
$_REQUEST['webPropertyId'], // your web property ID
$_REQUEST['customDataSourceId'], // your custom data source UID
$_REQUEST['datepicker'], // date
$_REQUEST['appendNumber'], // append number
'cost', // type of data
array(
"reset" => $_REQUEST['reset'],
"data" => file_get_contents_curl($_REQUEST['csvFile']),
"mimeType" => 'application/octet-stream',
"uploadType" => 'media'));
} catch (Exception $e) {
die('An error occured: ' . $e->getMessage()."\n");
}
}
Here is the error I am getting when I hit submit:
An error occured: Error calling POST https://www.googleapis.com/upload/analytics/v3/management/accounts/34620205/webproperties/UA-34620205-1/customDataSources/P4Zlk69kSCOtVVIu7iFjqw/dailyUploads/2013-03-09/uploads?appendNumber=1&type=cost&reset=true&uploadType=media&key=AIzaSyDzvHpTNC_CKAnpyfnc1Vjwl_joE5hgBhc: (400) Media type 'application/x-www-form-urlencoded' is not supported. Valid media types: [application/octet-stream]
Please help.
The dailyUploads resource for the Analytics API has been deprecated. It is recommend that you use the uploads resource.
The following example should work for php given you have an authorized analytics service object.
$analytics->management_uploads->uploadData(
'123456',
'UA-123456-1',
'122333444455555',
array('data' => file_get_contents('example.csv'),
'mimeType' => 'application/octet-stream',
'uploadType' => 'media'));