I'm using the aws sdk to connect to an s3 bucket. I have built an IAM Policy to allow read using "s3:GetObject", "s3:GetObjectAcl" and "s3:ListBucket" and I can use the AWS cli to view objects and list files (including with listobjectsv2). When I use:
$file = $s3client->getObject([
'Bucket' => $bucket,
'Key' => $key,
]);
$body = $file->get('Body');
$body->rewind();
echo "Downloaded the file and it begins with: {$body->read(26)}.\n";
I can view file contents but when I try:
$contents = $s3client->listObjectsV2([
'Bucket' => $bucket
]);
echo "The contents of your bucket are: \n";
foreach ($contents['Contents'] as $content) {
echo $content['Key'] . "\n";
I receive a super helpful error:
Fatal error: Uncaught Error: Class 'SimpleXMLElement' not found in /var/www/html/vendor/aws/aws-sdk-php/src/Api/Parser/PayloadParserTrait.php:39 Stack trace: ....
It goes on but I figure it's all trash so not continuing.
I grabbed the code straight from: https://docs.aws.amazon.com/code-samples/latest/catalog/php-s3-s3_basics-GettingStartedWithS3.php.html
Any suggestions on what is wrong?
sudo apt-get install php-xml
doh
Related
I can connect to my ovh storage:
$s3Client = new S3Client(['profile' => 'default','endpoint' =>'http://storage.sbg.cloud.ovh.net', 'region' => 'SBG','version' => latest','credentials' => array('key' => $key,'secret' => $secret,)]);
and list all my containers
$result = $s3Client->listBuckets();var_dump($result);)
But I cannot create a new one:
$s3Client->createBucket(array('Bucket' => 'cont1',));
=> So I reate it on OVH web interface :swift,SBG,private,cont1.
When I would like to upload a file in this new container :
$result = $s3Client->upload('cont1', 'test.123', fopen($file_name, 'rb'), 'public-read');
I get :
PHP Fatal error: Uncaught exception 'Aws\\S3\\Exception\\S3Exception' with message 'Error executing "PutObject" on "http://moncontainer.storage.sbg.cloud.ovh.net/test.zip";
AWS HTTP error: Client error: `PUT http://moncontainer.storage.sbg.cloud.ovh.net/test.zip` resulted in a `400 Bad Request` response:\n\nMalformedXMLThe XML you provided was not well-formed (truncated...)\n MalformedXML (client): The XML you provided was not well-formed or did not validate against our published schema - \nMalformedXMLThe XML you provided was not well-formed or did not validate against our published schematx7bd76ec0ba114f32951bb-0061d553da'\n\nGuzzleHttp\\Exception\\ClientException: Client error: `PUT http://moncontainer.storage.sbg.cloud.ovh.net/test.zip` resulted in a `400 Bad Request` response:\n\nMalformedXMLThe in /home/SrvWeb/BackupOVH/Proc/JSR/aws/Aws/WrappedHttpHandler.php on line 195
but if I put 'cont1/' (with slash)
$result = $s3Client->upload('cont1/', 'test.123', fopen($file_name, 'rb'), 'public-read');
an object is created in 'cont1' but named '/test.123'
What is the good way to create a container and how to upload a file with the correct name ?
Good news... part of the answer :
To push a file in a container with the good filename :
$containername = 'moncontainer';
$result = $s3Client->upload($containername, $containername.'/'.basename($file_Path), fopen($file_Path, 'rb'), 'public-read');
Still investigating for the creation of the container :(
I have followed this tutorial http://blog.fineuploader.com/2013/08/16/fine-uploader-s3-upload-directly-to-amazon-s3-from-your-browser/ to upload images to Amazon S3 with fineuploader and uploading works fine. The problem is when I want to see a picture I have uploaded to S3.
I get the error
[06-Jan-2015 12:30:19 Europe/Berlin] PHP Fatal error: Uncaught Aws\S3\Exception\AccessDeniedException: AWS Error Code: AccessDenied, Status Code: 403, AWS Request ID: 6F9935EA1BE9F4F5, AWS Error Type: client, AWS Error Message: Access Denied, User-Agent: aws-sdk-php2/2.7.12 Guzzle/3.9.2 curl/7.24.0 PHP/5.3.28 ITR
thrown in /home/connecti/public_html/aws/Aws/Common/Exception/NamespaceExceptionFactory.php on line 91
When I run this test exampel
<?php
require 'aws/aws-autoloader.php';
use Aws\S3\S3Client;
// Instantiate the S3 client with your AWS credentials
$s3 = S3Client::factory(array(
'key' => 'MY_KEY',
'secret' => 'MY_SECRET_KEY',
));
$bucket = 'MY_BUCKET';
// Use the high-level iterators (returns ALL of your objects).
$objects = $s3->getIterator('ListObjects', array('Bucket' => $bucket));
echo "Keys retrieved!\n";
foreach ($objects as $object) {
echo $object['Key'] . "\n";
}
// Use the plain API (returns ONLY up to 1000 of your objects).
$result = $s3->listObjects(array('Bucket' => $bucket));
echo "Keys retrieved!\n";
foreach ($result['Contents'] as $object) {
echo $object['Key'] . "\n";
}
?>
My key, secret key and bucket are correct!
Same with other examples.
What do I need to do? Can anyone give me an example how I show an image uploaded by fineuploader and if I have to do any setting on Amazon (in addition to what I have done from the fineuploader blog)?
The error message suggests that the server-side key does not have proper permissions to make a ListObjects or some related call on the bucket in question. You'll need to re-evaluate the IAM user/group associated with your server-side key and ensure it has all required assigned permissions.
I have been tasked with connecting to an s3 bucket and using documentation have the following:
<?php
define('AWS_KEY', 'key in here');
define('AWS_SECRET_KEY', 'key in here');
define('HOST', 'https://console.aws.amazon.com/s3/home?region=us-east-1#');
use Aws\S3\S3Client;
// Establish connection with DreamObjects with an S3 client.
$client = S3Client::factory(array(
'base_url' => HOST,
'key' => AWS_KEY,
'secret' => AWS_SECRET_KEY
));
// list owned buckets
$blist = $client->listBuckets();
echo " Buckets belonging to " . $blist['Owner']['ID'] . ":\n";
foreach ($blist['Buckets'] as $b) {
echo "{$b['Name']}\t{$b['CreationDate']}\n";
}
// list Bucket contents
$o_iter = $client->getIterator('ListObjects', array(
'Bucket' => $bucketname
));
foreach ($o_iter as $o) {
echo "{$o['Key']}\t{$o['Size']}\t{$o['LastModified']}\n";
}
but I get the error in the title any ideas I have my access keys but i am confused about how to fix this issue ?
It is probably a bad idea to hardcode or to use environment variable to pass your secret key and access key.
A better design pattern would be to leverage EC2 Role or to use the SDK configuration file (see http://docs.aws.amazon.com/aws-sdk-php/guide/latest/credentials.html for details)
The base_url argument you're using is invalid, it is the URL of the console, not the one from the service. You can just ignore this parameter (as per http://docs.aws.amazon.com/aws-sdk-php/guide/latest/configuration.html#client-configuration-options), the SDK will build it automatically for you
I've got an array of file information that is being looped through and using the AWS PHP 2 skd to upload the contents of these files to the cloud, it all works brilliantly until I try and add meta data, at this point, it adds the the metadata to the first object created, but after that I get the following error message:
Fatal error: Uncaught Aws\S3\Exception\SignatureDoesNotMatchException: AWS Error Code: SignatureDoesNotMatch, Status Code: 403, AWS Request ID: 8FC9360F2EB687EE, AWS Error Type: client, AWS Error Message: The request signature we calculated does not match the signature you provided. Check your key and signing method., User-Agent: aws-sdk-php2/2.2.1 Guzzle/3.3.1 curl/7.24.0 PHP/5.3.13 thrown in D:\Inetpub\wwwroot\ThirdParty_Resources\AWS_SDK2\aws\aws-sdk-php\src\Aws\Common\Exception\NamespaceExceptionFactory.php on line 89
I've cropped the code from my loop to highlight the area that is being naughty.
foreach($aFiles as $aFile) {
$arr_ObjectMeta = array(
'OriginalFileName' => $aFile['FileName']
, 'Description' => $aFile['FileDesc']
, 'Keywords' => $aFile['FileKeyW']
);
// get the file to upload
$obj_FileUpload = $obj_S3->putObject($sBucket, $sBucketFolder . $sFilenameToSave, $sFile, 'public-read', $arr_ObjectMeta);
if($obj_FileUpload) {
$files_uploaded++;
} else {
$files_not_uploaded++;
}
// clear the file upload S3 response object
unset($obj_FileUpload);
// delete the downloaded file
unlink($sServerUploadFolder.$sFilenameToSave);
}
So the second time around the loop, it seems to bomb because of the different meta values. When the meta data is the same, the loop executes without issue. Any help/pointers would be great.
You might be confusing the putObject method method with the upload helper method.
The upload helper method is available as of version 2.4 of the SDK. Using the upload method you could do the following:
try {
$sKey = $sBucketFolder . $sFilenameToSave;
$obj_FileUpload = $obj_S3->upload($sBucket, $sKey, $sFile, 'public-read', array(
'Metadata' => $arr_ObjectMeta
));
$files_uploaded++;
} catch (\Aws\S3\Exception\S3Exception $e) {
$files_not_uploaded++;
}
You can do the same thing with the putObject method as well, it is just slightly more verbose.
try {
$obj_FileUpload = $obj_S3->putObject(array(
'Bucket' => $sBucket
'Key' => $sBucketFolder . $sFilenameToSave,
'SourceFile' => $sFile,
'ACL' => 'public-read'
'Metadata' => $arr_ObjectMeta
));
$files_uploaded++;
} catch (\Aws\S3\Exception\S3Exception $e) {
$files_not_uploaded++;
}
I have been testing some PHP scripts that uses the aws-sdk-php to upload files to S3 storage. These scripts seems to work nicely when they are executed directly from the browser, but fails when trying to use it through an API class in Luracast Restler 3.0
A sample script that upload some dummy file is like the following:
<?php
require_once (dirname(__FILE__) . "/../lib/aws/aws.phar");
use Aws\Common\Aws;
use Aws\S3\Enum\CannedAcl;
use Aws\S3\Exception\S3Exception;
use Aws\Common\Enum\Region;
function test(){
// Instantiate an S3 client
$s3 = Aws::factory(array(
'key' => 'key',
'secret' => 'secret',
'region' => Region::EU_WEST_1
))->get('s3');
try {
$s3->putObject(array(
'Bucket' => 'my-bucket',
'Key' => '/test/test.txt',
'Body' => 'example file uploaded from php!',
'ACL' => CannedAcl::PUBLIC_READ
));
} catch (S3Exception $e) {
echo "There was an error uploading the file.\n";
}
}
This script is placed in the folder /util/test.php, while the aws-php-sdk is at /lib/aws/aws.phar
To test that this test method is well written I have created another php script at /test/upload.php with the following code:
<?php
require_once (dirname(__FILE__) . '/../util/test.php');
test();
So I can enter in the browser http://mydomain.com/test/upload.php and all is working as expected and the file is uploaded in S3 storage.
However when I call the function test() from an API class with the Restler framework, I have an error that says that Aws cannot be found:
Fatal error: Class 'Aws\Common\Aws' not found in /var/app/current/util/test.php on line 12
However the code is exactly the same that perfectly works when called from upload.php. I have been wasting hours trying to figure out what is happening here, but I cannot get any conclusion. Is like the aws-php-sdk autoloader does not work well under some circumstances.
Any hints?
This held me back quite some time,
The issue is caused by reslters autoloader which sets spl_autoload_unregiste
as described here:
https://github.com/Luracast/Restler/issues/72
One way to get arround the problem is to comment out the relevant lines in vendor/Luracast/Restler/AutoLoader.php
public static function thereCanBeOnlyOne() {
if (static::$perfectLoaders === spl_autoload_functions())
return static::$instance;
/*
if (false !== $loaders = spl_autoload_functions())
if (0 < $count = count($loaders))
for ($i = 0, static::$rogueLoaders += $loaders;
$i < $count && false != ($loader = $loaders[$i]);
$i++)
if ($loader !== static::$perfectLoaders[0])
spl_autoload_unregister($loader);
*/
return static::$instance;
}