I have followed this tutorial http://blog.fineuploader.com/2013/08/16/fine-uploader-s3-upload-directly-to-amazon-s3-from-your-browser/ to upload images to Amazon S3 with fineuploader and uploading works fine. The problem is when I want to see a picture I have uploaded to S3.
I get the error
[06-Jan-2015 12:30:19 Europe/Berlin] PHP Fatal error: Uncaught Aws\S3\Exception\AccessDeniedException: AWS Error Code: AccessDenied, Status Code: 403, AWS Request ID: 6F9935EA1BE9F4F5, AWS Error Type: client, AWS Error Message: Access Denied, User-Agent: aws-sdk-php2/2.7.12 Guzzle/3.9.2 curl/7.24.0 PHP/5.3.28 ITR
thrown in /home/connecti/public_html/aws/Aws/Common/Exception/NamespaceExceptionFactory.php on line 91
When I run this test exampel
<?php
require 'aws/aws-autoloader.php';
use Aws\S3\S3Client;
// Instantiate the S3 client with your AWS credentials
$s3 = S3Client::factory(array(
'key' => 'MY_KEY',
'secret' => 'MY_SECRET_KEY',
));
$bucket = 'MY_BUCKET';
// Use the high-level iterators (returns ALL of your objects).
$objects = $s3->getIterator('ListObjects', array('Bucket' => $bucket));
echo "Keys retrieved!\n";
foreach ($objects as $object) {
echo $object['Key'] . "\n";
}
// Use the plain API (returns ONLY up to 1000 of your objects).
$result = $s3->listObjects(array('Bucket' => $bucket));
echo "Keys retrieved!\n";
foreach ($result['Contents'] as $object) {
echo $object['Key'] . "\n";
}
?>
My key, secret key and bucket are correct!
Same with other examples.
What do I need to do? Can anyone give me an example how I show an image uploaded by fineuploader and if I have to do any setting on Amazon (in addition to what I have done from the fineuploader blog)?
The error message suggests that the server-side key does not have proper permissions to make a ListObjects or some related call on the bucket in question. You'll need to re-evaluate the IAM user/group associated with your server-side key and ensure it has all required assigned permissions.
Related
i tried a system s3 object storage from idrive and i did composer install for composer require aws/aws-sdk-php.
but it shows error like that in my code window.
enter image description here
<?php
require "../vendor/autoload.php";
use Aws\S3\S3Client;
use Aws\Exception\AwsException;
// Create an S3 client for IDrive e2
$profile_credentials = [
"profile" => "default",
"endpoint" => "l4g4.ch11.idrivee2-2.com",
"region" => "Chicago",
"version" => "latest",
"use_path_style_endpoint" => true,
];
$s3 = S3Client::factory($profile_credentials);
// Get list of buckets
try {
$buckets = $s3->listBuckets();
// print bucket names
foreach ($buckets["Buckets"] as $bucket) {
echo "{$bucket["Name"]}\n";
}
} catch (AwsException $e) {
echo "Error: {$e->getMessage()}" . PHP_EOL;
}
I just want an example for this s3 object strong to work I have also tried things on the amazon aws documentation but haven't found any results and the error keeps coming
I'm using the aws sdk to connect to an s3 bucket. I have built an IAM Policy to allow read using "s3:GetObject", "s3:GetObjectAcl" and "s3:ListBucket" and I can use the AWS cli to view objects and list files (including with listobjectsv2). When I use:
$file = $s3client->getObject([
'Bucket' => $bucket,
'Key' => $key,
]);
$body = $file->get('Body');
$body->rewind();
echo "Downloaded the file and it begins with: {$body->read(26)}.\n";
I can view file contents but when I try:
$contents = $s3client->listObjectsV2([
'Bucket' => $bucket
]);
echo "The contents of your bucket are: \n";
foreach ($contents['Contents'] as $content) {
echo $content['Key'] . "\n";
I receive a super helpful error:
Fatal error: Uncaught Error: Class 'SimpleXMLElement' not found in /var/www/html/vendor/aws/aws-sdk-php/src/Api/Parser/PayloadParserTrait.php:39 Stack trace: ....
It goes on but I figure it's all trash so not continuing.
I grabbed the code straight from: https://docs.aws.amazon.com/code-samples/latest/catalog/php-s3-s3_basics-GettingStartedWithS3.php.html
Any suggestions on what is wrong?
sudo apt-get install php-xml
doh
I have a web application in PHP up and running. I want this app capable of uploading images to AWS s3 bucket. I am checking the documentation at AWS, but found at least three different documentations for this purpose. But still I am not clear, is possible that my web app hosted with a different hosting service will be able to upload files to AWS ?
If yes, which is the best option ?
You should be able to upload from outside of the AWS network.
Use the AWS PHP SDK at https://aws.amazon.com/sdk-for-php/
Then use the following code:
<?php
require 'vendor/autoload.php';
use Aws\Common\Exception\MultipartUploadException;
use Aws\S3\MultipartUploader;
use Aws\S3\S3Client;
$bucket = '*** Your Bucket Name ***';
$keyname = '*** Your Object Key ***';
$s3 = new S3Client([
'version' => 'latest',
'region' => 'us-east-1'
]);
// Prepare the upload parameters.
$uploader = new MultipartUploader($s3, '/path/to/large/file.zip', [
'bucket' => $bucket,
'key' => $keyname
]);
// Perform the upload.
try {
$result = $uploader->upload();
echo "Upload complete: {$result['ObjectURL']}" . PHP_EOL;
} catch (MultipartUploadException $e) {
echo $e->getMessage() . PHP_EOL;
}
?>
Edit the bucket name, keyname, region and upload file name.
This is the multi-part upload style so you can upload huge files.
I have been tasked with connecting to an s3 bucket and using documentation have the following:
<?php
define('AWS_KEY', 'key in here');
define('AWS_SECRET_KEY', 'key in here');
define('HOST', 'https://console.aws.amazon.com/s3/home?region=us-east-1#');
use Aws\S3\S3Client;
// Establish connection with DreamObjects with an S3 client.
$client = S3Client::factory(array(
'base_url' => HOST,
'key' => AWS_KEY,
'secret' => AWS_SECRET_KEY
));
// list owned buckets
$blist = $client->listBuckets();
echo " Buckets belonging to " . $blist['Owner']['ID'] . ":\n";
foreach ($blist['Buckets'] as $b) {
echo "{$b['Name']}\t{$b['CreationDate']}\n";
}
// list Bucket contents
$o_iter = $client->getIterator('ListObjects', array(
'Bucket' => $bucketname
));
foreach ($o_iter as $o) {
echo "{$o['Key']}\t{$o['Size']}\t{$o['LastModified']}\n";
}
but I get the error in the title any ideas I have my access keys but i am confused about how to fix this issue ?
It is probably a bad idea to hardcode or to use environment variable to pass your secret key and access key.
A better design pattern would be to leverage EC2 Role or to use the SDK configuration file (see http://docs.aws.amazon.com/aws-sdk-php/guide/latest/credentials.html for details)
The base_url argument you're using is invalid, it is the URL of the console, not the one from the service. You can just ignore this parameter (as per http://docs.aws.amazon.com/aws-sdk-php/guide/latest/configuration.html#client-configuration-options), the SDK will build it automatically for you
I've got an array of file information that is being looped through and using the AWS PHP 2 skd to upload the contents of these files to the cloud, it all works brilliantly until I try and add meta data, at this point, it adds the the metadata to the first object created, but after that I get the following error message:
Fatal error: Uncaught Aws\S3\Exception\SignatureDoesNotMatchException: AWS Error Code: SignatureDoesNotMatch, Status Code: 403, AWS Request ID: 8FC9360F2EB687EE, AWS Error Type: client, AWS Error Message: The request signature we calculated does not match the signature you provided. Check your key and signing method., User-Agent: aws-sdk-php2/2.2.1 Guzzle/3.3.1 curl/7.24.0 PHP/5.3.13 thrown in D:\Inetpub\wwwroot\ThirdParty_Resources\AWS_SDK2\aws\aws-sdk-php\src\Aws\Common\Exception\NamespaceExceptionFactory.php on line 89
I've cropped the code from my loop to highlight the area that is being naughty.
foreach($aFiles as $aFile) {
$arr_ObjectMeta = array(
'OriginalFileName' => $aFile['FileName']
, 'Description' => $aFile['FileDesc']
, 'Keywords' => $aFile['FileKeyW']
);
// get the file to upload
$obj_FileUpload = $obj_S3->putObject($sBucket, $sBucketFolder . $sFilenameToSave, $sFile, 'public-read', $arr_ObjectMeta);
if($obj_FileUpload) {
$files_uploaded++;
} else {
$files_not_uploaded++;
}
// clear the file upload S3 response object
unset($obj_FileUpload);
// delete the downloaded file
unlink($sServerUploadFolder.$sFilenameToSave);
}
So the second time around the loop, it seems to bomb because of the different meta values. When the meta data is the same, the loop executes without issue. Any help/pointers would be great.
You might be confusing the putObject method method with the upload helper method.
The upload helper method is available as of version 2.4 of the SDK. Using the upload method you could do the following:
try {
$sKey = $sBucketFolder . $sFilenameToSave;
$obj_FileUpload = $obj_S3->upload($sBucket, $sKey, $sFile, 'public-read', array(
'Metadata' => $arr_ObjectMeta
));
$files_uploaded++;
} catch (\Aws\S3\Exception\S3Exception $e) {
$files_not_uploaded++;
}
You can do the same thing with the putObject method as well, it is just slightly more verbose.
try {
$obj_FileUpload = $obj_S3->putObject(array(
'Bucket' => $sBucket
'Key' => $sBucketFolder . $sFilenameToSave,
'SourceFile' => $sFile,
'ACL' => 'public-read'
'Metadata' => $arr_ObjectMeta
));
$files_uploaded++;
} catch (\Aws\S3\Exception\S3Exception $e) {
$files_not_uploaded++;
}