EDIT : Problem solved (thanks to this post) ! I just needed to install curl :
sudo apt-get install php5-curl
I am working on a tool to upload images in a AWS S3 bucket from a browser. I am using the PHP SDK provided by Amazon.
I tried the code provided in the documentation, but it does not work with me:
use Aws\S3\S3Client;
$bucket = '*** Your Bucket Name ***';
$keyname = '*** Your Object Key ***';
// $filepath should be absolute path to a file on disk
$filepath = '*** Your File Path ***';
// Instantiate the client.
$s3 = S3Client::factory();
// Upload a file.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filepath,
'ContentType' => 'text/plain',
'ACL' => 'public-read',
'StorageClass' => 'REDUCED_REDUNDANCY',
'Metadata' => array(
'param1' => 'value 1',
'param2' => 'value 2'
)
));
echo $result['ObjectURL'];
Even if I keep only these 2 lines:
use Aws\S3\S3Client;
$s3 = S3Client::factory();
...I get these errors:
Notice: Use of undefined constant CURLE_COULDNT_RESOLVE_HOST - assumed
'CURLE_COULDNT_RESOLVE_HOST' in
phar:///var/www/aws.phar/Aws/S3/S3Client.php on line 244
Notice: Use of undefined constant CURLE_COULDNT_CONNECT - assumed
'CURLE_COULDNT_CONNECT' in
phar:///var/www/aws.phar/Aws/S3/S3Client.php on line 244
Notice: Use of undefined constant CURLE_PARTIAL_FILE - assumed
'CURLE_PARTIAL_FILE' in phar:///var/www/aws.phar/Aws/S3/S3Client.php
on line 244
Notice: Use of undefined constant CURLE_WRITE_ERROR - assumed
'CURLE_WRITE_ERROR' in phar:///var/www/aws.phar/Aws/S3/S3Client.php on
line 244
Notice: Use of undefined constant CURLE_READ_ERROR - assumed
'CURLE_READ_ERROR' in phar:///var/www/aws.phar/Aws/S3/S3Client.php on
line 244
Notice: Use of undefined constant CURLE_OPERATION_TIMEOUTED - assumed
'CURLE_OPERATION_TIMEOUTED' in
phar:///var/www/aws.phar/Aws/S3/S3Client.php on line 244
Notice: Use of undefined constant CURLE_SSL_CONNECT_ERROR - assumed
'CURLE_SSL_CONNECT_ERROR' in
phar:///var/www/aws.phar/Aws/S3/S3Client.php on line 244
Notice: Use of undefined constant CURLE_HTTP_PORT_FAILED - assumed
'CURLE_HTTP_PORT_FAILED' in
phar:///var/www/aws.phar/Aws/S3/S3Client.php on line 244
Notice: Use of undefined constant CURLE_GOT_NOTHING - assumed
'CURLE_GOT_NOTHING' in phar:///var/www/aws.phar/Aws/S3/S3Client.php on
line 244
Notice: Use of undefined constant CURLE_SEND_ERROR - assumed
'CURLE_SEND_ERROR' in phar:///var/www/aws.phar/Aws/S3/S3Client.php on
line 244
Notice: Use of undefined constant CURLE_RECV_ERROR - assumed
'CURLE_RECV_ERROR' in phar:///var/www/aws.phar/Aws/S3/S3Client.php on
line 244
Fatal error: Uncaught exception
'Guzzle\Common\Exception\RuntimeException' with message 'The PHP cURL
extension must be installed to use Guzzle.' in
phar:///var/www/aws.phar/Guzzle/Http/Client.php:72 Stack trace: #0
phar:///var/www/aws.phar/Aws/Common/Client/AbstractClient.php(79):
Guzzle\Http\Client->__construct('https://s3.amaz...',
Object(Guzzle\Common\Collection)) #1
phar:///var/www/aws.phar/Aws/Common/Client/ClientBuilder.php(249):
Aws\Common\Client\AbstractClient->__construct(Object(Aws\Common\Credentials\RefreshableInstanceProfileCredentials),
Object(Aws\S3\S3Signature), Object(Guzzle\Common\Collection)) #2
phar:///var/www/aws.phar/Aws/S3/S3Client.php(207):
Aws\Common\Client\ClientBuilder->build() #3 /var/www/response.php(30):
Aws\S3\S3Client::factory() #4 {main} thrown in
phar:///var/www/aws.phar/Guzzle/Http/Client.php on line 72
Just installed aws php sdk on my local machine, without any aws registration,
after several trys got this working code:
use Aws\S3\S3Client;
$bucket = '*** Your Bucket Name ***';
$keyname = '*** Your Object Key ***';
// $filepath should be absolute path to a file on disk
$filepath = '*** Your File Path ***';
// Instantiate the client.
$s3 = S3Client::factory();
try{
// Upload a file.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filepath,
'ContentType' => 'text/plain',
'ACL' => 'public-read',
'StorageClass' => 'REDUCED_REDUNDANCY',
'Metadata' => array(
'param1' => 'value 1',
'param2' => 'value 2'
)
));
echo $result['ObjectURL'];
} catch (Exception $e) {
echo $e->getMessage() . "\n";
}
this code return aws Exception message:
You must specify a non-null value for the Body or SourceFile parameters.
and if I change $filepath to the path to any real file, this code outputs:
Error retrieving credentials from the instance profile metadata server. When you are not running inside of Amazon EC2, you must provide your AWS access key ID and secret access key in the "key" and "secret" options when creating a client or provide an instantiated Aws\Common\Credentials\CredentialsInterface object. ([curl] 28: Connection timed out after 5000 milliseconds [url] http://XXX.xxx.xxx.xxx/latest/meta-data/iam/security-credentials/)
so if any question you are very welcome, but it must work.
by the way if I remove catch section from the code I get same error messages from php on my broken page.
Hope this could help you in your case.
Related
I'm trying to create an s3 bucket and then upload a file to it. However, when I run the code I get this ugly error message:
Fatal error: Uncaught InvalidArgumentException: Found 1 error while
validating the input provided for the PutObject operation: [Body] must
be an fopen resource, a GuzzleHttp\Stream\StreamInterface object, or
something that can be cast to a string. Found bool(false) in
/Users/ddripz/Downloads/vendor/aws/aws-sdk-php/src/Api/Validator.php:65
Stack trace:
#0 /Users/ddripz/Downloads/vendor/aws/aws-sdk-php/src/Middleware.php(78): Aws\Api\Validator->validate('PutObject', Object(Aws\Api\StructureShape), Array)
#1 /Users/ddripz/Downloads/vendor/aws/aws-sdk-php/src/S3/S3Client.php(541): Aws\Middleware::Aws\{closure}(Object(Aws\Command), NULL)
#2 /Users/ddripz/Downloads/vendor/aws/aws-sdk-php/src/S3/S3Client.php(564): Aws\S3\S3Client::Aws\S3\{closure}(Object(Aws\Command), NULL)
#3 /Users/ddripz/Downloads/vendor/aws/aws-sdk-php/src/S3/S3Client.php(498): Aws\S3\S3Client::Aws\S3\{closure}(Object(Aws\Command), NULL)
#4 /Users/ddripz/Downloads/vendor/aws/aws-sdk-php/src/S3/S3Client.php(517) in /Users/ddripz/Downloads/vendor/aws/aws-sdk-php/src/Api/Validator.php on line 65
Here is my PHP code:
<?php
// Require the Composer autoloader.
require '/Users/ddripz/Downloads/vendor/autoload.php';
use Aws\S3\S3Client;
// Instantiate an Amazon S3 client.
$s3 = new S3Client([
'version' => 'latest',
'region' => 'us-west-2',
'credentials' => [
'key' => 'API KEY',
'secret' => 'SECRET KEY'
]
]);
$bucketname = 'we-sign-file-manager';
$file_path = '/Users/DennisWarfield/Desktop/wesign/uploads/5f31fc30410c17.68431957.jpg';
$key = basename($file_path);
try {
$s3->putObject([
'Bucket' => '',
'Key' => 'my-object',
'Body' => fopen('/path/to/file', 'r'),
'ACL' => 'public-read',
]);
} catch (Aws\S3\Exception\S3Exception $e) {
echo "There was an error uploading the file.\n";
}
?>
Any idea why this is happening?
Also, I'm confused as to why the error is occurring on line 65 when the max number of lines in my php file is 32.
Is my autoload.php path incorrect because the file is in downloads?
Change this line
'Body' => fopen('/path/to/file', 'r'),
to
'Body' => fopen($file_path, 'r'),
Please check your config/filesystems.php and make sure in s3 all the data and thier path is correct if not then correct them.
I am attempting to upload a file to an S3 bucket via PHP. This has been working in the past and I believe it is now not working due to PHP versioning, but unsure. I have the GuzzleHTTP and AWS subfolders. I now receive the following error: GuzzleHttp/Psr7/functions.php, errline: 417, errstr: Uncaught Error: Call to undefined function GuzzleHttp\Psr7\hash_init().
I did find that there were some changes to hash_init in 7.2, so I rolled back to 7.1 and still got the error.
<?php
require 'aws-autoloader.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
function image_to_s3($fileName) {
// Connect to AWS
try {
// You may need to change the region. It will say in the URL when the bucket is open
// and on creation.
$s3 = S3Client::factory(
array(
'credentials' => array(
'key' => 'KEY',
'secret' => 'SECRET'
),
'version' => 'latest',
'region' => 'us-east-2'
)
);
} catch (Exception $e) {
// We use a die, so if this fails. It stops here. Typically this is a REST call so this would
// return a json object.
return $e->getMessage();
}
// prep the aws s3 bucket
$bucket = 'BUCKETNAME';
$keyname = $fileName;
$filepath = 'SUBDIRECTORY/' . $fileName;
echo $filePath;
try {
// Upload a file.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filepath,
'ContentType' => 'text/plain',
'ACL' => 'public-read',
'StorageClass' => 'REDUCED_REDUNDANCY',
'Metadata' => array(
'param1' => 'value 1',
'param2' => 'value 2'
)
));
} catch (S3Exception $e) {
return $e->getMessage();
} catch (Exception $e) {
return $e->getMessage();
}
return '';
}
?>
Full stack trace...
Fatal error: errno: 1, errfile: \/home\/USERNAME\/GuzzleHttp\/Psr7\/functions.php, errline: 417, errstr: Uncaught Error: Call to undefined function GuzzleHttp\\Psr7\\hash_init() in \/home\/USERNAME\/GuzzleHttp\/Psr7\/functions.php:417\nStack trace:\n#0 \/home\/USERNAME\/Aws\/Signature\/SignatureV4.php(164): GuzzleHttp\\Psr7\\hash(Object(GuzzleHttp\\Psr7\\LazyOpenStream), 'sha256')\n#1 \/home\/USERNAME\/Aws\/Signature\/S3SignatureV4.php(22): Aws\\Signature\\SignatureV4->getPayload(Object(GuzzleHttp\\Psr7\\Request))\n#2 \/home\/USERNAME\/Aws\/Middleware.php(126): Aws\\Signature\\S3SignatureV4->signRequest(Object(GuzzleHttp\\Psr7\\Request), Object(Aws\\Credentials\\Credentials))\n#3 \/home\/USERNAME\/GuzzleHttp\/Promise\/FulfilledPromise.php(39): Aws\\Middleware::Aws\\{closure}(Object(Aws\\Credentials\\Credentials))\n#4 \/home\/USERNAME\/GuzzleHttp\/Promise\/TaskQueue.php(47): GuzzleHttp\\Promise\\FulfilledPromise::GuzzleHttp\\Promise\\{closure}()\n#5 \/home\/USERNAME\/GuzzleHttp\/Promise\/Promise.php(246): GuzzleHttp\\Promise\\TaskQueue->run("}
MatsLindh led me down the correct path. Hash is built into PHP now, but my hosting provider for my test environment (DreamHost) set the enable-hash to shared which means each user needs to manually enable the extension by importing the Shared Object file. I did this by adding the following command to my php.ini file extension = hash.so You can tell as soon as the extension is loading correctly because you will have a whole new table added to phpinfo() for hash.
Having an issue with the s3 driver for Laravel 5.2. The error i'm getting is this:
Found 1 error while validating the input provided for the HeadObject operation:
[Key] must be at least 1 characters long. Value provided is 0 characters long.
I'm using the league flysystem V3 as stated in Laravel docs. When I follow the stack track and start dumping out the vars the 'Key' value is always empty but i've set it all up in my config file.
Here are the top lines from my stack trace
in Validator.php line 38
at Validator->validate('HeadObject', object(StructureShape), array('Bucket' => 'monstervsl', 'Key' => '', '#http' => array())) in Middleware.php line 77
at Middleware::Aws\{closure}(object(Command), null) in S3Client.php line 710
at S3Client::Aws\S3\{closure}(object(Command), null) in S3Client.php line 729
at S3Client::Aws\S3\{closure}(object(Command), null) in Middleware.php line 53
at Middleware::Aws\{closure}(object(Command), null) in SSECMiddleware.php line 59
at SSECMiddleware->__invoke(object(Command)) in AwsClient.php line 208
As you can see it's getting the bucket from my config but not the key it's empty.
Here is my filesystem.php file
's3' => [
'driver' => 's3',
// 'key' => env('S3_KEY'),
// 'secret' => env('S3_SECRET'),
'key' => '8tfnxo8abgn7voaex8rgv', // <- Not my real key
'secret' => 'aw7btx49wXNF7AGWV', // <- not my real secret
'region' => 'eu-west-1',
'bucket' => 'monstervsl',
],
Here is my controller, it's fairly straight forward, I don't think the put contents stuff is relevant but added it anyway
// Write the contents to a new file on disk
$view = view('iframe')->with('json', $video->toJson());
$contents = $view->render();
$token = '12345';
$filePath = public_path() .'/iframes/' . $token . '.html';
file_put_contents($filePath, $contents);
Storage::disk('s3')->put('/', file_get_contents($filePath));
You have to give the destination file path/name where you want to save this file on S3 bucket as the first argument of put function. Currently you are trying to save the file at the root of bucket without any name. Try something like this:
Storage::disk('s3')->put('filename.html', file_get_contents($filePath));
This full path of the file is the Key in this context & that is what's missing in your original request.
I get the following errors when I try to use the AWS PHP SDK:
PHP Warning: Illegal string offset 'client.backoff' in C:\xampp\htdocs\aws_test_local\vendor\aws\aws-sdk-php\src\Aws\S3\S3Client.php on line 172
PHP Catchable fatal error: Object of class Guzzle\Plugin\Backoff\BackoffPlugin could not be converted to string in C:\xampp\htdocs\aws_test_local\vendor\aws\aws-sdk-php\src\Aws\S3\S3Client.php on line 172
PHP Warning: Illegal string offset 'signature' in C:\xampp\htdocs\aws_test_local\vendor\aws\aws-sdk-php\src\Aws\S3\S3Client.php on line 175
PHP Catchable fatal error: Object of class Aws\S3\S3Signature could not be converted to string in C:\xampp\htdocs\aws_test_local\vendor\aws\aws-sdk-php\src\Aws\S3\S3Client.php on line 175
They originate from the following code inside the S3Client.php file part of the AWS SDK.
public static function factory($config = array())
{
$exceptionParser = new S3ExceptionParser();
// Configure the custom exponential backoff plugin for retrying S3 specific errors
if (!isset($config[Options::BACKOFF])) {
$config[Options::BACKOFF] = static::createBackoffPlugin($exceptionParser);
}
$config[Options::SIGNATURE] = $signature = static::createSignature($config);
...
The Options-class is the Aws\Common\Enum\ClientOptions. If you look at it it defines a lot of constants like this:
const SIGNATURE = 'signature';
const BACKOFF = 'client.backoff';
I call the factory function in the following way:
$s3 = S3Client::factory(_PS_ROOT_DIR_.'/override/aws/aws-config.php');
My aws-config.php file looks like this:
<?php
return array(
'includes' => array('_aws'),
'services' => array(
'default_settings' => array(
'params' => array(
'key' => 'XXXXXXXXXXX',
'secret' => 'XXXXXXXXXXX',
'region' => 'eu-west-1'
)
)
)
);
?>
Any ideas? I installed the PHP SDK with Composer, so I'd expect any dependancies to be installed.
The argument to S3Client::factory() is supposed to be an array. You're giving it a filename that contains PHP code to return the array, but S3Client doesn't run the file. Try changing the file to:
<?php
$s3options = array(
'includes' => array('_aws'),
'services' => array(
'default_settings' => array(
'params' => array(
'key' => 'XXXXXXXXXXX',
'secret' => 'XXXXXXXXXXX',
'region' => 'eu-west-1'
)
)
)
);
?>
Then your main program can do:
require(_PS_ROOT_DIR_.'/override/aws/aws-config.php');
$s3 = S3Client::factory($s3options);
This is the php file named upload.php in ec2 server
require 'vendor/autoload.php';
use Aws\S3\S3Client;
$client = S3Client::factory(array(
'key' => 'aws-secret-key',
'secret' => 'aws-very-secret-pass',
));
$dir = '/home/user/movies/history';
$bucket = 'my-unique-bucket';
$keyPrefix = '';
$options = array(
'params' => array('ACL' => 'public-read'),
'concurrency' => 20,
'debug' => true
);
$client->uploadDirectory($dir, $bucket, $keyPrefix, $options);
When I execute the upload.php file in terminal returns fatal error like this,
PHP Fatal error: Uncaught exception 'UnexpectedValueException' with message 'RecursiveDirectoryIterator::__construct(/home/kaya/Resimler/transferet/): failed to open dir: No such file or directory' in /var/www/html/vendor/aws/aws-sdk-php/src/Aws/S3/Sync/UploadSyncBuilder.php:47
Stack trace:
#0 /var/www/html/vendor/aws/aws-sdk-php/src/Aws/S3/Sync/UploadSyncBuilder.php(47): RecursiveDirectoryIterator->__construct('/home/user/movie...', 12800)
#1 /var/www/html/vendor/aws/aws-sdk-php/src/Aws/S3/S3Client.php(557): Aws\S3\Sync\UploadSyncBuilder->uploadFromDirectory('/home/user/movie...')
#2 /var/www/html/upload_dir.php(21): Aws\S3\S3Client->uploadDirectory('/home/user/movie...', 'my-unique-bucket', '', Array)
#3 {main}
thrown in /var/www/html/vendor/aws/aws-sdk-php/src/Aws/S3/Sync/UploadSyncBuilder.php on line 47
Normally I can upload files clearly with php sdk except uploadfolder function. I couldnt find where is false. My php sdk version is 2.7.
I figured it out. It works on local server like xampp or something, doesnt work on remote server.