Fatal error: Uncaught InvalidArgumentException when trying to upload file to AWS - php

I'm trying to create an s3 bucket and then upload a file to it. However, when I run the code I get this ugly error message:
Fatal error: Uncaught InvalidArgumentException: Found 1 error while
validating the input provided for the PutObject operation: [Body] must
be an fopen resource, a GuzzleHttp\Stream\StreamInterface object, or
something that can be cast to a string. Found bool(false) in
/Users/ddripz/Downloads/vendor/aws/aws-sdk-php/src/Api/Validator.php:65
Stack trace:
#0 /Users/ddripz/Downloads/vendor/aws/aws-sdk-php/src/Middleware.php(78): Aws\Api\Validator->validate('PutObject', Object(Aws\Api\StructureShape), Array)
#1 /Users/ddripz/Downloads/vendor/aws/aws-sdk-php/src/S3/S3Client.php(541): Aws\Middleware::Aws\{closure}(Object(Aws\Command), NULL)
#2 /Users/ddripz/Downloads/vendor/aws/aws-sdk-php/src/S3/S3Client.php(564): Aws\S3\S3Client::Aws\S3\{closure}(Object(Aws\Command), NULL)
#3 /Users/ddripz/Downloads/vendor/aws/aws-sdk-php/src/S3/S3Client.php(498): Aws\S3\S3Client::Aws\S3\{closure}(Object(Aws\Command), NULL)
#4 /Users/ddripz/Downloads/vendor/aws/aws-sdk-php/src/S3/S3Client.php(517) in /Users/ddripz/Downloads/vendor/aws/aws-sdk-php/src/Api/Validator.php on line 65
Here is my PHP code:
<?php
// Require the Composer autoloader.
require '/Users/ddripz/Downloads/vendor/autoload.php';
use Aws\S3\S3Client;
// Instantiate an Amazon S3 client.
$s3 = new S3Client([
'version' => 'latest',
'region' => 'us-west-2',
'credentials' => [
'key' => 'API KEY',
'secret' => 'SECRET KEY'
]
]);
$bucketname = 'we-sign-file-manager';
$file_path = '/Users/DennisWarfield/Desktop/wesign/uploads/5f31fc30410c17.68431957.jpg';
$key = basename($file_path);
try {
$s3->putObject([
'Bucket' => '',
'Key' => 'my-object',
'Body' => fopen('/path/to/file', 'r'),
'ACL' => 'public-read',
]);
} catch (Aws\S3\Exception\S3Exception $e) {
echo "There was an error uploading the file.\n";
}
?>
Any idea why this is happening?
Also, I'm confused as to why the error is occurring on line 65 when the max number of lines in my php file is 32.
Is my autoload.php path incorrect because the file is in downloads?

Change this line
'Body' => fopen('/path/to/file', 'r'),
to
'Body' => fopen($file_path, 'r'),

Please check your config/filesystems.php and make sure in s3 all the data and thier path is correct if not then correct them.

Related

I am not able to get signed url in cloudfront, getting fatal error, code I am trying is below

<?php
require '../aws-autoloader.php';
use Aws\CloudFront\CloudFrontClient;
use Aws\Exception\AwsException;
// Create a CloudFront Client
$client = new Aws\CloudFront\CloudFrontClient([
'profile' => 'default',
'version' => 'latest',
'region' => 'us-east-1',
]);
// Set up parameter values for the resource
$resourceKey = 'https://example.cloudfront.net/b20cbfe5-a8df-47a5-94c4-aeadea20759f/dash/videoplayback.mpd';
$expires = time() + 300;
// Create a signed URL for the resource using the canned policy
$signedUrlCannedPolicy = $client->getSignedUrl([
'url' => $resourceKey,
'expires' => $expires,
'private_key' => 'pk.pem',
'key_pair_id' => 'keyid'
]);
getting error as
Fatal error: Uncaught InvalidArgumentException: error:0906D06C:PEM routines:PEM_read_bio:no start line in C:\xampp\htdocs\aws\Aws\CloudFront\Signer.php:40 Stack trace: #0 C:\xampp\htdocs\aws\Aws\CloudFront\UrlSigner.php(24): Aws\CloudFront\Signer->__construct('APKAJYH2L6BGHLW...', 'pk-APKAJYH2L6BG...') #1 C:\xampp\htdocs\aws\Aws\CloudFront\CloudFrontClient.php(138): Aws\CloudFront\UrlSigner->__construct('APKAJYH2L6BGHLW...', 'pk-APKAJYH2L6BG...') #2 C:\xampp\htdocs\aws\app\stream.php(26): Aws\CloudFront\CloudFrontClient->getSignedUrl(Array) #3 {main} thrown in C:\xampp\htdocs\aws\Aws\CloudFront\Signer.php on line 40
I have resolved this issue. The issue was you need to give absolute path in below like this
'private_key' => $_SERVER['DOCUMENT_ROOT'] . '/' . 'pk.pem',
Let me know if it's working or not?

AWS S3Client resolves the wrong url when using getIterator with an IP on PHP

I am unable to use the getIterator function of S3Client due to it somehow reversing the url.
Instead of it looking for http://192.168.120.70/bucket it returns this:
Could not resolve host: bucket.192.168.120.70
I'm sure there is something simple I'm overlooking.
<?php
require '/Applications/MAMP/htdocs/lab/aws/aws-autoloader.php';
use Aws\S3\S3Client;
use Aws\Exception\AwsException;
$bucketName = 'bucket';
$IAM_KEY = 'MY-KEY';
$IAM_SECRET = 'MY-SECRET';
// Connect to AWS
try {
$s3 = S3Client::factory(
array(
'credentials' => array(
'key' => $IAM_KEY,
'secret' => $IAM_SECRET
),
'version' => 'latest',
'region' => 'eu-west-1',
'endpoint' => 'http://192.168.120.70/',
'profile' => 'MY-PROFILE'
)
);
} catch (Exception $e) {
die("Error: " . $e->getMessage());
}
$buckets = $s3->listBuckets();
foreach ($buckets['Buckets'] as $bucket) {
echo $bucket['Name'] . "\n";
}
// returns -> bucket
$obj = $s3->getIterator('ListObjects', array('Bucket' => 'bucket'));
foreach ($obj as $object) {
var_dump($object);
}
// Error -> Could not resolve host: bucket.192.168.120.70
?>
The full error :
Fatal error: Uncaught exception 'Aws\S3\Exception\S3Exception' with message 'Error executing
"ListObjects" on "http://bucket.192.168.120.70/?encoding-type=url";
AWS HTTP error: cURL error 6: Could not resolve host: bucket.192.168.120.70 (see
https://curl.haxx.se/libcurl/c/libcurl-errors.html)' GuzzleHttp\Exception\ConnectException: cURL
error 6: Could not resolve host: bucket.192.168.120.70
(see https://curl.haxx.se/libcurl/c/libcurl-errors.html) in
/Applications/MAMP/htdocs/lab/aws/GuzzleHttp/Handler/CurlFactory.php:200 Stack trace: #0
/Applications/MAMP/htdocs/lab/aws/GuzzleHttp/Handler/CurlFactory.php(155):
GuzzleHttp\Handler\CurlFactory::createRejection(Object(GuzzleHttp\Handler\EasyHandle), Array) #1
/Applications/MAMP/htdocs/lab/aws/GuzzleHttp/Handler/CurlFactory.php(105):
GuzzleHttp\Handler\CurlFactory::finishError(Object(GuzzleHttp\Handler\CurlMultiHandler),
Object(GuzzleHttp\Handler\EasyHandle), Object(GuzzleHttp\Handler\CurlFactory)) #2
/Applications/MAMP/htdocs/lab/aws/GuzzleHttp/Han in
/Applications/MAMP/htdocs/lab/aws/Aws/WrappedHttpHandler.php on line 195
This is all you would need to accomplish your goal of listing buckets. The endpoint option is used for specific services and is not need in this scenerio:
$client = S3Client::factory(
array(
'credentials' => array(
'key' => $IAM_KEY,
'secret' => $IAM_SECRET,
),
'region' => 'eu-west-1',
'version' => 'latest')
);
I seems it was an issue in my s3client creation, after this it worked.
Strangely with my previous code it was only the first bucket that wasn't working.
// Connect to AWS
try {
// You may need to change the region. It will say in the URL when the bucket is open
// and on creation.
$s3 = S3Client::factory(
array(
'version' => 'latest',
'region' => 'Standard',
'endpoint' => 'http://192.167.120.60/', // Needed in my case
'use_path_style_endpoint' => true, // Needed in my case
'credentials' => array(
'key' => $IAM_KEY,
'secret' => $IAM_SECRET
)
)
);
} catch (Exception $e) {
// We use a die, so if this fails. It stops here. Typically this is a REST call so this would
// return a json object.
die("Error: " . $e->getMessage());
}

S3 file upload from PHP error: "Call to undefined function GuzzleHttp\Psr7\hash_init()"

I am attempting to upload a file to an S3 bucket via PHP. This has been working in the past and I believe it is now not working due to PHP versioning, but unsure. I have the GuzzleHTTP and AWS subfolders. I now receive the following error: GuzzleHttp/Psr7/functions.php, errline: 417, errstr: Uncaught Error: Call to undefined function GuzzleHttp\Psr7\hash_init().
I did find that there were some changes to hash_init in 7.2, so I rolled back to 7.1 and still got the error.
<?php
require 'aws-autoloader.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
function image_to_s3($fileName) {
// Connect to AWS
try {
// You may need to change the region. It will say in the URL when the bucket is open
// and on creation.
$s3 = S3Client::factory(
array(
'credentials' => array(
'key' => 'KEY',
'secret' => 'SECRET'
),
'version' => 'latest',
'region' => 'us-east-2'
)
);
} catch (Exception $e) {
// We use a die, so if this fails. It stops here. Typically this is a REST call so this would
// return a json object.
return $e->getMessage();
}
// prep the aws s3 bucket
$bucket = 'BUCKETNAME';
$keyname = $fileName;
$filepath = 'SUBDIRECTORY/' . $fileName;
echo $filePath;
try {
// Upload a file.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filepath,
'ContentType' => 'text/plain',
'ACL' => 'public-read',
'StorageClass' => 'REDUCED_REDUNDANCY',
'Metadata' => array(
'param1' => 'value 1',
'param2' => 'value 2'
)
));
} catch (S3Exception $e) {
return $e->getMessage();
} catch (Exception $e) {
return $e->getMessage();
}
return '';
}
?>
Full stack trace...
Fatal error: errno: 1, errfile: \/home\/USERNAME\/GuzzleHttp\/Psr7\/functions.php, errline: 417, errstr: Uncaught Error: Call to undefined function GuzzleHttp\\Psr7\\hash_init() in \/home\/USERNAME\/GuzzleHttp\/Psr7\/functions.php:417\nStack trace:\n#0 \/home\/USERNAME\/Aws\/Signature\/SignatureV4.php(164): GuzzleHttp\\Psr7\\hash(Object(GuzzleHttp\\Psr7\\LazyOpenStream), 'sha256')\n#1 \/home\/USERNAME\/Aws\/Signature\/S3SignatureV4.php(22): Aws\\Signature\\SignatureV4->getPayload(Object(GuzzleHttp\\Psr7\\Request))\n#2 \/home\/USERNAME\/Aws\/Middleware.php(126): Aws\\Signature\\S3SignatureV4->signRequest(Object(GuzzleHttp\\Psr7\\Request), Object(Aws\\Credentials\\Credentials))\n#3 \/home\/USERNAME\/GuzzleHttp\/Promise\/FulfilledPromise.php(39): Aws\\Middleware::Aws\\{closure}(Object(Aws\\Credentials\\Credentials))\n#4 \/home\/USERNAME\/GuzzleHttp\/Promise\/TaskQueue.php(47): GuzzleHttp\\Promise\\FulfilledPromise::GuzzleHttp\\Promise\\{closure}()\n#5 \/home\/USERNAME\/GuzzleHttp\/Promise\/Promise.php(246): GuzzleHttp\\Promise\\TaskQueue->run("}
MatsLindh led me down the correct path. Hash is built into PHP now, but my hosting provider for my test environment (DreamHost) set the enable-hash to shared which means each user needs to manually enable the extension by importing the Shared Object file. I did this by adding the following command to my php.ini file extension = hash.so You can tell as soon as the extension is loading correctly because you will have a whole new table added to phpinfo() for hash.

aws s3 uploading a directory to a bucket with php sdk

This is the php file named upload.php in ec2 server
require 'vendor/autoload.php';
use Aws\S3\S3Client;
$client = S3Client::factory(array(
'key' => 'aws-secret-key',
'secret' => 'aws-very-secret-pass',
));
$dir = '/home/user/movies/history';
$bucket = 'my-unique-bucket';
$keyPrefix = '';
$options = array(
'params' => array('ACL' => 'public-read'),
'concurrency' => 20,
'debug' => true
);
$client->uploadDirectory($dir, $bucket, $keyPrefix, $options);
When I execute the upload.php file in terminal returns fatal error like this,
PHP Fatal error: Uncaught exception 'UnexpectedValueException' with message 'RecursiveDirectoryIterator::__construct(/home/kaya/Resimler/transferet/): failed to open dir: No such file or directory' in /var/www/html/vendor/aws/aws-sdk-php/src/Aws/S3/Sync/UploadSyncBuilder.php:47
Stack trace:
#0 /var/www/html/vendor/aws/aws-sdk-php/src/Aws/S3/Sync/UploadSyncBuilder.php(47): RecursiveDirectoryIterator->__construct('/home/user/movie...', 12800)
#1 /var/www/html/vendor/aws/aws-sdk-php/src/Aws/S3/S3Client.php(557): Aws\S3\Sync\UploadSyncBuilder->uploadFromDirectory('/home/user/movie...')
#2 /var/www/html/upload_dir.php(21): Aws\S3\S3Client->uploadDirectory('/home/user/movie...', 'my-unique-bucket', '', Array)
#3 {main}
thrown in /var/www/html/vendor/aws/aws-sdk-php/src/Aws/S3/Sync/UploadSyncBuilder.php on line 47
Normally I can upload files clearly with php sdk except uploadfolder function. I couldnt find where is false. My php sdk version is 2.7.
I figured it out. It works on local server like xampp or something, doesnt work on remote server.

AWS dynamodb PHP 'Command was not found matching Update_table'

I working on a web application and since a lot of read and write action happening in the amazon dynamodb table am facing a provisionthroughput error. Thinking of using a Update table I used below code but facing error. This error is not mentioned in error handling chart of dynamobb.
<?php
use Aws\DynamoDb\DynamoDbClient;
$dynamoDB = DynamoDbClient::factory(array(
'key' => '',
'secret' => '',
'region' => Region::US_WEST_1
));
####################################################################
# Updating the table
// $dynamodb = new AmazonDynamoDB();
echo PHP_EOL . PHP_EOL;
echo "# Updating the \"${dynamo_db_table}\" table..." . PHP_EOL;
$up = $dynamoDB->Update_table(array(
'TableName' => $dynamo_db_table,
'ProvisionedThroughput' => array(
'ReadCapacityUnits' => 39,
'WriteCapacityUnits' => 37
)
));
$table_status = $dynamoDB->describe_table(array(
'TableName' => $dynamo_db_table
));
// Check for success...
if ($table_status->isOK())
{
print_r($table_status->body->Table->ProvisionedThroughput->to_array()->getArrayCopy());
}
else
{
print_r($table_status);
}
$count = 0;
do {
sleep(5);
$count += 5;
$response = $dynamoDB->describe_table(array(
'TableName' => $table_name
));
}
while ((string) $response->body->Table->TableStatus !== 'ACTIVE');
echo "The table \"${table_name}\" has been updated (slept ${count} seconds)." . PHP_EOL;
?>
I am facing below error:
# Updating the "tablename" table...
Fatal error: Uncaught exception 'Guzzle\Common\Exception\InvalidArgumentException' with message 'Command was not found matching Update_table' in /home/phretscl/public_html/RETS/vendor/guzzle/guzzle/src/Guzzle/Service/Client.php:117 Stack trace: #0 /home/phretscl/public_html/RETS/vendor/guzzle/guzzle/src/Guzzle/Service/Client.php(94): Guzzle\Service\Client->getCommand('Update_table', Array) #1 /home/phretscl/public_html/RETS/vendor/aws/aws-sdk-php/src/Aws/Common/Client/AbstractClient.php(103): Guzzle\Service\Client->__call('Update_table', Array) #2 /home/phretscl/public_html/RETS/file.php(97): Aws\Common\Client\AbstractClient->__call('Update_table', Array) #3 /home/phretscl/public_html/RETS/file.php(97): Aws\DynamoDb\DynamoDbClient->Update_table(Array) #4 {main} thrown in /home/phretscl/public_html/RETS/vendor/guzzle/guzzle/src/Guzzle/Service/Client.php on line 117
The code you are using is not correct. From the error message, I can tell you are using version 2.x of the AWS SDK for PHP, but the code you using from ############ line down looks like it is meant for version 1.x of the AWS SDK for PHP. The error is being thrown, because you are not calling the DynamoDbClient::updateTable() method correctly.
You should checkout the AWS SDK for PHP User Guide, particularly the page about DynamoDB, which has a code sample for UpdateTable.
EDIT w/ regards to the comments: If this is a long running process, you should use the Aws\Common\Aws way of instantiating the client, since it retains and reuses the client objects it creates. Replace you DynamoDbClient::factory(...) code with this:
use Aws\Common\Aws;
$aws = Aws::factory(array(
'key' => '...',
'secret' => '...',
'region' => Region::US_WEST_1
));
$dynamoDB = $aws->get('dynamodb');
You should provide the credentials like this
Also, the region should be like this 'region'=>'us-east-1'
$DDBClient = DynamoDbClient::factory([
'region'=>'us-east-1',
'version' => 'latest',
'credentials' => [
'key' => 'XXXXXXXXXX',
'secret' => 'XXXXXXXXXXXXX'
]
,'scheme' => 'http' // Use this if you don't have HTTPS
//, 'debug' => true
]);

Categories