Having an issue with the s3 driver for Laravel 5.2. The error i'm getting is this:
Found 1 error while validating the input provided for the HeadObject operation:
[Key] must be at least 1 characters long. Value provided is 0 characters long.
I'm using the league flysystem V3 as stated in Laravel docs. When I follow the stack track and start dumping out the vars the 'Key' value is always empty but i've set it all up in my config file.
Here are the top lines from my stack trace
in Validator.php line 38
at Validator->validate('HeadObject', object(StructureShape), array('Bucket' => 'monstervsl', 'Key' => '', '#http' => array())) in Middleware.php line 77
at Middleware::Aws\{closure}(object(Command), null) in S3Client.php line 710
at S3Client::Aws\S3\{closure}(object(Command), null) in S3Client.php line 729
at S3Client::Aws\S3\{closure}(object(Command), null) in Middleware.php line 53
at Middleware::Aws\{closure}(object(Command), null) in SSECMiddleware.php line 59
at SSECMiddleware->__invoke(object(Command)) in AwsClient.php line 208
As you can see it's getting the bucket from my config but not the key it's empty.
Here is my filesystem.php file
's3' => [
'driver' => 's3',
// 'key' => env('S3_KEY'),
// 'secret' => env('S3_SECRET'),
'key' => '8tfnxo8abgn7voaex8rgv', // <- Not my real key
'secret' => 'aw7btx49wXNF7AGWV', // <- not my real secret
'region' => 'eu-west-1',
'bucket' => 'monstervsl',
],
Here is my controller, it's fairly straight forward, I don't think the put contents stuff is relevant but added it anyway
// Write the contents to a new file on disk
$view = view('iframe')->with('json', $video->toJson());
$contents = $view->render();
$token = '12345';
$filePath = public_path() .'/iframes/' . $token . '.html';
file_put_contents($filePath, $contents);
Storage::disk('s3')->put('/', file_get_contents($filePath));
You have to give the destination file path/name where you want to save this file on S3 bucket as the first argument of put function. Currently you are trying to save the file at the root of bucket without any name. Try something like this:
Storage::disk('s3')->put('filename.html', file_get_contents($filePath));
This full path of the file is the Key in this context & that is what's missing in your original request.
Related
I'm trying to create an s3 bucket and then upload a file to it. However, when I run the code I get this ugly error message:
Fatal error: Uncaught InvalidArgumentException: Found 1 error while
validating the input provided for the PutObject operation: [Body] must
be an fopen resource, a GuzzleHttp\Stream\StreamInterface object, or
something that can be cast to a string. Found bool(false) in
/Users/ddripz/Downloads/vendor/aws/aws-sdk-php/src/Api/Validator.php:65
Stack trace:
#0 /Users/ddripz/Downloads/vendor/aws/aws-sdk-php/src/Middleware.php(78): Aws\Api\Validator->validate('PutObject', Object(Aws\Api\StructureShape), Array)
#1 /Users/ddripz/Downloads/vendor/aws/aws-sdk-php/src/S3/S3Client.php(541): Aws\Middleware::Aws\{closure}(Object(Aws\Command), NULL)
#2 /Users/ddripz/Downloads/vendor/aws/aws-sdk-php/src/S3/S3Client.php(564): Aws\S3\S3Client::Aws\S3\{closure}(Object(Aws\Command), NULL)
#3 /Users/ddripz/Downloads/vendor/aws/aws-sdk-php/src/S3/S3Client.php(498): Aws\S3\S3Client::Aws\S3\{closure}(Object(Aws\Command), NULL)
#4 /Users/ddripz/Downloads/vendor/aws/aws-sdk-php/src/S3/S3Client.php(517) in /Users/ddripz/Downloads/vendor/aws/aws-sdk-php/src/Api/Validator.php on line 65
Here is my PHP code:
<?php
// Require the Composer autoloader.
require '/Users/ddripz/Downloads/vendor/autoload.php';
use Aws\S3\S3Client;
// Instantiate an Amazon S3 client.
$s3 = new S3Client([
'version' => 'latest',
'region' => 'us-west-2',
'credentials' => [
'key' => 'API KEY',
'secret' => 'SECRET KEY'
]
]);
$bucketname = 'we-sign-file-manager';
$file_path = '/Users/DennisWarfield/Desktop/wesign/uploads/5f31fc30410c17.68431957.jpg';
$key = basename($file_path);
try {
$s3->putObject([
'Bucket' => '',
'Key' => 'my-object',
'Body' => fopen('/path/to/file', 'r'),
'ACL' => 'public-read',
]);
} catch (Aws\S3\Exception\S3Exception $e) {
echo "There was an error uploading the file.\n";
}
?>
Any idea why this is happening?
Also, I'm confused as to why the error is occurring on line 65 when the max number of lines in my php file is 32.
Is my autoload.php path incorrect because the file is in downloads?
Change this line
'Body' => fopen('/path/to/file', 'r'),
to
'Body' => fopen($file_path, 'r'),
Please check your config/filesystems.php and make sure in s3 all the data and thier path is correct if not then correct them.
I'm trying to use the google/cloud-translate library (v ^1.5) in Laravel (v ^6.0).
In GoogleController.php:
public function translate(Request $request) {
$request->validate([
'source' => 'required|string|min:2|max:5',
'target' => 'required|string|min:2|max:5',
'q' => 'required|string',
]);
$translate = new TranslateClient([
'keyFile' => base_path(config('services.google.json_path')),
'projectId' => config('services.google.project_id'),
'suppressKeyFileNotice' => true,
]);
// Translate text from english to french.
$result = $translate->translate($request->q, [
'target' => explode($request->target, '-')[0],
'source' => explode($request->source, '-')[0],
]);
return $result;
}
But calling the route in Postman gives me the error:
Argument 2 passed to Google\Auth\CredentialsLoader::makeCredentials() must be of the type array, string given, called in /[...]/vendor/google/cloud-core/src/RequestWrapperTrait.php on line 155
I've checked that the projectId and the path to the keyFile is correct. Can anyone shed some light on how to get past this error?
You're specifying the path to the key file, so you should use the keyFilePath parameter instead.
Try this:
$translate = new TranslateClient([
'keyFilePath' => base_path(config('services.google.json_path')),
...
]);
From the TranslateClient.__construct docs:
keyFile: The contents of the service account credentials .json file retrieved from the Google Developer's Console. Ex: json_decode(file_get_contents($path), true).
keyFilePath: The full path to your service account credentials .json file retrieved from the Google Developers Console.
I'm receiving files (up to 4 GB): the file-content is streamed to me in the body of a POST request.
I want to upload this stream directly to a s3 bucket, without saving it locally first.
Already tried different approaches which failed for different reasons.
My current approach:
use GuzzleHttp\Psr7\Stream;
use Aws\S3\S3Client;
$s3 = new \Aws\S3\S3Client([
'version' => 'latest',
'region' => 'eu-west-1',
'credentials' => [
'key' => 'abc',
'secret' => '123'
]
]);
$stream = new \GuzzleHttp\Psr7\Stream(fopen('php://input', 'r'));
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'ContentLength' => (int)$_SERVER['CONTENT_LENGTH'],
'Body' => $stream->getContents(),
'ACL' => 'private',
'StorageClass' => 'STANDARD_IA',
));
The following error occurs while trying to stream a 80 MB file:
PHP message: PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 78847383 bytes) in /var/www/slimapi/vendor/slim/slim/Slim/Http/Stream.php on line 403
Line 403 of Stream.php is:
if (!$this->isReadable() || ($contents = stream_get_contents($this->stream)) === false) {
So the error is probably caused by triying to load the whole content of the stream into a string, which exceeds the memory limit.
(It's irritating why the error is occuring within Slim/Stream, as I'm trying to use guzzle\Stream.)
So my questions is:
How can I stream the incoming POST data directly to a s3 bucket without buffering issues leading to memory problems?
I already tried:
$stream = Psr7\stream_for(fopen('php://input', 'r'));
$stream
= fopen('php://input', 'r');
within putObject(): 'Body' => Stream::factory(fopen('php://input', 'r')),
I know this is old topic but it's not marked as solved, so...
PHP SDK does support stream source as you can see in the SDK specs (https://docs.aws.amazon.com/aws-sdk-php/v3/api/api-s3-2006-03-01.html#putobject) - see Parameter syntax:
$result = $client->putObject([
// ...
'Body' => <string || resource || Psr\Http\Message\StreamInterface>,
// ...
]);
It means your code is almost OK, the only thing is that you should pass $stream instead of $stream->getContents():
$stream = new \GuzzleHttp\Psr7\Stream(fopen('php://input', 'r'));
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'ContentLength' => (int)$_SERVER['CONTENT_LENGTH'],
'Body' => $stream,
'ACL' => 'private',
'StorageClass' => 'STANDARD_IA',
));
As simple as that.
That PHP SDK call does not support directly reading a stream. So what appears to be happening to me is that PHP is exhausting memory as it loads the entire object from that stream into a variable, before it actually calls the SDK to PUT that string of data to the object.
You'll want to consider using the S3 Stream Wrapper.
This example seems most appropriate, but you'll need to pass the data between both streams. While the S3 Stream Wrapper appears to support creating a stream say from a local file, I didn't see a direct example of passing an existing stream to it.
In this example, we read 4096 bytes from the source if available (or less if 4096 isn't available, and if the returned value is non-empty then we write that to the S3 object. We continue this until the source reached EOF (in this example source must support and EOF).
$client = new Aws\S3\S3Client([/** options **/]);
// Register the stream wrapper from an S3Client object
$client->registerStreamWrapper();
$stream = fopen('s3://bucket/key', 'w');
while (!$stream_source->stream_eof()) {
$string = $stream_source->stream_read (4096)
if (!empty($string)) {
fwrite($stream, $string);
}
}
fclose($stream);
I don't understand why this doesn't work, and I have scoured the Internet and can't find anything matching my specific command I'm using.
I am basically trying to generate a presigned URL from Amazon S3 and I am following the directions in the docs to a T and it's not working.. Actually not to a T, I was doing it to a T. The docs say to make the array like this : [ 'Key' => 'Value' ] ... I saw another question here where the solved answer was to make it using array() .... but it doesn't change anything.
It still gives this error:
[01-Jan-2016 13:28:56 America/Los_Angeles] PHP Catchable fatal error: Argument 2 passed to Guzzle\Service\Client::getCommand() must be of the type array, object given, called in /Users/alex/Development/theshrineofdionysus-com/vendor/guzzle/guzzle/src/Guzzle/Service/Client.php on line 76 and defined in /Users/alex/Development/theshrineofdionysus-com/vendor/guzzle/guzzle/src/Guzzle/Service/Client.php on line 79
This is the code I am using related to the S3 part of it. Trust me when I saw the constants regarding the keys, region and bucket are correct, as I have other S3 code using them elsewhere that works flawlessly.
<?php
$s3 = Aws\S3\S3Client::factory(array(
'key' => AWS_ACCESS_KEY,
'secret' => AWS_SECRET_KEY,
'region' => AWS_REGION,
));
$cmd = $s3->getCommand('GetObject', array(
'Bucket' => AWS_BUCKET,
'Key' => $row['video_id']
));
$request = $s3->createPresignedRequest($cmd, '+120 minutes');
$url = (string) $request->getUri();
?>
I also know that $row['video_id'] is equal to an existing filename because without this code there, and I'm echoing it out it is the correct filename.
Here is my composer.json:
{
"require": {
"aws/aws-sdk-php": "2.*",
"php": ">=5.2.0"
}
}
This is my amazon code on the other page that works fine:
$s3 = Aws\S3\S3Client::factory(array(
'key' => AWS_ACCESS_KEY,
'secret' => AWS_SECRET_KEY,
'region' => AWS_REGION
));
$objects = $s3->getIterator('ListObjects', array('Bucket' => AWS_BUCKET));
foreach ($objects as $object) {
echo '<option value="' . $object['Key'] . '">' . $object['Key'] . '</option>' . PHP_EOL;
}
It looks like you're following the guide for v3 but have v2 installed. You can create a presigned URL in v2 by calling: $url = $s3->getObjectUrl(AWS_BUCKET, $row['video_id'], '+120 minutes');
A full guide can be found here.
EDIT : Problem solved (thanks to this post) ! I just needed to install curl :
sudo apt-get install php5-curl
I am working on a tool to upload images in a AWS S3 bucket from a browser. I am using the PHP SDK provided by Amazon.
I tried the code provided in the documentation, but it does not work with me:
use Aws\S3\S3Client;
$bucket = '*** Your Bucket Name ***';
$keyname = '*** Your Object Key ***';
// $filepath should be absolute path to a file on disk
$filepath = '*** Your File Path ***';
// Instantiate the client.
$s3 = S3Client::factory();
// Upload a file.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filepath,
'ContentType' => 'text/plain',
'ACL' => 'public-read',
'StorageClass' => 'REDUCED_REDUNDANCY',
'Metadata' => array(
'param1' => 'value 1',
'param2' => 'value 2'
)
));
echo $result['ObjectURL'];
Even if I keep only these 2 lines:
use Aws\S3\S3Client;
$s3 = S3Client::factory();
...I get these errors:
Notice: Use of undefined constant CURLE_COULDNT_RESOLVE_HOST - assumed
'CURLE_COULDNT_RESOLVE_HOST' in
phar:///var/www/aws.phar/Aws/S3/S3Client.php on line 244
Notice: Use of undefined constant CURLE_COULDNT_CONNECT - assumed
'CURLE_COULDNT_CONNECT' in
phar:///var/www/aws.phar/Aws/S3/S3Client.php on line 244
Notice: Use of undefined constant CURLE_PARTIAL_FILE - assumed
'CURLE_PARTIAL_FILE' in phar:///var/www/aws.phar/Aws/S3/S3Client.php
on line 244
Notice: Use of undefined constant CURLE_WRITE_ERROR - assumed
'CURLE_WRITE_ERROR' in phar:///var/www/aws.phar/Aws/S3/S3Client.php on
line 244
Notice: Use of undefined constant CURLE_READ_ERROR - assumed
'CURLE_READ_ERROR' in phar:///var/www/aws.phar/Aws/S3/S3Client.php on
line 244
Notice: Use of undefined constant CURLE_OPERATION_TIMEOUTED - assumed
'CURLE_OPERATION_TIMEOUTED' in
phar:///var/www/aws.phar/Aws/S3/S3Client.php on line 244
Notice: Use of undefined constant CURLE_SSL_CONNECT_ERROR - assumed
'CURLE_SSL_CONNECT_ERROR' in
phar:///var/www/aws.phar/Aws/S3/S3Client.php on line 244
Notice: Use of undefined constant CURLE_HTTP_PORT_FAILED - assumed
'CURLE_HTTP_PORT_FAILED' in
phar:///var/www/aws.phar/Aws/S3/S3Client.php on line 244
Notice: Use of undefined constant CURLE_GOT_NOTHING - assumed
'CURLE_GOT_NOTHING' in phar:///var/www/aws.phar/Aws/S3/S3Client.php on
line 244
Notice: Use of undefined constant CURLE_SEND_ERROR - assumed
'CURLE_SEND_ERROR' in phar:///var/www/aws.phar/Aws/S3/S3Client.php on
line 244
Notice: Use of undefined constant CURLE_RECV_ERROR - assumed
'CURLE_RECV_ERROR' in phar:///var/www/aws.phar/Aws/S3/S3Client.php on
line 244
Fatal error: Uncaught exception
'Guzzle\Common\Exception\RuntimeException' with message 'The PHP cURL
extension must be installed to use Guzzle.' in
phar:///var/www/aws.phar/Guzzle/Http/Client.php:72 Stack trace: #0
phar:///var/www/aws.phar/Aws/Common/Client/AbstractClient.php(79):
Guzzle\Http\Client->__construct('https://s3.amaz...',
Object(Guzzle\Common\Collection)) #1
phar:///var/www/aws.phar/Aws/Common/Client/ClientBuilder.php(249):
Aws\Common\Client\AbstractClient->__construct(Object(Aws\Common\Credentials\RefreshableInstanceProfileCredentials),
Object(Aws\S3\S3Signature), Object(Guzzle\Common\Collection)) #2
phar:///var/www/aws.phar/Aws/S3/S3Client.php(207):
Aws\Common\Client\ClientBuilder->build() #3 /var/www/response.php(30):
Aws\S3\S3Client::factory() #4 {main} thrown in
phar:///var/www/aws.phar/Guzzle/Http/Client.php on line 72
Just installed aws php sdk on my local machine, without any aws registration,
after several trys got this working code:
use Aws\S3\S3Client;
$bucket = '*** Your Bucket Name ***';
$keyname = '*** Your Object Key ***';
// $filepath should be absolute path to a file on disk
$filepath = '*** Your File Path ***';
// Instantiate the client.
$s3 = S3Client::factory();
try{
// Upload a file.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filepath,
'ContentType' => 'text/plain',
'ACL' => 'public-read',
'StorageClass' => 'REDUCED_REDUNDANCY',
'Metadata' => array(
'param1' => 'value 1',
'param2' => 'value 2'
)
));
echo $result['ObjectURL'];
} catch (Exception $e) {
echo $e->getMessage() . "\n";
}
this code return aws Exception message:
You must specify a non-null value for the Body or SourceFile parameters.
and if I change $filepath to the path to any real file, this code outputs:
Error retrieving credentials from the instance profile metadata server. When you are not running inside of Amazon EC2, you must provide your AWS access key ID and secret access key in the "key" and "secret" options when creating a client or provide an instantiated Aws\Common\Credentials\CredentialsInterface object. ([curl] 28: Connection timed out after 5000 milliseconds [url] http://XXX.xxx.xxx.xxx/latest/meta-data/iam/security-credentials/)
so if any question you are very welcome, but it must work.
by the way if I remove catch section from the code I get same error messages from php on my broken page.
Hope this could help you in your case.