Directus Storage Adapter AWS S3 error: No permission to write: - php

I decided to use Directus as a Headless CMS in one of our projects. For storing media files we decided to use AWS S3 as a storage adapter. I have followed all instructions from Directus documentation how to setup the configurations for S3 but i am keeping on getting this error:
message: "No permission to write: 122d303f-b69a-48f8-8138-8f3404b3c992.jpg"
I have tried the AWS S3 bucket from CLI commands, and i can upload files there and list the files but from Directus i am having the no permission to write error.
Storage configuration in Directus looks like this:
'storage' => [
'adapter' => 's3', // What storage adapter to use for files
// Defaults to the local filesystem. Other natively supported
// options include: Amazon S3, Aliyun OSS, Azure
// You'll need to require the correct flysystem adapters through Composer
// See https://docs.directus.io/extensions/storage-adapters.html#using-aws-s3
'root' => '/originals', // Where files are stored on disk
'thumb_root' => '/thumbnails', // Where thumbnails are stored on disk
'root_url' => '##S3_URL##/originals', // Where files are accessed over the web
// 'proxy_downloads' => false, // Use an internal proxy for downloading all files
// S3
////////////////////////////////////////
'key' => '##S3_KEY##',
'secret' => '##S3_SECRET##',
'region' => '##S3_REGION##',
'version' => 'latest',
'bucket' => '##S3_BUCKET##',
'options' => [
'ACL' => 'public-read',
'CacheControl' => 'max-age=604800'
],
],
Anyone can give any insight on this?
Thanks in advance.

Need to add the following
'options' => [
'ACL' => 'private',
],
I found that I did not need the bucket policy update. Just need to add this to the config.php when trying to upload to a private bucket.

I had the same issue.
In my case, I've allowed "s3:PutObjectAcl" on the bucket and it starts working :)
It seems that this is a flysystem issue.

Related

How is it possible to use the current EC2 instances IAM role in AWS ADK PHP?

I would like to use SNS/SQS in my PHP application, that is running on an EC2 instance.
I was able to find the guide about credentials for the AWS SDK for PHP Version 3, but I still have no idea how to simply use the IAM Role that is associated with my EC2 instance, my app is running on.
This is what I tried:
$SnSclient = new SnsClient([
'profile' => 'default',
'region' => 'eu-central-1',
'version' => '2010-03-31',
'credentials' => \Aws\Credentials\CredentialProvider::defaultProvider()
]);
But I get this error:
PHP Fatal error: Uncaught Aws\Exception\CredentialsException: Cannot read credentials from /home/ec2-user/.aws/credentials in phar:///var/www/html/aws.phar/Aws/Credentials/CredentialProvider.php:874
I would like to avoid uploading my pem file to the instance, since it is already running in AWS, as far as I know it should be able to use it's IAM role
The 'profile' => 'default', line had to be removed, as suggested in this answer
It does not seem to be well documented, but if you are using the IAM Role of the instance, then no profile or credentials attributes are necessary

Laravel Vapor and Lavavel Excel queued imports failing. due to temp storage not existing Laravel 8 Vapor

When I am trying to run an import that implements chunk reading and queues I am receiving the following error from the vapor queue;
ErrorException: touch(): Unable to create file /tmp/storage/framework/laravel-excel/laravel-excel-7IEEz0rP7NORtp7N4NeOxuH0hlbM9JPR.csv because No such file or directory in /var/task/vendor/maatwebsite/excel/src/Files/RemoteTemporaryFile.php:97
Documentation On Laravel Excel says to set these value in config/excel.php
https://docs.laravel-excel.com/3.1/imports/queued.html#multi-server-setup
'temporary_files' => [
'local_path' => storage_path('framework/laravel-excel'),
'remote_disk' => 's3',
'force_resync_remote' => true,
My vapor yml file has the following storage set up all working fine with correct env values. for vapor and local development
storage: **correct bucket name**
Has anyone managed to get Queued imports working with Laravel Excel and Vapor, and if so how did you manage it? the documentation doesn't really explain what values I should use
Try to set local_path to the tmp dir in the config file
<?php
return [
'temporary_files' => [
'local_path' => sys_get_temp_dir(),
...
],
];

How to use AWS PHP SDK v3.0, without credential file

Hi I am using aws SDK Version 3 for php to upload files on s3
I need to get rid of credentials file ( .aws/credentials) because it's causing issues on my production server,
The hard coded credentials method isn't working in my code. link pasted below.
https://docs.aws.amazon.com/sdk-for-php/v3/developer-guide/guide_credentials.html#hardcoded-credentials
kindly provide a valid and working solution how to use hard coded credentials.
please note if i use credential file everything works OK. so the problem is with credentials code.
here is my code when I initiate my s3 object
$s3Client = new S3Client([
'profile' => 'default',
'region' => 'us-west-2',
'version' => '2006-03-01',
'scheme' => 'http',
'credentials'=>[
'key' => KEY,
'secret' => SECRET
]
]);
You just need to remove the 'profile' => 'default', line, which has the effect of overriding your hard-coded credentials.
I've been dealing with your same problem with much frustration today, and finally solved it. See related answer here for the same problem on a different Amazon service.
per AWS documentation, https://docs.aws.amazon.com/aws-sdk-php/v2/guide/credentials.html
If you do not provide credentials to a client object at the time of
its instantiation (e.g., via the client's factory method or via a
service builder configuration), the SDK will attempt to find
credentials in your environment when you call your first operation.
The SDK will use the $_SERVER superglobal and/or getenv() function to
look for the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment
variables. These credentials are referred to as environment
credentials.
V3 doc here https://docs.aws.amazon.com/sdk-for-php/v3/developer-guide/guide_credentials.html
In my case I am using an IAM role in the machines which host the app, it is easier to manage permissions from IAM dashboard and you will avoid hardcoded or config file with credentials.

Uploading files to S3 using Laravel 5.1 fails

I'm trying to upload some files to S3 using Laravel and it's Filesystem api.
I'm getting the following error:
S3Exception in WrappedHttpHandler.php line 159:
Error executing "PutObject" on "https://s3..amazonaws.com/example-bucket/433922_1448096894943.png"; AWS HTTP error: cURL error 6: Could not resolve host: s3.global.amazonaws.com (see http://curl.haxx.se/libcurl/c/libcurl-errors.html)
here are my settings in the /config/filesystems.php file
's3' => [
'driver' => 's3',
'key' => 'apiKey',
'secret' => 'secretApiKey',
'region' => 'global',
'bucket' => 'example-bucket',
],
as noted in the error, the filesystem api can't resolve the following domain name
s3.global.amazonaws.com
the "global" part in here stands for the region but as far I understand S3 no longer requires a region.
while I think that the actual domain name should like this
example-bucket.s3.amazonaws.com/
here is the code I am using to invoke the filesystem api
Storage::disk('s3')->put('new-name.png', file_get_contents( 'path/to/file'))
Had to set up the correct S3 region, which in my case was: eu-west-1, and it all started working.

AWS SDK for PHP: Error retrieving credentials from the instance profile metadata server

I am trying to send SNS messeges to android through web api.
Downloaded and installed the SDK from http://aws.amazon.com/developers/getting-started/php/
Got following error while running sample.php:
Fatal error: Uncaught exception 'Aws\Common\Exception\InstanceProfileCredentialsException' with message 'Error retrieving credentials from the instance profile metadata server. When you are not running inside of Amazon EC2, you must provide your AWS access key ID and secret access key in the "key" and "secret" options when creating a client or provide an instantiated Aws\Common\Credentials\CredentialsInterface object. ([curl] 28: Connection timed out after 5016 milliseconds [url] http://169.254.169.254/latest/meta-data/iam/security-credentials/)' in C:\xampp\htdocs\aws-php\vendor\aws\aws-sdk-php\src\Aws\Common\InstanceMetadata\InstanceMetadataClient.php:85 Stack trace: #0 C:\xampp\htdocs\aws-php\vendor\aws\aws-sdk-php\src\Aws\Common\Credentials\RefreshableInstanceProfileCredentials.php(52): Aws\Common\InstanceMetadata\InstanceMetadataClient->getInstanceProfileCredentials() #1 C:\xampp\htdocs\aws-php\vendor\aws\aws-sdk-php\src\Aws\Common\Credentials\AbstractRefreshableCredentials.php(54): Aws\Common\Credentials\Refreshable in C:\xampp\htdocs\aws-php\vendor\aws\aws-sdk-php\src\Aws\Common\InstanceMetadata\InstanceMetadataClient.php on line 85
A little guidance on this topic will help me a lot
In my case, I was using
return DynamoDbClient::factory(array(
'version' => 'latest',
'region' => AWS_REGION,
'key' => AWS_KEY,
'secret' => AWS_SECRET
));
which used to be ok with aws/aws-sdk-php version 2.8.5 , but when composer automatically installed version 3.2.0, I got the error above. The problem is simply that I should've changed the way I made the call to
return DynamoDbClient::factory(array(
'version' => 'latest',
'region' => AWS_REGION,
'credentials' => array(
'key' => AWS_KEY,
'secret' => AWS_SECRET,
)
));
as documented here. Without changing the call, the apache php was falling back to looking for the ~/.aws/credentials file using the HOME environment variable, which was empty. You can check its value by running php -r 'var_dump(getenv("HOME"));'.
This is a related post
In my case I had to use hard-coded credentials
$s3Client = new S3Client([
'region' => REGION,
'version' => '2006-03-01',
'credentials' => [
'key' => S3_KEY,
'secret' => S3_SECRETE,
],
]);
See more details here:
You have to place the .aws/credentials file with your configuration in the home directory of the web service *usually /var/www) not in the home directory of the logged in user.
You can find what home directory you web service is using by running echo getenv('HOME'); in a php file on your server.
I was trying to use a credentials file and got the same error, this guy on github pretty much nailed it:
The credentials file should be in ini format but not have a .ini extension. It should have a 'default' section defined with your key and secret:
$ less ~/.aws/credentials
[default]
aws_access_key_id = key
aws_secret_access_key = secret
If you specified other section name instead of default, just add a profile key to the S3Client parameters:
[example]
aws_access_key_id = key
aws_secret_access_key = secret
$s3Client = new \Aws\S3\S3Client([
'version' => '2006-03-01',
'region' => $yourPreferredRegion,
'profile' => 'example',
]);
Using a credentials file or environment variables is the recommended way of providing credentials on your own server
And #Anti 's answer also helped me alot!
If you prefer the hard coded way, just follow #shadi 's answer.
assuming that the server is located on AWS EC2 (probably the same for ECS and elastic beanstalk) the "correct" way to handle this issue is not to store credentials at all.
instead, do this:
create an IAM role (https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/iam-roles-for-amazon-ec2.html)
add relevant permissions to the role policy (in this case, send SNS msg)
assign the role to the EC2 instance (instance settings => Attach/Replace IAM Role)
this way you don't leave any sensitive data in your code.
Here are the steps:
Type cd ~
By this you will go into the home directory.
mkdir .aws
sudo vi .aws/credentials
Write following lines and save the file.
[default]
aws_access_key_id = Your AWS Access Key
aws_secret_access_key = Your AWS Secret Access Key
If it is laravel and aws/aws-sdk-php-laravel sdk then after configuring all step and defining key in .env file
you have to drop config cache and rebuild it by following commands.
php artisan config:cache;
composer dump-autoload;
This might be because the config file hasn't been published.
Be sure to publish the config file:
php artisan vendor:publish --provider="Aws\Laravel\AwsServiceProvider"
To test this is the issue, just clear the config.
php artisan config:clear
If it works with the cache cleared, then this will be the issue.
You can try these lines:
$credentials = new Aws\Credentials\Credentials('key' , 'secret-key');
$s3 = new S3Client(['version' => 'latest','region' => 'ap-south-1','credentials'=>$credentials]);

Categories