Is there any RiakCS S3 PHP client library out there? The best I could find was S3cmd command line client software.
Also I've seen there is Riak PHP Client, but it looks like there is nothing related to S3.
I've installed aws-sdk-php-laravel and used same credentials as for RiakCS S3 but it doesn't seem to work. Error message below:
The AWS Access Key Id you provided does not exist in our records.
Thank you for any guidance or advice.
Actually, if you are using Riak, it wouldn't be a proxy, it would be a completely different endpoint. So you should do it this way with the base_url option:
$s3 = S3Client::factory([
'base_url' => 'http://127.0.0.1:8080',
'region' => 'my-region',
'key' => 'my-key',
'secret' => 'my-secret',
'command.params' => ['PathStyle' => true]
]);
Using 'command.params' allows you to set a parameter used in every operation. You will need to use the 'PathStyle' option on every request to make sure the SDK does not move your bucket into the host part of the URL like it is supposed to do for Amazon S3.
This is was all talked about on an issue on GitHub.
aws-sdk-php-laravel uses aws-sdk-php which is hard coded to use Amazon's URLs. If you want to use that library with Riak CS, you'll need to configure it to use your node as a proxy. According to the config docs that would be set using:
use Aws\S3\S3Client;
$s3 = S3Client::factory(array(
'request.options' => array(
'proxy' => '127.0.0.1:8080'
)
));
I haven't used Laravel, so I'm not sure where to put that so that it will pass the setting along to
Related
how to upload an image to scaleway storage by laravel or PHP methods?
Laravel uses FlySystem under the hood to abstract file storage. It provides several drivers out of the box including: S3, Rackspace, FTP etc.
If you want to support Scaleway, you would need to write a Custom Driver, which you can read more about it here.
Edit: It seems from the documentation of Scaleway, it supports AWS CLI clients, which means, this should be quite easy to add support for in FlySytem. I tried the following and it worked.
I added a new driver in config/filesystems.php as follows:
'scaleway' => [
'driver' => 's3',
'key' => '####',
'secret' => '#####',
'region' => 'nl-ams',
'bucket' => 'test-bucket-name',
'endpoint' => 'https://s3.nl-ams.scw.cloud',
]
and then, to use the disk, I did the following:
\Storage::disk('scaleway')->put('file.txt', 'Contents');
My file was uploaded.
EDIT: I also made a PR to get Scaleway accepted in the list of adapters for League's FlySystem. It got merged. You can see it live here.
I've looked at every answer on here and it seems my problem is a little different or there hasn't been a proper solution. I'm doing the following in my PHP file:
use Aws\Route53\Route53Client;
$client = Route53Client::factory(array(
'profile' => 'default',
'region' => 'us-east-1',
'version' => '2013-04-01'
));
Getting this error:
Fatal error: Uncaught Aws\Exception\CredentialsException: Cannot read credentials from /.aws/credentials
Seems like the easy fix would be ensure that the HOME directory is the right one. Indeed it already is. Files are readable and my ec2-user is already the owner. Key and Secret is already installed in the 'credentials' file. Profile name is already set to 'default.' Tried to copy /.aws to other directories such as the root, /home, etc and changed permissions, chmod, all the above. Still nothing.
Then I tried to hard-code the credentials (I know -- not recommended) just to give it a little kick, and it completely ignores that I did this:
$client = Route53Client::factory(array(
'profile' => 'default',
'region' => 'us-east-1',
'version' => '2013-04-01',
'credentials' => [
'key' => $key,
'secret' => $secret,
]
));
As a last resort, I even tried including the CredentialProvider class, and passing this into my array -- still nothing:
'credentials' => CredentialProvider::ini('default', '/home/ec2-user/.aws/credentials'),
What on earth am I doing wrong?
Just remove 'profile' => 'default', and you should work fine
$client = Route53Client::factory(array(
'region' => 'us-east-1',
'version' => 'latest',
'credentials' => [
'key' => $key,
'secret' => $secret,
]
));
Running on AWS Centos 7, I tried everything (chmod/chown /root /home/user, env, bashrc, etc) to get the /.aws/credentials to work outside the apache /var/www directory. The SDK reported that it could not read the credentials file.
I looked at PHP to see if I could set/override the HOME variable and it still did not read the credentials file until I placed the .aws folder in the '/var/www' folder and set the HOME variable in my php file like so:
<%php
putenv('HOME=/var/www');
//ZIP File SDK Install requires aws-autoloader
require 'aws-autoloader.php'; //Your php code below
Facing this issue, here was my exact approach:
PHP version : 7.2.24 AWS PHP SDK version: 3.180.4
First copy your existing aws folder to your root home directory
sudo cp -r ~/.aws /
Then your code should look like:
$client = Route53Client::factory(array(
'profile' => 'default',
'region' => 'us-east-1',
'version' => '2013-04-01'
));
In my case, it was interesting to realize that the PHP SDK looks for the credentials file in the root folder and not in the current users's home directory. That's the most feasible reason why my approach worked.
However, you want to find a more general place for your local configs and use the following approach to load it.
$path = '/my/config/folder/.aws/credentials';
$provider = CredentialProvider::ini('default', $path);
$provider = CredentialProvider::memoize($provider);
$client = Route53Client::factory(array(
'region' => 'us-east-1',
'version' => '2013-04-01',
'credentials' => $provider
));
Hopefully this throws more light into the AWS PHP community. It's really important to get this configuration right to build secure PHP applications
Here is what I ended up doing for purposes of this question, although EJ's answer above is actually the right answer. Hopefully this helps someone to get their credentials file to be read:
use Aws\Credentials\CredentialProvider;
use Aws\Route53\Route53Client;
$profile = 'default';
$path = '/var/www/html/.aws/credentials';
$provider = CredentialProvider::ini($profile, $path);
$provider = CredentialProvider::memoize($provider);
$client = Route53Client::factory(array(
'region' => 'us-east-1',
'version' => '2013-04-01',
'credentials' => $provider
));
Not sure what you are doing wrong, but I'd suggest bypassing the problem altogether and assigning an EC2 Instance role to the vm in question and then you won't have to worry about it; it's a better/more secure solution.
https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/iam-roles-for-amazon-ec2.html
I think the AWS manual is a bit confusing.
I created the .aws directory at the filesystem root (/), not in the /root or /home dir, and everything worked.
/.aws/credentials
I upgraded from PHP 8.0 to PHP 8.1 and PHP suddenly complained it couldn't find the credential file. The xdebug error trace showed me the expected location, which was one level below my public html directory. I don't know why it changed, but I simply ran this command in that directory:
ln -s /.aws/ .aws
The symlink I created works fine to provide the credentials. I'm back up and running.
check the permission of .aws/* files using "ls -l"
Change the permission to grand read or grant all permision "sudo chmod 777 .aws/*"
rerun the code
we're using DynamoDB in order to synchronize sessions between more than one EC2 machine under ELBs.
We noticed that this method slow down a lot the scripts.
Specifically, I made a js that calls 10 times 3 different php scripts on the server.
1) The first one is just an echo timestamp(); and takes about 50ms as roundtrip time.
2) The second one is a php script that connect through mysqli to the RDS MySQL and takes the same time (about 50-60ms).
3) The third script use the DynamoDB session keeping method described in official AWS documentation and takes about 150ms (3 times slower!!).
I'm cleaning the garbage every night (as documentation say) and the DynamoDB metrics seems OK (attached below).
The code I use is this:
use Aws\DynamoDb\DynamoDbClient;
use Aws\DynamoDb\Session\SessionHandler;
ini_set("session.entropy_file", "/dev/urandom");
ini_set("session.entropy_length", "512");
ini_set('session.gc_probability', 0);
require 'aws.phar';
$dynamoDb = DynamoDbClient::factory(array(
'key' => 'XXXXXX',
'secret' => 'YYYYYY',
'region' => 'eu-west-1'
));
$sessionHandler = SessionHandler::factory(array(
'dynamodb_client' => $dynamoDb,
'table_name' => 'sessions',
'session_lifetime' => 259200,
'consistent_read' => true,
'locking_strategy' => null,
'automatic_gc' => 0,
'gc_batch_size' => 25,
'max_lock_wait_time' => 15,
'min_lock_retry_microtime' => 5000,
'max_lock_retry_microtime' => 50000,
));
$sessionHandler->register();
session_start();
Am I doing something wrong, or is it normal all that time to retrieve the session?
Thanks.
Copying correspondence from an AWS engineer in AWS forums: https://forums.aws.amazon.com/thread.jspa?messageID=597493
Here a couple things to check:
Are you running your application on EC2 in the same region as your DynamoDB table?
Have you enabled OPcode caching to ensure that the classes used by the SDK do not need to be loaded from disk and parsed each time your
script is run?
Using a web server like Apache and connecting to a DynamoDB session
will require a new SSL connection to be established on each request.
This is because PHP doesn't (currently) allow you to reuse cURL
connection handles between requests. Some database drivers do allow
for a persistent connections between requests, which could account for
the performance difference.
If you follow up on the AWS forums thread, an AWS engineer should be able to help you with your issue. This thread is also monitored if you want to keep it open.
I want to read my own twit in a litte localhost application in js + php.
I know how to read the json in api.twitter.com/1/statuses/user_timeline.json?screen_name=myName but due to the limit rate I need to make a User Stream (https://dev.twitter.com/docs/streaming-api/user-streams)
I have my 4 keys by create a dev account :
'consumer_key' => '*****',
'consumer_secret' => '*****',
'user_token' => '*******',
'user_secret' => '******',
So I try with this https://github.com/themattharris/tmhOAuth/blob/master/examples/userstream.php
download the lib
run my MAMP (or WAMP or LAMP)
open the example, put my key
go to the page
and nothing. except the browser loader.
Why this hapeens?
is it due to localhost ?
or no params ?
or new twitter restriction ?
Streaming API is not right tool for litte application You are better off with plain REST API.
Streaming API application is not supposed to be run in browser; don't forget to set_time_limit(0) and start your .php script in comandline - where it will run forever (you should save tweets in database so your normal browser scripts could display them)
There are some similar questions but none have a good answers in how to upload files directly to S3 using PHP with progress bar. Is it even possible adding a progress bar without using Flash?
NOTE: I am referring to uploading from client browser directly to S3.
I've done this in our project. You can't upload directly to S3 using AJAX because of standard cross domain security policies; instead, you need to use either a regular form POST or Flash. You'll need to send the security policy and signature in a relatively complex process, as explained in the S3 docs.
YES, it is possible to do this in PHP SDK v3.
$client = new S3Client(/* config */);
$result = $client->putObject([
'Bucket' => 'bucket-name',
'Key' => 'bucket-name/file.ext',
'SourceFile' => 'local-file.ext',
'ContentType' => 'application/pdf',
'#http' => [
'progress' => function ($downloadTotalSize, $downloadSizeSoFar, $uploadTotalSize, $uploadSizeSoFar) {
// handle your progress bar percentage
printf(
"%s of %s downloaded, %s of %s uploaded.\n",
$downloadSizeSoFar,
$downloadTotalSize,
$uploadSizeSoFar,
$uploadTotalSize
);
}
]
]);
This is explained in the AWS docs - S3 Config section. It works by exposing GuzzleHttp's progress property-callable, as explained in this SO answer.
Technically speaking, with PHP you cannot go from client --> S3. Your solution, if you want to use PHP would either have to be designed as follows:
Client -> Web Server (PHP) -> Amazon S3
Client with PHP server embedded -> Amazon S3
The AWS PHP SDK: http://aws.amazon.com/sdkforphp/ is very well written and contains a specific example on how to send a file from a Client --> Server --> S3
With respects to the progress bar, there are many options available. A quick search of stackoverflow.com shows a question answered identical to this one:
Upload File Directly to S3 with Progress Bar
'pass through' php upload to amazon's s3?
It is possible to directly upload, but progress bar is impossible:
http://undesigned.org.za/2007/10/22/amazon-s3-php-class/
see example_form in the downloads,
direct upload from browser to S3