I have this code:
<?php
require('aws/aws-autoloader.php');
echo "1";
use Aws\S3\S3Client;
echo "2";
$s3Client = S3Client::factory(array(
'key' => 'mykey',
'secret' => 'mysecret',
));
echo "3";
echo "OK!";
?>
While on my machine the output is "123OK!" (as expected) after uploading it to the server I get only "12" (meaning the creation of the object fails?)
My local machine is running PHP 5.3.27 while the server is running 5.5.5-1chl1~precise1
Update:
The error I'm getting:
Fatal error: Uncaught exception 'Guzzle\Common\Exception\RuntimeException' with message 'The PHP cURL extension must be installed to use Guzzle.' in /var/www/api/1.0/aws/Guzzle/Http/Client.php:70 Stack trace: #0 /var/www/api/1.0/aws/Aws/Common/Client/AbstractClient.php(78): Guzzle\Http\Client->__construct('https://s3.amaz...', Object(Guzzle\Common\Collection)) #1
How do I install what is needed on a linux on C2?
sudo apt-get install php5-curl
sudo service apache2 restart
The error message says:
The PHP cURL extension must be installed to use Guzzle.
So… you need to install the PHP cURL extension.
how do I install it on a linux hosted on amazon?
It depends on the OS. Installing in Ubuntu is different from installing in Amazon Linux.
How about trying this :
<?php
require('aws/aws-autoloader.php');
use Aws\Common\Aws;
$aws_access_key = ''; // AWS Access key
$aws_access_security = ''; // AWS Security Key
$aws_default_region = 'ap-southeast-1'; // Your Default Region
$aws_default_scema = 'http'; // Default Protocol Schema
// Instantiate the AWS client with your AWS credentials
$aws = Aws::factory(array(
'key' => $aws_access_key,
'secret' => $aws_access_security,
'region' => $aws_default_region,
'scheme' => $aws_default_scema,
));
// Define S3client Object
$s3Client = $aws->get('s3');
// Test
var_dump($s3Client);
?>
That should work. But i think you should use composer method of AWS sdk use.
if you need a guide line or a script to work with s3, you can use my code on github : https://github.com/arizawan/aiss3clientphp
Related
I'm using PHP to try using the Google Cloud Spanner. I already did the gCloud settings and everything, and that's right. Now I need to make the connection via PHP to do a CRUD with the database that is in Spanner, but the code below always returns the error:
PHP Fatal error: Undefined constant 'Grpc\STATUS_UNKNOWN' in
/xxx/xxxx/www/vendor/google/cloud-spanner/Connection/Grpc.php on line
129
The code I have is:
<?php
require 'vendor/autoload.php';
use Google\Cloud\Spanner\SpannerClient;
/* Error start here */
$spanner = new SpannerClient([
'projectId' => 'my-project-id'
]);
$db = $spanner->connect('instance', 'database');
$userQuery = $db->execute('SELECT * FROM usuario WHERE login = #login', [
'parameters' => [
'login' => 'devteam'
]
]);
$user = $userQuery->rows()->current();
echo 'Hello ' . $user['login'];
The requirements I use in the composer are:
"require": {
"google/cloud": "^0.32.1",
"google/cloud-spanner": "^0.2.2"
}
I noticed that if I enter through the browser, the error presented above continues to appear. If I run the command php teste.php on the terminal, it runs the script correctly, ie, the terminal works and the browser does not.
Google Cloud PHP's spanner client is gRPC only. This means to use it you will need to install the gRPC PHP extension:
pecl install grpc
Once you have done that, add google/proto-client-php and google/gax to your composer.json and run composer update. After this is done, the error will be resolved.
For those wanting more detailed instructions, see this page for installing and enabling gRPC for PHP!
Since you mentioned that it works on CLI but not on browser, I can say that you need to enable the grpc extension on your php web server config.
E.g. Add
extension=grpc.so to your /etc/php/5.6/apache2/php.ini
I have been transferring my PHP server files from windows to ubuntu 14.04 and my script broke...
I'm getting the error when downloading from S3.
$result = $s3->getObject(array(
'Bucket' => AWS_S3_BUCKET,
'Key' => $imgName,
'SaveAs' => $targetPath
)); //the exception is thrown!
The exception message:
Error executing GetObject on http://s3.amazonaws.com/bucket/file AWS
HTTP error: Unable to open ../../downloads/file using mode r+:
fopen(../../downloads/file).
sudo chmod -R 777 wwwfolder didn't work
Any ideas?
** Update **
Found a Hax/Workaround.... adding touch($targetPath) before $s3->getObject method fixes the problem... but why? I tried on different ubuntu server (same version) and there it works (without touch; and i don't know how it was configured..).
What about now? Any ideas?
For a custom application, I am trying to integrate Rackspace cloud files using php-opencloud library.
This is the link I followed for setup -https://github.com/srijanaravali/php-opencloud/blob/master/docs/getting-started.md
# Install Composer
curl -sS https://getcomposer.org/installer | php
# Require php-opencloud as a dependency
php composer.phar require rackspace/php-opencloud:dev-master
However, when I try to instantiate a client object, it throws an error:
Fatal error: Class 'OpenCloud\Rackspace' not found in /var/www/example/Project/sites/all/libraries/php-opencloud/test.php on line 7
Here is the code snippet:
<?php
require 'vendor/autoload.php';
use OpenCloud\Rackspace;
// 1. Instantiate a Rackspace client.
$client = new Rackspace(Rackspace::US_IDENTITY_ENDPOINT, array(
'username' => getenv('Axxxxxxx'),
'apiKey' => getenv('abcxxxxxxxxxxxxxxxxxxxx')
));
print_r($client); die('!!');
Any pointers about whats missing?
Got it working. For some strange reason, php-opencloud library was empty under vendors/rackspace/php-opencloud.
Cloned one from github and created a symlink to it from above directory. It is working fine now.
I am trying to install Amazon AWS SDK StreamWrapper on cPanel.
I have this code in my s3_Upload.php file:
<?php
require 'aws/aws.phar';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$s3Client= S3Client::factory(array(
'key' => "<key>",
'secret' => "<secret_key>"
));
$s3Client->registerStreamWrapper();
$s3Bucket = '<bucket_name>';
$s3Path = 's3://'.$s3Bucket;
if (file_exists($s3Path.'/<folder>/clip.mp4')) {
echo 'Clip exists!';
} else {
echo 'Clip doesnt exists!';
}
?>
and I have both the aws.phar file and extracted version of aws-sdk-php-master.zip.
Issue:
Whenever I try to go to www.example.com/s3_Upload.php it writes this error:
Fatal error: Class 'Phar' not found in /home/<user>/public_html/aws/aws.phar on line 17
It seems your PHP environment is missing PHP/PECL's Phar class. This is odd, because, according to the Phar Installation page, "The Phar extension is built into PHP as of PHP version 5.3.0.".
So either you need to make sure the Phar extension is installed, or you need to use the aws.zip.
Note: The aws.zip is not the same as the aws-sdk-php-master.zip you've mentioned. I assume what you have is just a download of the SDK source from GitHub. That zip file would not contain any of the SDKs dependencies like aws.phar and aws.zip do. To use the official aws.zip, please see Installing from the Zip from the AWS SDK for PHP User Guide.
From this code I'm getting the error below
require "vendor/autoload.php";
use Aws\Common\Aws;
use Aws\DynamoDb\DynamoDbClient;
use Aws\DynamoDb\Enum\ComparisonOperator;
use Aws\DynamoDb\Enum\KeyType;
use Aws\DynamoDb\Enum\Type;
$aws = Aws::factory(array(
'key' => '[clipped]',
'secret' => '[clipped]',
'region' => Region::US_WEST_1
));
$client = $aws->get("dynamodb");
$tableName = "ExampleTable";
$result = $client->createTable(array(
"TableName" => $tableName,
"AttributeDefinitions" => array(
array(
"AttributeName" => "Id",
"AttributeType" => Type::NUMBER
)
),
"KeySchema" => array(
array(
"AttributeName" => "Id",
"KeyType" => KeyType::HASH
)
),
"ProvisionedThroughput" => array(
"ReadCapacityUnits" => 5,
"WriteCapacityUnits" => 6
)
));
print_r($result->getPath('TableDescription'));
I'm getting the following error when trying to add a table into AWS's DynamoDB.
PHP Fatal error: Uncaught Aws\\DynamoDb\\Exception\\DynamoDbException: AWS Error Code:
InvalidSignatureException,
Status Code: 400,
AWS Request ID: [clipped],
AWS Error Type: client,
AWS Error Message: Signature expired: 20130818T021159Z is now earlier than
20130818T021432Z (20130818T022932Z - 15 min.),
User-Agent: aws-sdk-php2/2.4.3 Guzzle/3.7.2 curl/7.21.6 PHP/5.3.6-13ubuntu3.9\n thrown in
/var/www/vendor/aws/aws-sdk-php/src/Aws/Common/Exception/NamespaceExceptionFactory.php on
line 91
So far I've:
Checked to see if Authentication Key and Secret Key were correct, they were.
Updated cURL
When I put false authentication permissions in, the error didn't change.
It seems that your local system time might be incorrect. I've had a similar problem with AWS S3, where my system clock was skewed by 30 mins.
If you're running ubuntu, try updating your system time:
sudo ntpdate ntp.ubuntu.com
You can also restart your date service to solve the problem if you've already got ntpdate installed.
sudo service ntpdate stop
sudo service ntpdate start
If you are using docker-machine on Mac, you can resolve with this command:
docker-machine ssh default 'sudo ntpclient -s -h pool.ntp.org'
Quick note for vagrant projects: this is usually resolved by vagrant reload.
Not exactly OP question, but this is top google response for "InvalidSignatureException DynamoDB", which has many underlying causes.
For me, it was because my body contained emoji, 100% reproducible. Worked around by encoding the body (in my case stringified json) using encodeURIComponent.