I am using the AWS facade for Laravel and I can instantiate a CloudSearchDomainClient object like this:
$c = AWS::createClient('cloudsearchdomain', ['endpoint' => '{our-endpoint}']);
But when I attempt to search like so:
$c->search(['query' => 'test']);
I get this error: Aws\CloudSearchDomain\Exception\CloudSearchDomainException with message 'Error executing "Search" on "2013-01-01/search"; AWS HTTP error: cURL error 6: Could not resolve host: 2013-01-01
It thinks the version is the endpoint.
I have the proper .env vars, eg. AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY and AWS_DEFAULT_REGION. I am able to use other AWS services, but CloudSearch specifically is a problem. What am I doing incorrectly?
I think you are mixing two things. An endpoint is a domain and api version is a date. So it should be more like this:
$c = AWS::createClient(
'cloudsearchdomain',
[
'endpoint' => 'https://example.com',
'apiVersion' => '2013-01-01',
]
);
Related
I am trying to upgrade an S3 Multipart Uploader from Laravel 8 to Laravel 9 and have upgraded to Flysystem 3 as outlined in the documentation and have no dependency errors https://laravel.com/docs/9.x/upgrade#flysystem-3.
I am having trouble getting access to the underlying S3Client to create a Multipart upload.
// Working in Laravel 8
// Laravel 9 throws exception on getAdapter()
$client = Storage::disk('s3')->getDriver()->getAdapter()->getClient();
// Underlying S3Client is used to create Multipart uploader as below
$bucket = config('filesystems.disks.s3.bucket');
$result = $client->createMultipartUpload([
'Bucket' => $bucket,
'Key' => $key,
'ContentType' => 'image/jpeg',
'ContentDisposition' => 'inline',
]);
return response()
->json([
'uploadId' => $result['UploadId'],
'key' => $result['Key'],
]);
Laravel 9, however, throws an exception Call to undefined method League\Flysystem\Filesystem::getAdapter().
I've looked over the source for League\Flysystem and updates to Laravel but can't seem to figure out the right way to work with the updates and get access to the underlying Aws\S3\S3Client.
My larger project is using a forked laravel-uppy-s3-multipart-upload library which can be seen here
https://github.com/kerkness/laravel-uppy-s3-multipart-upload/tree/laravel9
This was discussed in this Flysystem AWS adapter github issue:
https://github.com/thephpleague/flysystem-aws-s3-v3/issues/284
A method is being added in Laravel, and will be released next Tuesday (February 22, 2022):
https://github.com/laravel/framework/pull/41079
Workaround
The Laravel FilesystemAdapter base class is macroable, which means you could do this in your AppServiceProvider:
Illuminate\Filesystem\AwsS3V3Adapter::macro('getClient', fn() => $this->client);
Now you can call...
Storage::disk('s3')->getClient();
... and you will have an instance of the S3Client. (I love macros!)
You can remove this macro once next Tuesday's release is available.
Get / Fetch object / Image from S3 in Laravel 9? OR
*** Display files from S3 in Laravel ? ***
Ans:
These are only three steps to get object like ( Image, PDF, Video ) any kind of attachment from Laravel 9:
Go to project directory:
Step #1: composer require league/flysystem-aws-s3-v3
Step #2 Go to .env and right this
AWS_ACCESS_KEY_ID=your_aws_access_key
AWS_SECRET_ACCESS_KEY=your_aws_secret_key
AWS_DEFAULT_REGION=your_aws_regios
AWS_BUCKET=your_bucket_name
AWS_USE_PATH_STYLE_ENDPOINT=false
Step #3: Goto any controller and right this in the function
$source = Storage::disk('s3')->temporaryUrl($item->path, now()->addMinutes(30));
And then you can pass $source variable to wherever you want.
Thanks
Jahanzaib
So I'm running CakePHP 4 on an EC2 instance, AWS ES 7 and I've setup the ElasticSearch plugin in CakePHP.
composer require cakephp/elastic-search "^3.0"
I've added the elastic datasource connection in config/app.php
'elastic' => [
'className' => 'Cake\ElasticSearch\Datasource\Connection',
'driver' => 'Cake\ElasticSearch\Datasource\Connection',
'host' => 'search-DOMAIN.REGION.es.amazonaws.com',
'port' => 443,
'transport' => "AwsAuthV4",
'aws_access_key_id' => "KEY",
'aws_secret_access_key' => "SECRET",
'aws_region' => "REGION",
'ssl' => 'true',
],
.. and I've activated the plugin
use Cake\ElasticSearch\Plugin as ElasticSearchPlugin;
class Application extends BaseApplication
{
public function bootstrap()
{
$this->addPlugin(ElasticSearchPlugin::class);
I've manually added 1 index record to ES via curl from the EC2 instance. So I know the communication between EC2 and ES works.
curl -XPUT -u 'KEY:SECRET' 'https://search-DOMAIN.REGION.es.amazonaws.com/movies?pretty' -d '{"director": "Burton, Tim", "genre": ["Comedy","Sci-Fi"], "year": 1996, "actor": ["Jack Nicholson","Pierce Brosnan","Sarah Jessica Parker"], "title": "Mars Attacks!"}' -H 'Content-Type: application/json'
I also managed to search for this record via curl without any problems.
In the AppController.php I tried this simple search just to see if the plugin works and for the life of me, I can't get it to work.
# /src/Controller/AppController.php
...
use Cake\ElasticSearch\IndexRegistry;
class AppController extends Controller
{
public function initialize(): void
{
parent::initialize();
$this->loadModel('movies', 'Elastic');
$query = $this->movies->find('all');
$results = $query->toArray();
I'm getting the following error:
Client error: POST
https://search-DOMAIN.REGION.es.amazonaws.com/movies/movies/_search
resulted in a 403 Forbidden response: {"message":"The security token
included in the request is invalid."}
Elastica\Exception\Connection\GuzzleException
Seems like the plugin adds the 'Index' name twice for some reason. I looked everywhere for a setting that I might have missed. If I copy and paste the above URL and remove the duplicate Index from the URL in a browser it works fine.
https://search-DOMAIN.REGION.es.amazonaws.com/movies/_search
Am I missing something here?
I've even tried this method, and I get the same problem with duplicated index values in the url.
$Movies = IndexRegistry::get('movies');
$query = $Movies->find('all');
$results = $query->toArray();
I've tried a new/clean CakePHP instance and I get the same problem? Is there something wrong with the plugin? Is there a better approach to communicate with ES via CakePHP?
I'm not familiar with the plugin and Elasticsearch, but as far as I understand, one of the movies is the index name, and one of them is the type name, where the type name - at least according to the documentation - should be singular, ie the path would instead be expected to look like:
/movies/movie/_search
Furthermore, Index classes assume that the type mapping has the singular name of the index. For example the articles index has a type mapping of article.
https://book.cakephp.org/elasticsearch/3/en/3-0-upgrade-guide.html#types-renamed-to-indexes
Whether that would be the correct path with respect to what is supported by the used Elasticsearch version, that might be a different question.
You may want to open an issue over at GitHub.
I created a PHP script to retrieve some datas from my Google Cloud Platform account. Below is how I did :
<?php
require __DIR__ . '/vendor/autoload.php';
use Google\Cloud\BigQuery\BigQueryClient;
putenv('GOOGLE_APPLICATION_CREDENTIALS=key.json');
$projectId = 'xxxxx';
$datasetId = 'xxxxxx';
$table = 'xxxxx';
$bigQuery = new BigQueryClient([
'projectId' => $projectId
]);
// etc...
Everything works fine on my local computer (WAMP) but when I migrate my script to my company production environment, there is a problem :
Fatal error: Uncaught exception
'Google\Cloud\Core\Exception\ServiceException' with message 'cURL
error 6: Couldn't resolve host 'www.googleapis.com'
In fact I was excepting this message because every time I use Curl, I need to set our company proxy info :
<?php
curl_setopt($curl, CURLOPT_HTTPPROXYTUNNEL, true);
curl_setopt($curl, CURLOPT_PROXY, 'xxx.xxx.xxx.xxx');
By the way, i'm 100% sure that googleapis.com is white-listed by our proxies... but how to do it with the BigQueryClient ? I searched in the official documentation, no way to find how to use a proxy.
I would try passing blindly these constructions to one of the connection builder classes, hopefully one picks this up
'restOptions' => [
'proxy', 'xxx.xxx.xxx.xxx'
]
on the other hand if you have a trace log you could see if Guzzle or something else is used. Consider opening an issue tracking at: https://github.com/GoogleCloudPlatform/google-cloud-php/issues
Bigquery uses www.googleapis.com as its endpoint.
The command line instructions have global flags to specify an address, password, port and user name regarding a proxy usage, however, for client libraries you'll need to verify the access with your infrastructure team.
Use php's putenv() to set the proxy before using BigQueryClient.
putenv('HTTPS_PROXY=192.168.1.1:8080');
use Google\Cloud\BigQuery\BigQueryClient;
use GuzzleHttp\Client;
use Psr\Http\Message\RequestInterface;
$guzzleClient = new Client();
$config = [
'projectId' => 'xxx',
'keyFilePath' => 'key_path',
'restOptions' => [
'proxy' => 'xxx.xxx.xxx.xxx:xx'
],
'authHttpHandler' => function (RequestInterface $request, array $options = []) use ($guzzleClient) {
return $guzzleClient->send(
$request,
$options + [
'proxy' => 'xxx.xxx.xxx.xxx:xx'
]
);
}
];
$bigQueryClient = new BigQueryClient($config);
It works for me with php library google/cloud-bigquery version 1.8.0
Simple question. But cannot get it to work.
I created an IAM Role for EC2 with full access to CloudWatch.
I launched a new EC2 instance with this IAM Role attached.
I wrote a simple PHP application on this EC2 instance which tries to publish metrics to CloudWatch.
I am getting this error in nginx logs:
2017/08/23 11:44:06 [error] 32142#32142: *5 FastCGI sent in stderr:
"PHP message: PHP Fatal error:
Uncaught Aws\Exception\CredentialsException:
Cannot read credentials from /var/www/.aws/credentials
in /var/www/app/vendor/aws/aws-sdk-php/src/Credentials/CredentialProvider.php:394
From that same EC2 instance, the command:
curl http://169.254.169.254/latest/meta-data/iam/security-credentials/<role-attached-to-ec2-instance>
returns 200 OK with the Access Key and Secret in the response.
This is my PHP code that tries to write CloudWatch metrics:
<?php
require 'vendor/autoload.php';
use Aws\CloudWatch\CloudWatchClient;
use Aws\Exception\AwsException;
$count = $_GET["count"];
publishMetric($count);
function publishMetric($count) {
$client = new CloudWatchClient([
'profile' => 'default',
'region' => 'us-east-1',
'version' => '2010-08-01'
]);
try {
$result = $client->putMetricData(array(
'Namespace' => 'com.mynamespace',
'MetricData' => array(
array(
'MetricName' => 'Count',
//Timestamp : mixed type: string (date format)|int (unix timestamp)|\DateTime
'Timestamp' => time(),
'Value' => $count,
'Unit' => 'Number'
)
)
));
var_dump($result);
echo 'Done publishing metrics';
} catch (AwsException $e) {
// output error message if fails
error_log($e->getMessage());
echo 'Failure to publish metrics';
}
}
?>
Any idea what is missing in this setup?
I know this is late. I had the same issue and resolved it by removing profile => default line while initializing the client. If you do not provide credentials and profile, SDK will try to retrieve instance profile creds from metadata server.
Authentication of EC2 instance while accessing other AWS Services can be done in multiple ways:
Assigning a role to EC2 instance. Used when u have to give some "EC2 instance" a permission.
Do not assign a role; but use access-key which has all required permissions. Used when you give permission to a "User"
Both these are independent authentication mechanism. If you have already assigned role to your server; you do not have to write any code in your application (CredentialProvider.php) to authenticate.
Your current code can also be worked by creating a file /var/www/.aws/credentials which will look something like this:
accessKey=AKIAIB6FA52IMGLREIIB
secretKey=NQjJWKT+WZOUOrQ2Pr/WcRey3PnQFaGMJ8nRoaVU
I am trying to call a courier api with specified method. I am able to connect with the api using soapclient but getting following error:
Object reference not set to an instance of an object
I am using following code and data
$proxy = new SoapClient($my_api_url);
$params = array(
"UserName" => '****',
"Password" => '****',
"OrderNumber" => '41111',
"ClientName" => 'My Name',
"ContactNumber1" => '123456789',
"EmailAddress" => 'testapi#rohitdhiman.in',
"ShippingAddress1" => 'site 15'
);
$result = $proxy->BayOneAddOrder($params);
print_r($result);
If it works using SOAP UI then you can try using a PHP tool like https://providr.io as it will give you the exact PHP request using OOP approach.
If you do not want to use the online tool, then you can generate your own PHP package from your WSDL using PackageGenerator so you'll send request easily still using OOP approach.