I am trying to call the Google CSE Api from my localhost Docker container. Apparently, this is not working because of that.
I have defined CURLOPT_SSL_VERIFYPEER to false in order to prevent SSL certificate verification, but with no success.
If anyone has any thought on this, help would be appreciated.
My code:
// Create the google api client
$googleClient = new Google_Client();
// Set the google developer api key
$googleClient->setApplicationName('GoogleApi_Search');
$googleClient->setDeveloperKey(self::GOOGLE_SEARCH_API_KEY);
// for development purposes
$guzzleConfig = [
'curl' => [CURLOPT_SSL_VERIFYPEER => false],
'headers' => ['Referer' => 'localhost:8080'],
];
$guzzleClient = new Client($guzzleConfig);
$googleClient->setHttpClient($guzzleClient);
// The google custom search service client
$this->googleService = new Google_Service_Customsearch($googleClient);
// Define the search parameters
$this->searchParams = [
'cx' => self::GOOGLE_SEARCH_ENGINE_ID, // Custom search engine identifier
'gl' => 'en', // Location of results
'lr' => 'lang_en', // Language of results
'num' => 10, // Number of results (max. 10)
'start' => 0, // The current index (max. 10)
];
I solved my issue by setting the start parameter to 1 instead of 0. Apparently, setting it to be 0 trigger a fatal error on the server side which causes the error 400 Invalid Value and no other information.
Strange but working.
Related
I try to use the Google Calendar API by php running server side.
I created a service account and use its credentials (via the json file). I am NOT using G Suite.
Now I am able to list events of my calendar but I get a Forbidden - Error, when I try to create a new event.
I shared my calendar with the service account's email address and gave it admin rights for the calendar. I don't know what I am doing wrong! I just can't get it working. I am also using the scope Google_Service_Calendar::CALENDAR.
The exception: 403 - Forbidden
I have no idea how to debug this problem and how to proceed further.
Can you help me?
Edit: here is my calling code:
putenv('GOOGLE_APPLICATION_CREDENTIALS=../google-service-account.json');
$client = new Google_Client();
$client->useApplicationDefaultCredentials();
$client->setScopes([Google_Service_Calendar::CALENDAR]);
$client->setRedirectUri('urn:ietf:wg:oauth:2.0:oob');
$service = new Google_Service_Calendar($client);
$event = new Google_Service_Calendar_Event([
'summary' => $name,
'location' => $location,
'description' => $desc,
'start' => [
'dateTime' => $start,
'timeZone' => $timezone,
],
'endTimeUnspecified' => true,
]);
$calendar = $service->calendars->get($calendarId);
$settings = $service->settings->listSettings();
$list = $service->acl->listAcl($calendarId);
$event = $service->events->insert($calendarId, $event);
The first three service calls ($calender = ..., $settings = ... and $list = ...) work fine and in the acl I can also see that the email address of my service account has all rights (role: owner). But the last line $event = ... gives the forbidden error...
'endTimeUnspecified' => true,
is causing your problem (its googles fault, but you cant tell them that)
if you change to
'end' => [
'dateTime' => $start2,
'timeZone' => $timezone,
],
it will work (i know it is not what you want and maybe you can figure out a way to adjust $start2 to give you what you want as an end time but the endTimeUnspecified is what is causing it to say 403 forbidden
Having difficulties authorizing php SoapClient with MS Dynamic Great Plains. I can connect through SoapUI. However, it only successfully connects on 3rd attempt. Also, the auth token progressively gets longer. See pastebin link below.
I made use of the following package (https://github.com/mlabrum/NTLMSoap) to setup a NTLM stream, but it doesn't seem to be sending a correct token. The token length is shorter than what is sent through SoapUI.
$wsdlUrl = 'http://example.org:48620/Metadata/Legacy/Full/DynamicsGP.wsdl';
$options = [
'ntlm_username' => 'Domain\username',
'ntlm_password' => 'password'
];
$soapClient = new \NTLMSoap\Client($wsdlUrl, $options);
$params = array(
criteria => array(
'ModifiedDate' => array(
'GreaterThan' => '2016-04-18',
'LessThan' => '2016-04-19'
)
),>
'context' => array(
'OrganizationKey' => array(
'type' => 'CompanyKey',
'Id' =
)
)
);
$soapClient->__setLocation('http://example.org:48620/DynamicsGPWebServices/DynamicsGPService.asmx');
$response = $soapClient->GetPurchaseOrderList(array($params));
I had to set use ___setLocation() because client was being forwarded to http://localmachine:48620/DynamicsGPWebServices/DynamicsGPService.asmx
I have been trying to get Charles Web Proxy to work to show the actual the request/response, buts its crapped out on me.
This is the SoapUI output. http://pastebin.com/7zg4E3qD
I am running into an error attempting to authenticate and create a new Route53 Record Set using the Amazon's PHP SDK and changeresourceRecordSets. Here's what I have attempted so far:
Installed the AWS SDK for Laravel
Used Amazon's IAM to create a new user and group and applied the FullAdministrator policy to the group.
Stored the new user credentials and other AWS variables in my .env file like so:
Code below:
AWS_REGION=us-east-1
AWS_ACCESS_KEY_ID=XXYYZZ
AWS_SECRET_ACCESS_KEY=112233
AWS_ZONE_ID=UHUHUHUH
Confirmed that my Laravel environment is configured correctly and my controller works by testing the following:
Code below:
$s3 = AWS::createClient('s3');
$s3->putObject(array(
'Bucket' => 'mydomain.com',
'Key' => 'new.pdf',
'SourceFile' => storage_path('app/old.pdf'),
));
Once I confirmed that my credentials worked against S3, I closely followed this SO answer and code to create a new Route53 client and create a new Record Set in my Route53 Hosted Zone. Here's my slightly modified code:
Code below:
$client = AWS::createClient('Route53');
//dd($client); $client object returned, this works
$result = $client->changeResourceRecordSets(array(
'HostedZoneId' => env('AWS_ZONE_ID'),
'ChangeBatch' => array(
'Comment' => 'just testing',
'Changes' => array(
array(
'Action' => 'CREATE',
'ResourceRecordSet' => array(
'Name' => 'test.mydomain.com.',
'Type' => 'A',
'TTL' => 600,
'ResourceRecords' => array(
array(
'Value' => '52.52.52.52',//my AWS IP address
),
),
),
),
),
),
));
The resulting error is as follows:
Client error: POST
https://route53.amazonaws.com/2013-04-01/hostedzone/MYZONE/rrset/
resulted in a 403 Forbidden response:
Sender
And more from the error...
SignatureDoesNotMatch (client): Signature expired: 20160225T215502Z is now earlier than 20160225T220842Z (20160225T221342Z - 5 min.)
Any suggestions are appreciated.
I should have added that I'm running in a homestead/virtualbox environment and the real problem was that my date service on my VM was woefully off.
Simply running sudo ntpdate -s time.nist.gov fixed the problem.
I am trying to use thu john's twitter-laravel package. I am trying to dynamically set the access tokens and then get the user's home timeline. It says that the method doesn't exist. But I know for sure the method exists in the Twitter class. It is giving me this error.
Error:
Call to undefined method
Thujohn\Twitter\TwitterFacade::getHomeTimeline()
My code:
$config = array(
'ACCESS_TOKEN' => 'xxx',
'ACCESS_TOKEN_SECRET' => 'xxxx'
);
$tw = new Twitter($config);
return $tw->getHomeTimeline(array('count' => 20, 'format' => 'json'));
Just wondering if anyone knows why Amazon's AWS would be telling me "The X509 Certificate you provided does not exist in our records."
Here's the code I'm using...
$sqs = new AmazonSQS();
$queue_url = 'my_url';
$options = array(
'MaxNumberOfMessages' => 10,
);
$resp = $sqs->receive_message($queue_url, $options);
print_r($resp);
Here's the response I get...
[Type] => Sender
[Code] => InvalidClientTokenId
[Message] => The X509 Certificate you provided does not exist in our records.
[Detail] => CFSimpleXML Object
Here's the CFCredentials array I'm using inside config.inc.php...
'#default' => array(
'key' => 'my-key',
'secret' => 'my-secret',
'default_cache_config' => 'cache',
'certificate_authority' => FALSE
)
In order to use Amazon SQS, you have to specifically sign up for Amazon SQS ; it looks like you are not sign-up. You can do it by visiting this page and clicking the button labeled "Sign up for Amazon SQS".
The reason I was getting this error was because I am using MAMP PRO which doesn't have CURL installed with SSL. Only CURL.
So to get around this so I could test from my local machine was the below code. Note the "#" on the second line. I used this to suppress the warning that is given out by disable_ssl() method.
$s3 = new AmazonS3();
#$s3->disable_ssl();