PHP waiting for response to change status - php

I'm trying to implement SkyScanner API...
I need to call:
"http://partners.api.skyscanner.net/apiservices/pricing/uk1/v1.0/
{SessionKey}?apiKey={apiKey}
&pageIndex=0
&pageSize=10"
so I write:
$res1 = $client1->get('http://partners.api.skyscanner.net/apiservices/pricing/uk2/v1.0/'.$session_id.'?apikey=APIKEY&pageIndex=0&pageSize=10"');
$res1 = json_decode($res1->getBody()->getContents(), true);
$res1 = collect($res1);
and I need to wait for a response to change Status from UpdatePending to UpdateCompleted
API docs:
Keep requesting page 0 until you get UpdatesComplete with pageIndex=0
at half a second to one second interval. Once you get UpdatesComplete
you may request any page and page size.
While the status is UPDATESPENDING, you should request only page 0
because the contents of each page are liable to change until updates
are complete.
How to wait for response to change status...
I try:
while ($res1['Status'] == 'UpdatesPending') {
echo 'waiting';
}
dd($res1);
but there is no end ...
How to wait for a response to change status?

you can echo 'waiting' before youre code
then it is be compleate echo completed in the other line
echo 'waiting...';
$res1 = $client1->get('http://partners.api.skyscanner.net/apiservices/pricing/uk2/v1.0/'.$session_id.'?apikey=APIKEY&pageIndex=0&pageSize=10"');
$res1 = json_decode($res1->getBody()->getContents(), true);
$res1 = collect($res1);
echo 'completed';

You need to create a do/while loop to check the status and break out of it when the status changes:
do{
$res1 = $client1->get('http://partners.api.skyscanner.net/apiservices/pricing/uk2/v1.0/'.$session_id.'?apikey=APIKEY&pageIndex=0&pageSize=10"');
$res1 = json_decode($res1->getBody()->getContents(), true);
$res1 = collect($res1);
if($res1['Status'] == 'UpdatesPending') echo "waiting...<br />\n";
sleep(5); // Make it sleep 5 seconds so as to not spam the server
while($res1['Status'] == 'UpdatesPending');
echo "Done!";
Note that you won't get any actual feedback until the whole process has finished.

If you use guzzle as PHP HTTP Client, then you just need to define retry decider and retry timer
public function retryDecider()
{
return function ($retries, Request $request, Response $response = null, RequestException $exception = null)
{
// maximum retries is three times
if ($retries >= 3) {
return false;
}
if ($response) {
$content = $response->getBody()->getContents();
$res1 = json_decode($content, true);
$res1 = collect($res1);
// maybe you need to change this one to 'substr' or regex or something else
if ($res1['Status'] == 'UpdatesComplete') {
return false;
}
}
return true; // we will request again
};
}
public function retryDelay()
{
return function ($numberOfRetries) {
return 1000; // 1 second
};
}
and use that decider and delay function
$handlerStack = HandlerStack::create(new CurlHandler());
$handlerStack->push(Middleware::retry($this->retryDecider(), $this->retryDelay()));
$client = new Client(array('handler' => $handlerStack));
and get your response
$url = 'http://partners.api.skyscanner.net/apiservices/pricing/uk2/v1.0/'.$session_id.'?apikey=APIKEY&pageIndex=0&pageSize=10"';
$response = $client->request('GET', $url)->getBody()->getContents();
You can decode the last $response to decide want you want to do.
With this kind of approach, you can decide which response is valid and retry the request as many as you like.

Related

Why is my variable value set to false even after it's set to true?

I am trying to implement google reCAPTCHA. The issue is even when ($Return->success == true && $Return->score >= 0.5) is equal to true, when i should be getting $isOK = true, I get $isOK = false.
I tried a lot, but I am still getting this value set to false. Is there something I am doing wrong over here?
$captchaVerified = false;
if(isset($_POST['tokenVal'])){
function getCaptcha($tokenV,$secretKey){
$Response = file_get_contents("https://www.google.com/recaptcha/api/siteverify?secret=".urlencode($secretKey)."&response=".urlencode($tokenV));
$Return = json_decode($Response);
/* global $captchaVerified;*/
if ($Return->success == true && $Return->score >= 0.5) {
$captchaVerified = true;
} else {
$captchaVerified = false;
}
return $captchaVerified;
}
$isOK = getCaptcha($_POST['tokenVal'],$secretKey);
}
The given Response code is performing a GET request to the api server.
The api server is expecting a POST request.
$Response = file_get_contents("https://www.google.com/recaptcha/api/siteverify?secret=".urlencode($secretKey)."&response=".urlencode($tokenV));
To make POST request to the server you can follow these steps .
The overall reference has been taken from verify recaptcha

Wordpress - file_get_contents loop turns down homepage for a while - alternative?

i have following problem:
in a function, i put in an array with at least 700 names. I get out an array with all information about their releases from the last 10 days.
The function gets via iTunes API a json response, which i want to use for further analyzings.
Problem:
- while executing function, it takes about 3mins to finish it.
- homepage is not reachable for others, while i execute it:
(Error on Server: (70007)The timeout specified has expired: AH01075: Error dispatching request to : (polling)) --> Running out of memory?
Questions:
- how to code this function more efficient?
- how to code this function without using to much memory, shall i use unset(...) ??
Code:
function getreleases($artists){
# print_r($artists);
$releases = array();
foreach( $artists as $artist){
$artist = str_replace(" ","%20",$artist);
$ituneslink = "https://itunes.apple.com/search?term=".$artist."&media=music&entity=album&limit=2&country=DE";
$itunesstring = file_get_contents($ituneslink);
$itunesstring = json_decode($itunesstring);
/*Results being decoded from json to an array*/
if( ($itunesstring -> resultCount)>0 ){
foreach ( $itunesstring -> results as $value){
if( (date_diff(date_create('now'), date_create( ($value -> releaseDate )))->format('%a')) < 10) {
#echo '<br>Gefunden: ' . $artist;
$releases[] = $value;
}
}
}else{
echo '<br><span style="color:red">Nicht gefunden bei iTunes: ' . $artist .'</span>';
}
unset($ituneslink);
unset($itunesstring);
unset($itunesstring2);
}
return $releases;
}
The problem lies in the fact that every time that function is executed, your server needs to make 700+ API Calls, parse the data, and work your logic on it.
One potential solution is to use Wordpress's transients to 'cache' the value (or perhaps even the whole output), this way, it won't have to execute that strenuous function on every connection, it'll just pull the data from the transient. You can set an expiry date for a transient, so you can have it refetch the information every X days/hours.
Take a look at this article from CSS Tricks that walks you through a simple example using transients.
But the problem is not fixed. While updating the stuff and getting 700 items from iTunes API and while Running in the for-loop, the homepage is getting out of memory. although homepage is not reachable from my computer. I just tried for a "timeout" or "sleep" sothat the script is searching for stuff every few seconds. But it doesn't change it.
I just improved: Changed "foreach" to "for" because of memory reasons. Now variables are not being copied. Are there more problems :-/ ??
I've got to for-loops in there. Maybe $itunesstring is being copied ?
if(!function_exists('get_contents')){
function get_contents(&$url){
// if cURL is available, use it...
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_TIMEOUT, 10);
$cache = curl_exec($ch);
curl_close($ch);
return $cache;
}
}
function getfromituneslink(&$link,&$name){
$name = str_replace("'","",$name);
$name = substr($name, 0, 14);
$result = get_transient("getlink_itunes_{$name}");
if(false === $result){
$result = get_contents($link);
set_transient("getlink_itunes_{$name}",$result, 12*HOUR_IN_SECONDS);
}
return $result;
}
function getreleases(&$artists){
$releases= array();
while( 0 < count($artists)){
$itunesstring = array();
$artist = array_shift($artists);
$artist = str_replace(" ","%20",$artist);
$ituneslink = "https://itunes.apple.com/search?term=".$artist."&media=music&entity=album&limit=2&country=DE";
$itunesstring = getfromituneslink($ituneslink,$artist);
unset($ituneslink);
$itunesstring = json_decode($itunesstring);
if( ($itunesstring -> resultCount)>0 ){
#for($i=0; $i< (count($itunesstring -> results))-1; ++$i)
while( 0 < count(($itunesstring -> results))){
$value = array_shift($itunesstring -> results);
#$value = &$itunesstring[results][$i];
#foreach ( $itunesstring -> results as $value)
if( (date_diff(date_create('now'), date_create( ($value -> releaseDate )))->format('%a')) < 6) {
$releases[] = array($value->artistName, $value->collectionName, $value->releaseDate, str_replace("?uo=4","",$value -> collectionViewUrl));
unset($value);
}
}
}else{
echo '<br><span style="color:red">Nicht gefunden bei iTunes: ' . str_replace("%20"," ",$artist) .'</span>';
}
unset($ituneslink);
unset($itunesstring);
}
return $releases;
}
I don't know, where the problem is. :-(
Any other possibilty to let the function run to get the information one by another

Best approach to process batch API Request Lavel 5

EDIT : Already fixed the error Creating default object from empty value . My question now is just suggestions how to make this process better.
I am making a tool that sends batch API request to a certain URL. I am using https://github.com/stil/curl-easy for batch API request That API will validate the phone numbers I am sending and return a result. That API can support maximum 20 request per second, supposed I have 50,000 API request, what is the best and fastest way to process things?
The current process is, I query to my database for the number of records I have which is the number of phone numbers. Then I make a loop with the number of phone numbers. Inside a single iteration, I will query 13 records from the database then pass it to the queue and process it, when the processing is finished I will query again to the database and update the fields for the the phone number based on the API response. My problem is that I think I have a logic error on my loop when I run this the processing will stop for sometime and gives me:
Creating default object from empty value
I guess that is because of my loop here's my code.
public function getParallelApi()
{
$numItems = DB::connection('mysql')->select("SELECT COUNT(ID) AS cnt FROM phones WHERE status IS NULL");
for($y = 0; $y < $numItems[0]->cnt; $y++)
{
$queue = new \cURL\RequestsQueue;
// Set default options for all requests in queue
$queue->getDefaultOptions()
->set(CURLOPT_TIMEOUT, 5)
->set(CURLOPT_RETURNTRANSFER, true)
->set(CURLOPT_SSL_VERIFYPEER, false);
// This is the function that is called for every completed request
$queue->addListener('complete', function (\cURL\Event $event) {
$response = $event->response;
$json = $response->getContent(); // Returns content of response
$feed = json_decode($json, true);
$phone_number = $feed['msisdn'];
if(isset($phone_number))
{
$status = "";$remarks = "";$network = ""; $country = "";$country_prefix = "";
if(isset($feed['status']))
{
$status = $feed['status'];
}
if(isset($feed['network']))
{
$network = $feed['network'];
}
if(isset($feed['country']))
{
$country = $feed['country'];
}
if(isset($feed['country_prefix']))
{
$country_prefix = $feed['country_prefix'];
}
if(isset($feed['remark']))
{
$remarks = $feed['remark'];
}
$update = Phone::find($phone_number);
$update->network = $network;
$update->country = $country;
$update->country_prefix = $country_prefix;
$update->status = $status;
$update->remarks = $remarks;
$update->save();
}
});
// Get 13 records
$phone = DB::connection('mysql')->select("SELECT * FROM phones WHERE status IS NULL LIMIT 13");
foreach ($phone as $key => $value)
{
$request = new \cURL\Request($this->createCurlUrl($value->msisdn));
// var_dump($this->createCurlUrl($value->msisdn)); exit;
// Add request to queue
$queue->attach($request);
}
// Process the queue
while ($queue->socketPerform()) {
$queue->socketSelect();
}
sleep(2);
$y += 13; // Move to the next 13 records
}
Session::flash('alert-info', 'Data Updated Successfully!');
return Redirect::to('upload');
}
Since the maximum is 20 request per seconds I'm just doing it 13 request per second just to be sure I won't clog their server. I am adding sleep(2); to pause a bit so that I can make sure that the queue is fully processed before moving on.

How to perform multiple Guzzle requests at the same time?

I can perform single requests using Guzzle and I'm very pleased with Guzzle's performance so far however, I read in the Guzzle API something about MultiCurl and Batching.
Could someone explain to me how to make multiple requests at the same time? Async if possible. I don't know if that is what they mean with MultiCurl. Sync would also be not a problem. I just want to do multiple requests at the same time or very close (short space of time).
From the docs:
http://guzzle3.readthedocs.org/http-client/client.html#sending-requests-in-parallel
For an easy to use solution that returns a hash of request objects mapping to a response or error, see http://guzzle3.readthedocs.org/batching/batching.html#batching
Short example:
<?php
$client->send(array(
$client->get('http://www.example.com/foo'),
$client->get('http://www.example.com/baz'),
$client->get('http://www.example.com/bar')
));
An update related to the new GuzzleHttp guzzlehttp/guzzle
Concurrent/parallel calls are now run through a few different methods including Promises.. Concurrent Requests
The old way of passing a array of RequestInterfaces will not work anymore.
See example here
$newClient = new \GuzzleHttp\Client(['base_uri' => $base]);
foreach($documents->documents as $doc){
$params = [
'language' =>'eng',
'text' => $doc->summary,
'apikey' => $key
];
$requestArr[$doc->reference] = $newClient->getAsync( '/1/api/sync/analyze/v1?' . http_build_query( $params) );
}
$time_start = microtime(true);
$responses = \GuzzleHttp\Promise\unwrap($requestArr); //$newClient->send( $requestArr );
$time_end = microtime(true);
$this->get('logger')->error(' NewsPerf Dev: took ' . ($time_end - $time_start) );
Update:
As suggested in comments and asked by #sankalp-tambe, you can also use a different approach to avoid that a set of concurrent request with a failure will not return all the responses.
While the options suggested with Pool is feasible i still prefer promises.
An example with promises is to use settle and and wait methods instead of unwrap.
The difference from the example above would be
$responses = \GuzzleHttp\Promise\settle($requestArr)->wait();
I have created a full example below for reference on how to handle the $responses too.
require __DIR__ . '/vendor/autoload.php';
use GuzzleHttp\Client as GuzzleClient;
use GuzzleHttp\Promise as GuzzlePromise;
$client = new GuzzleClient(['timeout' => 12.0]); // see how i set a timeout
$requestPromises = [];
$sitesArray = SiteEntity->getAll(); // returns an array with objects that contain a domain
foreach ($sitesArray as $site) {
$requestPromises[$site->getDomain()] = $client->getAsync('http://' . $site->getDomain());
}
$results = GuzzlePromise\settle($requestPromises)->wait();
foreach ($results as $domain => $result) {
$site = $sitesArray[$domain];
$this->logger->info('Crawler FetchHomePages: domain check ' . $domain);
if ($result['state'] === 'fulfilled') {
$response = $result['value'];
if ($response->getStatusCode() == 200) {
$site->setHtml($response->getBody());
} else {
$site->setHtml($response->getStatusCode());
}
} else if ($result['state'] === 'rejected') {
// notice that if call fails guzzle returns is as state rejected with a reason.
$site->setHtml('ERR: ' . $result['reason']);
} else {
$site->setHtml('ERR: unknown exception ');
$this->logger->err('Crawler FetchHomePages: unknown fetch fail domain: ' . $domain);
}
$this->entityManager->persist($site); // this is a call to Doctrines entity manager
}
This example code was originally posted here.
Guzzle 6.0 has made sending multiple async requests very easy.
There are multiple ways to do it.
You can create the async requests and add the resultant promises to a single array, and get the result using the settle() method like this:
$promise1 = $client->getAsync('http://www.example.com/foo1');
$promise2 = $client->getAsync('http://www.example.com/foo2');
$promises = [$promise1, $promise2];
$results = GuzzleHttp\Promise\settle($promises)->wait();
You can now loop through these results and fetch the response using GuzzleHttpPromiseall or GuzzleHttpPromiseeach. Refer to this article for further details.
In case if you have an indeterminate number of requests to be sent(say 5 here), you can use GuzzleHttp/Pool::batch().
Here is an example:
$client = new Client();
// Create the requests
$requests = function ($total) use($client) {
for ($i = 1; $i <= $total; $i++) {
yield new Request('GET', 'http://www.example.com/foo' . $i);
}
};
// Use the Pool::batch()
$pool_batch = Pool::batch($client, $requests(5));
foreach ($pool_batch as $pool => $res) {
if ($res instanceof RequestException) {
// Do sth
continue;
}
// Do sth
}

Waiting for error from apns with enhanced format in php

We're using a modified version of easy-apns for sending our push messages. With the enhanced push message format the apple server responds if an error occurs, and does nothing if everything goes well.
The problem is that we have to wait for an error a certain amount of time after each message has been sent. For example if we receive no response after 1 second, we assume everything went ok.
With 20000 push messages, this takes far too long. Is there any way I can listen for errors in a faster way? For example sending to 1000 devices and then listen for errors? What happens if the connection gets closed, can I still read the error response?
Ideal would be some kind of asynchronous writing and reading, but I think that's not possible.
Here's the corresponding code:
$fp = $this->connect();
$expiry = time()+60*60;
// construct message
$msg = chr(1).pack("N",$batchid).pack("N",$expiry).pack("n",32).pack('H*',$devicetoken).pack("n",strlen($payload)).$payload;
// send message to Apple
$fwrite = fwrite($fp, $msg);
if(!$fwrite) {
// connection has been closed
$this->disconnect();
throw new Exception("Connection closed");
} else {
// read response from Apple
// Timeout. 1 million micro seconds = 1 second
$tv_sec = 1;
$tv_usec = 0;
$r = array($fp);
$we = null; // Temporaries. "Only variables can be passed as reference."
// PROBLEM: this method waits for $tv_sec seconds for a response
$numChanged = stream_select($r, $we, $we, $tv_sec, $tv_usec);
if( $numChanged === false ) {
throw new Exception("Failed selecting stream to read.");
} elseif ( $numChanged > 0 ) {
$command = ord( fread($fp, 1) );
$status = ord( fread($fp, 1) );
$identifier = implode('', unpack("N", fread($fp, 4)));
if( $status > 0 ) {
// The socket has also been closed. Cause reopening in the loop outside.
$this->disconnect();
throw new MessageException("APNS responded with status $status: {$this->statusDesc[$status]} ($devicetoken).".microtime(), $status);
} else {
// unknown response, assume ok
}
} else {
// no response, assume ok
}
}

Categories