Best approach to process batch API Request Lavel 5 - php

EDIT : Already fixed the error Creating default object from empty value . My question now is just suggestions how to make this process better.
I am making a tool that sends batch API request to a certain URL. I am using https://github.com/stil/curl-easy for batch API request That API will validate the phone numbers I am sending and return a result. That API can support maximum 20 request per second, supposed I have 50,000 API request, what is the best and fastest way to process things?
The current process is, I query to my database for the number of records I have which is the number of phone numbers. Then I make a loop with the number of phone numbers. Inside a single iteration, I will query 13 records from the database then pass it to the queue and process it, when the processing is finished I will query again to the database and update the fields for the the phone number based on the API response. My problem is that I think I have a logic error on my loop when I run this the processing will stop for sometime and gives me:
Creating default object from empty value
I guess that is because of my loop here's my code.
public function getParallelApi()
{
$numItems = DB::connection('mysql')->select("SELECT COUNT(ID) AS cnt FROM phones WHERE status IS NULL");
for($y = 0; $y < $numItems[0]->cnt; $y++)
{
$queue = new \cURL\RequestsQueue;
// Set default options for all requests in queue
$queue->getDefaultOptions()
->set(CURLOPT_TIMEOUT, 5)
->set(CURLOPT_RETURNTRANSFER, true)
->set(CURLOPT_SSL_VERIFYPEER, false);
// This is the function that is called for every completed request
$queue->addListener('complete', function (\cURL\Event $event) {
$response = $event->response;
$json = $response->getContent(); // Returns content of response
$feed = json_decode($json, true);
$phone_number = $feed['msisdn'];
if(isset($phone_number))
{
$status = "";$remarks = "";$network = ""; $country = "";$country_prefix = "";
if(isset($feed['status']))
{
$status = $feed['status'];
}
if(isset($feed['network']))
{
$network = $feed['network'];
}
if(isset($feed['country']))
{
$country = $feed['country'];
}
if(isset($feed['country_prefix']))
{
$country_prefix = $feed['country_prefix'];
}
if(isset($feed['remark']))
{
$remarks = $feed['remark'];
}
$update = Phone::find($phone_number);
$update->network = $network;
$update->country = $country;
$update->country_prefix = $country_prefix;
$update->status = $status;
$update->remarks = $remarks;
$update->save();
}
});
// Get 13 records
$phone = DB::connection('mysql')->select("SELECT * FROM phones WHERE status IS NULL LIMIT 13");
foreach ($phone as $key => $value)
{
$request = new \cURL\Request($this->createCurlUrl($value->msisdn));
// var_dump($this->createCurlUrl($value->msisdn)); exit;
// Add request to queue
$queue->attach($request);
}
// Process the queue
while ($queue->socketPerform()) {
$queue->socketSelect();
}
sleep(2);
$y += 13; // Move to the next 13 records
}
Session::flash('alert-info', 'Data Updated Successfully!');
return Redirect::to('upload');
}
Since the maximum is 20 request per seconds I'm just doing it 13 request per second just to be sure I won't clog their server. I am adding sleep(2); to pause a bit so that I can make sure that the queue is fully processed before moving on.

Related

PHP waiting for response to change status

I'm trying to implement SkyScanner API...
I need to call:
"http://partners.api.skyscanner.net/apiservices/pricing/uk1/v1.0/
{SessionKey}?apiKey={apiKey}
&pageIndex=0
&pageSize=10"
so I write:
$res1 = $client1->get('http://partners.api.skyscanner.net/apiservices/pricing/uk2/v1.0/'.$session_id.'?apikey=APIKEY&pageIndex=0&pageSize=10"');
$res1 = json_decode($res1->getBody()->getContents(), true);
$res1 = collect($res1);
and I need to wait for a response to change Status from UpdatePending to UpdateCompleted
API docs:
Keep requesting page 0 until you get UpdatesComplete with pageIndex=0
at half a second to one second interval. Once you get UpdatesComplete
you may request any page and page size.
While the status is UPDATESPENDING, you should request only page 0
because the contents of each page are liable to change until updates
are complete.
How to wait for response to change status...
I try:
while ($res1['Status'] == 'UpdatesPending') {
echo 'waiting';
}
dd($res1);
but there is no end ...
How to wait for a response to change status?
you can echo 'waiting' before youre code
then it is be compleate echo completed in the other line
echo 'waiting...';
$res1 = $client1->get('http://partners.api.skyscanner.net/apiservices/pricing/uk2/v1.0/'.$session_id.'?apikey=APIKEY&pageIndex=0&pageSize=10"');
$res1 = json_decode($res1->getBody()->getContents(), true);
$res1 = collect($res1);
echo 'completed';
You need to create a do/while loop to check the status and break out of it when the status changes:
do{
$res1 = $client1->get('http://partners.api.skyscanner.net/apiservices/pricing/uk2/v1.0/'.$session_id.'?apikey=APIKEY&pageIndex=0&pageSize=10"');
$res1 = json_decode($res1->getBody()->getContents(), true);
$res1 = collect($res1);
if($res1['Status'] == 'UpdatesPending') echo "waiting...<br />\n";
sleep(5); // Make it sleep 5 seconds so as to not spam the server
while($res1['Status'] == 'UpdatesPending');
echo "Done!";
Note that you won't get any actual feedback until the whole process has finished.
If you use guzzle as PHP HTTP Client, then you just need to define retry decider and retry timer
public function retryDecider()
{
return function ($retries, Request $request, Response $response = null, RequestException $exception = null)
{
// maximum retries is three times
if ($retries >= 3) {
return false;
}
if ($response) {
$content = $response->getBody()->getContents();
$res1 = json_decode($content, true);
$res1 = collect($res1);
// maybe you need to change this one to 'substr' or regex or something else
if ($res1['Status'] == 'UpdatesComplete') {
return false;
}
}
return true; // we will request again
};
}
public function retryDelay()
{
return function ($numberOfRetries) {
return 1000; // 1 second
};
}
and use that decider and delay function
$handlerStack = HandlerStack::create(new CurlHandler());
$handlerStack->push(Middleware::retry($this->retryDecider(), $this->retryDelay()));
$client = new Client(array('handler' => $handlerStack));
and get your response
$url = 'http://partners.api.skyscanner.net/apiservices/pricing/uk2/v1.0/'.$session_id.'?apikey=APIKEY&pageIndex=0&pageSize=10"';
$response = $client->request('GET', $url)->getBody()->getContents();
You can decode the last $response to decide want you want to do.
With this kind of approach, you can decide which response is valid and retry the request as many as you like.

Tranzila payment gateway : Not Authorized error message in response

Trying to integrate tranzila payment gateway in my php project and testing with dummy credit card numbers on localhost before go live.I get php code from tranzila official document.
code given below
`
$tranzila_api_host = 'secure5.tranzila.com';
$tranzila_api_path = '/cgi-bin/tranzila71u.cgi';
// Prepare transaction parameters
$query_parameters['supplier'] = 'TERMINAL_NAME'; // 'TERMINAL_NAME' should be replaced by actual terminal name
$query_parameters['sum'] = '45'; // Transaction sum
$query_parameters['currency'] = '1'; // Type of currency 1 NIS, 2 USD, 978 EUR, 826 GBP, 392 JPY
$query_parameters['ccno'] = '12312312'; // Test card number
$query_parameters['expdate']= '0820'; // Card expiry date: mmyy
$query_parameters['myid'] = '12312312'; // ID number if required
$query_parameters['mycvv'] = '123'; // number if required
$query_parameters['cred_type'] = '1'; // This field specifies the type of transaction, 1 - normal transaction, 6 - credit, 8 - payments
// $query_parameters['TranzilaPW'] = 'TranzilaPW' ;
$query_string = '' ;
foreach($query_parameters as $name => $value) {
$query_string .= $name.'='.$value.'&' ;
}
$query_string = substr($query_string , 0 , - 1 ) ; // Remove trailing '&'
// Initiate CURL
$cr = curl_init();
curl_setopt($cr,CURLOPT_URL ,"https://$tranzila_api_host$tranzila_api_path");
curl_setopt($cr,CURLOPT_POST,1);
curl_setopt($cr,CURLOPT_FAILONERROR,true);
curl_setopt($cr,CURLOPT_POSTFIELDS,$query_string) ;
curl_setopt($cr,CURLOPT_RETURNTRANSFER,1);
curl_setopt($cr,CURLOPT_SSL_VERIFYPEER,0);
// Execute request
$result = curl_exec($cr);
$error = curl_error($cr);
if(!empty($error)){
die( $error );
}
curl_close ($cr);
// Preparing associative array with response data
$response_array = explode('&',$result);
$response_assoc = array();
if(count($response_array) > 1){
foreach($response_array as $value){
$tmp = explode('=',$value);
if (count($tmp) > 1 ){
$response_assoc [$tmp[0]] = $tmp[1];
}
}
}
// Analyze the result string
if(!isset($response_assoc['Response'])){
die($result."\n");
/**
* When there is no 'Response' parameter it either means
* that some pre-transaction error happened (like authentication
* problems), in which case the result string will be in HTML format,
* explaining the error, or the request was made for generate token only
* (in this case the response string will only contain 'TranzilaTK'
* parameter)
*/
}else if($response_assoc['Response'] !== '000'){
die($response_assoc['Response']."\n");
// Any other than '000' code means transaction failure
// (bad card, expiry, etc ..)
}else{
die("Success \n");
}
`
Here i replaced supplier with my original supplier name which i can't show here for security reasons.When run this code with actual supplier i got 'Not Authorized' error.

Wordpress - file_get_contents loop turns down homepage for a while - alternative?

i have following problem:
in a function, i put in an array with at least 700 names. I get out an array with all information about their releases from the last 10 days.
The function gets via iTunes API a json response, which i want to use for further analyzings.
Problem:
- while executing function, it takes about 3mins to finish it.
- homepage is not reachable for others, while i execute it:
(Error on Server: (70007)The timeout specified has expired: AH01075: Error dispatching request to : (polling)) --> Running out of memory?
Questions:
- how to code this function more efficient?
- how to code this function without using to much memory, shall i use unset(...) ??
Code:
function getreleases($artists){
# print_r($artists);
$releases = array();
foreach( $artists as $artist){
$artist = str_replace(" ","%20",$artist);
$ituneslink = "https://itunes.apple.com/search?term=".$artist."&media=music&entity=album&limit=2&country=DE";
$itunesstring = file_get_contents($ituneslink);
$itunesstring = json_decode($itunesstring);
/*Results being decoded from json to an array*/
if( ($itunesstring -> resultCount)>0 ){
foreach ( $itunesstring -> results as $value){
if( (date_diff(date_create('now'), date_create( ($value -> releaseDate )))->format('%a')) < 10) {
#echo '<br>Gefunden: ' . $artist;
$releases[] = $value;
}
}
}else{
echo '<br><span style="color:red">Nicht gefunden bei iTunes: ' . $artist .'</span>';
}
unset($ituneslink);
unset($itunesstring);
unset($itunesstring2);
}
return $releases;
}
The problem lies in the fact that every time that function is executed, your server needs to make 700+ API Calls, parse the data, and work your logic on it.
One potential solution is to use Wordpress's transients to 'cache' the value (or perhaps even the whole output), this way, it won't have to execute that strenuous function on every connection, it'll just pull the data from the transient. You can set an expiry date for a transient, so you can have it refetch the information every X days/hours.
Take a look at this article from CSS Tricks that walks you through a simple example using transients.
But the problem is not fixed. While updating the stuff and getting 700 items from iTunes API and while Running in the for-loop, the homepage is getting out of memory. although homepage is not reachable from my computer. I just tried for a "timeout" or "sleep" sothat the script is searching for stuff every few seconds. But it doesn't change it.
I just improved: Changed "foreach" to "for" because of memory reasons. Now variables are not being copied. Are there more problems :-/ ??
I've got to for-loops in there. Maybe $itunesstring is being copied ?
if(!function_exists('get_contents')){
function get_contents(&$url){
// if cURL is available, use it...
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_TIMEOUT, 10);
$cache = curl_exec($ch);
curl_close($ch);
return $cache;
}
}
function getfromituneslink(&$link,&$name){
$name = str_replace("'","",$name);
$name = substr($name, 0, 14);
$result = get_transient("getlink_itunes_{$name}");
if(false === $result){
$result = get_contents($link);
set_transient("getlink_itunes_{$name}",$result, 12*HOUR_IN_SECONDS);
}
return $result;
}
function getreleases(&$artists){
$releases= array();
while( 0 < count($artists)){
$itunesstring = array();
$artist = array_shift($artists);
$artist = str_replace(" ","%20",$artist);
$ituneslink = "https://itunes.apple.com/search?term=".$artist."&media=music&entity=album&limit=2&country=DE";
$itunesstring = getfromituneslink($ituneslink,$artist);
unset($ituneslink);
$itunesstring = json_decode($itunesstring);
if( ($itunesstring -> resultCount)>0 ){
#for($i=0; $i< (count($itunesstring -> results))-1; ++$i)
while( 0 < count(($itunesstring -> results))){
$value = array_shift($itunesstring -> results);
#$value = &$itunesstring[results][$i];
#foreach ( $itunesstring -> results as $value)
if( (date_diff(date_create('now'), date_create( ($value -> releaseDate )))->format('%a')) < 6) {
$releases[] = array($value->artistName, $value->collectionName, $value->releaseDate, str_replace("?uo=4","",$value -> collectionViewUrl));
unset($value);
}
}
}else{
echo '<br><span style="color:red">Nicht gefunden bei iTunes: ' . str_replace("%20"," ",$artist) .'</span>';
}
unset($ituneslink);
unset($itunesstring);
}
return $releases;
}
I don't know, where the problem is. :-(
Any other possibilty to let the function run to get the information one by another

ElasticSearch PHP error 503

I'm quite new to ElasticSearch and just figuring some things out for now, but here is a problem. After all the trouble of changing the server we migrated the nodes, but have some missing data in one of them. We have two nodes that have records with the same ID in both of them. In 1'st one everything is ok, but in second some records seems to be missing. For example if I send the request like this:
GET /fnh_detailbeta/main/61UkVww8QGu093RirFG84A
I will get the following error (in Marvel - Sense)
{
"error": "NoShardAvailableActionException[[fnh_detailbeta][0] null]",
"status": 503
}
So, what I'm trying to do now is compare all the records in Node 1 with records in Node 2 via ElasticSearch PHP. Here is the PHP I'm trying to run:
<?php
require '../connector/vendor/autoload.php';
$number = 1;
$params = array();
$params['hosts'] = array (
'146.185.164.121:9200', // IP + Port
);
$client = new Elasticsearch\Client($params);
$getParams = array();
$getParams['index'] = 'fnh_mainbeta';
$getParams['type'] = 'main';
//REQUEST AMOUNT
$json = '{"query": {"match_all": {}},"size": 1,"from":'.$number.'}';
$getParams['body'] = $json;
$retDoc = $client->search($getParams);
$total = $retDoc['hits']['total'];
//END OF REQUEST AMOUNT
//REQUEST ID
for ($i = 1; $i <= $total; $i++) {
$json = '{"query": {"match_all": {}},"size": 1,"from":'.$number.'}';
$getParams = array();
$getParams['index'] = 'fnh_mainbeta';
$getParams['type'] = 'main';
$getParams['body'] = $json;
$getParams['ignore'] = 503;
$getParams['ignore'] = 404;
$retDoc = $client->search($getParams);
$found_id = $retDoc['hits']['hits'][0]['_id'];
echo $found_id;
//REGUEST SECOND DB
$getParams = array();
$getParams['index'] = 'fnh_detailbeta';
$getParams['type'] = 'main';
$getParams['id'] = '61UkVww8QGu093RirFG84A';
$retDoc = $client->get($getParams);
$found_id_two = $retDoc['_id'];
echo $found_id_two;
//END OF REQUEST SECOND DB
$number++;
sleep(1);
};
//END OF REQUEST AMOUNT
It is not finished by now, but I get the fatal error that exits the script:
Fatal error: Uncaught exception 'Guzzle\Http\Exception\ServerErrorResponseException' with message 'Server error response
[status code] 503
[reason phrase] Service Unavailable
This error appears when NoShardAvailableActionException[[fnh_detailbeta][0] null]... So, the question is:
What is the best way to catch this error (something like IF error THEN echo "error") without stop of the PHP execution. In other words - I want to determine this error, delete this record and go further comparing the Nodes.
Any help will be appreciated!

How to perform multiple Guzzle requests at the same time?

I can perform single requests using Guzzle and I'm very pleased with Guzzle's performance so far however, I read in the Guzzle API something about MultiCurl and Batching.
Could someone explain to me how to make multiple requests at the same time? Async if possible. I don't know if that is what they mean with MultiCurl. Sync would also be not a problem. I just want to do multiple requests at the same time or very close (short space of time).
From the docs:
http://guzzle3.readthedocs.org/http-client/client.html#sending-requests-in-parallel
For an easy to use solution that returns a hash of request objects mapping to a response or error, see http://guzzle3.readthedocs.org/batching/batching.html#batching
Short example:
<?php
$client->send(array(
$client->get('http://www.example.com/foo'),
$client->get('http://www.example.com/baz'),
$client->get('http://www.example.com/bar')
));
An update related to the new GuzzleHttp guzzlehttp/guzzle
Concurrent/parallel calls are now run through a few different methods including Promises.. Concurrent Requests
The old way of passing a array of RequestInterfaces will not work anymore.
See example here
$newClient = new \GuzzleHttp\Client(['base_uri' => $base]);
foreach($documents->documents as $doc){
$params = [
'language' =>'eng',
'text' => $doc->summary,
'apikey' => $key
];
$requestArr[$doc->reference] = $newClient->getAsync( '/1/api/sync/analyze/v1?' . http_build_query( $params) );
}
$time_start = microtime(true);
$responses = \GuzzleHttp\Promise\unwrap($requestArr); //$newClient->send( $requestArr );
$time_end = microtime(true);
$this->get('logger')->error(' NewsPerf Dev: took ' . ($time_end - $time_start) );
Update:
As suggested in comments and asked by #sankalp-tambe, you can also use a different approach to avoid that a set of concurrent request with a failure will not return all the responses.
While the options suggested with Pool is feasible i still prefer promises.
An example with promises is to use settle and and wait methods instead of unwrap.
The difference from the example above would be
$responses = \GuzzleHttp\Promise\settle($requestArr)->wait();
I have created a full example below for reference on how to handle the $responses too.
require __DIR__ . '/vendor/autoload.php';
use GuzzleHttp\Client as GuzzleClient;
use GuzzleHttp\Promise as GuzzlePromise;
$client = new GuzzleClient(['timeout' => 12.0]); // see how i set a timeout
$requestPromises = [];
$sitesArray = SiteEntity->getAll(); // returns an array with objects that contain a domain
foreach ($sitesArray as $site) {
$requestPromises[$site->getDomain()] = $client->getAsync('http://' . $site->getDomain());
}
$results = GuzzlePromise\settle($requestPromises)->wait();
foreach ($results as $domain => $result) {
$site = $sitesArray[$domain];
$this->logger->info('Crawler FetchHomePages: domain check ' . $domain);
if ($result['state'] === 'fulfilled') {
$response = $result['value'];
if ($response->getStatusCode() == 200) {
$site->setHtml($response->getBody());
} else {
$site->setHtml($response->getStatusCode());
}
} else if ($result['state'] === 'rejected') {
// notice that if call fails guzzle returns is as state rejected with a reason.
$site->setHtml('ERR: ' . $result['reason']);
} else {
$site->setHtml('ERR: unknown exception ');
$this->logger->err('Crawler FetchHomePages: unknown fetch fail domain: ' . $domain);
}
$this->entityManager->persist($site); // this is a call to Doctrines entity manager
}
This example code was originally posted here.
Guzzle 6.0 has made sending multiple async requests very easy.
There are multiple ways to do it.
You can create the async requests and add the resultant promises to a single array, and get the result using the settle() method like this:
$promise1 = $client->getAsync('http://www.example.com/foo1');
$promise2 = $client->getAsync('http://www.example.com/foo2');
$promises = [$promise1, $promise2];
$results = GuzzleHttp\Promise\settle($promises)->wait();
You can now loop through these results and fetch the response using GuzzleHttpPromiseall or GuzzleHttpPromiseeach. Refer to this article for further details.
In case if you have an indeterminate number of requests to be sent(say 5 here), you can use GuzzleHttp/Pool::batch().
Here is an example:
$client = new Client();
// Create the requests
$requests = function ($total) use($client) {
for ($i = 1; $i <= $total; $i++) {
yield new Request('GET', 'http://www.example.com/foo' . $i);
}
};
// Use the Pool::batch()
$pool_batch = Pool::batch($client, $requests(5));
foreach ($pool_batch as $pool => $res) {
if ($res instanceof RequestException) {
// Do sth
continue;
}
// Do sth
}

Categories