I have some PHP code whereby I post data to an API and get some response data. When a user pays the response status from the API should change. I am trying to delay the execution of PHP code and wait for 15 seconds before getting the payment status from the API which is not working as expected.. I have no idea how to do that in PHP or rather Laravel Queues since am using Laravel 5.4
~ Kindly assist?
$data = array(
'payment_reference' => $checkID,
'payment_type' => $type
);
//Post request to an API and get the status and store in a variable
$paySt = $this->global_Curl($data, 'api/payment/status')->data;
sleep(15);
//Second call to the API after sleep to check if status has changed
$payStat = $this->global_Curl($data, 'api/payment/status')->data;
if($payStat->status === '1'){
return 'true';
}
sleep() function is used to delay the current execution
check the link
https://www.w3schools.com/php/showphp.asp?filename=demo_func_misc_sleep
In your case, you need to again call the payment api after 4-5 sec delay
For Example:
#Payment process hit code
#sleep(4); //delay/wait for 4 second
#Again run code to check the payment status.
Related
I'm using an API that provides the score of the game. To show the score in real-time, I'm hitting API after every 2-3 seconds to see if any changes happen in the score but I think this method isn't very efficient. It would exceed my API requests very quickly. Is there any other way to achieve this method (that hits API only when there is a change in the score)? or Is this something that isn't possible with PHP?
This is what I'm currently doing (just an example)
<?php
while(true){
$api_url = "here goes url that returns response in json";
$score_api = file_get_contents($api_url);
$score_array = json_decode($score_api, true);
$score = $score_array['result']['score'];
sleep(2); // hit API again after 2 seconds
}
?>
I want to run more than 2500+ call on same time. So i have created a batch of 100 (2500/100 = 25 total call).
// REQUEST_BATCH_LIMIT = 100
$insert_chunks = array_chunk(['array', 'i want', 'to', 'insert'], REQUEST_BATCH_LIMIT);
$mh = $running = $ch = [];
foreach ($insert_chunks as $chunk_key => $insert_chunk) {
$mh[$chunk_key] = curl_multi_init();
$ch[$chunk_key] = [];
foreach ($insert_chunk as $ch_key => $_POST) {
$ch[$chunk_key][$ch_key] = curl_init('[Dynamic path of API]');
curl_setopt($ch[$chunk_key][$ch_key], CURLOPT_RETURNTRANSFER, true);
curl_multi_add_handle($mh[$chunk_key], $ch[$chunk_key][$ch_key]);
}
do {
curl_multi_exec($mh[$chunk_key], $running[$chunk_key]);
curl_multi_select($mh[$chunk_key]);
} while ($running[$chunk_key] > 0);
foreach(array_keys($ch[$chunk_key]) as $ch_key) {
$response = curl_getinfo($ch[$chunk_key][$ch_key]);
$returned_data = curl_multi_getcontent($ch[$chunk_key][$ch_key]);
curl_multi_remove_handle($mh[$chunk_key], $ch[$chunk_key][$ch_key]);
}
curl_multi_close($mh[$chunk_key]);
}
When i running this in local the system is hanged totally.
But this limit of batch like 100, 500 are not same on different device and server, so what is the reason about it? and what changes should i do to increase it?
If i am adding 1000 data with batch of 50, so for every batch 50 records should insert, but it insert randomly for a batch like 40, 42, 48, etc. so way this is skipped calls? (If i am using single record with simple cURL using loop then it is working fine.)
P.S. This code is i am using for bigcommrece API.
The BigCommerce API definitely throttles requests. The limits are different depending on which plan you are on.
https://support.bigcommerce.com/s/article/Platform-Limits
The "Standard Plan" is 20,000 per hour. I'm not sure how that is really implemented, though, because in my own experience, I've been throttled before hitting 20,000 requests in an hour.
As Nico Haase suggests, the key is for you to log every response you get from the BigCommerce API. While not a perfect system, they do usually provide a response that is helpful to understand the failure.
I run a process that makes thousands of API requests every day. I do sometimes have requests that fail as if the BigCommerce API simply dropped the connection.
This actually follows on from a previous question I had that, unfortunately, did not receive any answers so I'm not exactly holding my breath for a response but I understand this can be a bit of a tricky issue to solve.
I am currently trying to implement rate limiting on outgoing requests to an external API to match the limit on their end. I have tried to implement a token bucket library (https://github.com/bandwidth-throttle/token-bucket) into the class we are using to manage Guzzle requests for this particular API.
Initially, this seemed to be working as intended but we have now started seeing 429 responses from the API as it no longer seems to be correctly rate limiting the requests.
I have a feeling what is happening is that the number of tokens in the bucket is now being reset every time the API is called due to how Symfony handles services.
I am setting currently setting up the bucket location, rate and starting amount in the service's constructor:
public function __construct()
{
$storage = new FileStorage(__DIR__ . "/api.bucket");
$rate = new Rate(50, Rate::MINUTE);
$bucket = new TokenBucket(50, $rate, $storage);
$this->consumer = new BlockingConsumer($bucket);
$bucket->bootstrap(50);
}
I'm then attempting to consume a token before each request:
public function fetch(): array
{
try {
$this->consumer->consume(1);
$response = $this->client->request(
'GET', $this->buildQuery(), [
'query' => array_merge($this->params, ['api_key' => $this->apiKey]),
'headers' => [ 'Content-type' => 'application/json' ]
]
);
} catch (ServerException $e) {
// Process Server Exception
} catch (ClientException $e) {
// Process Client Exception
}
return $this->checkResponse($response);
}
I can't see anything obvious in that, that would allow it to request more than 50 times per minute unless the amount of available tokens was being reset on each request.
This is being supplied to a set of repository services that handle converting the data from each endpoint into objects used within the system. Consumers use the appropriate repository to request the data needed to complete their process.
If the amount of tokens is being reset by the bootstrap function being in service constructor, where should it be moved to within the Symfony framework that would still work with consumers?
I assume that it should work, but maybe try to move the ->bootstrap(50) call from every request? Not sure, but it can be the reason.
Anyway it's better to do that only once, as a part of your deployment (every time you deploy a new version). It doesn't have anything with Symfony, really, because the framework doesn't have any restrictions on deployment procedure. So it depends on how you do the deployment.
P.S. Have you considered to just handle 429 errors from the server? IMO you can wait (that's what BlockingConsumer does inside) when you receive 429 error. It's simpler and doesn't require an additional layer in your system.
BTW, have you considered nginx's ngx_http_limit_req_module as an alternative solution? It usually comes with nginx by default, so no additional actions to install, only a small configuration is required.
You can place an nginx proxy behind your code and the target web service and enable limits on it. Then in your code you will handle 429 as usual, but the requests will be throttled by your local nginx proxy, not by the external web service. So the final destination will get only limited amount of requests.
I have found a trick using Guzzle bundle for symfony.
I had to improve a sequential program sending GET requests to a Google API. In code example, it a pagespeed URL.
To have a rate limit, there an option to delay the requests before they are sent asynchronously.
Pagespeed rate limit is 200 requests per minute.
A quick calculation gives 200/60 = 0.3s per request.
Here is the code I tested on 300 urls, getting a fantastic result of no error, except if the url passed as a parameter in the GET request gives a 400 HTTP Error (Bad request).
I put a delay of 0.4s and the average result time is less then 0.2s, whereas it took more than a minute with a sequential program.
use GuzzleHttp;
use GuzzleHttp\Client;
use GuzzleHttp\Promise\EachPromise;
use GuzzleHttp\Exception\ClientException;
// ... Now inside class code ... //
$client = new GuzzleHttp\Client();
$promises = [];
foreach ($requetes as $i=>$google_request) {
$promises[] = $client->requestAsync('GET', $google_request ,['delay'=>0.4*$i*1000]); // delay is the trick not to exceed rate limit (in ms)
}
GuzzleHttp\Promise\each_limit($promises, function(){ // function returning the number of concurrent requests
return 100; // 1 or 100 concurrent request(s) don't really change execution time
}, // Fulfilled function
function ($response,$index)use($urls,$fp) { // $urls is used to get the url passed as a parameter in GET request and $fp a csv file pointer
$feed = json_decode($response->getBody(), true); // Get array of results
$this->write_to_csv($feed,$fp,$urls[$index]); // Write to csv
}, // Rejected function
function ($reason,$index) {
if ($reason instanceof GuzzleHttp\Exception\ClientException) {
$message = $reason->getMessage();
var_dump(array("error"=>"error","id"=>$index,"message"=>$message)); // You could write the errors to a file or database too
}
})->wait();
I have a 3rd party website that has webhooks that send to a specific url. I set it up to send to a blank page on my site (example: www.mysite.com/webhook.php)
I have a var_dump set up in the webhook.php that should display any information in the post or get.. I am new to webhooks and might just not understand how they work. I assume that i can have var_dump($_POST) in my php file to display an array of the HTTP request that is coming to my site.
I cannot see any requests on my site from the hook after sending test data.. any ideas?
I would do this to test the webhook.
<?php
$fWrite = fopen("log.txt","a");
$wrote = fwrite($fWrite, var_dump($_POST));
fclose($fWrite);
?>
This will return var_dump data in log.txt file, since as rickdenhaan said 36 mins ago, your current webhook.php return data to the webhook not your view.
You may need to create log.txt manually if you don't have right on directory (755)
I'm currently working with a payment API that use webhook. Webhook send data to X url then do action and return code to webhook. so webhook.php is the place where my order if ispaid or not is proceed or not... here what I did:
if ($payment->isPaid() == TRUE)
{
/*
* At this point you'd probably want to start the process of delivering the product to the customer.
*/
$con->query("UPDATE orders SET bankno = '$bankno', status = 'paid' WHERE ordertr = '$ids'");
}
elseif ($payment->isOpen() == FALSE)
{
/*
* The payment isn't paid and isn't open anymore. We can assume it was aborted.
*/
$con->query("UPDATE orders SET bankno = '$bankno', status = 'closed' WHERE ordertr = '$ids'");
}
So if order paid mark as paid in database if not mark as closed. This show webhook usage. Doing action according to what the webhook data send.
here is how i get content sent to my app:
$varname = json_decode(file_get_contents('php://input'));
well my content are JSON encoded, but I think this would be fine:
$varname = file_get_contents('php://input');
Long story short, I believe I've implemented the flow correctly, but on the final DoExpressCheckoutPayment I am getting:
ACK => SuccessWithWarning
L_ERRORCODE0 => 11607
L_SHORTMESSAGE0 => Duplicate Request
L_LONGMESSAGE0 => A successful transaction has already been completed for this token
Is this simply because I'm doing a GetExpressCheckoutDetails request before this? (the GetExpressCheckoutDetails ACK is "Success")
Note that the other data returned from DoExpressCheckoutPayment looks good:
PAYMENTINFO_0_PAYMENTSTATUS => Completed
PAYMENTINFO_0_ACK => Success
Should I just look for PAYMENTINFO_0_ACK and ignore the rest?
Sidenote- In case it's of interest, I'm using the PHP lib at https://github.com/thenbrent/paypal-digital-goods though I changed the stuff in the examples return.php to GetExpressCheckoutDetails on a new class since, of course, it made no sense to use the same purchase data every time and it has to be dynamic
EDIT: Okay I'm baffled. If I only call the GetExpressCheckoutDetails, then the response is:
CHECKOUTSTATUS => PaymentActionNotInitiated
However, if I call GetExpressCheckoutDetails and then DoExpressCheckoutPayment, the response of the preceding GetExpressCheckoutDetails becomes:
CHECKOUTSTATUS => PaymentActionCompleted (and it follows that the result of the subsequent DoExpressCheckoutPayment has the error of Duplicate Request)
How does that even make sense?! Did vanilla PHP just become asynchronous? Has paypal allocated enough money to buy a time machine? I'm probably missing something very basic, but I really don't see it yet :\
EDIT 2 Some Sample Code (didn't strip it to make it 100% vanilla, but should be pretty straightforward):
public static function completePaypalPurchase() {
self::configurePaypal(''); // Not relevent, just some setting of API keys and stuff
$paypalAPI = new PayPal_Purchase(); // Just to get purchase info so we can form the real purchase request
$response = $paypalAPI->get_checkout_details(); // Uses token from GET automatically
echo("RESPONSE FROM GET CHECKOUT");
print_r($response);
$ack = strtoupper($response['ACK']);
$userID = (int)$response['CUSTOM']; // This was passed earlier and is retrieved correctly
$numCredits = (int)$response['L_QTY0'];
//NOTE: If I comment out the below, then the $response above has CHECKOUTSTATUS => PaymentActionNotInitiated
// BUT If I do not comment it out, leaving it as-is then the $response above has CHECKOUTSTATUS => PaymentActionCompleted
// That's the core of the problem and where I'm stuck
if($ack == "SUCCESS" && $numCredits && $userID && $userID == $loggedInUserID) {
$paypalAPI = self::getPaypalPurchaseCredits($userID, $numCredits); // This creates a new PayPal_Purchase() with this info. In fact, it's the same method and therefore should return the same sort of object as the one used at the beginning of the flow
$response = $paypalAPI->process_payment();
$ack = strtoupper($response['ACK']);
echo("RESPONSE FROM DO PAYMENT");
print_r($response);
if(isset($response['PAYMENTINFO_0_TRANSACTIONID']) && $ack == "SUCCESS") {
$transactionID = $response['PAYMENTINFO_0_TRANSACTIONID'];
return(new APIReturn(true, array('ack'=>$ack, 'userid'=>$userID, 'numcredits'=>$numCredits, 'transactionid'=>$transactionID)));
}
}
return(new APIReturn(false, self::ERROR_NORESULT));
}
The correct order the calls is SetExpressCheckout, GetExpressCheckoutDetails, and then DoExpressCheckoutPayment. If you're geting a duplicate order error then you must be calling DECP twice somehow. You need to step through your code and see exactly how that's happening. It may be something in the class that you're using.
On that note, you may be interested in taking a look at my class instead. It makes everything very simple as it turns it all into PHP arrays and handles the gritty work for you.
If you don't want to start over with new class, though, then again, you need to step through what's happening with your code and through the class methods to see where it's getting posted twice.
Another thing I notice is that you're only checking for ACK = Success. That means when ACK = SuccessWithWarning it'll be treated as a failure. You need to handle both Success and SuccessWithWarning (which a decent class library would handle for you.)
Sorry I don't have a more definitive answer, but again, somewhere either in your code or in the library it must be getting posted twice. Are you logging the raw API requests and responses along the way? If so you'd be able to confirm it's getting hit twice because you'd have 2 sets of DECP requests and responses logged.