Using Guzzle promises asyncronously - php

I am attempting to use guzzle promises in order to make some http calls, to illustrate what I have, I have made this simple example where a fake http request would take 5 seconds:
$then = microtime(true);
$promise = new Promise(
function() use (&$promise) {
//Make a request to an http server
$httpResponse = 200;
sleep(5);
$promise->resolve($httpResponse);
});
$promise2 = new Promise(
function() use (&$promise2) {
//Make a request to an http server
$httpResponse = 200;
sleep(5);
$promise2->resolve($httpResponse);
});
echo 'PROMISE_1 ' . $promise->wait();
echo 'PROMISE_2 ' . $promise2->wait();
echo 'Took: ' . (microtime(true) - $then);
Now what I would want to do is start both of them, and then make both echo's await for the response. What actually happens is promise 1 fires, waits 5 seconds then fires promise 2 and waits another 5 seconds.
From my understanding I should maybe be using the ->resolve(); function of a promise to make it start, but I dont know how to pass resolve a function in which I would make an http call

By using wait() you're forcing the promise to be resolved synchronously: https://github.com/guzzle/promises#synchronous-wait
According to the Guzzle FAQ you should use requestAsync() with your RESTful calls:
Can Guzzle send asynchronous requests?
Yes. You can use the requestAsync, sendAsync, getAsync, headAsync,
putAsync, postAsync, deleteAsync, and patchAsync methods of a client
to send an asynchronous request. The client will return a
GuzzleHttp\Promise\PromiseInterface object. You can chain then
functions off of the promise.
$promise = $client->requestAsync('GET', 'http://httpbin.org/get');
$promise->then(function ($response) {
echo 'Got a response! ' . $response->getStatusCode(); });
You can force an asynchronous response to complete using the wait()
method of the returned promise.
$promise = $client->requestAsync('GET', 'http://httpbin.org/get');
$response = $promise->wait();

This question is a little old but I see no answer, so I'll give it a shot, maybe someone will find it helpful.
You can use the function all($promises).
I can't find documentation about this function but you can find its implementation here.
The comment above this function starts like this:
Given an array of promises, return a promise that is fulfilled when all the items in the array are fulfilled.
Sounds like what you are looking for, so you can do something like this:
$then = microtime(true);
$promises = [];
$promises[] = new Promise(
function() use (&$promise) {
//Make a request to an http server
$httpResponse = 200;
sleep(5);
$promise->resolve($httpResponse);
});
$promises[] = new Promise(
function() use (&$promise2) {
//Make a request to an http server
$httpResponse = 200;
sleep(5);
$promise2->resolve($httpResponse);
});
all($promises)->wait();
echo 'Took: ' . (microtime(true) - $then);
If this function isn't the one that helps you solve your problem, there are other interesting functions in that file like some($count, $promises), any($promises) or settle($promises).

You can use the function Utils::all($promises)->wait();
Here is code example for "guzzlehttp/promises": "^1.4"
$promises = [];
$key = 0;
foreach(something...) {
$key++;
$promises[$key] = new Promise(
function() use (&$promises, $key) {
// here you can call some sort of async operation
// ...
// at the end call ->resolve method
$promises[$key]->resolve('bingo');
}
);
}
$res = Utils::all($promises)->wait();
It is important that your operation in promise must be non-blocking if you want to get concurrent workflow. For example, sleep(1) is blocking operation. So, 10 promises with sleep(1) - together are going to wait 10 sec anyway.

Related

Sending requests in parallel - passing bigger data into async worker functions

I need to send a few requests where I'm sending file_get_contents() output in each of them. Since they take a few seconds, and execute synchronously, it takes a long time to finish. I was thinking to use some library to make it concurrent/parallel, I found spatie/async installable by composer:
$pool = Pool::create();
$image = #file_get_content("http://cdn-link.com/img.jpg");
$pool[] = async(function () use ($image) {
return $request->send();
})->then(function ($response) {
$this->response = $response;
});
$pool[] = async(function () use ($image) {
return $differentRequest->send();
})->then(function ($response) {
$this->differentResponse = $response;
});
await($pool);
The problem however is that if the $image variable is too big (usually if image is bigger than 100kb), I'm getting serialize errors due to passing it through the use statement. It's the same issue when I try to do it through the $this object. Is there a way to pass data like this to async workers? Or maybe, is there a better way to make such operations parallel?

How do I "extract" the data from a ReactPHP\Promise\Promise?

Disclaimer: This is my first time working with ReactPHP and "php promises", so the solution might just be staring me in the face 🙄
I'm currently working on a little project where I need to create a Slack bot. I decided to use the Botman package and utilize it's Slack RTM driver. This driver utilizes ReactPHP's promises to communicate with the RTM API.
My problem:
When I make the bot reply on a command, I want to get the get retrieve the response from RTM API, so I can cache the ID of the posted message.
Problem is that, the response is being returned inside one of these ReactPHP\Promise\Promise but I simply can't figure out how to retrieve the data.
What I'm doing:
So when a command is triggered, the bot sends a reply Slack:
$response = $bot->reply($response)->then(function (Payload $item) {
return $this->reply = $item;
});
But then $response consists of an (empty?) ReactPHP\Promise\Promise:
React\Promise\Promise^ {#887
-canceller: null
-result: null
-handlers: []
-progressHandlers: & []
-requiredCancelRequests: 0
-cancelRequests: 0
}
I've also tried using done() instead of then(), which is what (as far as I can understand) the official ReactPHP docs suggest you should use to retrieve data from a promise:
$response = $bot->reply($response)->done(function (Payload $item) {
return $this->reply = $item;
});
But then $response returns as null.
The funny thing is, during debugging, I tried to do a var_dump($item) inside the then() but had forgot to remove a non-existing method on the promise. But then the var_dump actually returned the data 🤯
$response = $bot->reply($response)->then(function (Payload $item) {
var_dump($item);
return $this->reply = $item;
})->resolve();
So from what I can fathom, it's like I somehow need to "execute" the promise again, even though it has been resolved before being returned.
Inside the Bot's reply method, this is what's going on and how the ReactPHP promise is being generated:
public function apiCall($method, array $args = [], $multipart = false, $callDeferred = true)
{
// create the request url
$requestUrl = self::BASE_URL . $method;
// set the api token
$args['token'] = $this->token;
// send a post request with all arguments
$requestType = $multipart ? 'multipart' : 'form_params';
$requestData = $multipart ? $this->convertToMultipartArray($args) : $args;
$promise = $this->httpClient->postAsync($requestUrl, [
$requestType => $requestData,
]);
// Add requests to the event loop to be handled at a later date.
if ($callDeferred) {
$this->loop->futureTick(function () use ($promise) {
$promise->wait();
});
} else {
$promise->wait();
}
// When the response has arrived, parse it and resolve. Note that our
// promises aren't pretty; Guzzle promises are not compatible with React
// promises, so the only Guzzle promises ever used die in here and it is
// React from here on out.
$deferred = new Deferred();
$promise->then(function (ResponseInterface $response) use ($deferred) {
// get the response as a json object
$payload = Payload::fromJson((string) $response->getBody());
// check if there was an error
if (isset($payload['ok']) && $payload['ok'] === true) {
$deferred->resolve($payload);
} else {
// make a nice-looking error message and throw an exception
$niceMessage = ucfirst(str_replace('_', ' ', $payload['error']));
$deferred->reject(new ApiException($niceMessage));
}
});
return $deferred->promise();
}
You can see the full source of it here.
Please just point me in some kind of direction. I feel like I tried everything, but obviously I'm missing something or doing something wrong.
ReactPHP core team member here. There are a few options and things going on here.
First off then will never return the value from a promise, it will return a new promise so you can create a promise chain. As a result of that you do a new async operation in each then that takes in the result from the previous one.
Secondly done never returns result value and works pretty much like then but will throw any uncaught exceptions from the previous promise in the chain.
The thing with both then and done is that they are your resolution methods. A promise a merely represents the result of an operation that isn't done yet. It will call the callable you hand to then/done once the operation is ready and resolves the promise. So ultimately all your operations happen inside a callable one way or the other and in the broadest sense. (Which can also be a __invoke method on a class depending on how you set it up. And also why I'm so excited about short closures coming in PHP 7.4.)
You have two options here:
Run all your operations inside callable's
Use RecoilPHP
The former means a lot more mind mapping and learning how async works and how to wrap your mind around that. The latter makes it easier but requires you to run each path in a coroutine (callable with some cool magic).

json_decode - how to speed it up in GuzzleHttp asnyc request

In my application I'm using the GuzzleHttp library, but it not probably the problem, but is't good to say it.
Every minute (using cron) I need to get data from 40+ addresses, so I took GuzzleHttp lib to be fast as possible.
Guzzle code:
$client = new Client();
$rectangles = $this->db->query("SELECT * FROM rectangles");
$requests = function ($rectangles)
{
foreach($rectangles as $rectangle)
{
// some GEO coords (It's not important)
$left = $rectangle["lft"];
$right = $rectangle["rgt"];
$top = $rectangle["top"];
$bottom = $rectangle["bottom"];
$this->indexes[] = $rectangle;
$uri = "https://example.com/?left=$left&top=$top&right=$right&bototm=$bottom";
yield new Request("GET", $uri);
}
};
$pool = new Pool($client, $requests($rectangles), [
'concurrency' => 5,
'fulfilled' => function ($response, $index) {
$resp = $response->getBody();
$carray = json_decode($resp,true);
if($carray["data"] != null)
{
$alerts = array_filter($carray["data"], function($alert) {
return $alert["type"] == 'xxx';
});
$this->data = array_merge($this->data, $alerts);
$this->total_count += count($alerts);
}
},
'rejected' => function ($reason, $index) {},
]);
$promise = $pool->promise();
$promise->wait();
return $this->data;
Of course i made a benchmark of this.
1. getting data from another server 0.000xx sec
2. json_decode 0.001-0.0100 (this is probably the problem :-()
The entire code takes about 6-8 seconds. It depends on the amount of data that is on a remote server.
All the time I thought Guzzle performs request asynchronously, so it will takes time as the longest request.
(slowest request = 200 ms == all request = 200 ms) - But this is probably not true! Or I am doing something wrong.
I used an associative array in json_decode (I feel that this is an acceleration of 1 sec (I'm not sure...)).
My question is, can I this code more optimize and speed it up?
I wish to make it fast as one the slowest request (0.200 sec).
PS: The data that I'm getting from URLs are just long JSONs. Thanks!
EDIT: I changed the 'concurrency' => 5 to 'concurrency' => 100 and now the duration is about 2-4 sec
To start, increase the concurrency value in the Pool config to the total number of requests you need to send. This should be fine and may in fact get you even faster.
In regards to speeding up json_decode by milliseconds, this probably depends on a lot of factors including the hardware you are using on the server that processes the JSON as well the varying sizes of the JSON data. I don't think there is something you could do programmatically in PHP to speed up that core function. I could be wrong though.
Another part of your code to look at is: $this->data = array_merge($this->data, $alerts); You could try using a loop instead.
You also are performing double work with array_filter where internally the array is being iterated over before the array_merge.
So, instead of:
if ($carray["data"] != null) {
$alerts = array_filter($carray["data"], function($alert) {
return $alert["type"] == 'xxx';
});
$this->data = array_merge($this->data, $alerts);
$this->total_count += count($alerts);
}
Maybe try this:
if ($carray["data"] != null) {
foreach ($carray["data"] as $cdata) {
if ($cdata["type"] == 'xxx') {
$this-data[] = $cdata;
$this->total_count++;
}
}
}

Return synchronously when a React/Promise is resolved

I need to return from a function call once a React/Promise has been resolved. The basic idea is to fake a synchronous call from an ansynchronous one. This means that the outer function must return a value once a promise has been resolved or rejected.
This is to create a driver for RedBeanPHP using React/Mysql. I am aware that this will likely lead to CPU starvation in the React event loop.
My initial idea was to use a generator then call yield inside a \React\Promise\Deferred::then callback.
function synchronous()
{
$result = asynchronous();
}
function asynchronous()
{
$deferred = new \React\Promise\Deferred;
$sleep = function() use ($deferred)
{
sleep(5);
$deferred->resolve(true);
};
$deferred->then(function($ret) {
yield $ret;
});
$sleep();
}
The PHP generator class, AFAICT, is only directly constructable by the PHP engine itself. The then callback would need to directly invoke send on the generator of the asynchronous function for this to work.
PHP lacks both continuations as well as generator delegation, which would make it possible to call yield from inside a nested callback, making this entirely impossible to achieve for the moment.
ReactPhp offers the async tools package which has an await function.
Code can then become:
function synchronous()
{
$result = \React\Async\await(asynchronous());
}
function asynchronous()
{
$deferred = new \React\Promise\Deferred;
$sleep = function() use ($deferred)
{
sleep(5);
$deferred->resolve(true);
};
$sleep();
return $deferred->promise();
}

Enabling pecl_http request pool cookie persistence

I'm trying to enable cookie persistence for a sent of pecl_http HttpRequest objects, sent using the same HttpRequestPool object (if it matters); unfortunately documentation is quite scarce, and despite all my attempts I do not think things work properly.
I have tried both using HttpRequestDataShare (albeit documentation here is very scarce) and using the 'cookiestore' request option to point to a file. I still do not see cookies sent back to the server(s) in consecutive requests.
To be clear, by "cookie persistence" I mean that cookies set by the server are automatically stored and re-sent by pecl_http on consecutive requests, without me having to manually handle that (if it comes to it I will, but I am hoping I don't have to do that).
Can anyone point me to a working code sample or application that sends multiple HttpRequest objects to the same server and utilizes pecl_http's cookie persistence?
Thanks!
Mind that the request pool tries to send all requests in parallel, so they cannot know cookies not yet received of course. E.g.:
<?php
$url = "http://dev.iworks.at/ext-http/.cookie.php";
function cc($a) { return array_map("current", array_map("current", $a)); }
$single_req = new HttpRequest($url);
printf("1st single request cookies:\n");
$single_req->send();
print_r(cc($single_req->getResponseCookies()));
printf("waiting 1 second...\n");
sleep(1);
printf("2nd single request cookies:\n");
$single_req->send();
print_r(cc($single_req->getResponseCookies()));
printf("1st pooled request cookies:\n");
$pooled_req = new HttpRequestPool(new HttpRequest($url), new HttpRequest($url));
$pooled_req->send();
foreach ($pooled_req as $req) {
print_r(cc($req->getResponseCookies()));
}
printf("waiting 1 second...\n");
sleep(1);
printf("2nd pooled request cookies:\n");
$pooled_req = new HttpRequestPool(new HttpRequest($url), new HttpRequest($url));
$pooled_req->send();
foreach ($pooled_req as $req) {
print_r(cc($req->getResponseCookies()));
}
printf("waiting 1 second...\n");
sleep(1);
printf("now creating a request datashare\n");
$pooled_req = new HttpRequestPool(new HttpRequest($url), new HttpRequest($url));
$s = new HttpRequestDataShare();
$s->cookie = true;
foreach ($pooled_req as $req) {
$s->attach($req);
}
printf("1st pooled request cookies:\n");
$pooled_req->send();
foreach ($pooled_req as $req) {
print_r(cc($req->getResponseCookies()));
}
printf("waiting 1 second...\n");
sleep(1);
printf("2nd pooled request cookies:\n");
$pooled_req->send();
foreach ($pooled_req as $req) {
print_r(cc($req->getResponseCookies()));
}

Categories