I am using Guzzle to consume a SOAP API. I have to make 6 requests, but in the future this might be even an indeterminate amount of requests.
Problem is that the requests are being send sync, instead of async. Every request on it's own takes +-2.5s. When I send all 6 requests paralell (at least thats what I am trying) it takes +- 15s..
I tried all the examples on Guzzle, the one with a fixed array with $promises, and even the pool (which I need eventually). When I put everything in 1 file (functional) I manage to get the total timing back to 5-6s (which is OK right?). But when I put everything in Objects and functions somehow I do something that makes Guzzle decide to do them sync again.
Checks.php:
public function request()
{
$promises = [];
$promises['requestOne'] = $this->requestOne();
$promises['requestTwo'] = $this->requestTwo();
$promises['requestThree'] = $this->requestThree();
// etc
// wait for all requests to complete
$results = \GuzzleHttp\Promise\settle($promises)->wait();
// Return results
return $results;
}
public function requestOne()
{
$promise = (new API\GetProposition())
->requestAsync();
return $promise;
}
// requestTwo, requestThree, etc
API\GetProposition.php
public function requestAsync()
{
$webservice = new Webservice();
$xmlBody = '<some-xml></some-xml>';
return $webservice->requestAsync($xmlBody, 'GetProposition');
}
Webservice.php
public function requestAsync($xmlBody, $soapAction)
{
$client = new Client([
'base_uri' => 'some_url',
'timeout' => 5.0
]);
$xml = '<soapenv:Envelope>
<soapenv:Body>
'.$xmlBody.'
</soapenv:Body>
</soapenv:Envelope>';
$promise = $client->requestAsync('POST', 'NameOfService', [
'body' => $xml,
'headers' => [
'Content-Type' => 'text/xml',
'SOAPAction' => $soapAction, // SOAP Method to post to
],
]);
return $promise;
}
I changed the XML and some of the parameters for abbreviation. The structure is like this, because I eventually have to talk against multiple API's, to thats why I have a webservice class in between that does all the preparation needed for that API. Most API's have multiple methods/actions that you can call, so that why I have something like. API\GetProposition.
Before the ->wait() statement I can see all $promises pending. So it looks like there are being send async. After ->wait() they have all been fulfilled.
It's all working, minus the performance. All 6 requests don't take more then 2.5 to max 3s.
Hope someone can help.
Nick
The problem was that the $client object was created with every request. Causing the curl multi curl not to be able to know which handler to use.
Found the answer via https://stackoverflow.com/a/46019201/7924519.
Related
I have this code with a while loop to exceed the limit of the API of 1000 records by merging all the arrays in the response together in one array and passing it to the view, but it takes too long waiting time, is there any better way to do it and speed up the process?
this is my code
public function guzzleGet()
{
$aData = [];
$sCursor = null;
while($aResponse = $this->guzzleGetData($sCursor))
{
if(empty($aResponse['data']))
{
break;
}
else
{
$aData = array_merge($aData, $aResponse['data']);
if(empty($aResponse['meta']['next_cursor']))
{
break;
}
else
{
$sCursor = $aResponse['meta']['next_cursor'];
}
}
}
$user = Auth::user()->name;
return view("".$user."/home")->with(['data' => json_encode($aData)]);
}
protected function guzzleGetData($sCursor = null)
{
$client = new \GuzzleHttp\Client();
$token = 'token';
$response = $client->request('GET', 'https://data.beneath.dev/v1/user/project/table', [
'headers' => [
'Authorization' => 'Bearer '.$token,
],
'query' => [
'limit' => 1000,
'cursor' => $sCursor
]
]);
if($response->getBody())
{
return json_decode($response->getBody(), true) ?: [];
}
return [];
}
You would have to debug where is the bottleneck. If the reason its slow is your network/bandwidth, then that might be the issue. It could also be the API is limiting your download speed.
Another bottleneck could be the download speed of the client. Since you are building up a big array, when the server sends it to the client it needs to download it and can take time.
You could potentially increase speed a little by reusing the same curl handler, or guzzle client. Another are you could improve is the array_merge, you can use your own custom logic explained here: https://stackoverflow.com/a/23348715/8485567.
If you have control of the external API, make sure to use gzip and HTTP/2 or possibly even gRPC instead of http.
However I would recommend you do this on the client side using JS, like that you can avoid the additional bandwidth it takes for the client to download it from the server. You could use the same approach of limit 1000, or could even stream the response as it comes in and render it.
I created a simple API in Lumen (application A) which:
receives PSR-7 request interface
replaces URI of the request to the application B
and sends the request through Guzzle.
public function apiMethod(ServerRequestInterface $psrRequest)
{
$url = $this->getUri();
$psrRequest = $psrRequest->withUri($url);
$response = $this->httpClient->send($psrRequest);
return response($response->getBody(), $response->getStatusCode(), $response->getHeaders());
}
The above code passes data to the application B for the query params, x-www-form-urlencoded, or JSON content type. However, it fails to pass the multipart/form-data. (The file is available in the application A: $psrRequest->getUploadedFiles()).
Edit 1
I tried replacing the Guzzle invocation with the Buzz
$psr18Client = new Browser(new Curl(new Psr17Factory()), new Psr17Factory());
$response = $psr18Client->sendRequest($psrRequest);
but still, it does not make a difference.
Edit 2
Instances of ServerRequestInterface represent a request on the server-side. Guzzle and Buzz are using an instance of RequestInterface to send data. The RequestInterface is missing abstraction over uploaded files. So files should be added manually http://docs.guzzlephp.org/en/stable/request-options.html#multipart
$options = [];
/** #var UploadedFileInterface $uploadedFile */
foreach ($psrRequest->getUploadedFiles() as $uploadedFile) {
$options['multipart'][] = [
'name' => 'file',
'fileName' => $uploadedFile->getClientFilename(),
'contents' => $uploadedFile->getStream()->getContents()
];
}
$response = $this->httpClient->send($psrRequest, $options);
But still no luck with that.
What I am missing? How to change the request so files will be sent properly?
It seems that $options['multipart'] is taken into account when using post method from guzzle. So changing the code to the $response = $this->httpClient->post($psrRequest->getUri(), $options); solves the problem.
Also, it is important not to attach 'content-type header.
I'm trying to send a request to an endpoint, but I don't want to wait for them to respond, as I don't need the response. So I'm using Guzzle, here's how:
$url = 'http://example.com';
$client = new \Guzzelhttp\Client();
$promise = $client->postAsync($url, [
'headers' => ['Some headers and authorization'],
'query' => [
'params' => 'params',
]
])->then(function ($result) {
// I don't need the result. So I just leave it here.
});
$promise->wait();
A I understood, I have to call the wait method on the client in order to actually send the request. But it's totally negates the request being "async" because if the url was not accessible or the server was down, the application would wait for a timeout or any other errors.
So, the question here is, what does Guzzle mean by "async" when you have to wait for the response anyway? And how can I call a truly async request with PHP?
Thanks
What you can do is:
$url = 'http://example.com';
$client = new \Guzzelhttp\Client();
$promise = $client->postAsync($url, [
'headers' => ['Some headers and authorization'],
'query' => [
'params' => 'params',
]
])->then(function ($result) {
return $result->getStatusCode();
})
->wait();
echo $promise;
You need the wait() to be called as the last line so you get the result which will come from your promise.
In this case it will return just the status code.
Just as mentioned in Github is not able to "fire and forget"so i think what you are trying to achieve, like a complete promise like in Vue or React won't work for you here the way you want it to work.
Another approach and what i do personally is to use a try-catch on guzzle requests, so if there is a guzzle error then you catch it and throw an exception.
Call then() method if you don't want to wait for the result:
$client = new GuzzleClient();
$promise = $client->getAsync($url)
$promise->then();
Empty then() call will make an HTTP request without waiting for result, Very similar to
curl_setopt(CURLOPT_RETURNTRANSFER,false)
use Illuminate\Support\Facades\Http;
...Some Code here
$prom = Http::timeout(1)->async()->post($URL_STRING, $ARRAY_DATA)->wait();
... Some more important code here
return "Request sent"; //OR whatever you need to return
This works for me as I don't need to know the response always.
It still uses wait() but because of the small timeout value it doesn't truly wait for the response.
Hope this helps others.
I am using guzzle for getting post of a single page and it is working fine. But now the problem is page has pagination 20 post on each page. i want to get all the posts. How can I do it by using guzzle ?
here is my code:
public function __construct()
{
$this->client = new Client([
'base_uri' => 'https://xxxxxx.com/',
'Content-Type' => 'application/json',
]);
}
public function post($post)
{
$response = $this->client->request('GET', $post);
$output = $response->getBody()->getContents();
$data = $this->getData($output);
return $data;
}
There is no way to do it in general. HTTP as a protocol doesn't specify anything about pagination. So depends on the server you work with. Usually the response contains something like
{
"page": 5,
"total": 631
}
Based on this info you can create an URL for the next page by adding ?page=6 (also depends on the server) and request it.
How can i call methods asynchronously in symfony2.7 Like.
I have to retrieve data making 4 different API connections. The problem is slow response from my application since PHP is synchronous so it has to wait for the response from all the API and then render the data.
class MainController{
public function IndexAction(){
// make Asynchronous Calls to GetFirstAPIData(), GetSecondAPIData(), GetThridAPIData()
}
public function GetFirstAPIData(){
// Get data
}
public function GetSecondAPIData(){
// Get data
}
public function GetThridAPIData(){
// Get data
}
}
You can use guzzle for that, especially when we're are talking about http based apis. Guzzle is a web-client which has async calls built in.
The code would look somewhat like this: (taken from the docs)
$client = new Client(['base_uri' => 'http://httpbin.org/']);
// Initiate each request but do not block
$promises = [
'image' => $client->getAsync('/image'),
'png' => $client->getAsync('/image/png'),
'jpeg' => $client->getAsync('/image/jpeg'),
'webp' => $client->getAsync('/image/webp')
];
// Wait on all of the requests to complete. Throws a ConnectException
// if any of the requests fail
$results = Promise\unwrap($promises);
// Wait for the requests to complete, even if some of them fail
$results = Promise\settle($promises)->wait();
// You can access each result using the key provided to the unwrap
// function.
echo $results['image']->getHeader('Content-Length');
echo $results['png']->getHeader('Content-Length');
In this example all 4 requests are executed in parallel. Note: Only IO is async not the handling of the results. But that's probably what you want.