I am using Symfony 4.4 and PHP 7.4.
I would like to use Guzzle Pool with multipart and I need to add a proxy.
$requests = function ($total) {
$uri = $this->parameters['endpoint'] . self::EMAIL_ENDPOINT;
for ($i = 0; $i < $total; $i++) {
yield new Request(
'POST', $uri,[], new MultipartStream($this->multiPartConfiguration->getMultipart())
);
}
};
$pool = new Pool($this->client, $requests(4185), [
//'concurrency' => 5,
'fulfilled' => function (Response $response, $index) {
$data = json_decode((string)$response->getBody(), true);
dump($data);
},
'rejected' => function (RequestException $reason, $index) {
dump('Reason : '.$reason->getMessage());
},
]);
I don't know where I can add the option proxy.
Thanks
I'm using guzzle in a loop to get an array of promises. This string in the loop:
$rpomises[] = $this->client->getAsync($url, $options);
Next, I make:
$res = collect(Promise\settle($promises)->wait());
One of the items from the result is:
As you can see it's just an array with string field and GuzzleHttp\Rsr7\Response object. So how I can get requested URL from this construction?
Thank you for any help!
I ran into the same issue.
My uses:
use GuzzleHttp\Client;
use GuzzleHttp\Promise\EachPromise;
use GuzzleHttp\Psr7\Response;
use Requests;
Extracted function from class:
public function getUrlsInParallelRememberingSource($urls,
$numberThreads = 10)
{
$client = new Client();
$responsesModified = [];
$promises = (function () use ($urls, $client, &$responsesModified)
{
foreach ($urls as $url) {
yield $client->getAsync($url)->then(function($response) use ($url, &$responsesModified)
{
$data = [
'url' => $url,
'body' => 'res' // pass here whatever you want
];
$responsesModified[] = $data;
return $response;
});
}
})();
$eachPromise = new EachPromise($promises,
[
'concurrency' => $numberThreads,
'fulfilled' => function (Response $response)
{
},
'rejected' => function ($reason)
{
}
]);
$eachPromise->promise()->wait();
return $responsesModified;
}
This gives follwing result:
http://i.kagda.ru/5001133750442_01-11-2020-00:34:55_5001.png
I am using guzzle promises to send a concurrent request but I want to control the concurrency that's why I want to use guzzle pool. How I can transform guzzle promises into guzzle pool. here is my code:
public function getDispenceryforAllPage($dispencery)
{
$GetAllproducts = [];
$promiseGetPagination = $this->client->getAsync($dispencery)
->then(function ($response) {
return $this->getPaginationNumber($response->getBody()->getContents());
});
$Pagination = $promiseGetPagination->wait();
$pagearray = array();
for($i=1;$i<=$Pagination; $i++){
$pagearray[] = $i;
}
foreach($pagearray as $page_no) {
$GetAllproducts[] = $this->client->getAsync($dispencery.'?page='.$page_no)
->then(function ($response) {
$promise = $this->getData($response->getBody()->getContents());
return $promise;
});
}
$results = GuzzleHttp\Promise\settle($GetAllproducts)->wait();
return $results;
}
I have the below working example for guzzle 6.
I use postAsync and pool.
function postInBulk($inputs)
{
$client = new Client([
'base_uri' => 'https://a.b.com'
]);
$headers = [
'Authorization' => 'Bearer token_from_directus_user'
];
$requests = function ($a) use ($client, $headers) {
for ($i = 0; $i < count($a); $i++) {
yield function() use ($client, $headers) {
return $client->postAsync('https://a.com/project/items/collection', [
'headers' => $headers,
'json' => [
"snippet" => "snippet",
"rank" => "1",
"status" => "published"
]
]);
};
}
};
$pool = new Pool($client, $requests($inputs),[
'concurrency' => 5,
'fulfilled' => function (Response $response, $index) {
// this is delivered each successful response
},
'rejected' => function (RequestException $reason, $index) {
// this is delivered each failed request
},
]);
$pool->promise()->wait();
}
Just use each_limit() or each_limit_all() (instead of settle()) with a generator.
function getDispenceryforAllPage($dispencery)
{
$promiseGetPagination = $this->client->getAsync($dispencery)
->then(function ($response) {
return $this->getPaginationNumber($response->getBody()->getContents());
});
$Pagination = $promiseGetPagination->wait();
$pagearray = range(1, $Pagination);
$requestGenerator = function () use ($dispencery, $pagearray) {
foreach ($pagearray as $page_no) {
yield $this->client->getAsync($dispencery . '?page=' . $page_no)
->then(function ($response) {
return $this->getData($response->getBody()->getContents());
});
}
};
// Max 5 concurrent requests
$results = GuzzleHttp\Promise\each_limit_all($requestGenerator(), 5)->wait();
return $results;
}
I have modified your code to support pool.
class GuzzleTest
{
private $client;
public function __construct($baseUrl)
{
$this->client = new \GuzzleHttp\Client([// Base URI is used with relative requests
'base_uri' => $baseUrl,
// You can set any number of default request options.
'timeout' => 2.0,]);
}
public function getDispenceryforAllPage($dispencery)
{
$GetAllproducts = [];
$promiseGetPagination = $this->client->getAsync($dispencery)
->then(function ($response) {
return $this->getPaginationNumber($response->getBody()->getContents());
});
$Pagination = $promiseGetPagination->wait();
$pagearray = array();
for ($i = 1; $i <= $Pagination; $i++) {
$pagearray[] = $i;
}
$pool = new \GuzzleHttp\Pool($this->client, $this->_yieldRequest($pagearray, $dispencery), [
'concurrency' => 5,
'fulfilled' => function ($response, $index) {
// this is delivered each successful response
},
'rejected' => function ($reason, $index) {
// this is delivered each failed request
},
]);
// Initiate the transfers and create a promise
$poolPromise = $pool->promise();
// Force the pool of requests to complete.
$results = $poolPromise->wait();
return $results;
}
private function _yieldRequest($pagearray, $dispencery){
foreach ($pagearray as $page_no) {
$uri = $dispencery . '?page=' . $page_no;
yield function() use ($uri) {
return $this->client->getAsync($uri);
};
}
}
}
for testing purposes, I have an array of 2000 image URIs (strings) with I download asynchronously with this functions. After some googling & testing & trying I've come up with 2 functions that both of them work (well to be honest downloadFilesAsync2 throws a InvalidArgumentException at the last line).
The function downloadFilesAsync2 is based on the class GuzzleHttp\Promise\EachPromise and downloadFilesAsync1 is based on the GuzzleHttp\Pool class.
Both functions download pretty well the 2000 files asynchronously, with the limit of 10 threads at the same time.
I know that they work, but nothing else. I wonder if someone could explain both aproaches, if one is better than the other, implications, etc.
// for the purpose of this question i've reduced the array to 5 files!
$uris = array /
"https://cdn.enchufix.com/media/catalog/product/u/n/unix-48120.jpg",
"https://cdn.enchufix.com/media/catalog/product/u/n/unix-48120-01.jpg",
"https://cdn.enchufix.com/media/catalog/product/u/n/unix-48120-02.jpg",
"https://cdn.enchufix.com/media/catalog/product/u/n/unix-48120-03.jpg",
"https://cdn.enchufix.com/media/catalog/product/u/n/unix-48120-04.jpg",
);
function downloadFilesAsync2(array $uris, string $dir, $overwrite=true) {
$client = new \GuzzleHttp\Client();
$requests = array();
foreach ($uris as $i => $uri) {
$loc = $dir . DIRECTORY_SEPARATOR . basename($uri);
if ($overwrite && file_exists($loc)) unlink($loc);
$requests[] = new GuzzleHttp\Psr7\Request('GET', $uri, ['sink' => $loc]);
echo "Downloading $uri to $loc" . PHP_EOL;
}
$pool = new \GuzzleHttp\Pool($client, $requests, [
'concurrency' => 10,
'fulfilled' => function (\Psr\Http\Message\ResponseInterface $response, $index) {
// this is delivered each successful response
echo 'success: '.$response->getStatusCode().PHP_EOL;
},
'rejected' => function ($reason, $index) {
// this is delivered each failed request
echo 'failed: '.$reason.PHP_EOL;
},
]);
$promise = $pool->promise(); // Start transfers and create a promise
$promise->wait(); // Force the pool of requests to complete.
}
function downloadFilesAsync1(array $uris, string $dir, $overwrite=true) {
$client = new \GuzzleHttp\Client();
$promises = (function () use ($client, $uris, $dir, $overwrite) {
foreach ($uris as $uri) {
$loc = $dir . DIRECTORY_SEPARATOR . basename($uri);
if ($overwrite && file_exists($loc)) unlink($loc);
yield $client->requestAsync('GET', $uri, ['sink' => $loc]);
echo "Downloading $uri to $loc" . PHP_EOL;
}
})();
(new \GuzzleHttp\Promise\EachPromise(
$promises, [
'concurrency' => 10,
'fulfilled' => function (\Psr\Http\Message\ResponseInterface $response) {
// echo "\t=>\tDONE! status:" . $response->getStatusCode() . PHP_EOL;
},
'rejected' => function ($reason, $index) {
echo 'ERROR => ' . strtok($reason->getMessage(), "\n") . PHP_EOL;
},
])
)->promise()->wait();
}
First, I will address the InvalidArgumentException within the downloadFilesAsync2 method. There are actually a pair of issues with this method. Both relate to this:
$requests[] = $client->request('GET', $uri, ['sink' => $loc]);
The first issue is the fact that Client::request() is a synchronous utility method which wraps $client->requestAsync()->wait(). $client->request() will return an instance of Psr\Http\Message\ResponseInterface, as a result $requests[] will actually be populated with ResponseInterface implementations. This is what, ultimately causes the InvalidArgumentException as the $requests does not contain any Psr\Http\Message\RequestInterface's, and the exception is thrown from within Pool::__construct().
A corrected version of this method should contain code which looks more like:
$requests = [
new Request('GET', 'www.google.com', [], null, 1.1),
new Request('GET', 'www.ebay.com', [], null, 1.1),
new Request('GET', 'www.cnn.com', [], null, 1.1),
new Request('GET', 'www.red.com', [], null, 1.1),
];
$pool = new Pool($client, $requests, [
'concurrency' => 10,
'fulfilled' => function(ResponseInterface $response) {
// do something
},
'rejected' => function($reason, $index) {
// do something error handling
},
'options' => ['sink' => $some_location,],
]);
$promise = $pool->promise();
$promise->wait();
To answer your second question, "What is the difference between these two methods", the answer is simply, there is none. To explain this, let me copy and paste Pool::__construct():
/**
* #param ClientInterface $client Client used to send the requests.
* #param array|\Iterator $requests Requests or functions that return
* requests to send concurrently.
* #param array $config Associative array of options
* - concurrency: (int) Maximum number of requests to send concurrently
* - options: Array of request options to apply to each request.
* - fulfilled: (callable) Function to invoke when a request completes.
* - rejected: (callable) Function to invoke when a request is rejected.
*/
public function __construct(
ClientInterface $client,
$requests,
array $config = []
) {
// Backwards compatibility.
if (isset($config['pool_size'])) {
$config['concurrency'] = $config['pool_size'];
} elseif (!isset($config['concurrency'])) {
$config['concurrency'] = 25;
}
if (isset($config['options'])) {
$opts = $config['options'];
unset($config['options']);
} else {
$opts = [];
}
$iterable = \GuzzleHttp\Promise\iter_for($requests);
$requests = function () use ($iterable, $client, $opts) {
foreach ($iterable as $key => $rfn) {
if ($rfn instanceof RequestInterface) {
yield $key => $client->sendAsync($rfn, $opts);
} elseif (is_callable($rfn)) {
yield $key => $rfn($opts);
} else {
throw new \InvalidArgumentException('Each value yielded by '
. 'the iterator must be a Psr7\Http\Message\RequestInterface '
. 'or a callable that returns a promise that fulfills '
. 'with a Psr7\Message\Http\ResponseInterface object.');
}
}
};
$this->each = new EachPromise($requests(), $config);
}
now if we compare that to an a simplified version of the code within the downloadFilesAsync1 method:
$promises = (function () use ($client, $uris) {
foreach ($uris as $uri) {
yield $client->requestAsync('GET', $uri, ['sink' => $some_location]);
}
})();
(new \GuzzleHttp\Promise\EachPromise(
$promises, [
'concurrency' => 10,
'fulfilled' => function (\Psr\Http\Message\ResponseInterface $response) {
// do something
},
'rejected' => function ($reason, $index) {
// do something
},
])
)->promise()->wait();
In both examples, there is a generator which yields promises that resolve to instances of ResponseInterface and that generator along with the configuration array (fulfilled callable, rejected callable, concurrency) is also fed into a new instance of EachPromise.
In summary:
downloadFilesAsync1 is functionally the same thing as using Pool only without the error checking that has been built into Pool::__construct().
There are a few errors within downloadFilesAsync2 which will cause the files to be downloaded in a synchronous fashion prior to receiving an InvalidArgumentException when the Pool is instantiated.
My only recommendation is: use whichever feels more intuitive for you to use.
I want to get transfer time for each request.
How I can use on_stats option for async requests?
http://docs.guzzlephp.org/en/latest/request-options.html#on-stats
My code:
<?php
use GuzzleHttp\{Pool, Client};
use GuzzleHttp\Psr7\{
Request, Response
};
$httpClient = new Client();
foreach ($items as $request) {
$requests[] = new Request(...);
}
$responses = Pool::batch($httpClient, $requests, ['fulfilled' => function($response, $index) {
});
Solution:
$responses = Pool::batch($httpClient, $requests, ['fulfilled' => function($response, $index) {
}, 'options' => ['on_stats' => function(TransferStats $stats) {
//..
}]]);