PHP worker (Multiple processes and/or threads) - php

I'm trying to fetch statistical data from a web service. Each request has a response time of 1-2 seconds and I've to submit the request for thousands of IDs, one at a time. All requests would sum up to a few hours, because of the server's response time.
I want to parallelize as much requests as possible (the server's can handle it). I've installed PHP7 and pthreads (CLI-only), but the maximum number of threads is limited (20 in Windows PHP CLI), so I've to start multiple processes.
Is there any simple PHP based framework/library for multi-process/pthread and job-queue handling? I don't need a large framework like symfony or laravel.

Workers
You could look into using php-resque which doesn't require pthreads.
You will have to run a local Redis server though (could also be remote). I believe you can run Redis on Windows, according to this SO
Concurrent Requests
You may also want to look into sending concurrent requests using something like GuzzleHttp, you can find examples on how to use that here
From the Docs:
You can send multiple requests concurrently using promises and asynchronous requests.
use GuzzleHttp\Client;
use GuzzleHttp\Promise;
$client = new Client(['base_uri' => 'http://httpbin.org/']);
// Initiate each request but do not block
$promises = [
'image' => $client->getAsync('/image'),
'png' => $client->getAsync('/image/png'),
'jpeg' => $client->getAsync('/image/jpeg'),
'webp' => $client->getAsync('/image/webp')
];
// Wait on all of the requests to complete. Throws a ConnectException
// if any of the requests fail
$results = Promise\unwrap($promises);
// Wait for the requests to complete, even if some of them fail
$results = Promise\settle($promises)->wait();
// You can access each result using the key provided to the unwrap
// function.
echo $results['image']->getHeader('Content-Length');
echo $results['png']->getHeader('Content-Length');

Related

PHP AJAX, return multiple values in realtime in a for loop

What I have:
foreach ($contacts as $contact) {
$this->StocklistMailer($contact, $weekOrDay, $data, $content, $itemGroup);
}
return new Response('completed', 204);
What I would like is :
foreach ($contacts as $contact) {
$this->StocklistMailer($contact, $weekOrDay, $data, $content, $itemGroup);
return new Response($contact->getEmail, 204);
}
return new Response('completed', 204);
And it returns to a AJAX call on the page, the reason I want to accomplish this is because I want realtime feedback to whom it has send an email to.
You can't send multiple responses from your application, the whole idea is that you only generate one response.
However, you can put all the information you require in one response.
$emails = [];
foreach ($contacts as $contact) {
$this->StocklistMailer($contact, $weekOrDay, $data, $content, $itemGroup);
$emails[] = $contact->getEmail;
}
return new Response(json_encode($emails), 200);
Note that I changed 204 (No content) to 200 (OK).
You would have to break it into multiple calls;
First call will return the array of contacts which you will store in your JavaScript array. (assuming you are using Client Side JS/Ajax to call this php file).
Then loop through the array and make as many calls to php as many contacts are in your array while passing each contact at a time to php.
You can show fancy progress bar as you are looping through the array :)
You can do that, but not from Symfony. Look into ReactPHP, Ratchet and related technologies.
https://blog.wyrihaximus.net/2015/03/reactphp-sockets/
http://socketo.me/docs/hello-world
You can create a websockets server that would listen on localhost for messages from your Symfony application and will be redirecting them using websockets to the browser.
The cliend would open a websocket connection to your websockets server, and send the request to your application. While the application is processing, it is sending the progress to the websocket server using a socket on the local machine. The client should be getting the progress realtime from the websocket and should be displaying it.
This way, you get a realtime interactive interface, with a long-running process.
Even better would be creating a rabbitmq worker, that would be sending the emails and reporting the progress to the websockets server. You would create the task for the worker from your Symfony application, and therefore you wouldn't be limited by the execution time limit for php requests. Another win with the rabbitmq worker is that you can have only one (or as many as you like), and therefore the tasks will queue and you won't be burning server resources by 50 processes generating and sending emails at once.

PHP's SoapClient times out after twice the time of default_socket_timeout when fetching WSDL

I'm using PHP's built-in SoapClient, and it needs to time out after 5 seconds. Therefore I'm using ini_set("default_socket_timeout", 5);. However, when fetching the WSDL it times out after 10 seconds instead. If I change the default socket timeout to i.e. 1 second it times out after 2 and so on. The actual soap calls time out after 5 seconds as expected though.
Is this a bug in PHP, expected behaviour or is there some additional setting I need to change? Using PHP 5.6.28.
Thanks!
There is an additional timeout setting that you will want to change: connection_timeout which is passed to the SoapClient in the $options array parameter. This timeout controls how long the SoapClient waits (in seconds) to connect to the Soap service, which is what you are looking for.
Usage:
$soapclient = new SoapClient($uri, array('connection_timeout' => 5));
Once connected, "default_socket_timeout" comes into play and controls how long the SoapClient waits for a response.

How to use guzzle to detect we are going to get a huge HTTP respond, before we even start to download the respond

We are still using legacy Guzzle 3.x.
Time to time, we need to send a HTTP GET respond to affiliate pixel fire URL.
Most of the time, we are expecting a few kb text respond.
However, some affiliates will send us a huge binary file, which we are not interested.
We would like to detect such scenario earlier, before even start to waste time to download the unnecessary binary file.
$url = 'http://speedtest.ftp.otenet.gr/files/test1Gb.db';
$client = new \Guzzle\Http\Client();
echo "1) Grab 1GB file...\n";
$s = $client->get($url,
array(
'timeout' => 5, // Response timeout
'connect_timeout' => 5, // Connection timeout
)
);
echo "2) Grab 1GB file...\n";
// Code will "hang" here to wait 1GB file finished download.
$response = $s->send();
echo "Grab 1GB file done\n";
// Check if a header exists.
if ($response->hasHeader('Content-Length')) {
$content_length = $response->getHeader('Content-Length');
// If the response is too large, we will reject it.
}
So, our code will spend a huge amount of time in executing
$response = $s->send();
If we can know this affiliate is going to send us an unnecessary 1GB binary file, we can give up earlier.
May I know, is there any way to know our respond size, even before we spend time to start downloading the unnecessary huge respond?
It's not a direct answer, but if migrate to Guzzle 6, you will be able to use streaming responses.
$client = new GuzzleHttp\Client();
$response = $client->request('GET', 'URL', ['stream' => true]);
// Only headers are downloaded here.
$response->getHeader('Content-Length');
I think that Guzzle 3 supports something like this, but I'm not sure.

Gearman client NON_BLOCKING mode vs doBackground

I'm looking for what the options GEARMAN_CLIENT_NON_BLOCKING does.
Using this example:
<?php
$client = new GearmanClient();
$client->setOptions(GEARMAN_CLIENT_NON_BLOCKING);
$client->addServer('127.0.0.1', 4730);
var_dump("before");
$client->doNormal('queue', 'data');
var_dump("after");
The script never prints "after" if no worker listens on "queue" function.
Reading the documentation (http://gearman.info/libgearman/gearman_client_options.html), this options should permit to perform the request in "non-blocking" mode.
I known if I want to send a job without waiting the response from the worker, I should use "doBackground" method.
So, which means "non-blocking" mode at the client side?

Guzzle asynch request not working

I'm using Guzzle that I installed via composer and failing to do something relatively straightforward.
I might be misunderstanding the documentation but essentially what I'm wanting to do is run a POST request to a server and continue executing code without waiting for a response. Here's what I have :
$client = new \GuzzleHttp\Client(/*baseUrl, and auth credentials here*/);
$client->post('runtime/process-instances', [
'future'=>true,
'json'=> $data // is an array
]);
die("I'm done with the call");
Now lets say the runtime/process-instances runs for about 5mn, I will not get the die message before those 5mn are up... When instead I want it right after the message is sent to the server.
Now I don't have access to the server so I can't have the server respond before running the execution. I just need to ignore the response.
Any help is appreciated.
Things I've tried:
$client->post(/*blabla*/)->then(function ($response) {});
It is not possible in Guzzle to send a request and immediately exit. Asynchronous requests require that you wait for them to complete. If you do not, the request will not get sent.
Also note that you are using post instead of postAsync, the former is a synchronous (blocking) request. To asynchronously send a post request, use the latter. In your code example, by changing post to postAsync the process will exit before the request is complete, but the target will not receive that request.
Have you tried setting a low timeout?

Categories