I have a php application that gets requests for part numbers from our server. At that moment, we reach out to a third party API to gather pricing information to make sure we have the latest pricing for that particular request. Sometimes the third party API is slow or it might be down, so we have a database that stores the latest pricing requests for each particular part number that we can use as a fallback. I'd like to run the request to the third party API and the database in parallel using Gearman. Here is the idea:
Receive request
Through gearman, create two jobs:
Request to third party API
MySQL database lookup
Wait in a loop and return the results based on the following conditions:
If the third party API has completed return that result, return that result immediately
If an elapsed time has passed, (e.g. 2 seconds) and the third party API hasn't responded, return the MySQL lookup data
Using gearman, my thoughts were to either run the two tasks in the foreground and break out of runTasks() within the setCompleteCallback() call, or to run them in the background and check in on the two tasks within a separate loop and check in on the tasks using jobStatus().
Unfortunately, I can't get either route to work for me while still getting access to the resulting data. Is there a a better way, or are there some existing examples of how someone has made this work?
I think you've described a single blocking problem, namely the results of an 3rd-party API lookup. There's two ways you can handle this from my point of view, either you could abort the attempt altogether if you decide that you've run out of time or you could report back to the client that you ran out of time but continue on with the lookup anyway, just to update your local cache just in case it happens to respond slower than you would like. I'll describe how I would go about the former problem because that would be easier.
From the client side:
$request = array(
'productId' => 5,
);
$client = new GearmanClient( );
$client->addServer( '127.0.0.1', 4730 );
$results = json_decode($client->doNormal('apiPriceLookup', json_encode( $request )));
if($results && property_exists($results->success) && $results->success) {
// Use local data
} else {
// Use fresh data
}
This will create a job on the job server with a function name of 'apiPriceLookup' and pass it the workload data containing a product id of 5. It will wait for the results to come back, and check for a success property. If it exists and is true, then the api lookup was successful.
The idea is to set the timeout condition then in the worker task, which completely depends on how you're implementing the API lookup. If you're using cURL (or some wrapper around cURL), you can see the answer to how to detect a timeout here.
From the worker side:
$worker= new GearmanWorker();
$worker->addServer();
$worker->addFunction("apiPriceLookup", "apiPriceLookup", $count);
while ($worker->work());
function apiPriceLookup($job) {
$payload = json_decode($job->workload());
try {
$results = [
'data' => PerformApiLookupForProductId($payload->productId),
'success' => true,
];
} catch(Exception $e) {
$results = ['success' => false];
}
return json_encode($results);
}
This just creates a GearmanWorker object and subscribes it the function of apiPriceLookup. It will call the function apiPriceLookup whenever a client submits a task to the job server. That function calls out to another function, PerformApiLookupForProductId, which should be written so as to throw an exception whenever a timeout condition occurs.
I don't think this would be considered using exceptions to control logic flow, I think timeout conditions generally are exceptional (or should be) events. For instance, Guzzle will throw a GuzzleHttp\Exception\RequestException when it has decided to timeout.
Related
I'm working on a process where I have a Queue, and I start with a known unit of work. As I process the unit of work, it will result in zero-or-more (unknown) units of work that gets added to the Queue. I continue to process the queue until there's no more work to perform.
I'm working on a proof-of-concept using Guzzle where I accept a first URL to seed the queue, then process the body of the response which may result in more URLs that need to be processed. My goal is to add them to the queue and have Guzzle continue processing them until there's nothing left in the queue.
In other cases, I can define a variable as the queue, and pass it by-reference into a function so that it gets updated with new work. But in the case of Guzzle Async Pools (which I think is the most efficient way to handle this), there doesn't seem to be a clear way to update the queue in-process and have the Pool execute the requests.
Does Guzzle provide a built-in approach for updating the list of Pool requests from inside a fulfilled Promise callback?
use ArrayIterator;
use GuzzleHttp\Promise\EachPromise;
use GuzzleHttp\TransferStats;
use Psr\Http\Message\ResponseInterface;
// Re-usable callback which prints the URL being requested
function onStats(TransferStats $stats) {
echo sprintf(
'%s (%s)' . PHP_EOL,
$stats->getEffectiveUri(),
$stats->getTransferTime()
);
}
// The queue of work to be performed
$requests = new ArrayIterator([
$client->get('http://httpbin.org/anything', [
'on_stats' => 'onStats',
])
]);
// Process the queue, which results in more work to be performed
$p = (new EachPromise($requests, [
'concurrency' => 50,
'fulfilled' => function(ResponseInterface $response) use ($client, &$requests) {
$hash = bin2hex(random_bytes(10));
$requests[] = $client->get(sprintf('http://httpbin.org/anything/%s', $hash), [
'on_stats' => 'onStats',
]);
},
'rejected' => function($reason) {
echo $reason . PHP_EOL;
},
]))->promise();
// Wait for everything to finish
$p->wait(true);
My question appears to be similar to Incrementally add requests to a Guzzle 5.0 Pool (Rolling Requests), but is different in that these refer to different major versions of Guzzle.
After posting this, I was able to do more searching and found some more SO threads and GitHub Issues for Guzzle. I found this library, which appears to address the problem.
https://github.com/alexeyshockov/guzzle-dynamic-pool
I need to delay execution of one method in Laravel 5.0, or to be more specific, I need it to be executed at special given time. The method is sending notification through GCM to mobile app, and I need to do it repeatedly and set it to a different time. As I found out, there is no way how to intentionally delay notification in GCM. I know basics from working with cron and scheduling in Laravel, but I cant find answer to my problem.
The method I need to execute with delay is this:
public function pushAndroid($receiver, $message, $data)
{
$pushManager = new PushManager(PushManager::ENVIRONMENT_DEV);
$gcmAdapter = new GcmAdapter(array(
'apiKey' => self::GCM_API_KEY
));
$androidDevicesArray = array(new Device($receiver));
$devices = new DeviceCollection($androidDevicesArray);
$msg = new GcmMessage($message, $data);
$push = new Push($gcmAdapter, $devices, $msg);
$pushManager->add($push);
$pushManager->push();
}
Information when (date+time) it should be executed is stored in table in database. And for every notification, I need to do it only once, not repeatedly.
If you take a look at https://laravel.com/docs/5.6/scheduling you can setup something that fits your needs.
Make something with the looks of
$schedule->call(function () {
// Here you get the collection for the current date and time
$notifications = YourModel::whereDate('datecolumn',\Carbon\Carbon::now());
...
})->everyMinute();
You can also use Queues with Delayed Dispatching, if this makes more sense. Since you hinted you only need to do it once.
ProcessJobClassName::dispatch($podcast)->delay(now()->addMinutes(10));
I have a php function which gets called when someone visits POST www.example.com/webhook. However, the external service which I cannot control, sometimes calls this url twice in rapid succession, messing with my logic since the webhook persists stuff in the database which takes a few ms to complete.
In other words, when the second request comes in (which can not be ignored), the first request is likely not completed yet however I need this to be completed in the order it came in.
So I've created a little hack in Laravel which should "throttle" the execution with 5 seconds in between. It seems to work most of the time. However an error in my code or some other oversight, does not make this solution work everytime.
function myWebhook() {
// Check if cache value (defaults to 0) and compare with current time.
while(Cache::get('g2a_webhook_timestamp', 0) + 5 > Carbon::now()->timestamp) {
// Postpone execution.
sleep(1);
}
// Create a cache value (file storage) that stores the current
Cache::put('g2a_webhook_timestamp', Carbon::now()->timestamp, 1);
// Execute rest of code ...
}
Anyone perhaps got a watertight solution for this issue?
You have essentially designed your own simplified queue system which is the right approach but you can make use of the native Laravel queue to have a more robust solution to your problem.
Define a job, e.g: ProcessWebhook
When a POST request is received to /webhook queue the job
The laravel queue worker will process one job at a time[1] in the order they're received, ensuring that no matter how many requests are received, they'll be processed one by one and in order.
The implementation of this would look something like this:
Create a new Job, e.g: php artisan make:job ProcessWebhook
Move your webhook processing code into the handle method of the job, e.g:
public function __construct($data)
{
$this->data = $data;
}
public function handle()
{
Model::where('x', 'y')->update([
'field' => $this->data->newValue
]);
}
Modify your Webhook controller to dispatch a new job when a POST request is received, e.g:
public function webhook(Request $request)
{
$data = $request->getContent();
ProcessWebhook::dispatch($data);
}
Start your queue worker, php artisan queue:work, which will run in the background processing jobs in the order they arrive, one at a time.
That's it, a maintainable solution to processing webhooks in order, one-by-one. You can read the Queue documentation to find out more about the functionality available, including retrying failed jobs which can be very useful.
[1] Laravel will process one job at a time per worker. You can add more workers to improve queue throughput for other use cases but in this situation you'd just want to use one worker.
In my app I'm using server-sent events and have the following situation (pseudo code):
$response = new StreamedResponse();
$response->setCallback(function () {
while(true) {
// 1. $data = fetchData();
// 2. echo "data: $data";
// 3. sleep(x);
}
});
$response->send();
My SSE Response class accepts a callback to gather the data (step 1), which actually performs a database query. Now to my problem: As I am trying to avoid polling the database each X seconds, I want to make use of Doctrine's onFlush event to set a flag that the corresponding entity has been actually changed, which would then be checked within fetchData callback. Normally, I would do this by setting a flag on current user session, but as the streaming loop constantly writes data, the session cannot be accessed within the callback. So has anybody an idea how to resolve this problem?
BTW: I'm using Symfony 3.3 and Doctrine 2.5 - thanks for any help!
I know that this question is from a long time ago, but here's a suggestion:
Use shared memory (the php shm_*() functions). That way your flag isn't tied to a specific session.
Be sure to lock and unlock around access to the shared memory (I usually use a semaphore).
I am trying to transfer an ongoing call to another worker but before I can transfer the caller. First I need to hold the caller and call the desired worker if he/she is available. The problem is that when I call the desired worker, the caller automatically Hang's up.
public function transferToAgent(){
$client = $this->init_client();
$call = $client->account->calls->get($_POST["CallSid"]);
$call->update(array(
"Url" => HTTP_BASE_URL."agent/call_controls/forward_agent?data=".$_POST['agentname'],
"Method" => "POST"
));
}
public function forward_agent(){
$agentname = $_GET['data'];
$this->gabbyvilletwilio->AgentTransfer($agentname);
}
And this is my code where I call the other Agent
function AgentTransfer($agentname){
$response = new Services_Twilio_Twiml;
$response->say(
'Your call is now being transferred to your desired agent.',
['voice' => 'alice', 'language' => 'en-GB']
);
$dial = $response->dial();
$dial->client($agentname);
print $response;
}
Twilio developer evangelist here.
The problem you have is that when you redirect one leg of the call, the other leg of the call becomes disconnected and has no other actions to perform from their TwiML, so hangs up as the call is complete for them.
There are a number of ways to get around this. As you are trying for what sounds like a warm transfer from one agent to another, then I recommend you take a read through this tutorial on warm transfers in PHP. It is written for Laravel, but I'm sure you could translate to Codeigniter.
In brief though, the tutorial uses <Conference> rather than connecting callers with <Number> so that the connection isn't broken. Then, instead of redirecting a call, you can put the caller on hold and invite another agent into the conference, unholding and transferring the caller when both agents are ready.
Let me know if that helps.