How do I use StreamedResponse to render template view in Symfony2 - php

I am trying to use streamedResponse to output progress to my index page in Symfony2.
This code below does show my progress on the api calls as it occurs, but I am having trouble rendering the streamed information in an actual view. Right now it is just outputing plain text on the top of the page, then rendering the view when its all complete.
I don't want to return the final array and close the function until everything is loaded, but I can't seem to get a regular twig template to show while I output the progress.
I have tried using render but nothing seems to truly ouput that view file to the screen unless I return.
public function indexAction($countryCode)
{
//anywhere from five to fifteen api calls are going to take place
foreach ($Widgets as $Widget) {
$response = new StreamedResponse();
$curlerUrl = $Widget->getApiUrl()
. '?action=returnWidgets'
. '&data=' . urlencode(serialize(array(
'countryCode' => $countryCode
)));
$requestStartTime = microtime(true);
$curler = $this->get('curler')->curlAUrl($curlerUrl);
$curlResult = json_decode($curler['body'], true);
if(isset($curlResult['data'])){
//do some processing on the data
}
$response->setCallback(function() use ($Widget, $executionTime) {
flush();
sleep(1);
var_dump($Widget->getName());
var_dump($executionTime);
flush();
});
$response->send();
}
//rest of indexAction with a return statement
return array(
//all the vars my template will need
);
}
Also, another important detail is that I am trying to render all to twig and there seems to be some interesting issues with that.

As I understand it, you only get one chance to output something to the browser from the server (PHP/Twig), then it's up to JavaScript to make any further changes (like update a progress bar).
I'd recommend using multi-cURL to perform all 15 requests asynchronously. This effectively makes the total request time equal to the slowest request so you can serve your page much faster and maybe eliminate the need for the progress bar.
// Create the multiple cURL handle
$mh = curl_multi_init();
$handles = array();
$responses = array();
// Create and add the cURL handles to the $mh
foreach($widgets as $widget) {
$ch = $curler->getHandle($widget->getURL()); // Code that returns a cURL handle
$handles[] = $ch;
curl_multi_add_handle($mh, $ch);
}
// Execute the requests
do {
curl_multi_exec($mh, $running);
curl_multi_select($mh);
} while ($running > 0);
// Get the request content
foreach($handles as $handle) {
$responses[] = curl_multi_getcontent($handle);
// Close the handles
curl_close($handle);
}
curl_multi_close();
// Do something with the responses
// ...
Ideally, this would be a method of your Curler service.
public function processHandles(array $widgets)
{
// most of the above
return $responses;
}

You may implements all of the logic in the setCallback method, so consider this code:
public function indexAction($countryCode)
{
$Widgets = [];
$response = new StreamedResponse();
$curlerService = $this->get('curler');
$response->setCallback(function() use ($Widgets, $curlerService, $countryCode) {
foreach ($Widgets as $Widget) {
$curlerUrl = $Widget->getApiUrl()
. '?action=returnWidgets'
. '&data=' . urlencode(serialize(array(
'countryCode' => $countryCode
)));
$requestStartTime = microtime(true);
$curler = $curlerService->curlAUrl($curlerUrl);
$curlResult = json_decode($curler['body'], true);
if(isset($curlResult['data'])){
//do some processing on the data
}
flush();
sleep(1);
var_dump($Widget->getName());
var_dump( (microtime(true) - $requestStartTime) );
flush();
}
});
// Directly return the streamed response object
return $response;
}
Further reading this and this article.
Hope this help

Related

PHP CURL multi-threaded and single-threaded function help. How do I do this?

I found a function here: http://archevery.blogspot.com/2013/07/php-curl-multi-threading.html
I am using it to send an array of URLs to run and process as quickly as possible via Multi-threaded curl requests. This works great.
SOME of the urls I want to send it require they be processed in order, not at the same time, but in a sequence.
How can I achieve this?
Example:
URL-A URL-B URL-C --> All fire off at the same time
URL-D URL-E --> Must wait for URL-D to finish before URL-E is triggered.
My purpose is for a task management system that allows me to add PHP applications as "Tasks" in the database. I have a header/detail relationship with the tasks so a task with one header and one detail can be sent off multi-threaded, but a task with one header and multiple details must be sent off in the order of the detail tasks.
I can do this by calling curl requests in a loop, but I want them to also fire off the base request (the first task of a sequence) as part of the multi-threaded function. I dont want to have to wait for all sequential tasks to pile up and process in order. As in the first task of each sequence should be multi-threaded, but tasks with a sequence then need to wait for that task to complete before moving to the next.
I tried this function that I send the multiple tasks to, but it waits for each task to finish before moving on the next. I need to somehow combine the multi-threaded function from the URL above with this one.
Here is my multithreaded curl function:
function runRequests($url_array, $thread_width = 10) {
$threads = 0;
$master = curl_multi_init();
$curl_opts = array(CURLOPT_RETURNTRANSFER => true,
CURLOPT_FOLLOWLOCATION => true,
CURLOPT_MAXREDIRS => 5,
CURLOPT_CONNECTTIMEOUT => 15,
CURLOPT_TIMEOUT => 15,
CURLOPT_RETURNTRANSFER => TRUE);
$results = array();
$count = 0;
foreach($url_array as $url) {
$ch = curl_init();
$curl_opts = [CURLOPT_URL => $url];
curl_setopt_array($ch, $curl_opts);
curl_multi_add_handle($master, $ch); //push URL for single rec send into curl stack
$results[$count] = array("url" => $url, "handle" => $ch);
$threads++;
$count++;
if($threads >= $thread_width) { //start running when stack is full to width
while($threads >= $thread_width) {
//usleep(100);
while(($execrun = curl_multi_exec($master, $running)) === -1){}
curl_multi_select($master);
// a request was just completed - find out which one and remove it from stack
while($done = curl_multi_info_read($master)) {
foreach($results as &$res) {
if($res['handle'] == $done['handle']) {
$res['result'] = curl_multi_getcontent($done['handle']);
}
}
curl_multi_remove_handle($master, $done['handle']);
curl_close($done['handle']);
$threads--;
}
}
}
}
do { //finish sending remaining queue items when all have been added to curl
//usleep(100);
while(($execrun = curl_multi_exec($master, $running)) === -1){}
curl_multi_select($master);
while($done = curl_multi_info_read($master)) {
foreach($results as &$res) {
if($res['handle'] == $done['handle']) {
$res['result'] = curl_multi_getcontent($done['handle']);
}
}
curl_multi_remove_handle($master, $done['handle']);
curl_close($done['handle']);
$threads--;
}
} while($running > 0);
curl_multi_close($master);
return $results;
}
and here is single-threaded curl function.
function runSingleRequests($url_array) {
foreach($url_array as $url) {
// Initialize a CURL session.
$ch = curl_init();
// Page contents not needed.
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 0);
// grab URL and pass it to the variable.
curl_setopt($ch, CURLOPT_URL, $url);
// process the request.
$result = curl_exec($ch);
}
Both take an array of URLs as their input.
I currently have an array of all single tasks and another array of all multiple tasks with a "header id" that lets me know what header task each detail task is part of.
Any help on theory or code would be most appreciated.
Thanks!
Why don't you use a rudementary task scheduler to schedule your requests and followups, instead of running everything at once?
See it in action: https://ideone.com/suTUBS
<?php
class Task
{
protected $follow_up = [];
protected $task_callback;
public function __construct($task_callback)
{
$this->task_callback = $task_callback;
}
public function addFollowUp(Task $follow_up)
{
$this->follow_up[] = $follow_up;
}
public function complete()
{
foreach($this->follow_up as $runnable) {
$runnable->run();
}
}
public function run()
{
$callback = $this->task_callback;
$callback($this);
}
}
$provided_task_scheduler_from_somewhere = function()
{
$tasks = [];
$global_message_thing = 'failed';
$second_global_message_thing = 'failed';
$task1 = new Task(function (Task $runner)
{
$something_in_closure = function() use ($runner) {
echo "running task one\n";
$runner->complete();
};
$something_in_closure();
});
/**
* use $global_message_thing as reference so we can manipulate it
* This will make sure that the follow up on this one knows the status of what happened here
*/
$second_follow_up = new Task(function(Task $runner) use (&$global_message_thing)
{
echo "second follow up on task one.\n";
$global_message_thing = "success";
$runner->complete();
});
/**
* Just doing things in random order to show that order doesn't really matter with a task scheduler
* just the follow ups
*/
$tasks[] = $task1;
$tasks[] = new Task(function(Task $runner)
{
echo "running task 2\n";
$runner->complete();
});
$task1->addFollowUp(new Task(function(Task $runner)
{
echo "follow up on task one.\n";
$runner->complete();
}));
$task1->addFollowUp($second_follow_up);
/**
* Adding the references to our "status" trackers here to know what to print
* One will still be on failed because we did nothing with it. this way we know it works properly
* as a control.
*/
$second_follow_up->addFollowUp(new Task(function(Task $runner) use (&$global_message_thing, &$second_global_message_thing) {
if($global_message_thing === "success") {
echo "follow up on the second follow up, three layers now, w00007!\n";
}
if($second_global_message_thing === "success") {
echo "you don't see this\n";
}
$runner->complete();
}));
return $tasks;
};
/**
* Normally you'd use some aggretating function to build up your tasks
* list or a collection of classes. I simulated that here with this callback function.
*/
$tasks = $provided_task_scheduler_from_somewhere();
foreach($tasks as $task) {
$task->run();
}
This way you can have nesting of tasks that need to follow after each other, with some clever uses of closures you can pass parameters to the executing functions and the encompassing objects outside it.
In my example the Task object itself is passed to the executing function so the executing function can call complete when it's done with it's job.
When complete is called the Task determine if it has scheduled follow up tasks to execute and if so, those are automatically called and works itself down the chain like that.
It's a rudimentary task scheduler, but it should help you on the way getting steps planned in the order you want them to be executed.
Here's an easier to follow example, From : http://arguments.callee.info/2010/02/21/multiple-curl-requests-with-php/
curl_multi_init. This family of functions allows you to combine cURL handles and execute them simultaneously.
EXAMPLE
build the individual requests, but do not execute them
$ch_1 = curl_init('http://webservice.one.com/');
$ch_2 = curl_init('http://webservice.two.com/');
curl_setopt($ch_1, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch_2, CURLOPT_RETURNTRANSFER, true);
build the multi-curl handle, adding both $ch
$mh = curl_multi_init();
curl_multi_add_handle($mh, $ch_1);
curl_multi_add_handle($mh, $ch_2);
execute all queries simultaneously, and continue when all are complete
$running = null;
do {
curl_multi_exec($mh, $running);
} while ($running);
close the handles
curl_multi_remove_handle($mh, $ch1);
curl_multi_remove_handle($mh, $ch2);
curl_multi_close($mh);
all of our requests are done, we can now access the results
$response_1 = curl_multi_getcontent($ch_1);
$response_2 = curl_multi_getcontent($ch_2);
echo "$response_1 $response_2"; // output results
If both websites take one second to return, we literally cut our page load time in half by using the second example instead of the first!
Referances : https://www.php.net/manual/en/function.curl-multi-init.php

Slim Framework: how to early close client connection

I have a long task for a slim controller, I would like to early end the output to client and then continue the backend elaboration.
$app->get("/test",function() use($app){
$app->render("page.html"); //this is the client output
$app->easlyStop(); //a slim hypothetical command to call
$task=new MyTask();
$task->longAsyncTask(); //this take a few, client don't have to wait.
});
Is there a solution with Slim?
The easiest option here is to call a method with a system call and return before it finishes:
exec('/bin/php /path/to/a/script.php > /dev/null &');
Note that this is a simplification as PHP is request oriented, which means that a new process is started for every request, and the webserver sends the response to the user once the request is finished. You could use flush and other techniques, but these are prone to errors and depends on the configurations of the webserver too.
This is a method for Slim controller with Json view:
$app->get( '/test/', function () use($app) {
$app->view = new Json();
try{
//here the output of Json view
$model=["myjsondata"=>[]];
$app->render(200,$model);
}catch (\Slim\Exception\Stop $e) {}
//following code is copied from Slim->run() to early output
$app->response()->headers->replace(["Content-Length"=>$app->response()->length()]);
$app->response()->headers->replace(["Connection"=>"close"]);
list($status, $headers, $body) = $app->response->finalize();
\Slim\Http\Util::serializeCookies($headers, $app->response->cookies, $app->settings);
if (headers_sent() === false) {
if (strpos(PHP_SAPI, 'cgi') === 0) {
header(sprintf('Status: %s', \Slim\Http\Response::getMessageForCode($status)));
} else {
header(sprintf('HTTP/%s %s', $app->config('http.version'), \Slim\Http\Response::getMessageForCode($status)));
}
foreach ($headers as $name => $value) {
$hValues = explode("\n", $value);
foreach ($hValues as $hVal) {
header("$name: $hVal", false);
}
}
}
if (!$app->request->isHead()) {
echo $body;
}
//early output to client
ob_end_flush();
ob_flush();
flush();
if (session_id()) session_write_close();
//my async job
sleep(5);
});
I think this can be easily insert in a Slim plugin. This works only with Json view becase this is my need but it can be used with Twig or other Slim views getting output with ob* php functions instead of catching the Stop() exception.

Proper use of curl_multi to make parallel asynchronous calls

I found a simple class to do a parallel requests:
class Requests {
public $handle;
public function __construct() {
$this->handle = curl_multi_init();
}
public function process($urls, $callback) {
foreach($urls as $url) {
$ch = curl_init($url);
curl_setopt_array($ch, array(CURLOPT_RETURNTRANSFER => TRUE));
curl_multi_add_handle($this->handle, $ch);
}
do {
$mrc = curl_multi_exec($this->handle, $active);
if ($state = curl_multi_info_read($this->handle)) {
$info = curl_getinfo($state['handle']);
$callback(curl_multi_getcontent($state['handle']), $info);
curl_multi_remove_handle($this->handle, $state['handle']);
}
usleep(10000); // stop wasting CPU cycles and rest for a couple ms
} while ($mrc == CURLM_CALL_MULTI_PERFORM || $active);
}
public function __destruct() {
curl_multi_close($this->handle);
}
}
This should be used in the following way:
$dataprocess = function($data,$info){
echo $data;
}
$urls = array('url1','url2','url3');
$rqs = new Requests();
$rqs->process(urls,$dataprocess);
However it looks like not all urls are fetching (I'd estimate that only about a half of urls are fetched).
I found this note to PHP's curl_multi_exec function description:
If it returns CURLM_CALL_MULTI_PERFORM you better call it again soon, as that is a signal that it still has local data to send or remote data to receive.
So I suspect that this class returns too earlier or should repeat the request in some cases. But the class is controlling both curl_multi_exec output and $active parameter, so it should work fine.
Any thoughs?
UPDATE
What I've done for the moment is that I put a cicle in the function process around all its code to execute it until all the URL's are retrieved (during debugging I see that after each iteration the number of unloaded URLs is reducing like 50-22-8-0).
But I changed the class dramatically: instead of a callback function I'm passing the array with two key names (one for URL and one for content storing). So it working for me now but I still can not figure out how to do this for callback function style.

Send multiple numbers SMS requests in one second PHP

I'm trying to send SMS using an API. It is sending almost one SMS per second but i want to send multiple SMS in one second using multithreading/pthreads in PHP. How is it possible or how can i send multiple SMS request asynchronously to API server from my end at least time.
//Threads Class
class MThread extends Thread {
public $data;
public $result;
public function __construct($data){
$this->data = $data;
}
public function run() {
foreach($this->data as $dt_res){
// Send the POST request with cURL
$ch = curl_init("http://www.example.com");
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, $dt_res['to']);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$res = curl_exec($ch);
$http_code = curl_getinfo($ch, CURLINFO_HTTP_CODE);
curl_close($ch);
$this->result = $http_code;
/**/
}
}
}
// $_POST['data'] has multi arrays
$request = new MThread($_POST['data']);
if ($request->start()) {
$request->join();
print_r($request->result);
}
Any idea will be appreciated.
You don't necessarily need to use threads to send multiple HTTP requests asynchronously. You can use non-blocking I/O, multicurl is suitable in this case. There are some HTTP clients with multicurl support.
Example (using Guzzle 6):
$client = new \GuzzleHttp\Client();
$requestGenerator = function() use ($client) {
$uriList = ['https://www.google.com', 'http://amazon.com', 'http://github.com', 'http://stackoverflow.com'];
foreach ($uriList as $uri) {
$request = new \GuzzleHttp\Psr7\Request('GET', $uri);
$promise = $client->sendAsync($request);
yield $promise;
}
};
$concurrency = 4;
\GuzzleHttp\Promise\each_limit($requestGenerator(), $concurrency, function(\GuzzleHttp\Psr7\Response $response) {
var_dump($response->getBody()->getContents());
}, function(\Exception $e) {
var_dump($e->getMessage());
})->wait();
Why do you make a foreach into the run() ? When you do that, that exactly like a simple function, no multithreading.
So, how to use multithreading with pthread ?
Here is the solution at your problem:
$thread = array();
foreach ($_POST['data'] as $index => $data) {
$thread[$index] = new MThread($data);
$thread[$index]->start();
}
You should be able to understand your error with this code.
Just delete your foreach into your run() function and use my code and it's will work.
It's better to use something like Beanstalk with multiple workers.

Symfony event addListener and variable Scope/State

I'm currently using sitl/curl-easy to make a multi-curl parallel requets, but I'm having some hard time with scope / state:
// Init queue of requests
$queue = new \cURL\RequestsQueue;
... init queue options
// Set function to be executed when request will be completed
$queue->addListener('complete', function (\cURL\Event $event)
{
$response = $event->response->getContent();
// ugly :D, get the key from the object request
$key = explode("&key=", array_values($event->request->getOptions()->toArray())[0])[1];
// if response is not null, truncate it to use less space
if ($response != null){
$response = str_replace(array("\r", "\n"), "", $response);
}
>>>>>>>> DON'T WORK, $banner_holder is declared on the top of the php page
array_push($banner_holder, array("key"=> $key,"content"=>$response));
});
How to push the array of $key and $response outside the listener?
Thanks in advance.
Since you are using a closure function you will need to specify that you want to "use" the $banner_holder array like so
$queue->addListener('complete', function (\cURL\Event $event) use ($banner_holder) {

Categories