Yii framework async request - php

I have ajax request that do 3 missions:
Save Model (DB)
Send Email
Give success or failed message.
Because this mission takes too time. User can wait up to 20 sec for response (success or failed message). And if the user close the browser its stop in one of the operation that current process for the user.
This is bad user experience.
I want user submit his data to my Controller and after it he will get the "success or failed message". And the process will be completely in the server side and its should support multi sessions.
How can I do that?

#hakre What you gave not reduce the time user wait for respond.
I found the best solution for this:
runactions extension for yii
This is extension let you run from controller background actions.
There are several way to use it. The best one for my case is this
public function actionTimeConsumingProcess()
{
if (ERunActions::runBackground())
{
//do all the stuff that should work in background
mail->send()
}
else
{
//this code will be executed immediately
//echo 'Time-consuming process has been started'
//user->setFlash ...render ... redirect,
}
//User information
echo "Submit Success!"
}
And its work but without ajax request, When I make ajax request its not working for some reason.
So I used:
ERunActions::httpPOST($this->createAbsoluteUrl('Form/Submit'), array('to'=>'mail#domain.com', 'subject'=>'This is the subject'));
And its work great but its not the ideal solution for this.

The problem about runactions extension is that it works only with unauthenticated users, you may use the yii backjob extension, but this will require the use of some kind of non-blocking session storage, such as CHttpDbSession.
I am still looking for the right way to do this...
I found one of the best options is to run a server backgroud job:
/usr/bin/php -q longProcess.php > /dev/null 2>&1 &
However, I still need to know how to pass controller and action in cmd line and allow yii to actually use it, something like
/usr/bin/php -q yii-index.php controller action > /dev/null 2>&1 &
Update: I found out the best way is to use a yii console application and run it as a background job. the link below helped a lot:
Yii cron jobs
I use this code now and it is working perfectly:
exec("nohup /protected/yiic mycommand /dev/null 2>&1 &")

Related

How to check if process is completed?

I have a php script that is currently invoked directly by a webhook. The webhook method was fine up until this past week where the volume of requests is becoming problematic for API rate limits.
What I have been trying to do is make a second PHP file ($path/webhook-receiver.php) to be invoked by webhooks only when there isn't a process running. I'll be using the process user webb recommended, which is in the invoked script ($path/event-finance-reporting.php) it will create a file as the first action, and the delete that file as the last exection.
Before invoking the script the automation will check the directory to make sure it is empty, otherwise it will kick back an error to the user telling them to wait until the current job is completed before submitting another one.
The problem I'm running into now is that both $command1 and $command2'. both end up invoking the$path/webhook-reciever.phpinstead of$path/event-finance-reporting.php`.
$command1 = "php -f $path/event-finance-reporting.php 123456789";
$command2 = "/usr/bin/php -q -f $path/event-finance-reporting.php 123456789";
Anyone know why would be?
The goal it to have only one instance of event-finance-reporting.php run at a time. One strategy is to create a unique lockfile, don't run if it exists, and delete it when it finishes, e.g.,:
$lockfilepath = '.../event-finance-reporting.lock';
if(file_exists($lockfilepath)){
print("try again later");
exit();
}
touch($lockfilepath);
...
// event-finance-reporting.php code
...
unlink($lockfilepath);
You could also do something more complicated in the if, such as checking the age of the lockfile, then deleting and ignoring it if it was left behind awhile ago by a crashed instance of event-finance-reporting.php.
With this strategy, you also don't need two separate phps.

Yii Php Executing asynchronous background request

Hi i'm trying to execute a LONG RUNNING request (action) in background.
function actionRequest($id){
//execute very long process here in background but continue redirect
Yii::app()->user->setFlash('success', "Currently processing your request you may check it from time to time.");
$this->redirect(array('index', 'id'=>$id));
}
What i'm trying to achieve is to NOT have the user waiting for the request to be processed since it generally takes 5-10min, and the request usually goes to a timeout, and even if I set the timeout longer, waiting for 5-10 min. isn't a good user experience.
So I want to return to the page immediately notifying the user that his/her request is being processed, while he can still browse, and do other stuff in the application, he/she can then go back to the page and see that his/her request was processed.
I've looked into Yii extensions backjob, It works, the redirect is executed immediately (somehow a background request), but when doing other things, like navigating in the site, it doesn't load, and it seems that the request is still there, and i cannot continue using the application until the request is finished.
A similar extension runactions promises the same thing, but I could not even get it to work, it says it 'touches a url', like a fire and forget job but doesn't work.
I've also tried to look into message queuing services like Gearman, RabbitMQ, but is really highly technical, I couldn't even install Gearman in my windows machine so "farming" services won't work for me. Some answers to background processing includes CRON and AJAX but that doesn't sound too good, plus a lot of issues.
Is there any other workaround to having asynchronous background processing? I've really sought hard for this, and i'm really not looking for advanced/sophisticated solutions like "farming out work to several machines" and the likes. Thank You very much!
If you want to be able to run asynchronous jobs via Yii, you may not have a choice but to dabble with some AJAX in order to retrieve the status of the job asynchronously. Here are high-level guidelines that worked for me. Hopefully this will assist you in some way!
Setting up a console action
To run background jobs, you will need to use Yii's console component. Under /protected/commands, create a copy of your web controller that has your actionRequest() (e.g. /protected/commands/BulkCommand.php).
This should allow you to go in your /protected folder and run yiic bulk request.
Keep in mind that if you have not created a console application before, you will need to set up its configuration similar to how you've done it for the web application. A straight copy of /protected/config/main.php into /protected/config/console.php should do 90% of the job.
Customizing an extension for running asynchronous console jobs
What has worked for me is using a combination of two extensions: CConsole and TConsoleRunner. TConsoleRunner uses popen to run shell scripts, which worked for me on Windows and Ubuntu. I simply merged its run() code into CConsole as follows:
public function popen($shell, $redirectOutput = '')
{
$shell = $this->resolveCommandLine($shell, false, $redirectOutput);
$ret = self::RETURN_CODE_SUCCESS;
if (!$this->displayCommands) {
ob_start();
}
if ($this->isWindows()) {
pclose(popen('start /b '.$shell, 'r'));
}
else {
pclose(popen($shell.' > /dev/null &', 'r'));
}
if (!$this->displayCommands) {
ob_end_clean();
}
return $ret;
}
protected function isWindows()
{
if(PHP_OS == 'WINNT' || PHP_OS == 'WIN32')
return true;
else
return false;
}
Afterwards, I changed CConsole's runCommand() to the following:
public function runCommand($command, $args, $async = false, &$outputLines = null, $executor = 'popen')
{
...
switch ($executor) {
...
case 'popen':
return $this->popen($shell);
...
}
}
Running the asynchronous job
With the above set up, you can now use the following snippet of code to call yiic bulk request we created earlier.
$console = new CConsole();
$console->runCommand('bulk request', array(
'--arg1="argument"',
'--arg2="argument"',
'--arg3="argument"',
));
You would insert this in your original actionRequest().
Checking up on the status
Unfortunately, I'm not sure what kind of work your bulk request is doing. For myself, I was gathering a whole bunch of files and putting them in a folder. I knew going in how many files I expected, so I could easily create a controller action that verifies how many files have been created so far and give a % of the status as a simple division.

php background script

Here is my problem : I have an application that allows users to synchronize their data with social networks. The problem is that each synchronization can take up to 10 seconds. So I would like to trigger the synchronization in "background" task, and if possible parallelize all synchronisation.
Currently, my script looks like :
$user = user::login($username, $password);
/* Do some treatments, display the home page */
$user->synchronize();
/* END OF LOGIN SCRIPT */
and in my user.class.php, I have something like
public function synchronize(){
$Networks = socialNetworks::getByUserId($this->userid);
foreach($Networks as $n) $n->synchronize();
}
and finally in the socialNetworks.class.php, I have implemented all the synchronization scripts for each social network (fb, linkedin, google+, ...).
Note that I don't need to wait for the result of the synchronization when logging in.
So I have 2 problems here :
when calling $user->synchronize(); the login script is blocked until the end of the loop
the loop itself is very long, and I would like to launch parallel "threads", to make it faster when calling the synchronize() method (I have some Ajax triggering this action).
What would you suggest to optimize this ?
Thanks.
One option is to use Gearman, and offload this work to it's workers.
You basically create a PHP script that does some work. Then you tell it to do that work from your login script, and the work will be done in the background outside of the request.
Her is an blog post about asynchronous processing with php:
http://robert.accettura.com/blog/2006/09/14/asynchronous-processing-with-php/
<?php
include 'AsynchronousProcessing.php'
launchBackgroundProcess('php /path/to/task1.php');
launchBackgroundProcess('php /path/to/task2.php');
$class->SomeWork();
$class->SomeOtherWork();
?>
There are 2 ways.
Use exec() to make multiple syscalls like exec("php /path/to/script?networkId=1 > /dev/null 2>/dev/null &")
use pcntl_fork() function which works only in Linux.

cron alternative php tip

i got a file fetch.php
for now i am manually calling it from a bookmark once a day to execute the script.
i want to set kind of like a cron..that just goes to fetch.php once a day..
is it possible..
the fetch.php file has a html and bunch of javascript thats why cron doesn't work..
thanks... if you need clarification let me know..
in the crontab you can run it with a lynx command
example :
lynx -dump http://your.website.com > /dev/null
EDIT:
the problem is that lynxs not run javascript.
so you need to find a command line web browser that support javascript , it Not quite simple.
check a 'links' or 'w3m'
read more about in this post :
http://www.linuxquestions.org/questions/linux-software-2/is-there-a-browser-command-line-tool-for-testing-javascript-websites-359260/
Since JavaScript has to be executed, it sounds like it has to be run in a browser. Unless you're willing to add a lot more complexity, I'd go with one of these routes:
Use cron or Task Scheduler to launch the page in a web browser. Use javascript to have the page close itself (its tab) after it's loaded.
cron: * 9 * * * /usr/bin/firefox "http://localhost/cron.php" (may or may not work)
JavaScript: function all_done() { window.close(); }
Leave a tab open 24/7 and use ReloadEvery or something similar.

PHP: How to return information to a waiting script and continue processing

Suppose there are two scripts Requester.php and Provider.php, and Requester requires processing from Provider and makes an http request to it (Provider.php?data="data"). In this situation, Provider quickly finds the answer, but to maintain the system must perform various updates throughout the database. Is there a way to immediately return the value to Requester, and then continue processing in Provider.
Psuedo Code
Provider.php
{
$answer = getAnswer($_GET['data']);
echo $answer;
//SIGNAL TO REQUESTER THAT WE ARE FINISHED
processDBUpdates();
return;
}
You can flush the output buffer with the flush() command.
Read the comments in the PHP manual for more info
I use this code for running a process in the background (works on Linux).
The process runs with its output redirected to a file.
That way, if I need to display status on the process, it's just a matter of writing a small amount of code to read and display the contents of the output file.
I like this approach because it means you can completely close the browser and easily come back later to check on the status.
You basically want to signal the end of 1 process (return to the original Requester.php) and spawn a new process (finish Provider.php). There is probably a more elegant way to pull this off, but I've managed this a couple different ways. All of them basically result in exec-ing a command in order to shell off the second process.
adding the following > /dev/null 2>&1 & to the end of your command will allow it to run in the background without inhibiting the actual execution of your current script
Something like the following may work for you:
exec("wget -O - \"$url\" > /dev/null 2>&1 &");
-- though you could do it as a command line PHP process as well.
You could also save the information that needs to be processed and handle the remaining processing on a cron job that re-creates the same sort of functionality without the need to exec.
I think you'll need on the provider to send the data (be sure to flush), and then on the Requester, use fopen/fread to read an expected amount of data, so you can drop the connection to the Provider and continue. If you don't specify an amount of data to expect, I would think the requester would sit there waiting for the Provider to close the connection, which probably doesn't happen until the end of it's run (ie. all the secondary work intensive tasks are complete). You'll need to try out a few POC's..
Good luck.
Split the Provider in two: ProviderCore and ProviderInterface. In ProviderInterface just do the "quick and easy" part, also save a flag in database that the recent request hasn't been processed yet. Run ProviderCore as a cron job that searches for that flag and completes processing. If there's nothing to do, ProviderCore will terminate and retry in (say) 2 minutes.
I'm going out on a limb here, but perhaps you should try cURL or use a socket to update the requester?
You could start another php process in Provider.php using pcntl_fork()
Provider.php
{
// Fork process
$pid = pcntl_fork();
// You are now running both a daemon process and the parent process
// through the rest of the code below
if ($pid > 0) {
// PARENT Process
$answer = getAnswer($_GET['data']);
echo $answer;
//SIGNAL TO REQUESTER THAT WE ARE FINISHED
return;
}
if ($pid == 0) {
// DAEMON Process
processDBUpdates();
return;
}
// If you get here the daemon process failed to start
handleDaemonErrorCondition();
return;
}

Categories