forking php process and tying to specific web user - php

I have a web app that has a few processes that can take up to 10 minutes to run. Sometimes these processes are triggered by a user and they need the output as it is processed.
For instance, the user is looking for
a few records that they need. The
click the button to retrieve the
records (this is the part that can
take 10 minutes). They can continue
to work on other things but when they
click back to view the returns, it is
updated as the records are downloaded
into the system.
Right now, the user is locked while the process runs. I know about pcntl_fork() to fork a child process so that the user doesn't have to wait until the long process completes.
I was wondering if it's possible to tie that forked process to the specific user that triggered the request in a $_SESSION variable so that I can update the user when the process is complete. Also, is this the best way to update a user on a long-running process?

I think gearman fits your needs. Look at this sample code, taken from the doc :
<?php
/* create our object */
$gmclient= new GearmanClient();
/* add the default server */
$gmclient->addServer();
/* run reverse client */
$job_handle = $gmclient->doBackground("reverse", "this is a test");
if ($gmclient->returnCode() != GEARMAN_SUCCESS)
{
echo "bad return code\n";
exit;
}
$done = false;
do
{
sleep(3);
$stat = $gmclient->jobStatus($job_handle);
if (!$stat[0]) // the job is known so it is not done
$done = true;
echo "Running: " . ($stat[1] ? "true" : "false") . ", numerator: " . $stat[2] . ", denomintor: " . $stat[3] . "\n";
}
while(!$done);
echo "done!\n";
?>
If you store the $job_handle in the session, you can adapt the sample to make a control script.

Related

wp_schedule_single_event() not creating wp-cron job but executed successfully

I am using the wp_schedule_single_event() function to create a wp-cron job that sends an email to the specified user at the specified time.
Mostly this wp-cron job is successfully created and the users get informed in time. But sometimes it just doesn't work.
Whats especially strange is that wp_schedule_single_event() always returns true (which means that it was executed successfully) even when the wp-cron job isn't created (I check that with the WP Crontrol plugin).
My code (write_log: custom function to log the given strings, time: the corresponding timestamp):
write_log('User ' . get_current_user_id() . ' now tries to create the addProductsExpired cron job with timestamp: ' . time);
$success = wp_schedule_single_event(time, 'hook_addProductsExpired', array(get_current_user_id()));
if (!$success) {
write_log('The creation failed!');
}
write_log('User ' . get_current_user_id() . ' now tries to create the sendReminderMail cron job with timestamp: ' . time);
$success = wp_schedule_single_event(time - 60 * 60 * 24, 'hook_sendReminderMail', array(get_current_user_id()));
if (!$success) {
write_log('The creation failed!');
}
I should also note that I never accomplished it to reproduce the error by myself
I so far tried:
updating Wordpress
studying the logs
executing the function with accounts of users where it previously failed (it worked on my pc and also on the pc of the user in future executions)
modifying parameters in the user entry of affected users
manually executing the function with the parameters it previously failed
rewriting and optimising the whole function
None of them worked or threw an error i could debug.
I am now using actionsheduler.org as suggested by Terminator-Barbapapa
in his comment to my question. So far I haven't experienced any issues.

Request to launch background php script

I'm currently working on an internal website displaying a lot of statistics, and some pages or ajax scripts are extremely slow due to large datas.
What I'm searching is a way to launch theses scripts in background with a request, and then launch ajax requests to know the progress of the background script.
Is there any way to achieve this? I work with php7.0 and an apache2 server (I don't have direct access to apache server configuration, so if it's possible, I search for a client-side option)
If ever someone else is searching for a way to achieve this, here is the solution I found:
I call in Ajax a script, it forks itself and save the PID of the child process in the database.
Then I call session_write_close() in the child process to allow user making new requests, and the father process exits (not waiting for child end).
As the father exits, the user receive an answer to his request, and the child process continue his job.
Then in Ajax I call another script to get the evolution of the worker, and finally I get the result and kill the child process when everything is done.
Here is the code of my worker class:
class AsyncWorker
{
private $pid;
private $worker;
private $wMgr;
public function __construct($action, $content, $params = NULL)
{
$this->wMgr = new WorkersManager();
$pid = pcntl_fork(); // Process Fork
if ($pid < 0) {
Ajax::Response(AJX_ERR, "Impossible de fork le processus");
} else if ($pid == 0) { // In the child, we start the job and save the worker properties
sleep(1);
$this->pid = getmypid();
$this->worker = $this->wMgr->fetchBy(array("pid" => $this->pid));
if (!$this->worker) {
$this->worker = $this->wMgr->getEmptyObject();
$this->wMgr->create($this->worker);
}
$this->worker->setPid($this->pid);
$this->worker->setAction($action);
$this->worker->setContent($content);
$this->worker->setPercent(0.00);
$this->worker->setResult("");
$this->wMgr->update($this->worker);
$this->launch($params);
} else { // In the father, we save the pid to DB and answer the request.
$this->worker = $this->wMgr->fetchBy(array("pid" => $this->pid));
if (!$this->worker) {
$this->worker = $this->wMgr->getEmptyObject();
$this->worker->setPid($pid);
$this->wMgr->create($this->worker);
}
Ajax::Response(AJX_OK, "Worker started", $this->worker->getId());
}
}
// Worker job
private function launch($params = NULL)
{
global $form, $_PHPPATH, $url, $session;
session_write_close(); // This is useful to let the user make new requests
ob_start(); // Avoid writing anything
/*
** Some stuff specific to my app (include the worker files, etc..)
*/
$result = ob_get_contents(); // Get the wrote things and save them to DB as result
$this->worker->setResult($result);
$this->worker->setPercent(100);
ob_end_clean();
}
}
It's a bit tricky but I had no choices, as I have no access to server plugins and libraries.
you can make php script to execute shell bash script , or using exec() method for that

execute external php via cron

I have a script on my shared hosting. When i execute the script it checks if there are new members on the site. If so, the script headers to my windows server with two get parameters and a script there will execute and make a useracount for the new user. this works manualy and for 1 user just fine, however, i want to add cron to this so it runs every 15 minutes. this is'nt the problem when there is one user, but is the script has more then one user, it wont reach there becouse of the header. How can i fix this?
my code:
$array = $arr->invoices->invoice;
foreach($array as $key => $value) {
if(!order_is_active($value->id)) {
$username_win = strtolower($value->firstname) . rand(9,9999);
$password_win = ucfirst(maakpass(10, TRUE, TRUE));
if (add_user_to_db($value->id, $value->userid, $value->status, $username_win, $password_win)) {
header('location: http://ip/adduper/?username=' . htmlspecialchars($username_win) . '&password=' . htmlspecialchars($password_win));
} else {
echo 'order bestaat al';
}
}
}
You can store all of the users in an array and then send the json encoded string to your other server which will then json_decode it to get back an array. It can then loop over the array and add each user.
Rather than doing a header, I'd move toward doing CURL
This will allow you to more cleanly return from the Windows machine a status of success or failure. As with all remote connections, you have to account for when one machine can't connect to the other.....and it isn't a matter of if that case will happen, but when. Such is the nature of the Internet.
With a PHP header, a failed connection would create all sorts of chaos.

JQuery/PHP - Breaking up a bunch of ajax requests into batches?

UPDATE: Seems like I have been wasting my time to some extent as according to http://www.browserscope.org/?category=network&v=top-d most modern browsers already limit the number of connections to a single host. 6 being the common number of connections which suits my purposes rather well. But I guess it is still an interesting problem.
The final piece of the jigsaw for my work task is break a list of potentially 250+ ajax requests into batches.
As the result of the following php code
<?
// print("alert(\" booya \");");
$hitlist = array();
$hitlist = urlBuilder($markets,$template);
foreach ($hitlist as $mktlist) {
foreach ($mktlist as $id => $hit) {
$cc = substr($id,0,2);
$lc = substr($id,-4);
echo ("$(\"#" . $cc . $lc . "\").load(\"psurl.php?server=" . $server . "&url=" . $hit . "&port=" . $port . "\");\n");
}
}
?>
This generates a long list of jquery .load's which right now are all executed on a click.
e.g.
$("#sesv-1").load("psurl.php?server=101.abc.com&url=/se/sv&port=80");
$("#sesv-2").load("psurl.php?server=101.abc.com&url=/se/sv/catalog/&port=80");
$("#sesv-3").load("psurl.php?server=101.abc.com&url=/se/sv/catalog/products/12345678&port=80");
$("#atde-1").load("psurl.php?server=101.abc.com&url=/at/de&port=80");
$("#atde-2").load("psurl.php?server=101.abc.com&url=/at/de/catalog/&port=80");
$("#atde-3").load("psurl.php?server=101.abc.com&url=/at/de/catalog/products/12345678&port=80");
$("#benl-1").load("psurl.php?server=101.abc.com&url=/be/nl&port=80");
$("#benl-2").load("psurl.php?server=101.abc.com&url=/be/nl/catalog/&port=80");
$("#benl-3").load("psurl.php?server=101.abc.com&url=/be/nl/catalog/products/12345678&port=80");
$("#befr-1").load("psurl.php?server=101.abc.com&url=/be/fr&port=80");
$("#befr-2").load("psurl.php?server=101.abc.com&url=/be/fr/catalog/&port=80");
$("#befr-3").load("psurl.php?server=101.abc.com&url=/be/fr/catalog/products/12345678&port=80");
Depending on circumstances it can be like 250 requests or perhaps only 30-40. The whole purpose of the app is to warm up newly restarted appservers... so 250 requests in a new jvm = death!
So ideally I would like to break them up. Perhaps by the market would be best meaning at most 5-6 requests at a time.
Any ideas on how this can be accomplished? Is it possible in standard jquery? Trying to make the dependencies as limited as possible so preferably without plugins!
You can use jQuery's .queue.
// Define a queue for execution
var
$elem = $("#sesv-1"),
enqueue = function(a){ $elem.queue("status", a) };
// Queue your requests
enqueue(function(a){
$aElem.load("url", a);
});
enqueue(function(a){
$otherElem.load("url", a);
});
// Execute the queue
$elem.dequeue("status");
You can create as many queues as you need (most probably per market) then enqueue your requests.

PHP MySQL get_lock

In a script I'm trying to check whether the same script is already running using MySQL GET_LOCK. The problem is, when a script tries to get lock, which isn't free, it blocks forever regardless of the parameter I provide.
<?php
class Controller_Refresher extends Controller {
public function action_run($key) {
echo date('H:i:s'."\n", time());
$this->die_if_running();
$this->run();
echo date('H:i:s'."\n", time());
}
private function die_if_running() {
$result = DB::query(Database::SELECT, "SELECT IS_FREE_LOCK('refresher_running') AS free")->execute();
if (! intval($result[0]['free'])) die('Running already');
$result = DB::query(Database::SELECT, "SELECT GET_LOCK('refresher_running', 1)")->execute();
}
private function run() {
echo "Starting\n";
ob_flush();
sleep(10);
DB::query(Database::SELECT, "SELECT RELEASE_LOCK('refresher_running')")->execute();
}
}
When I run this in 2 tabs in browser, I get e.g.:
-tab 1-
20:48:16
Starting
20:48:26
-tab 2-
20:48:27
Starting
20:48:37
While what I want to do is to make the second tab die('Running already');.
Watch out - this problem might actually be caused by php locking the session file:
https://stackoverflow.com/a/5487811/539149
So you should call session_write_close() before any code that needs to run concurrently. I discovered this after trying this Mutex class:
http://code.google.com/p/mutex-for-php/
The class worked great but my php scripts were still running one by one!
Also, you don't need IS_FREE_LOCK(). Just call GET_LOCK('refresher_running', 0) and it will either return 1 if it gives you the lock or 0 if the lock is taken. It's more atomic that way. Of course, lock timeouts can still be useful in certain situations, like when you want to queue up tasks, but watch out for the script timing out if you get too many simultaneous requests.
Zack Morris
One option would be to rely on a filesystem lock instead of a database. Since it's the script execution that needs handling, it should not matter. A sample from the manual with a non-blocking exclusive lock:
$fp = fopen('/tmp/lock.txt', 'r+');
/* Activate the LOCK_NB option on an LOCK_EX operation */
if(!flock($fp, LOCK_EX | LOCK_NB)) {
die('Running already');
}
/* ... */
fclose($fp);
Edit
Another option would be to use a status file that gets created at the beginning of each exection and will be automatically deleted by register_shutdown_function upon script completion.
The script would simply check the existence of the status file and if it's already there, execution would stop:
define('statusFile', sys_get_temp_dir() . DIRECTORY_SEPARATOR . 'myjob.running');
//
// If a process is already running then exit
//
if (file_exists(statusFile)) {
die('Running already');
} else {
file_put_contents(date('Y-m-d H:i:s'), statusFile);
}
//
// Other code here
//
function shutdown() {
unlink(statusFile);
}
//
// Remove the status file on completion
//
register_shutdown_function('shutdown');

Categories