My page executes a script that takes a relatively long time to complete. I would like to make it so that the user can submit information, immediately echo "Complete", and allow the user to exit the page while the script continues executing. How can I do this?
Use cron. On your page create a email task and save in in db or fs - does not matter. Create a script which runs every n minutes which gets email tasks and executes them.
Unfortunately, your hosting may not have cron support...
Email is naturally slow. I would advise you to use a job queue for your emails. You should look at
Gearman
Beanstalkd
ZeromQ
With these solutions, you can queue slow tasks and continue to show valid information and progress to your user.
Example Client
$client= new GearmanClient();
$client->addServer();
$client->do("email", json_encode(array("A#yahoo.com","Hello World","This is your first Mail")));
echo "Welcome To XYZ" ;
Server
$worker = new GearmanWorker();
$worker->addServer();
$worker->addFunction("email", "sendMail");
while ( $worker->work() );
function sendMail($mail) {
list($to, $subject, $message) = json_decode($mail);
return mail($to, $subject, $message);
}
My suggestion would be to submit all of the information into a table row or similar data structure, then run a cronjob every few minutes to go through each row and run the script based on the information that had been submitted.
This would be slightly complicated, I'm afraid, but it would immediately free the user (once the raw information was stored into the DB)
Related
I have a script that is running continuously in the server, in this case a PHP script, like:
php path/to/my/index.php.
It's been executed, and when it's done, it's executed again, and again, forever.
I'm looking for the best way to be notified if that event stop running(been executed).
There are many reasons why it stops been called, like server memory, new deployment, human error... etc.
I just want to be notified(email, sms, slack...) if that script was not executed for certain amount of time(like 1 hour, 1 day, etc...)
My server is Ubuntu living in AWS.
An idea:
I was thinking on having an index in REDIS/MEMCACHED/ETC with a TTL. Every time the script run, renovate that TTL for this index.
If the script stop working for that TTL time, this index will expire. I just need a way to trigger a notification when that expiration happen, but looks like REDIS/MEMCACHED are not prepared for that
register_shutdown_function might help, but might not... https://www.php.net/manual/en/function.register-shutdown-function.php
I can't say i've ever seen a script that needs to run indefinitely in PHP. Perhaps there is another way to solve the problem you are after?
Update - Following your redis idea, I'd look at keyspace notifications. https://redis.io/topics/notifications
I've not tested the idea since I'm not actually a redis user. But it may be possible to subscribe to capture the expiration event (perhaps from another server?) and generate your notification.
There's no 'best' way to do this. Ultimately, what works best will boil down to the specific workflow you're supporting.
tl;dr version: Find what constitutes success and record the most recent time it happened. Use that for your notification trigger in another script.
Long version:
That said, persistent storage with a separate watcher is probably the most straight-forward way to do this. Record the last successful run, and then check it with a cron job every so often.
For what it's worth, for scripts like this I generally monitor exit codes or logs produced by the script in question. This isolates the error notification process from the script itself so a flaw in the script (hopefully) doesn't hamper the notification.
For a barebones example, say we have a script to invoke the actual script... (This is very much untested pseudo-code)
<?php
//Run and record.
exec("php path/to/my/index.php", $output, $return_code);
//$return_code will be 255 on fatal errors. You can use other return codes
//with exit in your called script to report other fail states.
if($return_code == 0) {
file_put_contents('/path/to/folder/last_success.txt', time());
} else {
file_put_contents('/path/to/folder/error_report.json', json_encode([
'return_code' => $return_code,
'time' => time(),
'output' => implode("\n", $output),
//assuming here that error output isn't silently logged somewhere already.
], JSON_PRETTY_PRINT));
}
And then a watcher.php that monitors these files on a cron job.
<?php
//Notify us immediately on failure maybe?
//If you have a lot of transient failures it may make more sense to
//aggregate and them in a single report at a specific time instead.
if(is_file('/path/to/folder/error_report.json')) {
//Mail details stored in JSON here.
//rename file so it's recorded, but we don't receive it again.
rename('/path/to/folder/error_report.json', '/path/to/folder/error_report.json'.'-sent-'.date('Y-m-d-H-i-s'));
} else {
if(is_file('/path/to/folder/last_success.txt')) {
$last_success = intval(file_get_contents('/path/to/folder/last_success.txt'));
if(strtotime('-24 hours') > $last_success) {
//Our script hasn't run in 24 hours, let someone know.
}
} else {
//No successful run recorded. Might want to put code here if that's unexpected.
}
}
Notes: There are some caveats to the specific approach displayed above. A script can fail in a non-fatal way and if you're not checking for it this example could record that as a successful run. For example, permissions errors causing warnings but the script still runs it's full course and exits normally without hitting an exit call with a specific return code. Our example invoker here would log that as a successful run - even though it isn't.
Another option is to log success from your script and only check for error exits from the invoker.
I'm going on a limb here; I'm trying to direct a long running script to Artisan. Is it possible for App::call() to return a string value or maybe even send an email once the long running script finishes?
I'm trying to look for more info on this, but is it right to assume that if Artisan is running I can redirect the user to something like a waiting page, maybe a looping gif?
Use Queue::push() with an appropriate driver (database, perhaps) to push the long-running job to a queue.
The last thing the long-running job should do is send some indication that it's finished.
Here's some sample code:
Queue::push(function($job) use ($id)
{
Artisan::call('my-command', ['arg1', 'arg2']);
$job->delete();
});
// Then at the end of your my-command script:
$jobModel = LongRunningJob::find($id);
$jobModel->finishedDate = Carbon::now();
$jobModel->save();
Of course you can then poll the database to determine whether the long-running command has finished.
I searched around but i couldn't find a solution other than set_time_limit(0) which won't work on most of the shared hosting around.
Basically i have a script that send messages to my user's friends when they want. Some of my users have +4000 friends and the script gets into trouble.
Currently im calling this script in the background with AJAX. As i don't need/want the user to wait until this finish i would love to have some kind of background proccesing.
My current code:
global $client, $emails, $subject, $message;
_info("got on_auth_success cb, jid ".$client->full_jid->to_string());
$client->set_status("available!", "dnd", 10);
set_time_limit(60*10);
if( count($emails) < 40 ){
foreach( $emails as $email )
{
$msg = new XMPPMsg(array('to'=>'-'.$email.'#chat.facebook.com'), $message);
$client->send($msg);
sleep(1);
}
}
else
{
$counter = 0;
//Lets create batches
foreach( $emails as $email )
{
$counter++;
$msg = new XMPPMsg(array('to'=>'-'.$email.'#chat.facebook.com'), $message);
$client->send($msg);
sleep(1);
if( $counter == 50 )
{
sleep(10);
$counter = 0;
}
}
}
$client->send_end_stream();
Would be a good solution to use exec ? like for example
exec("doTask.php $arg1 $arg2 $arg3 >/dev/null 2>&1 &");
I need a solution that works on most of the hosting as this is a wordpress plugin that can be installed on any host. Thanks!
It would be ideal if you put this into some sort of cron. Build a list of emails to send and store them in a queue and have a cron script of some sort process that queue. Wordpress does have it's own cron mechanism (see wp_cron), but it might be difficult on low-traffic sites to run frequently enough to send that number of emails. Granted, you could use cron proper to make sure wp_cron is run. You should see How to execute a large PHP Script? as it's very related.
If you really want it to be synchronous, since you are using background ajax you could also just make a number of smaller ajax calls. For example, you could make your ajax script perform 20 emails at a time, and then return the number of emails remaining. Then your client code on the browser knows there's still some left and calls the background php via ajax again, perhaps with a counter indicating the current position. [Edit: But this is reliant on your users being patient enough to keep their browsers open to complete the send, and they may well need to have relatively stable connection to the internet.]
I have a php script that displays a web form to the user. When the user submits the form, i save the data in the database and then i fork a process, the son, to send some SMS to the database users afected with the new changes.
When i fork, i check for the son, and he sends the SMS correctly, and in the end exits.
But by some reason, the father waits for the son to do his tasks. I dunno why this is happening..
Here is a sample of my code:
// before this point, i've inserted some data in the database, and now i commit the transaction
$conn->commit();
$pid = pcntl_fork();
if ($pid == 0) { // its the son, so he will send the messages
$conn = new PDO('oci:dbname='.SERVER.';charset='.CHARSET, DATABASE, PASSWORD);
$suppliersPhoneNumber = getSuppliersPhoneNumber($conn, ...);
$conn = null;
$sms = new MessageSender($suppliersPhoneNumber, $_POST['subCategory']);
$sms->handleMessages(); // send the sms
//sleep(60);
exit(0); // the son won't execute more code
}/
The line with the code "sleep(60)" is how i know that the father is waiting for the child. But how is this possible if the son exits?? I know the father waits for the son, cause in fact my script freezes for 1 minute, the waiting time.
My idea is to have a father inserting the required data in the database, in the end he spawns a new child to send the messages, but doesn't waits for him, so we can send a response page to the user saying everything went fine, while the messages are effectively being sent.
What is going wrong here?
Thks in advance
EDIT
The problem was not solve, instead i followed the solution of Paulo H. registed below. Indeed it was a better way.
I believe that you are using Apache to run this code, I suggest you run a completely separate process in the background.
I think the problem is happening because Apache waits for this child process before sending the information to the browser. Anyway do not recommend fork if not in a php-cli script.
Usually the child process stay in zombie mode until the parent process call a "wait", but there not seems to be the case.
By forking a process the child process will be in the same process group. Perhaps Apache, or whatever web server you are using is waiting for all processes in the group to end before detaching and ending the request.
Try calling setsid() in the child after forking which should put it as a leader in a new group.
http://www.php.net/manual/en/function.posix-setsid.php
instead of
exit();
try using this
posix_kill(getmypid(),9);
and please tell if this works.
Here is the basic setup:
A PHP script writes to a table in a database and then issues NOTIFY job_added. It then begins listening for a response by issuing LISTEN job_complete
A daemon (written in C) has already issued a LISTEN jod_added and hence wakes up and processes the table.
The daemon processes the table and writes results into a results table before calling NOTIFY job_complete
The PHP script then wakes up and retrieves the result from the results table.
All but the last step is working. The daemon uses libpq and I have checked the success of the NOTIFY issued by the daemon once it has added the result to the results table.
So I think the problem lies with the PHP script. Here is the pertitent code:
$id = query("INSERT into jobs_table (/* details not important */) VALUES (/* */) RETURNING id");
query("NOTIFY job_added");
//daemon wakes up and does its thing.
query("LISTEN job_complete".$id);
$time = time();
while((time() - $time) < 30) {
$notify = pg_get_notify($conn);
if($notify) {
// Never gets here
if($notify['message']=="job_complete".$id) {
//our job has completed
break;
}
}
usleep(25000);
}
So we add to the jobs table, issue a LISTEN and loop for 30seconds until we get the notification that our job is done.
The problem is that pg_get_notify() never picks up the NOTIFY issued by the daemon. Note, the NOTIFY issued by the daemon happens after the LISTEN by the php script, I checked.
Is there anything I am doing that is completely wrong? Btw, I am well aware query() isn't a built-in function, it was added for brevity.
Thanks
I would be willing to bet that the problem is that you are not committing the transaction. Notifies are raised on commit.
Try:
query('COMMIT');
See if that raises the notification for you.