Run PHP function/script in background? [duplicate] - php

Problem
I have a form that, when submitted, will run basic code to process the information submitted and insert it into a database for display on a notification website. In addition, I have a list of people who have signed up to receive these notifications via email and SMS message. This list is trivial as the moment (only pushing about 150), however it's enough to cause it takes upwards of a minute to cycle through the entire table of subscribers and send out 150+ emails. (The emails are being sent individually as requested by the system administrators of our email server because of mass email policies.)
During this time, the individual who posted the alert will sit on the last page of the form for almost a minute without any positive reinforcement that their notification is being posted. This leads to other potential problems, all that have possible solutions that I feel are less than ideal.
First, the poster might think the server is lagging and click the 'Submit' button again, causing the script to start over or run twice. I could solve this by using JavaScript to disable the button and replace the text to say something like 'Processing...', however this is less than ideal because the user will still be stuck on the page for the length of the script execution. (Also, if JavaScript is disabled, this problem still exists.)
Second, the poster might close the tab or the browser prematurely after submitting the form. The script will keeping running on the server until it tries to write back to the browser, however if the user then browses to any page within our domain (while the script is still running), the browser hangs loading the page until the script has ended. (This only happens when a tab or window of the browser is closed and not the entire browser application.) Still, this is less than ideal.
(Possible) Solution
I've decided I want to break out the "email" part of the script into a separate file I can call after the notification has been posted. I originally thought of putting this on the confirmation page after the notification has been successfully posted. However, the user will not know this script is running and any anomalies will not be apparent to them; This script cannot fail.
But, what if I can run this script as a background process? So, my question is this: How can I execute a PHP script to trigger as a background service and run completely independent of what the user has done at the form level?
EDIT: This cannot be cron'ed. It must run the instant the form is submitted. These are high-priority notifications. In addition, the system administrators running our servers disallow crons from running any more frequently than 5 minutes.

Doing some experimentation with exec and shell_exec I have uncovered a solution that worked perfectly! I choose to use shell_exec so I can log every notification process that happens (or doesn't). (shell_exec returns as a string and this was easier than using exec, assigning the output to a variable and then opening a file to write to.)
I'm using the following line to invoke the email script:
shell_exec("/path/to/php /path/to/send_notifications.php '".$post_id."' 'alert' >> /path/to/alert_log/paging.log &");
It is important to notice the & at the end of the command (as pointed out by #netcoder). This UNIX command runs a process in the background.
The extra variables surrounded in single quotes after the path to the script are set as $_SERVER['argv'] variables that I can call within my script.
The email script then outputs to my log file using the >> and will output something like this:
[2011-01-07 11:01:26] Alert Notifications Sent for http://alerts.illinoisstate.edu/2049 (SCRIPT: 38.71 seconds)
[2011-01-07 11:01:34] CRITICAL ERROR: Alert Notifications NOT sent for http://alerts.illinoisstate.edu/2049 (SCRIPT: 23.12 seconds)

On Linux/Unix servers, you can execute a job in the background by using proc_open:
$descriptorspec = array(
array('pipe', 'r'), // stdin
array('file', 'myfile.txt', 'a'), // stdout
array('pipe', 'w'), // stderr
);
$proc = proc_open('php email_script.php &', $descriptorspec, $pipes);
The & being the important bit here. The script will continue even if the original script has ended.

Of all the answers, none considered the ridiculously easy fastcgi_finish_request function, that when called, flushes all remaining output to the browser and closes the Fastcgi session and the HTTP connection, while letting the script run in the background.
Example:
<?php
header('Content-Type: application/json');
echo json_encode(['ok' => true]);
fastcgi_finish_request(); // The user is now disconnected from the script
// Do stuff with received data
Note: Due to a wontfix quirk calling flush() after fastcgi_finish_request will cause it to exit without warning/error.
You may wish to call ignore_user_abort(true) beforehand to supress this behavior, or simply avoid calling flush() after you've intentionally closed the connection :)
$connected = true;
// Stuff...
fastcgi_finish_request();
$connected = false;
// ...
if ($connected) {
flush();
}
Or
ignore_user_abort(true);
fastcgi_finish_request();
// Accidental flush()es won't do harm (even if you really shouldn't be calling flush() if you know you've disconnected from the user)
flush();

PHP exec("php script.php") can do it.
From the Manual:
If a program is started with this
function, in order for it to continue
running in the background, the output
of the program must be redirected to a
file or another output stream. Failing
to do so will cause PHP to hang until
the execution of the program ends.
So if you redirect the output to a log file (what is a good idea anyways), your calling script will not hang and your email script will run in bg.

And why not making a HTTP Request on the script and ignoring the response ?
http://php.net/manual/en/function.httprequest-send.php
If you make your request on the script you need to call your webserver will run it in background and you can (in your main script) show a message telling the user that the script is running.

The simpler way to run a PHP script in background is
php script.php >/dev/null &
The script will run in background and the page will also reach the action page faster.

How about this?
Your PHP script that holds the form saves a flag or some value into a database or file.
A second PHP script polls for this value periodically and if it's been set, it triggers the Email script in a synchronous manner.
This second PHP script should be set to run as a cron.

As I know you cannot do this in easy way (see fork exec etc (don't work under windows)), may be you can reverse the approach, use the background of the browser posting the form in ajax, so if the post still work you've no wait time.
This can help even if you have to do some long elaboration.
About sending mail it's always suggest to use a spooler, may be a local & quick smtp server that accept your requests and the spool them to the real MTA or put all in a DB, than use a cron that spool the queue.
The cron may be on another machine calling the spooler as external url:
* * * * * wget -O /dev/null http://www.example.com/spooler.php

Background cron job sounds like a good idea for this.
You'll need ssh access to the machine to run the script as a cron.
$ php scriptname.php to run it.

If you can access the server over ssh and can run your own scripts you can make a simple fifo server using php (although you will have to recompile php with posix support for fork).
The server can be written in anything really, you probably can easily do it in python.
Or the simplest solution would be sending an HttpRequest and not reading the return data but the server might destroy the script before it finish processing.
Example server :
<?php
define('FIFO_PATH', '/home/user/input.queue');
define('FORK_COUNT', 10);
if(file_exists(FIFO_PATH)) {
die(FIFO_PATH . ' exists, please delete it and try again.' . "\n");
}
if(!file_exists(FIFO_PATH) && !posix_mkfifo(FIFO_PATH, 0666)){
die('Couldn\'t create the listening fifo.' . "\n");
}
$pids = array();
$fp = fopen(FIFO_PATH, 'r+');
for($i = 0; $i < FORK_COUNT; ++$i) {
$pids[$i] = pcntl_fork();
if(!$pids[$i]) {
echo "process(" . posix_getpid() . ", id=$i)\n";
while(true) {
$line = chop(fgets($fp));
if($line == 'quit' || $line === false) break;
echo "processing (" . posix_getpid() . ", id=$i) :: $line\n";
// $data = json_decode($line);
// processData($data);
}
exit();
}
}
fclose($fp);
foreach($pids as $pid){
pcntl_waitpid($pid, $status);
}
unlink(FIFO_PATH);
?>
Example client :
<?php
define('FIFO_PATH', '/home/user/input.queue');
if(!file_exists(FIFO_PATH)) {
die(FIFO_PATH . ' doesn\'t exist, please make sure the fifo server is running.' . "\n");
}
function postToQueue($data) {
$fp = fopen(FIFO_PATH, 'w+');
stream_set_blocking($fp, false); //don't block
$data = json_encode($data) . "\n";
if(fwrite($fp, $data) != strlen($data)) {
echo "Couldn't the server might be dead or there's a bug somewhere\n";
}
fclose($fp);
}
$i = 1000;
while(--$i) {
postToQueue(array('xx'=>21, 'yy' => array(1,2,3)));
}
?>

If you're on Windows, research proc_open or popen...
But if we're on the same server "Linux" running cpanel then this is the right approach:
#!/usr/bin/php
<?php
$pid = shell_exec("nohup nice php -f
'path/to/your/script.php' /dev/null 2>&1 & echo $!");
While(exec("ps $pid"))
{ //you can also have a streamer here like fprintf,
// or fgets
}
?>
Don't use fork() or curl if you doubt you can handle them, it's just like abusing your server
Lastly, on the script.php file which is called above, take note of this make sure you wrote:
<?php
ignore_user_abort(TRUE);
set_time_limit(0);
ob_start();
// <-- really optional but this is pure php
//Code to be tested on background
ob_flush(); flush();
//this two do the output process if you need some.
//then to make all the logic possible
str_repeat(" ",1500);
//.for progress bars or loading images
sleep(2); //standard limit
?>

For background worker, I think you should try this technique. It will help to call as many as pages you like all pages will run at once independently without waiting for each page response as asynchronous.
form_action_page.php
<?php
post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue");
// post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue2");
// post_async("http://localhost/projectname/otherpage.php", "Keywordname=anyValue");
// call as many as pages you like all pages will run at once //independently without waiting for each page response as asynchronous.
// Your form db insertion or other code goes here do what ever you want //above code will work as background job this line will direct hit before //above lines response
/*
* Executes a PHP page asynchronously so the current page does not have to wait for it to finish running.
*/
function post_async($url,$params)
{
$post_string = $params;
$parts = parse_url($url);
$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);
$out = "GET ".$parts['path']."?$post_string"." HTTP/1.1\r\n";//you can use POST instead of GET if you like
$out .= "Host: ".$parts['host']."\r\n";
$out .= "Content-Type: application/x-www-form-urlencoded\r\n";
$out .= "Content-Length: ".strlen($post_string)."\r\n";
$out .= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
fclose($fp);
}
?>
testpage.php
<?php
echo $_REQUEST["Keywordname"];//case1 Output > testValue
// here do your background operations it will not halt main page
?>
P.S: if you want to send url parameters as loop then follow this answer: https://stackoverflow.com/a/41225209/6295712

Assuming you are running on a *nix platform, use cron and the php executable.
EDIT:
There are quite a number of questions asking for "running php without cron" on SO already. Here's one:
Schedule scripts without using CRON
That said, the exec() answer above sounds very promising :)

In my case I have 3 params, one of them is string (mensaje):
exec("C:\wamp\bin\php\php5.5.12\php.exe C:/test/N/trunk/api/v1/Process.php $idTest2 $idTest3 \"$mensaje\" >> c:/log.log &");
In my Process.php I have this code:
if (!isset($argv[1]) || !isset($argv[2]) || !isset($argv[3]))
{
die("Error.");
}
$idCurso = $argv[1];
$idDestino = $argv[2];
$mensaje = $argv[3];

Use Amphp to execute jobs in parallel & asynchronously.
Install the library
composer require amphp/parallel-functions
Code sample
<?php
require "vendor/autoload.php";
use Amp\Promise;
use Amp\ParallelFunctions;
echo 'started</br>';
$promises[1] = ParallelFunctions\parallel(function (){
// Send Email
})();
$promises[2] = ParallelFunctions\parallel(function (){
// Send SMS
})();
Promise\wait(Promise\all($promises));
echo 'finished';
Fo your use case, You can do something like below
<?php
use function Amp\ParallelFunctions\parallelMap;
use function Amp\Promise\wait;
$responses = wait(parallelMap([
'a#example.com',
'b#example.com',
'c#example.com',
], function ($to) {
return send_mail($to);
}));

This works for me. try this:
exec("php asyn.php > /dev/null 2>/dev/null &);

Related

PHP cURL; Wait for API status change before continuing [duplicate]

I work on a somewhat large web application, and the backend is mostly in PHP. There are several places in the code where I need to complete some task, but I don't want to make the user wait for the result. For example, when creating a new account, I need to send them a welcome email. But when they hit the 'Finish Registration' button, I don't want to make them wait until the email is actually sent, I just want to start the process, and return a message to the user right away.
Up until now, in some places I've been using what feels like a hack with exec(). Basically doing things like:
exec("doTask.php $arg1 $arg2 $arg3 >/dev/null 2>&1 &");
Which appears to work, but I'm wondering if there's a better way. I'm considering writing a system which queues up tasks in a MySQL table, and a separate long-running PHP script that queries that table once a second, and executes any new tasks it finds. This would also have the advantage of letting me split the tasks among several worker machines in the future if I needed to.
Am I re-inventing the wheel? Is there a better solution than the exec() hack or the MySQL queue?
I've used the queuing approach, and it works well as you can defer that processing until your server load is idle, letting you manage your load quite effectively if you can partition off "tasks which aren't urgent" easily.
Rolling your own isn't too tricky, here's a few other options to check out:
GearMan - this answer was written in 2009, and since then GearMan looks a popular option, see comments below.
ActiveMQ if you want a full blown open source message queue.
ZeroMQ - this is a pretty cool socket library which makes it easy to write distributed code without having to worry too much about the socket programming itself. You could use it for message queuing on a single host - you would simply have your webapp push something to a queue that a continuously running console app would consume at the next suitable opportunity
beanstalkd - only found this one while writing this answer, but looks interesting
dropr is a PHP based message queue project, but hasn't been actively maintained since Sep 2010
php-enqueue is a recently (2017) maintained wrapper around a variety of queue systems
Finally, a blog post about using memcached for message queuing
Another, perhaps simpler, approach is to use ignore_user_abort - once you've sent the page to the user, you can do your final processing without fear of premature termination, though this does have the effect of appearing to prolong the page load from the user perspective.
When you just want to execute one or several HTTP requests without having to wait for the response, there is a simple PHP solution, as well.
In the calling script:
$socketcon = fsockopen($host, 80, $errno, $errstr, 10);
if($socketcon) {
$socketdata = "GET $remote_house/script.php?parameters=... HTTP 1.1\r\nHost: $host\r\nConnection: Close\r\n\r\n";
fwrite($socketcon, $socketdata);
fclose($socketcon);
}
// repeat this with different parameters as often as you like
On the called script.php, you can invoke these PHP functions in the first lines:
ignore_user_abort(true);
set_time_limit(0);
This causes the script to continue running without time limit when the HTTP connection is closed.
Another way to fork processes is via curl. You can set up your internal tasks as a webservice. For example:
http://domain/tasks/t1
http://domain/tasks/t2
Then in your user accessed scripts make calls to the service:
$service->addTask('t1', $data); // post data to URL via curl
Your service can keep track of the queue of tasks with mysql or whatever you like the point is: it's all wrapped up within the service and your script is just consuming URLs. This frees you up to move the service to another machine/server if necessary (ie easily scalable).
Adding http authorization or a custom authorization scheme (like Amazon's web services) lets you open up your tasks to be consumed by other people/services (if you want) and you could take it further and add a monitoring service on top to keep track of queue and task status.
http://domain/queue?task=t1
http://domain/queue?task=t2
http://domain/queue/t1/100931
It does take a bit of set-up work but there are a lot of benefits.
If it just a question of providing expensive tasks, in case of php-fpm is supported, why not to use fastcgi_finish_request() function?
This function flushes all response data to the client and finishes the request. This allows for time consuming tasks to be performed without leaving the connection to the client open.
You don't really use asynchronicity in this way:
Make all your main code first.
Execute fastcgi_finish_request().
Make all heavy stuff.
Once again php-fpm is needed.
I've used Beanstalkd for one project, and planned to again. I've found it to be an excellent way to run asynchronous processes.
A couple of things I've done with it are:
Image resizing - and with a lightly loaded queue passing off to a CLI-based PHP script, resizing large (2mb+) images worked just fine, but trying to resize the same images within a mod_php instance was regularly running into memory-space issues (I limited the PHP process to 32MB, and the resizing took more than that)
near-future checks - beanstalkd has delays available to it (make this job available to run only after X seconds) - so I can fire off 5 or 10 checks for an event, a little later in time
I wrote a Zend-Framework based system to decode a 'nice' url, so for example, to resize an image it would call QueueTask('/image/resize/filename/example.jpg'). The URL was first decoded to an array(module,controller,action,parameters), and then converted to JSON for injection to the queue itself.
A long running cli script then picked up the job from the queue, ran it (via Zend_Router_Simple), and if required, put information into memcached for the website PHP to pick up as required when it was done.
One wrinkle I did also put in was that the cli-script only ran for 50 loops before restarting, but if it did want to restart as planned, it would do so immediately (being run via a bash-script). If there was a problem and I did exit(0) (the default value for exit; or die();) it would first pause for a couple of seconds.
Here is a simple class I coded for my web application. It allows for forking PHP scripts and other scripts. Works on UNIX and Windows.
class BackgroundProcess {
static function open($exec, $cwd = null) {
if (!is_string($cwd)) {
$cwd = #getcwd();
}
#chdir($cwd);
if (strtoupper(substr(PHP_OS, 0, 3)) == 'WIN') {
$WshShell = new COM("WScript.Shell");
$WshShell->CurrentDirectory = str_replace('/', '\\', $cwd);
$WshShell->Run($exec, 0, false);
} else {
exec($exec . " > /dev/null 2>&1 &");
}
}
static function fork($phpScript, $phpExec = null) {
$cwd = dirname($phpScript);
#putenv("PHP_FORCECLI=true");
if (!is_string($phpExec) || !file_exists($phpExec)) {
if (strtoupper(substr(PHP_OS, 0, 3)) == 'WIN') {
$phpExec = str_replace('/', '\\', dirname(ini_get('extension_dir'))) . '\php.exe';
if (#file_exists($phpExec)) {
BackgroundProcess::open(escapeshellarg($phpExec) . " " . escapeshellarg($phpScript), $cwd);
}
} else {
$phpExec = exec("which php-cli");
if ($phpExec[0] != '/') {
$phpExec = exec("which php");
}
if ($phpExec[0] == '/') {
BackgroundProcess::open(escapeshellarg($phpExec) . " " . escapeshellarg($phpScript), $cwd);
}
}
} else {
if (strtoupper(substr(PHP_OS, 0, 3)) == 'WIN') {
$phpExec = str_replace('/', '\\', $phpExec);
}
BackgroundProcess::open(escapeshellarg($phpExec) . " " . escapeshellarg($phpScript), $cwd);
}
}
}
PHP HAS multithreading, its just not enabled by default, there is an extension called pthreads which does exactly that.
You'll need php compiled with ZTS though. (Thread Safe)
Links:
Examples
Another tutorial
pthreads PECL Extension
UPDATE: since PHP 7.2 parallel extension comes into play
Tutorial/Example
reference manual
This is the same method I have been using for a couple of years now and I haven't seen or found anything better. As people have said, PHP is single threaded, so there isn't much else you can do.
I have actually added one extra level to this and that's getting and storing the process id. This allows me to redirect to another page and have the user sit on that page, using AJAX to check if the process is complete (process id no longer exists). This is useful for cases where the length of the script would cause the browser to timeout, but the user needs to wait for that script to complete before the next step. (In my case it was processing large ZIP files with CSV like files that add up to 30 000 records to the database after which the user needs to confirm some information.)
I have also used a similar process for report generation. I'm not sure I'd use "background processing" for something such as an email, unless there is a real problem with a slow SMTP. Instead I might use a table as a queue and then have a process that runs every minute to send the emails within the queue. You would need to be warry of sending emails twice or other similar problems. I would consider a similar queueing process for other tasks as well.
It's a great idea to use cURL as suggested by rojoca.
Here is an example. You can monitor text.txt while the script is running in background:
<?php
function doCurl($begin)
{
echo "Do curl<br />\n";
$url = 'http://'.$_SERVER['SERVER_NAME'].$_SERVER['REQUEST_URI'];
$url = preg_replace('/\?.*/', '', $url);
$url .= '?begin='.$begin;
echo 'URL: '.$url.'<br>';
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$result = curl_exec($ch);
echo 'Result: '.$result.'<br>';
curl_close($ch);
}
if (empty($_GET['begin'])) {
doCurl(1);
}
else {
while (ob_get_level())
ob_end_clean();
header('Connection: close');
ignore_user_abort();
ob_start();
echo 'Connection Closed';
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush();
flush();
$begin = $_GET['begin'];
$fp = fopen("text.txt", "w");
fprintf($fp, "begin: %d\n", $begin);
for ($i = 0; $i < 15; $i++) {
sleep(1);
fprintf($fp, "i: %d\n", $i);
}
fclose($fp);
if ($begin < 10)
doCurl($begin + 1);
}
?>
There is a PHP extension, called Swoole.
Although it might not be enabled, it is available on my hosting for being enabled at click of a button.
Worth checking it out. I haven't had time to use it yet, as I was searching here for info, when I stumbled across it and thought it worth sharing.
Unfortunately PHP does not have any kind of native threading capabilities. So I think in this case you have no choice but to use some kind of custom code to do what you want to do.
If you search around the net for PHP threading stuff, some people have come up with ways to simulate threads on PHP.
If you set the Content-Length HTTP header in your "Thank You For Registering" response, then the browser should close the connection after the specified number of bytes are received. This leaves the server side process running (assuming that ignore_user_abort is set) so it can finish working without making the end user wait.
Of course you will need to calculate the size of your response content before rendering the headers, but that's pretty easy for short responses (write output to a string, call strlen(), call header(), render string).
This approach has the advantage of not forcing you to manage a "front end" queue, and although you may need to do some work on the back end to prevent racing HTTP child processes from stepping on each other, that's something you needed to do already, anyway.
If you don't want the full blown ActiveMQ, I recommend to consider RabbitMQ. RabbitMQ is lightweight messaging that uses the AMQP standard.
I recommend to also look into php-amqplib - a popular AMQP client library to access AMQP based message brokers.
Spawning new processes on the server using exec() or directly on another server using curl doesn't scale all that well at all, if we go for exec you are basically filling your server with long running processes which can be handled by other non web facing servers, and using curl ties up another server unless you build in some sort of load balancing.
I have used Gearman in a few situations and I find it better for this sort of use case. I can use a single job queue server to basically handle queuing of all the jobs needing to be done by the server and spin up worker servers, each of which can run as many instances of the worker process as needed, and scale up the number of worker servers as needed and spin them down when not needed. It also let's me shut down the worker processes entirely when needed and queues the jobs up until the workers come back online.
i think you should try this technique it will help to call as many as pages you like all pages will run at once independently without waiting for each page response as asynchronous.
cornjobpage.php //mainpage
<?php
post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue");
//post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue2");
//post_async("http://localhost/projectname/otherpage.php", "Keywordname=anyValue");
//call as many as pages you like all pages will run at once independently without waiting for each page response as asynchronous.
?>
<?php
/*
* Executes a PHP page asynchronously so the current page does not have to wait for it to finish running.
*
*/
function post_async($url,$params)
{
$post_string = $params;
$parts=parse_url($url);
$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);
$out = "GET ".$parts['path']."?$post_string"." HTTP/1.1\r\n";//you can use POST instead of GET if you like
$out.= "Host: ".$parts['host']."\r\n";
$out.= "Content-Type: application/x-www-form-urlencoded\r\n";
$out.= "Content-Length: ".strlen($post_string)."\r\n";
$out.= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
fclose($fp);
}
?>
testpage.php
<?
echo $_REQUEST["Keywordname"];//case1 Output > testValue
?>
PS:if you want to send url parameters as loop then follow this answer :https://stackoverflow.com/a/41225209/6295712
PHP is a single-threaded language, so there is no official way to start an asynchronous process with it other than using exec or popen. There is a blog post about that here. Your idea for a queue in MySQL is a good idea as well.
Your specific requirement here is for sending an email to the user. I'm curious as to why you are trying to do that asynchronously since sending an email is a pretty trivial and quick task to perform. I suppose if you are sending tons of email and your ISP is blocking you on suspicion of spamming, that might be one reason to queue, but other than that I can't think of any reason to do it this way.

Executing a daemon in php [duplicate]

Problem
I have a form that, when submitted, will run basic code to process the information submitted and insert it into a database for display on a notification website. In addition, I have a list of people who have signed up to receive these notifications via email and SMS message. This list is trivial as the moment (only pushing about 150), however it's enough to cause it takes upwards of a minute to cycle through the entire table of subscribers and send out 150+ emails. (The emails are being sent individually as requested by the system administrators of our email server because of mass email policies.)
During this time, the individual who posted the alert will sit on the last page of the form for almost a minute without any positive reinforcement that their notification is being posted. This leads to other potential problems, all that have possible solutions that I feel are less than ideal.
First, the poster might think the server is lagging and click the 'Submit' button again, causing the script to start over or run twice. I could solve this by using JavaScript to disable the button and replace the text to say something like 'Processing...', however this is less than ideal because the user will still be stuck on the page for the length of the script execution. (Also, if JavaScript is disabled, this problem still exists.)
Second, the poster might close the tab or the browser prematurely after submitting the form. The script will keeping running on the server until it tries to write back to the browser, however if the user then browses to any page within our domain (while the script is still running), the browser hangs loading the page until the script has ended. (This only happens when a tab or window of the browser is closed and not the entire browser application.) Still, this is less than ideal.
(Possible) Solution
I've decided I want to break out the "email" part of the script into a separate file I can call after the notification has been posted. I originally thought of putting this on the confirmation page after the notification has been successfully posted. However, the user will not know this script is running and any anomalies will not be apparent to them; This script cannot fail.
But, what if I can run this script as a background process? So, my question is this: How can I execute a PHP script to trigger as a background service and run completely independent of what the user has done at the form level?
EDIT: This cannot be cron'ed. It must run the instant the form is submitted. These are high-priority notifications. In addition, the system administrators running our servers disallow crons from running any more frequently than 5 minutes.
Doing some experimentation with exec and shell_exec I have uncovered a solution that worked perfectly! I choose to use shell_exec so I can log every notification process that happens (or doesn't). (shell_exec returns as a string and this was easier than using exec, assigning the output to a variable and then opening a file to write to.)
I'm using the following line to invoke the email script:
shell_exec("/path/to/php /path/to/send_notifications.php '".$post_id."' 'alert' >> /path/to/alert_log/paging.log &");
It is important to notice the & at the end of the command (as pointed out by #netcoder). This UNIX command runs a process in the background.
The extra variables surrounded in single quotes after the path to the script are set as $_SERVER['argv'] variables that I can call within my script.
The email script then outputs to my log file using the >> and will output something like this:
[2011-01-07 11:01:26] Alert Notifications Sent for http://alerts.illinoisstate.edu/2049 (SCRIPT: 38.71 seconds)
[2011-01-07 11:01:34] CRITICAL ERROR: Alert Notifications NOT sent for http://alerts.illinoisstate.edu/2049 (SCRIPT: 23.12 seconds)
On Linux/Unix servers, you can execute a job in the background by using proc_open:
$descriptorspec = array(
array('pipe', 'r'), // stdin
array('file', 'myfile.txt', 'a'), // stdout
array('pipe', 'w'), // stderr
);
$proc = proc_open('php email_script.php &', $descriptorspec, $pipes);
The & being the important bit here. The script will continue even if the original script has ended.
Of all the answers, none considered the ridiculously easy fastcgi_finish_request function, that when called, flushes all remaining output to the browser and closes the Fastcgi session and the HTTP connection, while letting the script run in the background.
Example:
<?php
header('Content-Type: application/json');
echo json_encode(['ok' => true]);
fastcgi_finish_request(); // The user is now disconnected from the script
// Do stuff with received data
Note: Due to a wontfix quirk calling flush() after fastcgi_finish_request will cause it to exit without warning/error.
You may wish to call ignore_user_abort(true) beforehand to supress this behavior, or simply avoid calling flush() after you've intentionally closed the connection :)
$connected = true;
// Stuff...
fastcgi_finish_request();
$connected = false;
// ...
if ($connected) {
flush();
}
Or
ignore_user_abort(true);
fastcgi_finish_request();
// Accidental flush()es won't do harm (even if you really shouldn't be calling flush() if you know you've disconnected from the user)
flush();
PHP exec("php script.php") can do it.
From the Manual:
If a program is started with this
function, in order for it to continue
running in the background, the output
of the program must be redirected to a
file or another output stream. Failing
to do so will cause PHP to hang until
the execution of the program ends.
So if you redirect the output to a log file (what is a good idea anyways), your calling script will not hang and your email script will run in bg.
And why not making a HTTP Request on the script and ignoring the response ?
http://php.net/manual/en/function.httprequest-send.php
If you make your request on the script you need to call your webserver will run it in background and you can (in your main script) show a message telling the user that the script is running.
The simpler way to run a PHP script in background is
php script.php >/dev/null &
The script will run in background and the page will also reach the action page faster.
How about this?
Your PHP script that holds the form saves a flag or some value into a database or file.
A second PHP script polls for this value periodically and if it's been set, it triggers the Email script in a synchronous manner.
This second PHP script should be set to run as a cron.
As I know you cannot do this in easy way (see fork exec etc (don't work under windows)), may be you can reverse the approach, use the background of the browser posting the form in ajax, so if the post still work you've no wait time.
This can help even if you have to do some long elaboration.
About sending mail it's always suggest to use a spooler, may be a local & quick smtp server that accept your requests and the spool them to the real MTA or put all in a DB, than use a cron that spool the queue.
The cron may be on another machine calling the spooler as external url:
* * * * * wget -O /dev/null http://www.example.com/spooler.php
Background cron job sounds like a good idea for this.
You'll need ssh access to the machine to run the script as a cron.
$ php scriptname.php to run it.
Assuming you are running on a *nix platform, use cron and the php executable.
EDIT:
There are quite a number of questions asking for "running php without cron" on SO already. Here's one:
Schedule scripts without using CRON
That said, the exec() answer above sounds very promising :)
If you can access the server over ssh and can run your own scripts you can make a simple fifo server using php (although you will have to recompile php with posix support for fork).
The server can be written in anything really, you probably can easily do it in python.
Or the simplest solution would be sending an HttpRequest and not reading the return data but the server might destroy the script before it finish processing.
Example server :
<?php
define('FIFO_PATH', '/home/user/input.queue');
define('FORK_COUNT', 10);
if(file_exists(FIFO_PATH)) {
die(FIFO_PATH . ' exists, please delete it and try again.' . "\n");
}
if(!file_exists(FIFO_PATH) && !posix_mkfifo(FIFO_PATH, 0666)){
die('Couldn\'t create the listening fifo.' . "\n");
}
$pids = array();
$fp = fopen(FIFO_PATH, 'r+');
for($i = 0; $i < FORK_COUNT; ++$i) {
$pids[$i] = pcntl_fork();
if(!$pids[$i]) {
echo "process(" . posix_getpid() . ", id=$i)\n";
while(true) {
$line = chop(fgets($fp));
if($line == 'quit' || $line === false) break;
echo "processing (" . posix_getpid() . ", id=$i) :: $line\n";
// $data = json_decode($line);
// processData($data);
}
exit();
}
}
fclose($fp);
foreach($pids as $pid){
pcntl_waitpid($pid, $status);
}
unlink(FIFO_PATH);
?>
Example client :
<?php
define('FIFO_PATH', '/home/user/input.queue');
if(!file_exists(FIFO_PATH)) {
die(FIFO_PATH . ' doesn\'t exist, please make sure the fifo server is running.' . "\n");
}
function postToQueue($data) {
$fp = fopen(FIFO_PATH, 'w+');
stream_set_blocking($fp, false); //don't block
$data = json_encode($data) . "\n";
if(fwrite($fp, $data) != strlen($data)) {
echo "Couldn't the server might be dead or there's a bug somewhere\n";
}
fclose($fp);
}
$i = 1000;
while(--$i) {
postToQueue(array('xx'=>21, 'yy' => array(1,2,3)));
}
?>
If you're on Windows, research proc_open or popen...
But if we're on the same server "Linux" running cpanel then this is the right approach:
#!/usr/bin/php
<?php
$pid = shell_exec("nohup nice php -f
'path/to/your/script.php' /dev/null 2>&1 & echo $!");
While(exec("ps $pid"))
{ //you can also have a streamer here like fprintf,
// or fgets
}
?>
Don't use fork() or curl if you doubt you can handle them, it's just like abusing your server
Lastly, on the script.php file which is called above, take note of this make sure you wrote:
<?php
ignore_user_abort(TRUE);
set_time_limit(0);
ob_start();
// <-- really optional but this is pure php
//Code to be tested on background
ob_flush(); flush();
//this two do the output process if you need some.
//then to make all the logic possible
str_repeat(" ",1500);
//.for progress bars or loading images
sleep(2); //standard limit
?>
For background worker, I think you should try this technique. It will help to call as many as pages you like all pages will run at once independently without waiting for each page response as asynchronous.
form_action_page.php
<?php
post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue");
// post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue2");
// post_async("http://localhost/projectname/otherpage.php", "Keywordname=anyValue");
// call as many as pages you like all pages will run at once //independently without waiting for each page response as asynchronous.
// Your form db insertion or other code goes here do what ever you want //above code will work as background job this line will direct hit before //above lines response
/*
* Executes a PHP page asynchronously so the current page does not have to wait for it to finish running.
*/
function post_async($url,$params)
{
$post_string = $params;
$parts = parse_url($url);
$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);
$out = "GET ".$parts['path']."?$post_string"." HTTP/1.1\r\n";//you can use POST instead of GET if you like
$out .= "Host: ".$parts['host']."\r\n";
$out .= "Content-Type: application/x-www-form-urlencoded\r\n";
$out .= "Content-Length: ".strlen($post_string)."\r\n";
$out .= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
fclose($fp);
}
?>
testpage.php
<?php
echo $_REQUEST["Keywordname"];//case1 Output > testValue
// here do your background operations it will not halt main page
?>
P.S: if you want to send url parameters as loop then follow this answer: https://stackoverflow.com/a/41225209/6295712
In my case I have 3 params, one of them is string (mensaje):
exec("C:\wamp\bin\php\php5.5.12\php.exe C:/test/N/trunk/api/v1/Process.php $idTest2 $idTest3 \"$mensaje\" >> c:/log.log &");
In my Process.php I have this code:
if (!isset($argv[1]) || !isset($argv[2]) || !isset($argv[3]))
{
die("Error.");
}
$idCurso = $argv[1];
$idDestino = $argv[2];
$mensaje = $argv[3];
Use Amphp to execute jobs in parallel & asynchronously.
Install the library
composer require amphp/parallel-functions
Code sample
<?php
require "vendor/autoload.php";
use Amp\Promise;
use Amp\ParallelFunctions;
echo 'started</br>';
$promises[1] = ParallelFunctions\parallel(function (){
// Send Email
})();
$promises[2] = ParallelFunctions\parallel(function (){
// Send SMS
})();
Promise\wait(Promise\all($promises));
echo 'finished';
Fo your use case, You can do something like below
<?php
use function Amp\ParallelFunctions\parallelMap;
use function Amp\Promise\wait;
$responses = wait(parallelMap([
'a#example.com',
'b#example.com',
'c#example.com',
], function ($to) {
return send_mail($to);
}));
This works for me. try this:
exec("php asyn.php > /dev/null 2>/dev/null &);

PHP exec - echo output line by line during progress

I'm trying to find a way in which I can echo out the output of an exec call, and then flush that to the screen while the process is running. I have written a simple PHP script which accepts a file upload and then converts the file if it is not the appropriate file type using FFMPEG. I am doing this on a windows machine. Currently my command looks like so:
$cmd = "ffmpeg.exe -i ..\..\uploads\\".$filename." ..\..\uploads\\".$filename.".m4v 2>&1";
exec( $cmd, $output);
I need something like this:
while( $output ) {
print_r( $output);
ob_flush(); flush();
}
I've read about using ob_flush() and flush() to clear the output buffer, but I only get output once the process has completed. The command works perfectly, It just doesn't update the Page while converting. I'd like to have some output so the person knows what's going on.
I've set the time out
set_time_limit( 10 * 60 ); //5 minute time out
and would be very greatful if someone could put me in the right direction. I've looked at a number of solutions which come close one Stackoverflow, but none seem to have worked.
Since the exec call is a blocking call you have no way of using buffers to get status.
Instead you could redirect the output in the system call to a log file. Let the client query the server for progress update in which case the server could parse the last lines of the log file to get information about current progress and send it back to the client.
exec() is blocking call, and will NOT return control to PHP until the external program has terminated. That means you cannot do anything to dump the output on a line-by-line basis because PHP is suspended while the external app is running.
For what you want, you need to use proc_open, which returns a filehandle you can read from in a loop. e.g.
$fh = proc_open('.....');
while($line = fgets($fh)) {
print($line);
flush();
}
There are two problems with this approach:
The first is that, as #Marc B notes, the fact that exec will block until it's finished. You'll have to devise some way of measuring progress.
The second is that using ob_flush() in this way amounts to holding the connection between server & client open and dribbling the data out a little at a time. This is not something that the HTTP protocol was designed for and while it might work sometimes, it's not going to work consistently - different browsers and different servers will time out differently. The better way to do it is via AJAX calls: using Javascript's setTimeout() function (or setInterval()), make a call to the server periodically and have the server send back a progress report.

Running an external php code asynchronously

I am building a WebService, using PHP:
Basically,
User sends a request to the server, via HTTP Request. 'request.php', ie.
Server starts php code asynchronously. 'update.php', ie.
The connection with the user is finished.
The code 'update.php' is still running, and will finish after some time.
The code 'update.php' is finished.
The problem is with php running asynchronously some external code.
Is that possible? Is there another way to do it? With shell_exec?
Please, I need insights! An elegant way is preferable.
Thank you!
The best approach is using message queue like RabbitMQ or even simple MySQL table.
Each time you add new task in front controller it goes to queue. Then update.php run by cron job fetch it from queue, process, save results and mark task as finished.
Also it will help you distribute load over time preventing from DoS caused by your own script.
You could have the user connect to update.php, generate some sort of unique ID to keep track of the process, and then call fsockopen() on itself with a special GET variable to signify that it's doing the heavy lifting rather than user interaction. Close that connection immediately, and then print out the appropriate response to the user.
Meanwhile, look for the special GET variable you specified, and when present call ignore_user_abort() and proceed with whatever operations you need in that branch of the if clause. So here's a rough skeleton of what your update.php file would look like:
<?php
if ( isset($_GET['asynch']) ) {
ignore_user_abort();
// check for $_GET['id'] and validate,
// then execute long-running code here
} else {
// generate $id here
$host = $_SERVER['SERVER_NAME'];
$url = "/update.php?asynch&id={$id}";
if ( $handle = fsockopen($host, 80, $n, $s, 5) ) {
$data = "GET {$url} HTTP/1.0\r\nHost: {$host}\r\n\r\n";
fwrite($handle, $data);
fclose($handle);
}
// return a response to the user
echo 'Response goes here';
}
?>
You could build a service with PHP.
Or launch a PHP script using bash : system("php myScript.php param param2 &")
Look into worker processes with Redis resque or gearman

Run PHP Task Asynchronously

I work on a somewhat large web application, and the backend is mostly in PHP. There are several places in the code where I need to complete some task, but I don't want to make the user wait for the result. For example, when creating a new account, I need to send them a welcome email. But when they hit the 'Finish Registration' button, I don't want to make them wait until the email is actually sent, I just want to start the process, and return a message to the user right away.
Up until now, in some places I've been using what feels like a hack with exec(). Basically doing things like:
exec("doTask.php $arg1 $arg2 $arg3 >/dev/null 2>&1 &");
Which appears to work, but I'm wondering if there's a better way. I'm considering writing a system which queues up tasks in a MySQL table, and a separate long-running PHP script that queries that table once a second, and executes any new tasks it finds. This would also have the advantage of letting me split the tasks among several worker machines in the future if I needed to.
Am I re-inventing the wheel? Is there a better solution than the exec() hack or the MySQL queue?
I've used the queuing approach, and it works well as you can defer that processing until your server load is idle, letting you manage your load quite effectively if you can partition off "tasks which aren't urgent" easily.
Rolling your own isn't too tricky, here's a few other options to check out:
GearMan - this answer was written in 2009, and since then GearMan looks a popular option, see comments below.
ActiveMQ if you want a full blown open source message queue.
ZeroMQ - this is a pretty cool socket library which makes it easy to write distributed code without having to worry too much about the socket programming itself. You could use it for message queuing on a single host - you would simply have your webapp push something to a queue that a continuously running console app would consume at the next suitable opportunity
beanstalkd - only found this one while writing this answer, but looks interesting
dropr is a PHP based message queue project, but hasn't been actively maintained since Sep 2010
php-enqueue is a recently (2017) maintained wrapper around a variety of queue systems
Finally, a blog post about using memcached for message queuing
Another, perhaps simpler, approach is to use ignore_user_abort - once you've sent the page to the user, you can do your final processing without fear of premature termination, though this does have the effect of appearing to prolong the page load from the user perspective.
When you just want to execute one or several HTTP requests without having to wait for the response, there is a simple PHP solution, as well.
In the calling script:
$socketcon = fsockopen($host, 80, $errno, $errstr, 10);
if($socketcon) {
$socketdata = "GET $remote_house/script.php?parameters=... HTTP 1.1\r\nHost: $host\r\nConnection: Close\r\n\r\n";
fwrite($socketcon, $socketdata);
fclose($socketcon);
}
// repeat this with different parameters as often as you like
On the called script.php, you can invoke these PHP functions in the first lines:
ignore_user_abort(true);
set_time_limit(0);
This causes the script to continue running without time limit when the HTTP connection is closed.
Another way to fork processes is via curl. You can set up your internal tasks as a webservice. For example:
http://domain/tasks/t1
http://domain/tasks/t2
Then in your user accessed scripts make calls to the service:
$service->addTask('t1', $data); // post data to URL via curl
Your service can keep track of the queue of tasks with mysql or whatever you like the point is: it's all wrapped up within the service and your script is just consuming URLs. This frees you up to move the service to another machine/server if necessary (ie easily scalable).
Adding http authorization or a custom authorization scheme (like Amazon's web services) lets you open up your tasks to be consumed by other people/services (if you want) and you could take it further and add a monitoring service on top to keep track of queue and task status.
http://domain/queue?task=t1
http://domain/queue?task=t2
http://domain/queue/t1/100931
It does take a bit of set-up work but there are a lot of benefits.
If it just a question of providing expensive tasks, in case of php-fpm is supported, why not to use fastcgi_finish_request() function?
This function flushes all response data to the client and finishes the request. This allows for time consuming tasks to be performed without leaving the connection to the client open.
You don't really use asynchronicity in this way:
Make all your main code first.
Execute fastcgi_finish_request().
Make all heavy stuff.
Once again php-fpm is needed.
I've used Beanstalkd for one project, and planned to again. I've found it to be an excellent way to run asynchronous processes.
A couple of things I've done with it are:
Image resizing - and with a lightly loaded queue passing off to a CLI-based PHP script, resizing large (2mb+) images worked just fine, but trying to resize the same images within a mod_php instance was regularly running into memory-space issues (I limited the PHP process to 32MB, and the resizing took more than that)
near-future checks - beanstalkd has delays available to it (make this job available to run only after X seconds) - so I can fire off 5 or 10 checks for an event, a little later in time
I wrote a Zend-Framework based system to decode a 'nice' url, so for example, to resize an image it would call QueueTask('/image/resize/filename/example.jpg'). The URL was first decoded to an array(module,controller,action,parameters), and then converted to JSON for injection to the queue itself.
A long running cli script then picked up the job from the queue, ran it (via Zend_Router_Simple), and if required, put information into memcached for the website PHP to pick up as required when it was done.
One wrinkle I did also put in was that the cli-script only ran for 50 loops before restarting, but if it did want to restart as planned, it would do so immediately (being run via a bash-script). If there was a problem and I did exit(0) (the default value for exit; or die();) it would first pause for a couple of seconds.
Here is a simple class I coded for my web application. It allows for forking PHP scripts and other scripts. Works on UNIX and Windows.
class BackgroundProcess {
static function open($exec, $cwd = null) {
if (!is_string($cwd)) {
$cwd = #getcwd();
}
#chdir($cwd);
if (strtoupper(substr(PHP_OS, 0, 3)) == 'WIN') {
$WshShell = new COM("WScript.Shell");
$WshShell->CurrentDirectory = str_replace('/', '\\', $cwd);
$WshShell->Run($exec, 0, false);
} else {
exec($exec . " > /dev/null 2>&1 &");
}
}
static function fork($phpScript, $phpExec = null) {
$cwd = dirname($phpScript);
#putenv("PHP_FORCECLI=true");
if (!is_string($phpExec) || !file_exists($phpExec)) {
if (strtoupper(substr(PHP_OS, 0, 3)) == 'WIN') {
$phpExec = str_replace('/', '\\', dirname(ini_get('extension_dir'))) . '\php.exe';
if (#file_exists($phpExec)) {
BackgroundProcess::open(escapeshellarg($phpExec) . " " . escapeshellarg($phpScript), $cwd);
}
} else {
$phpExec = exec("which php-cli");
if ($phpExec[0] != '/') {
$phpExec = exec("which php");
}
if ($phpExec[0] == '/') {
BackgroundProcess::open(escapeshellarg($phpExec) . " " . escapeshellarg($phpScript), $cwd);
}
}
} else {
if (strtoupper(substr(PHP_OS, 0, 3)) == 'WIN') {
$phpExec = str_replace('/', '\\', $phpExec);
}
BackgroundProcess::open(escapeshellarg($phpExec) . " " . escapeshellarg($phpScript), $cwd);
}
}
}
PHP HAS multithreading, its just not enabled by default, there is an extension called pthreads which does exactly that.
You'll need php compiled with ZTS though. (Thread Safe)
Links:
Examples
Another tutorial
pthreads PECL Extension
UPDATE: since PHP 7.2 parallel extension comes into play
Tutorial/Example
reference manual
This is the same method I have been using for a couple of years now and I haven't seen or found anything better. As people have said, PHP is single threaded, so there isn't much else you can do.
I have actually added one extra level to this and that's getting and storing the process id. This allows me to redirect to another page and have the user sit on that page, using AJAX to check if the process is complete (process id no longer exists). This is useful for cases where the length of the script would cause the browser to timeout, but the user needs to wait for that script to complete before the next step. (In my case it was processing large ZIP files with CSV like files that add up to 30 000 records to the database after which the user needs to confirm some information.)
I have also used a similar process for report generation. I'm not sure I'd use "background processing" for something such as an email, unless there is a real problem with a slow SMTP. Instead I might use a table as a queue and then have a process that runs every minute to send the emails within the queue. You would need to be warry of sending emails twice or other similar problems. I would consider a similar queueing process for other tasks as well.
It's a great idea to use cURL as suggested by rojoca.
Here is an example. You can monitor text.txt while the script is running in background:
<?php
function doCurl($begin)
{
echo "Do curl<br />\n";
$url = 'http://'.$_SERVER['SERVER_NAME'].$_SERVER['REQUEST_URI'];
$url = preg_replace('/\?.*/', '', $url);
$url .= '?begin='.$begin;
echo 'URL: '.$url.'<br>';
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$result = curl_exec($ch);
echo 'Result: '.$result.'<br>';
curl_close($ch);
}
if (empty($_GET['begin'])) {
doCurl(1);
}
else {
while (ob_get_level())
ob_end_clean();
header('Connection: close');
ignore_user_abort();
ob_start();
echo 'Connection Closed';
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush();
flush();
$begin = $_GET['begin'];
$fp = fopen("text.txt", "w");
fprintf($fp, "begin: %d\n", $begin);
for ($i = 0; $i < 15; $i++) {
sleep(1);
fprintf($fp, "i: %d\n", $i);
}
fclose($fp);
if ($begin < 10)
doCurl($begin + 1);
}
?>
There is a PHP extension, called Swoole.
Although it might not be enabled, it is available on my hosting for being enabled at click of a button.
Worth checking it out. I haven't had time to use it yet, as I was searching here for info, when I stumbled across it and thought it worth sharing.
Unfortunately PHP does not have any kind of native threading capabilities. So I think in this case you have no choice but to use some kind of custom code to do what you want to do.
If you search around the net for PHP threading stuff, some people have come up with ways to simulate threads on PHP.
If you set the Content-Length HTTP header in your "Thank You For Registering" response, then the browser should close the connection after the specified number of bytes are received. This leaves the server side process running (assuming that ignore_user_abort is set) so it can finish working without making the end user wait.
Of course you will need to calculate the size of your response content before rendering the headers, but that's pretty easy for short responses (write output to a string, call strlen(), call header(), render string).
This approach has the advantage of not forcing you to manage a "front end" queue, and although you may need to do some work on the back end to prevent racing HTTP child processes from stepping on each other, that's something you needed to do already, anyway.
If you don't want the full blown ActiveMQ, I recommend to consider RabbitMQ. RabbitMQ is lightweight messaging that uses the AMQP standard.
I recommend to also look into php-amqplib - a popular AMQP client library to access AMQP based message brokers.
Spawning new processes on the server using exec() or directly on another server using curl doesn't scale all that well at all, if we go for exec you are basically filling your server with long running processes which can be handled by other non web facing servers, and using curl ties up another server unless you build in some sort of load balancing.
I have used Gearman in a few situations and I find it better for this sort of use case. I can use a single job queue server to basically handle queuing of all the jobs needing to be done by the server and spin up worker servers, each of which can run as many instances of the worker process as needed, and scale up the number of worker servers as needed and spin them down when not needed. It also let's me shut down the worker processes entirely when needed and queues the jobs up until the workers come back online.
i think you should try this technique it will help to call as many as pages you like all pages will run at once independently without waiting for each page response as asynchronous.
cornjobpage.php //mainpage
<?php
post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue");
//post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue2");
//post_async("http://localhost/projectname/otherpage.php", "Keywordname=anyValue");
//call as many as pages you like all pages will run at once independently without waiting for each page response as asynchronous.
?>
<?php
/*
* Executes a PHP page asynchronously so the current page does not have to wait for it to finish running.
*
*/
function post_async($url,$params)
{
$post_string = $params;
$parts=parse_url($url);
$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);
$out = "GET ".$parts['path']."?$post_string"." HTTP/1.1\r\n";//you can use POST instead of GET if you like
$out.= "Host: ".$parts['host']."\r\n";
$out.= "Content-Type: application/x-www-form-urlencoded\r\n";
$out.= "Content-Length: ".strlen($post_string)."\r\n";
$out.= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
fclose($fp);
}
?>
testpage.php
<?
echo $_REQUEST["Keywordname"];//case1 Output > testValue
?>
PS:if you want to send url parameters as loop then follow this answer :https://stackoverflow.com/a/41225209/6295712
PHP is a single-threaded language, so there is no official way to start an asynchronous process with it other than using exec or popen. There is a blog post about that here. Your idea for a queue in MySQL is a good idea as well.
Your specific requirement here is for sending an email to the user. I'm curious as to why you are trying to do that asynchronously since sending an email is a pretty trivial and quick task to perform. I suppose if you are sending tons of email and your ISP is blocking you on suspicion of spamming, that might be one reason to queue, but other than that I can't think of any reason to do it this way.

Categories