Passing a variable through to another PHP page - php

I'm making a server manager. I want to add a "kill" command, which would call a php script that would essentially run a shell_exec('kill $arrtext'); and kill the process, thus closing the server down. Here is the part of my script that returns the results and checks to see which servers are running:
<?php
$COMMAND = shell_exec('ps -o command ax | grep skulltag | grep -v grep');
$old = array("skulltag-server", "-port", "-iwad", "-wad", "+exec");
$new = array("SKULLTAG", "Running on Port", "Using IWAD", "+ PWAD", "with Config");
$COMMAND = str_replace($old, $new, $COMMAND);
$arr = explode("./",$COMMAND);
$text = shell_exec('pgrep -u doom');
$arrtext = preg_split('/\s+/', $text);
for( $i = 1; $i < count($arr); $i++ ) {
echo '<div class = "serverborder">';
echo '<div class = "servertextalign">';
echo $i,'. PROCESS ID <span style="color: #f00;">',$arrtext[$i],'</span> with server parameters: <span style="color: #777;">',$arr[$i],'</span>';
echo '</div>';
echo '</div>';
echo '<br>';
}
?>
However, I have no idea how I would add a link or something that would set the proper $arrtext[] variable (depending on which one they picked) and pass it to the PHP script that would kill the server.
The live demo can be found at http://server.oblivionro.net/servers/
Thanks!

Could you try using a shell_exec in another, tiny, script to run that kill script in the command line? Don't use a GET variable. I would rather create a small form for each server in the list and passing it through POST ie. requiring the tiny script that takes hidden POST variables, sending the form action to the same page, and passing the array as a parameter
// In form
echo '<input type="hidden" name="pid" value="'.$arrtext[$i].'"/>';
// In script
$pid = $_POST['pid'];
shell_exec('php -f /location/of/kill_script.php -- -'. $pid)
Where pid is your process ID. Obviously, you should set up your kill script to check that the pid is valid. The benefit of this is that the script's true location can stay hidden and doesn't need even need to be in the www root. You shouldn't need to link the real script directly.

The dirty, dirty (not recommended) way to do it is to have the link go to the script with the command, like:
KILL
Then in the killScript, you RIGOROUSLY verify that the process they are killing something they're supposed to be killing.
A better way would be to avoid using as powerful a command as "kill", so that someone doesn't go to killScript.php?p=1230 where 1230 is the process number of your Minecraft game or something...

I am confused why you feel the need to ask since you seem to have a grasp of PHP scripting and already create spans etc with the proper data. it is trivial to construct an anchor tag. The anchor tag might reference a GET variable, which could be the PID. After proper validation such as ensuring the PID references a doom server process (and proper login credentials), the PID can then be used to shell a kill command.
Note that you are potentially opening your server up to allow the world to shut down processes on your server.

Related

Run PHP function/script in background? [duplicate]

Problem
I have a form that, when submitted, will run basic code to process the information submitted and insert it into a database for display on a notification website. In addition, I have a list of people who have signed up to receive these notifications via email and SMS message. This list is trivial as the moment (only pushing about 150), however it's enough to cause it takes upwards of a minute to cycle through the entire table of subscribers and send out 150+ emails. (The emails are being sent individually as requested by the system administrators of our email server because of mass email policies.)
During this time, the individual who posted the alert will sit on the last page of the form for almost a minute without any positive reinforcement that their notification is being posted. This leads to other potential problems, all that have possible solutions that I feel are less than ideal.
First, the poster might think the server is lagging and click the 'Submit' button again, causing the script to start over or run twice. I could solve this by using JavaScript to disable the button and replace the text to say something like 'Processing...', however this is less than ideal because the user will still be stuck on the page for the length of the script execution. (Also, if JavaScript is disabled, this problem still exists.)
Second, the poster might close the tab or the browser prematurely after submitting the form. The script will keeping running on the server until it tries to write back to the browser, however if the user then browses to any page within our domain (while the script is still running), the browser hangs loading the page until the script has ended. (This only happens when a tab or window of the browser is closed and not the entire browser application.) Still, this is less than ideal.
(Possible) Solution
I've decided I want to break out the "email" part of the script into a separate file I can call after the notification has been posted. I originally thought of putting this on the confirmation page after the notification has been successfully posted. However, the user will not know this script is running and any anomalies will not be apparent to them; This script cannot fail.
But, what if I can run this script as a background process? So, my question is this: How can I execute a PHP script to trigger as a background service and run completely independent of what the user has done at the form level?
EDIT: This cannot be cron'ed. It must run the instant the form is submitted. These are high-priority notifications. In addition, the system administrators running our servers disallow crons from running any more frequently than 5 minutes.
Doing some experimentation with exec and shell_exec I have uncovered a solution that worked perfectly! I choose to use shell_exec so I can log every notification process that happens (or doesn't). (shell_exec returns as a string and this was easier than using exec, assigning the output to a variable and then opening a file to write to.)
I'm using the following line to invoke the email script:
shell_exec("/path/to/php /path/to/send_notifications.php '".$post_id."' 'alert' >> /path/to/alert_log/paging.log &");
It is important to notice the & at the end of the command (as pointed out by #netcoder). This UNIX command runs a process in the background.
The extra variables surrounded in single quotes after the path to the script are set as $_SERVER['argv'] variables that I can call within my script.
The email script then outputs to my log file using the >> and will output something like this:
[2011-01-07 11:01:26] Alert Notifications Sent for http://alerts.illinoisstate.edu/2049 (SCRIPT: 38.71 seconds)
[2011-01-07 11:01:34] CRITICAL ERROR: Alert Notifications NOT sent for http://alerts.illinoisstate.edu/2049 (SCRIPT: 23.12 seconds)
On Linux/Unix servers, you can execute a job in the background by using proc_open:
$descriptorspec = array(
array('pipe', 'r'), // stdin
array('file', 'myfile.txt', 'a'), // stdout
array('pipe', 'w'), // stderr
);
$proc = proc_open('php email_script.php &', $descriptorspec, $pipes);
The & being the important bit here. The script will continue even if the original script has ended.
Of all the answers, none considered the ridiculously easy fastcgi_finish_request function, that when called, flushes all remaining output to the browser and closes the Fastcgi session and the HTTP connection, while letting the script run in the background.
Example:
<?php
header('Content-Type: application/json');
echo json_encode(['ok' => true]);
fastcgi_finish_request(); // The user is now disconnected from the script
// Do stuff with received data
Note: Due to a wontfix quirk calling flush() after fastcgi_finish_request will cause it to exit without warning/error.
You may wish to call ignore_user_abort(true) beforehand to supress this behavior, or simply avoid calling flush() after you've intentionally closed the connection :)
$connected = true;
// Stuff...
fastcgi_finish_request();
$connected = false;
// ...
if ($connected) {
flush();
}
Or
ignore_user_abort(true);
fastcgi_finish_request();
// Accidental flush()es won't do harm (even if you really shouldn't be calling flush() if you know you've disconnected from the user)
flush();
PHP exec("php script.php") can do it.
From the Manual:
If a program is started with this
function, in order for it to continue
running in the background, the output
of the program must be redirected to a
file or another output stream. Failing
to do so will cause PHP to hang until
the execution of the program ends.
So if you redirect the output to a log file (what is a good idea anyways), your calling script will not hang and your email script will run in bg.
And why not making a HTTP Request on the script and ignoring the response ?
http://php.net/manual/en/function.httprequest-send.php
If you make your request on the script you need to call your webserver will run it in background and you can (in your main script) show a message telling the user that the script is running.
The simpler way to run a PHP script in background is
php script.php >/dev/null &
The script will run in background and the page will also reach the action page faster.
How about this?
Your PHP script that holds the form saves a flag or some value into a database or file.
A second PHP script polls for this value periodically and if it's been set, it triggers the Email script in a synchronous manner.
This second PHP script should be set to run as a cron.
As I know you cannot do this in easy way (see fork exec etc (don't work under windows)), may be you can reverse the approach, use the background of the browser posting the form in ajax, so if the post still work you've no wait time.
This can help even if you have to do some long elaboration.
About sending mail it's always suggest to use a spooler, may be a local & quick smtp server that accept your requests and the spool them to the real MTA or put all in a DB, than use a cron that spool the queue.
The cron may be on another machine calling the spooler as external url:
* * * * * wget -O /dev/null http://www.example.com/spooler.php
Background cron job sounds like a good idea for this.
You'll need ssh access to the machine to run the script as a cron.
$ php scriptname.php to run it.
If you can access the server over ssh and can run your own scripts you can make a simple fifo server using php (although you will have to recompile php with posix support for fork).
The server can be written in anything really, you probably can easily do it in python.
Or the simplest solution would be sending an HttpRequest and not reading the return data but the server might destroy the script before it finish processing.
Example server :
<?php
define('FIFO_PATH', '/home/user/input.queue');
define('FORK_COUNT', 10);
if(file_exists(FIFO_PATH)) {
die(FIFO_PATH . ' exists, please delete it and try again.' . "\n");
}
if(!file_exists(FIFO_PATH) && !posix_mkfifo(FIFO_PATH, 0666)){
die('Couldn\'t create the listening fifo.' . "\n");
}
$pids = array();
$fp = fopen(FIFO_PATH, 'r+');
for($i = 0; $i < FORK_COUNT; ++$i) {
$pids[$i] = pcntl_fork();
if(!$pids[$i]) {
echo "process(" . posix_getpid() . ", id=$i)\n";
while(true) {
$line = chop(fgets($fp));
if($line == 'quit' || $line === false) break;
echo "processing (" . posix_getpid() . ", id=$i) :: $line\n";
// $data = json_decode($line);
// processData($data);
}
exit();
}
}
fclose($fp);
foreach($pids as $pid){
pcntl_waitpid($pid, $status);
}
unlink(FIFO_PATH);
?>
Example client :
<?php
define('FIFO_PATH', '/home/user/input.queue');
if(!file_exists(FIFO_PATH)) {
die(FIFO_PATH . ' doesn\'t exist, please make sure the fifo server is running.' . "\n");
}
function postToQueue($data) {
$fp = fopen(FIFO_PATH, 'w+');
stream_set_blocking($fp, false); //don't block
$data = json_encode($data) . "\n";
if(fwrite($fp, $data) != strlen($data)) {
echo "Couldn't the server might be dead or there's a bug somewhere\n";
}
fclose($fp);
}
$i = 1000;
while(--$i) {
postToQueue(array('xx'=>21, 'yy' => array(1,2,3)));
}
?>
If you're on Windows, research proc_open or popen...
But if we're on the same server "Linux" running cpanel then this is the right approach:
#!/usr/bin/php
<?php
$pid = shell_exec("nohup nice php -f
'path/to/your/script.php' /dev/null 2>&1 & echo $!");
While(exec("ps $pid"))
{ //you can also have a streamer here like fprintf,
// or fgets
}
?>
Don't use fork() or curl if you doubt you can handle them, it's just like abusing your server
Lastly, on the script.php file which is called above, take note of this make sure you wrote:
<?php
ignore_user_abort(TRUE);
set_time_limit(0);
ob_start();
// <-- really optional but this is pure php
//Code to be tested on background
ob_flush(); flush();
//this two do the output process if you need some.
//then to make all the logic possible
str_repeat(" ",1500);
//.for progress bars or loading images
sleep(2); //standard limit
?>
For background worker, I think you should try this technique. It will help to call as many as pages you like all pages will run at once independently without waiting for each page response as asynchronous.
form_action_page.php
<?php
post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue");
// post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue2");
// post_async("http://localhost/projectname/otherpage.php", "Keywordname=anyValue");
// call as many as pages you like all pages will run at once //independently without waiting for each page response as asynchronous.
// Your form db insertion or other code goes here do what ever you want //above code will work as background job this line will direct hit before //above lines response
/*
* Executes a PHP page asynchronously so the current page does not have to wait for it to finish running.
*/
function post_async($url,$params)
{
$post_string = $params;
$parts = parse_url($url);
$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);
$out = "GET ".$parts['path']."?$post_string"." HTTP/1.1\r\n";//you can use POST instead of GET if you like
$out .= "Host: ".$parts['host']."\r\n";
$out .= "Content-Type: application/x-www-form-urlencoded\r\n";
$out .= "Content-Length: ".strlen($post_string)."\r\n";
$out .= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
fclose($fp);
}
?>
testpage.php
<?php
echo $_REQUEST["Keywordname"];//case1 Output > testValue
// here do your background operations it will not halt main page
?>
P.S: if you want to send url parameters as loop then follow this answer: https://stackoverflow.com/a/41225209/6295712
Assuming you are running on a *nix platform, use cron and the php executable.
EDIT:
There are quite a number of questions asking for "running php without cron" on SO already. Here's one:
Schedule scripts without using CRON
That said, the exec() answer above sounds very promising :)
In my case I have 3 params, one of them is string (mensaje):
exec("C:\wamp\bin\php\php5.5.12\php.exe C:/test/N/trunk/api/v1/Process.php $idTest2 $idTest3 \"$mensaje\" >> c:/log.log &");
In my Process.php I have this code:
if (!isset($argv[1]) || !isset($argv[2]) || !isset($argv[3]))
{
die("Error.");
}
$idCurso = $argv[1];
$idDestino = $argv[2];
$mensaje = $argv[3];
Use Amphp to execute jobs in parallel & asynchronously.
Install the library
composer require amphp/parallel-functions
Code sample
<?php
require "vendor/autoload.php";
use Amp\Promise;
use Amp\ParallelFunctions;
echo 'started</br>';
$promises[1] = ParallelFunctions\parallel(function (){
// Send Email
})();
$promises[2] = ParallelFunctions\parallel(function (){
// Send SMS
})();
Promise\wait(Promise\all($promises));
echo 'finished';
Fo your use case, You can do something like below
<?php
use function Amp\ParallelFunctions\parallelMap;
use function Amp\Promise\wait;
$responses = wait(parallelMap([
'a#example.com',
'b#example.com',
'c#example.com',
], function ($to) {
return send_mail($to);
}));
This works for me. try this:
exec("php asyn.php > /dev/null 2>/dev/null &);

Executing a daemon in php [duplicate]

Problem
I have a form that, when submitted, will run basic code to process the information submitted and insert it into a database for display on a notification website. In addition, I have a list of people who have signed up to receive these notifications via email and SMS message. This list is trivial as the moment (only pushing about 150), however it's enough to cause it takes upwards of a minute to cycle through the entire table of subscribers and send out 150+ emails. (The emails are being sent individually as requested by the system administrators of our email server because of mass email policies.)
During this time, the individual who posted the alert will sit on the last page of the form for almost a minute without any positive reinforcement that their notification is being posted. This leads to other potential problems, all that have possible solutions that I feel are less than ideal.
First, the poster might think the server is lagging and click the 'Submit' button again, causing the script to start over or run twice. I could solve this by using JavaScript to disable the button and replace the text to say something like 'Processing...', however this is less than ideal because the user will still be stuck on the page for the length of the script execution. (Also, if JavaScript is disabled, this problem still exists.)
Second, the poster might close the tab or the browser prematurely after submitting the form. The script will keeping running on the server until it tries to write back to the browser, however if the user then browses to any page within our domain (while the script is still running), the browser hangs loading the page until the script has ended. (This only happens when a tab or window of the browser is closed and not the entire browser application.) Still, this is less than ideal.
(Possible) Solution
I've decided I want to break out the "email" part of the script into a separate file I can call after the notification has been posted. I originally thought of putting this on the confirmation page after the notification has been successfully posted. However, the user will not know this script is running and any anomalies will not be apparent to them; This script cannot fail.
But, what if I can run this script as a background process? So, my question is this: How can I execute a PHP script to trigger as a background service and run completely independent of what the user has done at the form level?
EDIT: This cannot be cron'ed. It must run the instant the form is submitted. These are high-priority notifications. In addition, the system administrators running our servers disallow crons from running any more frequently than 5 minutes.
Doing some experimentation with exec and shell_exec I have uncovered a solution that worked perfectly! I choose to use shell_exec so I can log every notification process that happens (or doesn't). (shell_exec returns as a string and this was easier than using exec, assigning the output to a variable and then opening a file to write to.)
I'm using the following line to invoke the email script:
shell_exec("/path/to/php /path/to/send_notifications.php '".$post_id."' 'alert' >> /path/to/alert_log/paging.log &");
It is important to notice the & at the end of the command (as pointed out by #netcoder). This UNIX command runs a process in the background.
The extra variables surrounded in single quotes after the path to the script are set as $_SERVER['argv'] variables that I can call within my script.
The email script then outputs to my log file using the >> and will output something like this:
[2011-01-07 11:01:26] Alert Notifications Sent for http://alerts.illinoisstate.edu/2049 (SCRIPT: 38.71 seconds)
[2011-01-07 11:01:34] CRITICAL ERROR: Alert Notifications NOT sent for http://alerts.illinoisstate.edu/2049 (SCRIPT: 23.12 seconds)
On Linux/Unix servers, you can execute a job in the background by using proc_open:
$descriptorspec = array(
array('pipe', 'r'), // stdin
array('file', 'myfile.txt', 'a'), // stdout
array('pipe', 'w'), // stderr
);
$proc = proc_open('php email_script.php &', $descriptorspec, $pipes);
The & being the important bit here. The script will continue even if the original script has ended.
Of all the answers, none considered the ridiculously easy fastcgi_finish_request function, that when called, flushes all remaining output to the browser and closes the Fastcgi session and the HTTP connection, while letting the script run in the background.
Example:
<?php
header('Content-Type: application/json');
echo json_encode(['ok' => true]);
fastcgi_finish_request(); // The user is now disconnected from the script
// Do stuff with received data
Note: Due to a wontfix quirk calling flush() after fastcgi_finish_request will cause it to exit without warning/error.
You may wish to call ignore_user_abort(true) beforehand to supress this behavior, or simply avoid calling flush() after you've intentionally closed the connection :)
$connected = true;
// Stuff...
fastcgi_finish_request();
$connected = false;
// ...
if ($connected) {
flush();
}
Or
ignore_user_abort(true);
fastcgi_finish_request();
// Accidental flush()es won't do harm (even if you really shouldn't be calling flush() if you know you've disconnected from the user)
flush();
PHP exec("php script.php") can do it.
From the Manual:
If a program is started with this
function, in order for it to continue
running in the background, the output
of the program must be redirected to a
file or another output stream. Failing
to do so will cause PHP to hang until
the execution of the program ends.
So if you redirect the output to a log file (what is a good idea anyways), your calling script will not hang and your email script will run in bg.
And why not making a HTTP Request on the script and ignoring the response ?
http://php.net/manual/en/function.httprequest-send.php
If you make your request on the script you need to call your webserver will run it in background and you can (in your main script) show a message telling the user that the script is running.
The simpler way to run a PHP script in background is
php script.php >/dev/null &
The script will run in background and the page will also reach the action page faster.
How about this?
Your PHP script that holds the form saves a flag or some value into a database or file.
A second PHP script polls for this value periodically and if it's been set, it triggers the Email script in a synchronous manner.
This second PHP script should be set to run as a cron.
As I know you cannot do this in easy way (see fork exec etc (don't work under windows)), may be you can reverse the approach, use the background of the browser posting the form in ajax, so if the post still work you've no wait time.
This can help even if you have to do some long elaboration.
About sending mail it's always suggest to use a spooler, may be a local & quick smtp server that accept your requests and the spool them to the real MTA or put all in a DB, than use a cron that spool the queue.
The cron may be on another machine calling the spooler as external url:
* * * * * wget -O /dev/null http://www.example.com/spooler.php
Background cron job sounds like a good idea for this.
You'll need ssh access to the machine to run the script as a cron.
$ php scriptname.php to run it.
Assuming you are running on a *nix platform, use cron and the php executable.
EDIT:
There are quite a number of questions asking for "running php without cron" on SO already. Here's one:
Schedule scripts without using CRON
That said, the exec() answer above sounds very promising :)
If you can access the server over ssh and can run your own scripts you can make a simple fifo server using php (although you will have to recompile php with posix support for fork).
The server can be written in anything really, you probably can easily do it in python.
Or the simplest solution would be sending an HttpRequest and not reading the return data but the server might destroy the script before it finish processing.
Example server :
<?php
define('FIFO_PATH', '/home/user/input.queue');
define('FORK_COUNT', 10);
if(file_exists(FIFO_PATH)) {
die(FIFO_PATH . ' exists, please delete it and try again.' . "\n");
}
if(!file_exists(FIFO_PATH) && !posix_mkfifo(FIFO_PATH, 0666)){
die('Couldn\'t create the listening fifo.' . "\n");
}
$pids = array();
$fp = fopen(FIFO_PATH, 'r+');
for($i = 0; $i < FORK_COUNT; ++$i) {
$pids[$i] = pcntl_fork();
if(!$pids[$i]) {
echo "process(" . posix_getpid() . ", id=$i)\n";
while(true) {
$line = chop(fgets($fp));
if($line == 'quit' || $line === false) break;
echo "processing (" . posix_getpid() . ", id=$i) :: $line\n";
// $data = json_decode($line);
// processData($data);
}
exit();
}
}
fclose($fp);
foreach($pids as $pid){
pcntl_waitpid($pid, $status);
}
unlink(FIFO_PATH);
?>
Example client :
<?php
define('FIFO_PATH', '/home/user/input.queue');
if(!file_exists(FIFO_PATH)) {
die(FIFO_PATH . ' doesn\'t exist, please make sure the fifo server is running.' . "\n");
}
function postToQueue($data) {
$fp = fopen(FIFO_PATH, 'w+');
stream_set_blocking($fp, false); //don't block
$data = json_encode($data) . "\n";
if(fwrite($fp, $data) != strlen($data)) {
echo "Couldn't the server might be dead or there's a bug somewhere\n";
}
fclose($fp);
}
$i = 1000;
while(--$i) {
postToQueue(array('xx'=>21, 'yy' => array(1,2,3)));
}
?>
If you're on Windows, research proc_open or popen...
But if we're on the same server "Linux" running cpanel then this is the right approach:
#!/usr/bin/php
<?php
$pid = shell_exec("nohup nice php -f
'path/to/your/script.php' /dev/null 2>&1 & echo $!");
While(exec("ps $pid"))
{ //you can also have a streamer here like fprintf,
// or fgets
}
?>
Don't use fork() or curl if you doubt you can handle them, it's just like abusing your server
Lastly, on the script.php file which is called above, take note of this make sure you wrote:
<?php
ignore_user_abort(TRUE);
set_time_limit(0);
ob_start();
// <-- really optional but this is pure php
//Code to be tested on background
ob_flush(); flush();
//this two do the output process if you need some.
//then to make all the logic possible
str_repeat(" ",1500);
//.for progress bars or loading images
sleep(2); //standard limit
?>
For background worker, I think you should try this technique. It will help to call as many as pages you like all pages will run at once independently without waiting for each page response as asynchronous.
form_action_page.php
<?php
post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue");
// post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue2");
// post_async("http://localhost/projectname/otherpage.php", "Keywordname=anyValue");
// call as many as pages you like all pages will run at once //independently without waiting for each page response as asynchronous.
// Your form db insertion or other code goes here do what ever you want //above code will work as background job this line will direct hit before //above lines response
/*
* Executes a PHP page asynchronously so the current page does not have to wait for it to finish running.
*/
function post_async($url,$params)
{
$post_string = $params;
$parts = parse_url($url);
$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);
$out = "GET ".$parts['path']."?$post_string"." HTTP/1.1\r\n";//you can use POST instead of GET if you like
$out .= "Host: ".$parts['host']."\r\n";
$out .= "Content-Type: application/x-www-form-urlencoded\r\n";
$out .= "Content-Length: ".strlen($post_string)."\r\n";
$out .= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
fclose($fp);
}
?>
testpage.php
<?php
echo $_REQUEST["Keywordname"];//case1 Output > testValue
// here do your background operations it will not halt main page
?>
P.S: if you want to send url parameters as loop then follow this answer: https://stackoverflow.com/a/41225209/6295712
In my case I have 3 params, one of them is string (mensaje):
exec("C:\wamp\bin\php\php5.5.12\php.exe C:/test/N/trunk/api/v1/Process.php $idTest2 $idTest3 \"$mensaje\" >> c:/log.log &");
In my Process.php I have this code:
if (!isset($argv[1]) || !isset($argv[2]) || !isset($argv[3]))
{
die("Error.");
}
$idCurso = $argv[1];
$idDestino = $argv[2];
$mensaje = $argv[3];
Use Amphp to execute jobs in parallel & asynchronously.
Install the library
composer require amphp/parallel-functions
Code sample
<?php
require "vendor/autoload.php";
use Amp\Promise;
use Amp\ParallelFunctions;
echo 'started</br>';
$promises[1] = ParallelFunctions\parallel(function (){
// Send Email
})();
$promises[2] = ParallelFunctions\parallel(function (){
// Send SMS
})();
Promise\wait(Promise\all($promises));
echo 'finished';
Fo your use case, You can do something like below
<?php
use function Amp\ParallelFunctions\parallelMap;
use function Amp\Promise\wait;
$responses = wait(parallelMap([
'a#example.com',
'b#example.com',
'c#example.com',
], function ($to) {
return send_mail($to);
}));
This works for me. try this:
exec("php asyn.php > /dev/null 2>/dev/null &);

Open Linux terminal command in PHP

I have a server running on Linux that execute commands to 12 nodes (12 computers with Linux running in them). I recently downloaded PHP on the server to create web pages that can execute commands by opening a specific PHP file.
I used exec(), passthru(), shell_​exec(), and system(). system() is the only one that returns a part of my code. I would like PHP to act like open termainal command in linux and I cannot figure out how to do it!
Here is an example of what is happening now (Linux directly vs PHP):
When using linux open terminal command directly:
user#wizard:/home/hyperwall/Desktop> /usr/local/bin/chbg -mt
I get an output:
The following settings will be used:
option = mtsu COLOR = IMAGE = imagehereyouknow!
NODES = LOCAL
and additional code to send it to 12 nodes.
Now with PHP:
switch($_REQUEST['do'])
{ case 'test':
echo system('/usr/local/bin/chbg -mt');
break;
}
Output:
The following settings will be used:
option = mtsu COLOR = IMAGE = imagehereyouknow!
NODES = LOCAL
And stops! Anyone has an explanation of what is happening? And how to fix it? Only system displays part of the code the other functions display nothing!
My First thought is it can be something about std and output error. Some softwares dump some informations on std out and some in std error. When you are not redirecting std error to std out, most of the system calls only returns the stdout part. It sounds thats why you see the whole output in terminal and can't in the system calls.
So try with
/usr/local/bin/chbg -mt 2>&1
Edit:
Also for a temporary work through, you can try some other things. For example redirect the output to file next to the script and read its contents after executing the command, This way you can use the exec:
exec("usr/local/bin/chbg -mt 2>&1 > chbg_out");
//Then start reading chbg_out and see is it work
Edit2
Also it does not make sense why others not working for you.
For example this piece of code written in c, dumps a string in stderr and there is other in stdout.
#include <stdio.h>
#include<stdlib.h>
int main()
{
fputs("\nerr\nrro\nrrr\n",stderr);
fputs("\nou\nuu\nuttt\n",stdout);
return 0;
}
and this php script, tries to run that via exec:
<?php
exec("/tmp/ctest",&$result);
foreach ( $result as $v )
{
echo $v;
}
#output ouuuuttt
?>
See it still dumps out the stdout. But it did not receive the stderr.
Now consider this:
<?php
exec("/tmp/ctest 2>&1",&$result);
foreach ( $result as $v )
{
echo $v;
}
//output: errrrorrrouuuuttt
?>
See, this time we got the whole outputs.
This time the system:
<?php
echo system("/tmp/ctest 2>&1");
//output: err rro rrr ou uu uttt uttt
?>
and so on ...
Maybe your chbg -mt writes additional code to stderr instead of stdout? Try to execute your script inside php like this:
/usr/local/bin/chbg -mt 2>&1
The other responses are good for generic advice. But in this specific case, it appears you are trying to change your background on your desktop. This requires many special considerations because of 'user context':
First, your web server is probably running as a different user, and therefore would not have permissions to change your desktop.
Second, the program probably requires some environmental variables from your user context. For example, X programs need a DISPLAY variable, ssh-agent needs SSH_AGENT_PID and SSH_AUTH_SOCK, etc. I don't know much about changing backgrounds, but I'm guessing it involves D-Bus, which probably requires things like DBUS_SESSION_BUS_ADDRESS, KONSOLE_DBUS_SERVICE, KONSOLE_DBUS_SESSION, and KONSOLE_DBUS_WINDOW. There may be many others. Note that some of these vars change every time you log in, so you can't hard-code them on the PHP side.
For testing, it might be simpler to start your own webserver right from your user session. (i.e. Don't use the system one, it has to run as you. You will need to run it on an alternate port, like 8080). The web server you start manually will have all the 'context' it needs. I'll mention websocketd because it just came out and looks neat.
For "production", you may need to run a daemon in your user context all the time, and have the web server talk to that daemon to 'get stuff done' inside your user context.
PHP's system only returns the last line of execution:
Return Value: Returns the last line of the command output on success, and FALSE on failure.
You will most likely want to use either exec or passthru. exec has an optional parameter to put the output into an array. You could implode the output and use that to echo it.
switch($_REQUEST['do'])
{ case 'test':
exec('/usr/local/bin/chbg -mt', $output);
echo implode('\n', $output); // Could use <br /> if HTML output is desired
break;
}
I think that the result of execution, can changes between users.
First, try to run your PHP script directly into your terminal php yourScript.php
If it runs as expected, go to your Apache service and update it to run with your own credentials
You are trying to change the backgrounds for currently logged in users... While they are using the desktop. Like while I'm typing this message. I minimize my browser and 'ooh my desktop background is different'. Hopefully this is for something important like it turns red when the reactor or overheating.
Anyway to my answer:
Instead of trying to remotely connect and run items as the individual users. Setup each user to run a bash script (in their own account, in their own shell) on a repeating timer. Say every 10 minutes. Have it select the SAME file.. from a network location
/somenetworkshare/backgrounds/images/current.png
Then you can update ALL nodes (1 to a million) just by changing the image itself in /somenetworkshare/backgrounds/images/current.png
I wrote something a while ago that does just this -- you can run a command interpreter (/bin/sh), send it commands, read back responses, send more commands, etc. It uses proc_open() to open a child process and talk to it.
It's at http://github.com/andrasq/quicklib, Quick/Proc/Process.php
Using it would look something like (easier if you have a flexible autoloader; I wrote one of those too in Quicklib):
include 'lib/Quick/Proc/Exception.php';
include 'lib/Quick/Proc/Exists.php';
include 'lib/Quick/Proc/Process.php';
$proc = new Quick_Proc_Process("/bin/sh");
$proc->putInput("pwd\n");
$lines = $proc->getOutputLines($nlines = 10, $timeoutSec = 0.2);
echo $lines[0];
$proc->putInput("date\n");
$lines = $proc->getOutputLines(1, 0.2);
echo $lines[0];
Outputs
/home/andras/quicklib
Sat Feb 21 01:50:39 EST 2015
The unit of communication between php and the process is newline terminated lines. All commands must be newline terminated, and all responses are retrieved in units of lines. Don't forget the newlines, they're hard to identify afterward.
I am working on a project that uses Terminal A on machine A to output to Terminal B on Machine B, both using linux for now. I didnt see it mentioned, but perhaps you can use redirection, something like this in your webserver:
switch($_REQUEST['do'])
{ case 'test':
#process ID on the target (12345, 12346 etc)
echo system('/usr/local/bin/chbg -mt > /proc/<processID>/fd/1');
#OR
#device file on the target (pts/0,tty0, etc)
echo system('/usr/local/bin/chbg -mt > /dev/<TTY-TYPE>/<TTYNUM>');
break;
}
Definitely the permissions need to be set correctly for this to work. The command "mesg y" in a terminal may also assist...Hope that helps.

advance process control in PHP

I need to build a system that a user will send file to the server
then php will run a command-line tool using system() ( example tool.exe userfile )
i need a way to see the pid of the process to know the user that have start the tool
and a way to know when the tool have stop .
Is this possible on a Windows vista Machine , I can't move to a Linux Server .
besides that the code must continue run when the user close the browser windows
Rather than trying to obtain the ID of a process and monitor how long it runs, I think that what you want to do is have a "wrapper" process that handles pre/post-processing, such as logging or database manipulation.
The first step to the is to create an asynchronous process, that will run independently of the parent and allow it to be started by a call to a web page.
To do this on Windows, we use WshShell:
$cmdToExecute = "tool.exe \"$userfile\"";
$WshShell = new COM("WScript.Shell");
$result = $WshShell->Run($cmdToExecute, 0, FALSE);
...and (for completeness) if we want to do it on *nix, we append > /dev/null 2>&1 & to the command:
$cmdToExecute = "/usr/bin/tool \"$userfile\"";
exec("$cmdToExecute > /dev/null 2>&1 &");
So, now you know how to start an external process that will not block your script, and will continue execution after your script has finished. But this doesn't complete the picture - because you want to track the start and end times of the external process. This is quite simple - we just wrap it in a little PHP script, which we shall call...
wrapper.php
<?php
// Fetch the arguments we need to pass on to the external tool
$userfile = $argv[1];
// Do any necessary pre-processing of the file here
$startTime = microtime(TRUE);
// Execute the external program
exec("C:/path/to/tool.exe \"$userfile\"");
// By the time we get here, the external tool has finished - because
// we know that a standard call to exec() will block until the called
// process finishes
$endTime = microtime(TRUE);
// Log the times etc and do any post processing here
So instead of executing the tool directly, we make our command in the main script:
$cmdToExecute = "php wrapper.php \"$userfile\"";
...and we should have a finely controllable solution for what you want to do.
N.B. Don't forget to escapeshellarg() where necessary!

Php server response to android when executing a batch file

I have an application which is in java & i have created a batch file(.bat) of that application & running it in php like this
<?php
system ("cmd.exe /c final.bat");
?>
The output is something like this
C:\wamp\www\php>java -jar Most closely reseambling is this...C:\wamp\www\php>pause Press any key to continue . . .
I am invoking this through my android application & want this out as response...Can you tell me how to do this .... It would be great if I would be able to eliminate prompts from response.
Check out the exec function. Instead of automatically printing the output, it will return the output as a string, or in an optional array of lines in the second argument. You can then parse this string or array and print only what you need.
Example which would print all but the first line of output:
exec('cmd.exe /c final.bat',$output);
for($i = 1; $i < count($output); ++$i) echo $output[$i],'<br />';
system returns any output from the command, so a simple
echo system(...);
would do the trick. To be able to send text as input to the invoked command, you should look into using popen() instead, which lets you "use" the invoked command as if you were using it from a shell/command prompt.
Note that system will block until the external program finishes. In your case, with it sitting there with "press any key to continue", this will never happen and your PHP process will just sit there indefinitely (or until max_execution_time is exceeded, whichever comes first).

Categories