Get data stream from ssh connection using ssh2 on php - php

I am trying to retrieve a stream of data that comes from an ssh connection. When I use putty to ssh in, I will just start getting data on the screen, no commands were needed.
Now I am trying to do the same thing but in php to manipulate the data and save it to a database. For this task I have already installed the necessary package for ssh2 and was able to get a connection. But I don't know how to get that data. My overall goal is to have this script running as a daemon and continually retrieve information to save.
I have tried using ssh2_shell and use the returned stream resource by stream_get_contents but it returns false.
$stdio_stream = ssh2_shell($connection);
$contents = get_resource_type ($stdio_stream);
echo $contents;
$contents = stream_get_contents ($stdio_stream);
if ($contents) {
print_r($contents);
} else {
echo 'it failed';
}
And I have tried this as per User Contributed Notes
$stdout_stream = ssh2_exec($connection, "/bin/ls -la /tmp");
$dio_stream = ssh2_fetch_stream($stdout_stream, SSH2_STREAM_STDIO);
$result_dio = stream_get_contents($dio_stream);

Figured it out! The trick is timing. The Shell has to have data on it in order to read. Most of the time using a shell you are sending commands and then reading data from said command, but in my case the server is just spitting data out, so i have to put sleep(1) to fill the buffer.
Here is what i got to work for a shell connection that is always sending data.
$sshConn=ssh2_connect($ipAddress, 22);
usleep(500);
ssh2_auth_password($sshConn,$userName,$password);
$shell = ssh2_shell($sshConn);
# Here we are waiting for Shell to initialize
# Increase this a bit if you get unexpected results
usleep(9000);
$count = 0;
while($count<3) { //run ten times
sleep(1);
while(($line = fgets($shell))) {
echo "$line</br>";
}
$count++;
}
See PHP Programming/SSH Class for more information.

Related

Run PHP function/script in background? [duplicate]

Problem
I have a form that, when submitted, will run basic code to process the information submitted and insert it into a database for display on a notification website. In addition, I have a list of people who have signed up to receive these notifications via email and SMS message. This list is trivial as the moment (only pushing about 150), however it's enough to cause it takes upwards of a minute to cycle through the entire table of subscribers and send out 150+ emails. (The emails are being sent individually as requested by the system administrators of our email server because of mass email policies.)
During this time, the individual who posted the alert will sit on the last page of the form for almost a minute without any positive reinforcement that their notification is being posted. This leads to other potential problems, all that have possible solutions that I feel are less than ideal.
First, the poster might think the server is lagging and click the 'Submit' button again, causing the script to start over or run twice. I could solve this by using JavaScript to disable the button and replace the text to say something like 'Processing...', however this is less than ideal because the user will still be stuck on the page for the length of the script execution. (Also, if JavaScript is disabled, this problem still exists.)
Second, the poster might close the tab or the browser prematurely after submitting the form. The script will keeping running on the server until it tries to write back to the browser, however if the user then browses to any page within our domain (while the script is still running), the browser hangs loading the page until the script has ended. (This only happens when a tab or window of the browser is closed and not the entire browser application.) Still, this is less than ideal.
(Possible) Solution
I've decided I want to break out the "email" part of the script into a separate file I can call after the notification has been posted. I originally thought of putting this on the confirmation page after the notification has been successfully posted. However, the user will not know this script is running and any anomalies will not be apparent to them; This script cannot fail.
But, what if I can run this script as a background process? So, my question is this: How can I execute a PHP script to trigger as a background service and run completely independent of what the user has done at the form level?
EDIT: This cannot be cron'ed. It must run the instant the form is submitted. These are high-priority notifications. In addition, the system administrators running our servers disallow crons from running any more frequently than 5 minutes.
Doing some experimentation with exec and shell_exec I have uncovered a solution that worked perfectly! I choose to use shell_exec so I can log every notification process that happens (or doesn't). (shell_exec returns as a string and this was easier than using exec, assigning the output to a variable and then opening a file to write to.)
I'm using the following line to invoke the email script:
shell_exec("/path/to/php /path/to/send_notifications.php '".$post_id."' 'alert' >> /path/to/alert_log/paging.log &");
It is important to notice the & at the end of the command (as pointed out by #netcoder). This UNIX command runs a process in the background.
The extra variables surrounded in single quotes after the path to the script are set as $_SERVER['argv'] variables that I can call within my script.
The email script then outputs to my log file using the >> and will output something like this:
[2011-01-07 11:01:26] Alert Notifications Sent for http://alerts.illinoisstate.edu/2049 (SCRIPT: 38.71 seconds)
[2011-01-07 11:01:34] CRITICAL ERROR: Alert Notifications NOT sent for http://alerts.illinoisstate.edu/2049 (SCRIPT: 23.12 seconds)
On Linux/Unix servers, you can execute a job in the background by using proc_open:
$descriptorspec = array(
array('pipe', 'r'), // stdin
array('file', 'myfile.txt', 'a'), // stdout
array('pipe', 'w'), // stderr
);
$proc = proc_open('php email_script.php &', $descriptorspec, $pipes);
The & being the important bit here. The script will continue even if the original script has ended.
Of all the answers, none considered the ridiculously easy fastcgi_finish_request function, that when called, flushes all remaining output to the browser and closes the Fastcgi session and the HTTP connection, while letting the script run in the background.
Example:
<?php
header('Content-Type: application/json');
echo json_encode(['ok' => true]);
fastcgi_finish_request(); // The user is now disconnected from the script
// Do stuff with received data
Note: Due to a wontfix quirk calling flush() after fastcgi_finish_request will cause it to exit without warning/error.
You may wish to call ignore_user_abort(true) beforehand to supress this behavior, or simply avoid calling flush() after you've intentionally closed the connection :)
$connected = true;
// Stuff...
fastcgi_finish_request();
$connected = false;
// ...
if ($connected) {
flush();
}
Or
ignore_user_abort(true);
fastcgi_finish_request();
// Accidental flush()es won't do harm (even if you really shouldn't be calling flush() if you know you've disconnected from the user)
flush();
PHP exec("php script.php") can do it.
From the Manual:
If a program is started with this
function, in order for it to continue
running in the background, the output
of the program must be redirected to a
file or another output stream. Failing
to do so will cause PHP to hang until
the execution of the program ends.
So if you redirect the output to a log file (what is a good idea anyways), your calling script will not hang and your email script will run in bg.
And why not making a HTTP Request on the script and ignoring the response ?
http://php.net/manual/en/function.httprequest-send.php
If you make your request on the script you need to call your webserver will run it in background and you can (in your main script) show a message telling the user that the script is running.
The simpler way to run a PHP script in background is
php script.php >/dev/null &
The script will run in background and the page will also reach the action page faster.
How about this?
Your PHP script that holds the form saves a flag or some value into a database or file.
A second PHP script polls for this value periodically and if it's been set, it triggers the Email script in a synchronous manner.
This second PHP script should be set to run as a cron.
As I know you cannot do this in easy way (see fork exec etc (don't work under windows)), may be you can reverse the approach, use the background of the browser posting the form in ajax, so if the post still work you've no wait time.
This can help even if you have to do some long elaboration.
About sending mail it's always suggest to use a spooler, may be a local & quick smtp server that accept your requests and the spool them to the real MTA or put all in a DB, than use a cron that spool the queue.
The cron may be on another machine calling the spooler as external url:
* * * * * wget -O /dev/null http://www.example.com/spooler.php
Background cron job sounds like a good idea for this.
You'll need ssh access to the machine to run the script as a cron.
$ php scriptname.php to run it.
If you can access the server over ssh and can run your own scripts you can make a simple fifo server using php (although you will have to recompile php with posix support for fork).
The server can be written in anything really, you probably can easily do it in python.
Or the simplest solution would be sending an HttpRequest and not reading the return data but the server might destroy the script before it finish processing.
Example server :
<?php
define('FIFO_PATH', '/home/user/input.queue');
define('FORK_COUNT', 10);
if(file_exists(FIFO_PATH)) {
die(FIFO_PATH . ' exists, please delete it and try again.' . "\n");
}
if(!file_exists(FIFO_PATH) && !posix_mkfifo(FIFO_PATH, 0666)){
die('Couldn\'t create the listening fifo.' . "\n");
}
$pids = array();
$fp = fopen(FIFO_PATH, 'r+');
for($i = 0; $i < FORK_COUNT; ++$i) {
$pids[$i] = pcntl_fork();
if(!$pids[$i]) {
echo "process(" . posix_getpid() . ", id=$i)\n";
while(true) {
$line = chop(fgets($fp));
if($line == 'quit' || $line === false) break;
echo "processing (" . posix_getpid() . ", id=$i) :: $line\n";
// $data = json_decode($line);
// processData($data);
}
exit();
}
}
fclose($fp);
foreach($pids as $pid){
pcntl_waitpid($pid, $status);
}
unlink(FIFO_PATH);
?>
Example client :
<?php
define('FIFO_PATH', '/home/user/input.queue');
if(!file_exists(FIFO_PATH)) {
die(FIFO_PATH . ' doesn\'t exist, please make sure the fifo server is running.' . "\n");
}
function postToQueue($data) {
$fp = fopen(FIFO_PATH, 'w+');
stream_set_blocking($fp, false); //don't block
$data = json_encode($data) . "\n";
if(fwrite($fp, $data) != strlen($data)) {
echo "Couldn't the server might be dead or there's a bug somewhere\n";
}
fclose($fp);
}
$i = 1000;
while(--$i) {
postToQueue(array('xx'=>21, 'yy' => array(1,2,3)));
}
?>
If you're on Windows, research proc_open or popen...
But if we're on the same server "Linux" running cpanel then this is the right approach:
#!/usr/bin/php
<?php
$pid = shell_exec("nohup nice php -f
'path/to/your/script.php' /dev/null 2>&1 & echo $!");
While(exec("ps $pid"))
{ //you can also have a streamer here like fprintf,
// or fgets
}
?>
Don't use fork() or curl if you doubt you can handle them, it's just like abusing your server
Lastly, on the script.php file which is called above, take note of this make sure you wrote:
<?php
ignore_user_abort(TRUE);
set_time_limit(0);
ob_start();
// <-- really optional but this is pure php
//Code to be tested on background
ob_flush(); flush();
//this two do the output process if you need some.
//then to make all the logic possible
str_repeat(" ",1500);
//.for progress bars or loading images
sleep(2); //standard limit
?>
For background worker, I think you should try this technique. It will help to call as many as pages you like all pages will run at once independently without waiting for each page response as asynchronous.
form_action_page.php
<?php
post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue");
// post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue2");
// post_async("http://localhost/projectname/otherpage.php", "Keywordname=anyValue");
// call as many as pages you like all pages will run at once //independently without waiting for each page response as asynchronous.
// Your form db insertion or other code goes here do what ever you want //above code will work as background job this line will direct hit before //above lines response
/*
* Executes a PHP page asynchronously so the current page does not have to wait for it to finish running.
*/
function post_async($url,$params)
{
$post_string = $params;
$parts = parse_url($url);
$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);
$out = "GET ".$parts['path']."?$post_string"." HTTP/1.1\r\n";//you can use POST instead of GET if you like
$out .= "Host: ".$parts['host']."\r\n";
$out .= "Content-Type: application/x-www-form-urlencoded\r\n";
$out .= "Content-Length: ".strlen($post_string)."\r\n";
$out .= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
fclose($fp);
}
?>
testpage.php
<?php
echo $_REQUEST["Keywordname"];//case1 Output > testValue
// here do your background operations it will not halt main page
?>
P.S: if you want to send url parameters as loop then follow this answer: https://stackoverflow.com/a/41225209/6295712
Assuming you are running on a *nix platform, use cron and the php executable.
EDIT:
There are quite a number of questions asking for "running php without cron" on SO already. Here's one:
Schedule scripts without using CRON
That said, the exec() answer above sounds very promising :)
In my case I have 3 params, one of them is string (mensaje):
exec("C:\wamp\bin\php\php5.5.12\php.exe C:/test/N/trunk/api/v1/Process.php $idTest2 $idTest3 \"$mensaje\" >> c:/log.log &");
In my Process.php I have this code:
if (!isset($argv[1]) || !isset($argv[2]) || !isset($argv[3]))
{
die("Error.");
}
$idCurso = $argv[1];
$idDestino = $argv[2];
$mensaje = $argv[3];
Use Amphp to execute jobs in parallel & asynchronously.
Install the library
composer require amphp/parallel-functions
Code sample
<?php
require "vendor/autoload.php";
use Amp\Promise;
use Amp\ParallelFunctions;
echo 'started</br>';
$promises[1] = ParallelFunctions\parallel(function (){
// Send Email
})();
$promises[2] = ParallelFunctions\parallel(function (){
// Send SMS
})();
Promise\wait(Promise\all($promises));
echo 'finished';
Fo your use case, You can do something like below
<?php
use function Amp\ParallelFunctions\parallelMap;
use function Amp\Promise\wait;
$responses = wait(parallelMap([
'a#example.com',
'b#example.com',
'c#example.com',
], function ($to) {
return send_mail($to);
}));
This works for me. try this:
exec("php asyn.php > /dev/null 2>/dev/null &);

php live output for bash script if output is filtered [duplicate]

I'm just experimenting with PHP and shell_exec on my Linux server. It's a really cool function to use and I am really enjoying it so far. Is there a way to view the live output that is going on while the command is running?
For example, if ping stackoverflow.com was run, while it is pinging the target address, every time it pings, show the results with PHP? Is that possible?
I would love to see the live update of the buffer as it's running. Maybe it's not possible but it sure would be nice.
This is the code I am trying and every way I have tried it always displays the results after the command is finished.
<?php
$cmd = 'ping -c 10 127.0.0.1';
$output = shell_exec($cmd);
echo "<pre>$output</pre>";
?>
I've tried putting the echo part in a loop but still no luck. Anyone have any suggestions on making it show the live output to the screen instead of waiting until the command is complete?
I've tried exec, shell_exec, system, and passthru. Everyone of them displays the content after it's finished. Unless I'm using the wrong syntax or I'm not setting up the loop correctly.
To read the output of a process, popen() is the way to go. Your script will run in parallel with the program and you can interact with it by reading and writing it's output/input as if it was a file.
But if you just want to dump it's result straight to the user you can cut to the chase and use passthru():
echo '<pre>';
passthru($cmd);
echo '</pre>';
If you want to display the output at run time as the program goes, you can do this:
while (# ob_end_flush()); // end all output buffers if any
$proc = popen($cmd, 'r');
echo '<pre>';
while (!feof($proc))
{
echo fread($proc, 4096);
# flush();
}
echo '</pre>';
This code should run the command and push the output straight to the end user at run time.
More useful information
Note that if you are using sessions then having one of those running will prevent the user from loading other pages, as sessions enforce that concurrent requests cannot happen. To prevent this from being a problem, call session_write_close() before the loop.
If your server is behind a nginx gateway, then the nginx buffering may be disruptive to the desired behavior. Set the header header('X-Accel-Buffering: no'); to hint nginx that it shouldn't do that. As headers are sent first, this has to be called in the beginning of the script, before any data is sent.
First of all, thanks Havenard for your snippet - it helped a lot!
A slightly modified version of Havenard's code which i found useful.
<?php
/**
* Execute the given command by displaying console output live to the user.
* #param string cmd : command to be executed
* #return array exit_status : exit status of the executed command
* output : console output of the executed command
*/
function liveExecuteCommand($cmd)
{
while (# ob_end_flush()); // end all output buffers if any
$proc = popen("$cmd 2>&1 ; echo Exit status : $?", 'r');
$live_output = "";
$complete_output = "";
while (!feof($proc))
{
$live_output = fread($proc, 4096);
$complete_output = $complete_output . $live_output;
echo "$live_output";
# flush();
}
pclose($proc);
// get exit status
preg_match('/[0-9]+$/', $complete_output, $matches);
// return exit status and intended output
return array (
'exit_status' => intval($matches[0]),
'output' => str_replace("Exit status : " . $matches[0], '', $complete_output)
);
}
?>
Sample Usage :
$result = liveExecuteCommand('ls -la');
if($result['exit_status'] === 0){
// do something if command execution succeeds
} else {
// do something on failure
}
If you're willing to download a dependency, Symfony's processor component does this. I found the interface to working with this cleaner than reinventing anything myself with popen() or passthru().
This was provided by the Symfony documentation:
You can also use the Process class with the foreach construct to get
the output while it is generated. By default, the loop waits for new
output before going to the next iteration:
$process = new Process('ls -lsa');
$process->start();
foreach ($process as $type => $data) {
if ($process::OUT === $type) {
echo "\nRead from stdout: ".$data;
} else { // $process::ERR === $type
echo "\nRead from stderr: ".$data;
}
}
As a warning, I've run into some problems PHP and Nginx trying to buffer the output before sending it to the browser. You can disable output buffering in PHP by turning it off in php.ini: output_buffering = off. There's apparently a way to disable it in Nginx, but I ended up using the PHP built in server for my testing to avoid the hassle.
I put up a full example of this on Gitlab: https://gitlab.com/hpierce1102/web-shell-output-streaming

How to save status the youtube dl to the .txt file [duplicate]

I'm just experimenting with PHP and shell_exec on my Linux server. It's a really cool function to use and I am really enjoying it so far. Is there a way to view the live output that is going on while the command is running?
For example, if ping stackoverflow.com was run, while it is pinging the target address, every time it pings, show the results with PHP? Is that possible?
I would love to see the live update of the buffer as it's running. Maybe it's not possible but it sure would be nice.
This is the code I am trying and every way I have tried it always displays the results after the command is finished.
<?php
$cmd = 'ping -c 10 127.0.0.1';
$output = shell_exec($cmd);
echo "<pre>$output</pre>";
?>
I've tried putting the echo part in a loop but still no luck. Anyone have any suggestions on making it show the live output to the screen instead of waiting until the command is complete?
I've tried exec, shell_exec, system, and passthru. Everyone of them displays the content after it's finished. Unless I'm using the wrong syntax or I'm not setting up the loop correctly.
To read the output of a process, popen() is the way to go. Your script will run in parallel with the program and you can interact with it by reading and writing it's output/input as if it was a file.
But if you just want to dump it's result straight to the user you can cut to the chase and use passthru():
echo '<pre>';
passthru($cmd);
echo '</pre>';
If you want to display the output at run time as the program goes, you can do this:
while (# ob_end_flush()); // end all output buffers if any
$proc = popen($cmd, 'r');
echo '<pre>';
while (!feof($proc))
{
echo fread($proc, 4096);
# flush();
}
echo '</pre>';
This code should run the command and push the output straight to the end user at run time.
More useful information
Note that if you are using sessions then having one of those running will prevent the user from loading other pages, as sessions enforce that concurrent requests cannot happen. To prevent this from being a problem, call session_write_close() before the loop.
If your server is behind a nginx gateway, then the nginx buffering may be disruptive to the desired behavior. Set the header header('X-Accel-Buffering: no'); to hint nginx that it shouldn't do that. As headers are sent first, this has to be called in the beginning of the script, before any data is sent.
First of all, thanks Havenard for your snippet - it helped a lot!
A slightly modified version of Havenard's code which i found useful.
<?php
/**
* Execute the given command by displaying console output live to the user.
* #param string cmd : command to be executed
* #return array exit_status : exit status of the executed command
* output : console output of the executed command
*/
function liveExecuteCommand($cmd)
{
while (# ob_end_flush()); // end all output buffers if any
$proc = popen("$cmd 2>&1 ; echo Exit status : $?", 'r');
$live_output = "";
$complete_output = "";
while (!feof($proc))
{
$live_output = fread($proc, 4096);
$complete_output = $complete_output . $live_output;
echo "$live_output";
# flush();
}
pclose($proc);
// get exit status
preg_match('/[0-9]+$/', $complete_output, $matches);
// return exit status and intended output
return array (
'exit_status' => intval($matches[0]),
'output' => str_replace("Exit status : " . $matches[0], '', $complete_output)
);
}
?>
Sample Usage :
$result = liveExecuteCommand('ls -la');
if($result['exit_status'] === 0){
// do something if command execution succeeds
} else {
// do something on failure
}
If you're willing to download a dependency, Symfony's processor component does this. I found the interface to working with this cleaner than reinventing anything myself with popen() or passthru().
This was provided by the Symfony documentation:
You can also use the Process class with the foreach construct to get
the output while it is generated. By default, the loop waits for new
output before going to the next iteration:
$process = new Process('ls -lsa');
$process->start();
foreach ($process as $type => $data) {
if ($process::OUT === $type) {
echo "\nRead from stdout: ".$data;
} else { // $process::ERR === $type
echo "\nRead from stderr: ".$data;
}
}
As a warning, I've run into some problems PHP and Nginx trying to buffer the output before sending it to the browser. You can disable output buffering in PHP by turning it off in php.ini: output_buffering = off. There's apparently a way to disable it in Nginx, but I ended up using the PHP built in server for my testing to avoid the hassle.
I put up a full example of this on Gitlab: https://gitlab.com/hpierce1102/web-shell-output-streaming

Executing a daemon in php [duplicate]

Problem
I have a form that, when submitted, will run basic code to process the information submitted and insert it into a database for display on a notification website. In addition, I have a list of people who have signed up to receive these notifications via email and SMS message. This list is trivial as the moment (only pushing about 150), however it's enough to cause it takes upwards of a minute to cycle through the entire table of subscribers and send out 150+ emails. (The emails are being sent individually as requested by the system administrators of our email server because of mass email policies.)
During this time, the individual who posted the alert will sit on the last page of the form for almost a minute without any positive reinforcement that their notification is being posted. This leads to other potential problems, all that have possible solutions that I feel are less than ideal.
First, the poster might think the server is lagging and click the 'Submit' button again, causing the script to start over or run twice. I could solve this by using JavaScript to disable the button and replace the text to say something like 'Processing...', however this is less than ideal because the user will still be stuck on the page for the length of the script execution. (Also, if JavaScript is disabled, this problem still exists.)
Second, the poster might close the tab or the browser prematurely after submitting the form. The script will keeping running on the server until it tries to write back to the browser, however if the user then browses to any page within our domain (while the script is still running), the browser hangs loading the page until the script has ended. (This only happens when a tab or window of the browser is closed and not the entire browser application.) Still, this is less than ideal.
(Possible) Solution
I've decided I want to break out the "email" part of the script into a separate file I can call after the notification has been posted. I originally thought of putting this on the confirmation page after the notification has been successfully posted. However, the user will not know this script is running and any anomalies will not be apparent to them; This script cannot fail.
But, what if I can run this script as a background process? So, my question is this: How can I execute a PHP script to trigger as a background service and run completely independent of what the user has done at the form level?
EDIT: This cannot be cron'ed. It must run the instant the form is submitted. These are high-priority notifications. In addition, the system administrators running our servers disallow crons from running any more frequently than 5 minutes.
Doing some experimentation with exec and shell_exec I have uncovered a solution that worked perfectly! I choose to use shell_exec so I can log every notification process that happens (or doesn't). (shell_exec returns as a string and this was easier than using exec, assigning the output to a variable and then opening a file to write to.)
I'm using the following line to invoke the email script:
shell_exec("/path/to/php /path/to/send_notifications.php '".$post_id."' 'alert' >> /path/to/alert_log/paging.log &");
It is important to notice the & at the end of the command (as pointed out by #netcoder). This UNIX command runs a process in the background.
The extra variables surrounded in single quotes after the path to the script are set as $_SERVER['argv'] variables that I can call within my script.
The email script then outputs to my log file using the >> and will output something like this:
[2011-01-07 11:01:26] Alert Notifications Sent for http://alerts.illinoisstate.edu/2049 (SCRIPT: 38.71 seconds)
[2011-01-07 11:01:34] CRITICAL ERROR: Alert Notifications NOT sent for http://alerts.illinoisstate.edu/2049 (SCRIPT: 23.12 seconds)
On Linux/Unix servers, you can execute a job in the background by using proc_open:
$descriptorspec = array(
array('pipe', 'r'), // stdin
array('file', 'myfile.txt', 'a'), // stdout
array('pipe', 'w'), // stderr
);
$proc = proc_open('php email_script.php &', $descriptorspec, $pipes);
The & being the important bit here. The script will continue even if the original script has ended.
Of all the answers, none considered the ridiculously easy fastcgi_finish_request function, that when called, flushes all remaining output to the browser and closes the Fastcgi session and the HTTP connection, while letting the script run in the background.
Example:
<?php
header('Content-Type: application/json');
echo json_encode(['ok' => true]);
fastcgi_finish_request(); // The user is now disconnected from the script
// Do stuff with received data
Note: Due to a wontfix quirk calling flush() after fastcgi_finish_request will cause it to exit without warning/error.
You may wish to call ignore_user_abort(true) beforehand to supress this behavior, or simply avoid calling flush() after you've intentionally closed the connection :)
$connected = true;
// Stuff...
fastcgi_finish_request();
$connected = false;
// ...
if ($connected) {
flush();
}
Or
ignore_user_abort(true);
fastcgi_finish_request();
// Accidental flush()es won't do harm (even if you really shouldn't be calling flush() if you know you've disconnected from the user)
flush();
PHP exec("php script.php") can do it.
From the Manual:
If a program is started with this
function, in order for it to continue
running in the background, the output
of the program must be redirected to a
file or another output stream. Failing
to do so will cause PHP to hang until
the execution of the program ends.
So if you redirect the output to a log file (what is a good idea anyways), your calling script will not hang and your email script will run in bg.
And why not making a HTTP Request on the script and ignoring the response ?
http://php.net/manual/en/function.httprequest-send.php
If you make your request on the script you need to call your webserver will run it in background and you can (in your main script) show a message telling the user that the script is running.
The simpler way to run a PHP script in background is
php script.php >/dev/null &
The script will run in background and the page will also reach the action page faster.
How about this?
Your PHP script that holds the form saves a flag or some value into a database or file.
A second PHP script polls for this value periodically and if it's been set, it triggers the Email script in a synchronous manner.
This second PHP script should be set to run as a cron.
As I know you cannot do this in easy way (see fork exec etc (don't work under windows)), may be you can reverse the approach, use the background of the browser posting the form in ajax, so if the post still work you've no wait time.
This can help even if you have to do some long elaboration.
About sending mail it's always suggest to use a spooler, may be a local & quick smtp server that accept your requests and the spool them to the real MTA or put all in a DB, than use a cron that spool the queue.
The cron may be on another machine calling the spooler as external url:
* * * * * wget -O /dev/null http://www.example.com/spooler.php
Background cron job sounds like a good idea for this.
You'll need ssh access to the machine to run the script as a cron.
$ php scriptname.php to run it.
Assuming you are running on a *nix platform, use cron and the php executable.
EDIT:
There are quite a number of questions asking for "running php without cron" on SO already. Here's one:
Schedule scripts without using CRON
That said, the exec() answer above sounds very promising :)
If you can access the server over ssh and can run your own scripts you can make a simple fifo server using php (although you will have to recompile php with posix support for fork).
The server can be written in anything really, you probably can easily do it in python.
Or the simplest solution would be sending an HttpRequest and not reading the return data but the server might destroy the script before it finish processing.
Example server :
<?php
define('FIFO_PATH', '/home/user/input.queue');
define('FORK_COUNT', 10);
if(file_exists(FIFO_PATH)) {
die(FIFO_PATH . ' exists, please delete it and try again.' . "\n");
}
if(!file_exists(FIFO_PATH) && !posix_mkfifo(FIFO_PATH, 0666)){
die('Couldn\'t create the listening fifo.' . "\n");
}
$pids = array();
$fp = fopen(FIFO_PATH, 'r+');
for($i = 0; $i < FORK_COUNT; ++$i) {
$pids[$i] = pcntl_fork();
if(!$pids[$i]) {
echo "process(" . posix_getpid() . ", id=$i)\n";
while(true) {
$line = chop(fgets($fp));
if($line == 'quit' || $line === false) break;
echo "processing (" . posix_getpid() . ", id=$i) :: $line\n";
// $data = json_decode($line);
// processData($data);
}
exit();
}
}
fclose($fp);
foreach($pids as $pid){
pcntl_waitpid($pid, $status);
}
unlink(FIFO_PATH);
?>
Example client :
<?php
define('FIFO_PATH', '/home/user/input.queue');
if(!file_exists(FIFO_PATH)) {
die(FIFO_PATH . ' doesn\'t exist, please make sure the fifo server is running.' . "\n");
}
function postToQueue($data) {
$fp = fopen(FIFO_PATH, 'w+');
stream_set_blocking($fp, false); //don't block
$data = json_encode($data) . "\n";
if(fwrite($fp, $data) != strlen($data)) {
echo "Couldn't the server might be dead or there's a bug somewhere\n";
}
fclose($fp);
}
$i = 1000;
while(--$i) {
postToQueue(array('xx'=>21, 'yy' => array(1,2,3)));
}
?>
If you're on Windows, research proc_open or popen...
But if we're on the same server "Linux" running cpanel then this is the right approach:
#!/usr/bin/php
<?php
$pid = shell_exec("nohup nice php -f
'path/to/your/script.php' /dev/null 2>&1 & echo $!");
While(exec("ps $pid"))
{ //you can also have a streamer here like fprintf,
// or fgets
}
?>
Don't use fork() or curl if you doubt you can handle them, it's just like abusing your server
Lastly, on the script.php file which is called above, take note of this make sure you wrote:
<?php
ignore_user_abort(TRUE);
set_time_limit(0);
ob_start();
// <-- really optional but this is pure php
//Code to be tested on background
ob_flush(); flush();
//this two do the output process if you need some.
//then to make all the logic possible
str_repeat(" ",1500);
//.for progress bars or loading images
sleep(2); //standard limit
?>
For background worker, I think you should try this technique. It will help to call as many as pages you like all pages will run at once independently without waiting for each page response as asynchronous.
form_action_page.php
<?php
post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue");
// post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue2");
// post_async("http://localhost/projectname/otherpage.php", "Keywordname=anyValue");
// call as many as pages you like all pages will run at once //independently without waiting for each page response as asynchronous.
// Your form db insertion or other code goes here do what ever you want //above code will work as background job this line will direct hit before //above lines response
/*
* Executes a PHP page asynchronously so the current page does not have to wait for it to finish running.
*/
function post_async($url,$params)
{
$post_string = $params;
$parts = parse_url($url);
$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);
$out = "GET ".$parts['path']."?$post_string"." HTTP/1.1\r\n";//you can use POST instead of GET if you like
$out .= "Host: ".$parts['host']."\r\n";
$out .= "Content-Type: application/x-www-form-urlencoded\r\n";
$out .= "Content-Length: ".strlen($post_string)."\r\n";
$out .= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
fclose($fp);
}
?>
testpage.php
<?php
echo $_REQUEST["Keywordname"];//case1 Output > testValue
// here do your background operations it will not halt main page
?>
P.S: if you want to send url parameters as loop then follow this answer: https://stackoverflow.com/a/41225209/6295712
In my case I have 3 params, one of them is string (mensaje):
exec("C:\wamp\bin\php\php5.5.12\php.exe C:/test/N/trunk/api/v1/Process.php $idTest2 $idTest3 \"$mensaje\" >> c:/log.log &");
In my Process.php I have this code:
if (!isset($argv[1]) || !isset($argv[2]) || !isset($argv[3]))
{
die("Error.");
}
$idCurso = $argv[1];
$idDestino = $argv[2];
$mensaje = $argv[3];
Use Amphp to execute jobs in parallel & asynchronously.
Install the library
composer require amphp/parallel-functions
Code sample
<?php
require "vendor/autoload.php";
use Amp\Promise;
use Amp\ParallelFunctions;
echo 'started</br>';
$promises[1] = ParallelFunctions\parallel(function (){
// Send Email
})();
$promises[2] = ParallelFunctions\parallel(function (){
// Send SMS
})();
Promise\wait(Promise\all($promises));
echo 'finished';
Fo your use case, You can do something like below
<?php
use function Amp\ParallelFunctions\parallelMap;
use function Amp\Promise\wait;
$responses = wait(parallelMap([
'a#example.com',
'b#example.com',
'c#example.com',
], function ($to) {
return send_mail($to);
}));
This works for me. try this:
exec("php asyn.php > /dev/null 2>/dev/null &);

AJAX - Progress bar for a shell command that is executed

I am making use of AJAX on my site and I would like to show users progress of a file that is being downloaded by my server.
The download is done by script that outputs a percentage to the shell. I would like to pass this info back to the user using AJAX. How can I do this?
Thank you for any help and direction.
I hope your solutions do not involve writing to a text file and retrieving that percentage from the text file!! Too much over head I think.
EDIT - More Info
It is a Linux Shell command - Fedora Core 10.
Currently this is how the shell output looks like:
[download] 9.9% of 10.09M at 10.62M/s ETA 00:00
The percentage changes and I wish to capture that and send it back to the user as it changes.
To execute this, I make use of PHPs exec() function.
Instead of exec, you could use popen. This will give you a handle you use with fread to grab the output your command generates as it happens.
You'll need to parse out the updates it makes to the percentage indicator. Once you have that data, there are a few ways you could get it to a client, e.g. with a "comet" style push, or have an Ajax request poll for updates.
I haven't tried this, but I think this approach would work.
You need three pieces:
Have shell script output its stream to netcat connected to a port
Have a php script listening to stream coming from said port for incoming data, updating a record in memcache or some database w/ the percentage finished.
Have your web script periodically make ajax calls, to the server which checks this value in your backend store.
I'm working on a similar problem. I have to parse the output of my video conversion shell script. I use popen and parse the output of the returned resource. At first I used fgets but that didn't recognize the updated values as new lines. So I created a simple function to that takes an optional $arg_delimiter so you can check for other return types like the chr(13) cariage return. The example code is a bit modified and therefor untested because in my case these functions were methods on my parser object.
function get_line ($arg_handle, $arg_delimiter = NULL)
{
$delimiter = (NULL !== $arg_delimiter) ? $arg_delimiter : chr(10);
$result = array();
while ( ! feof($arg_handle))
{
$currentCharacter = fgetc($arg_handle);
if ($delimiter === $currentCharacter)
{
return implode('', $result);
}
$result[] = $currentCharacter;
}
return implode('', $result);
}
I simply loop over the results from the popen() resource like this:
$command = '/usr/bin/yourcommand';
$handle = popen($command . ' 2>&1', 'r');
while ( ! feof($handle))
{
$line = get_line($handle, chr(13));
preg_match($yourParserRegex, $line, $data);
if (count($data) > 0)
{
printf("<script type='text/javascript'>\n //<![CDATA[\n window.alert('Result: %s');\n // ]]>\n</script>"
,$data[1]
);
flush();
}
}
Now all you need to do is figure out the comet stuff.

Categories