How to run and forget a command in php on windows - php

What I'm trying to do is to run on Windows a command, that lauches putty.exe with specyfic parameters, that will start a connection and run some commands, that are in txt file on a specified in a putty session other server. I just had to run some scripts from the other server on linux, while this main script will run on windows server.
It looks a like that:
$putty_exe_path = 'C:\\putty\\putty.exe';
$linux_command_txt_file_path = 'C:\\putty\\commands_to_run_on_linux.txt';
$putty_session_name = 'puttysessionname';
$linux_user = 'root';
$linux_password = 'password';
$cmd = "\"\"".$putty_exe_path."\" -load ".$putty_session_name." -l ".$linux_user." -pw ".$linux_password." -m \"".$linux_command_txt_file_path."\"\"";
shell_exec($cmd);
And it works, but there is a problem when script on the linux, that I run using this way is running for a long time because my script is wating for output. I would like to run that command with exec()/shell_exec() or somehow on windows server and exit. I don't need the output. I need to fire and forget. I was looking for a solution but all I found is this:
> /dev/null 2>/dev/null &
but it doesn't work on windows I guess.

If you want to execute a console command in windows and don't await for its output you redirect the output to > NUL
For example , execute this in php :
var_dump(system("cmd.exe > NUL"));// will output inmediately string(0)""
var_dump(system("cmd.exe"));// will wait for the prompt result and write something like Microsoft Windows bla bla bla ...
Aditionally , if you want to know if this is really doing something, try with :
system("echo SOME TEXT > C:\SOME_FILE.txt"); // tHIS WILL Create a text file in the C disk of windows (but we dont add > NUL because there is already an output path !)
Try with it please, read more in :
more info about

Not an answer but too long for a comment
It's important to understand what > /dev/null 2>/dev/null & is doing.
In Linux/Unix operating systems, /dev/null is an emulated device for "nothing", it discards anything written to it. It, /dev/null, doesn't exist in Windows. Research more here: https://en.wikipedia.org/wiki/Null_device
In linux there are two primary streams of output when running commands, stdout and stderr. >/dev/null redirects stdout to /dev/null, 2>/dev/null redirects stderr to /dev/null and & sends the command to the background. Now you can use this to look up equivalents in Windows.

Related

How run PHP script file in the background forever

My issue seems to be asked before but hold on, this one is a bit different.
I have 2 php files, I run the following commands:
nohup php file_1.php >/dev/null 2>&1 &
nohup php file_2.php >/dev/null 2>&1 &
This will create basically 2 php processes.
My problem is that sometimes one of the files or even both of them are killed by the server for unknown reason and I have to re-enter the commands over again. I tried 'forever' but doesn't help.
If the server is rebooted I will have to enter those 2 commands, I thought about Cronjob but I'm not sure if it would launch it twice which would create more confusion.
My question is how to start automatically the files if one or both of them got killed? What is the best way to achieve this which would check exactly that file_1.php or that file_2.php is indeed running?
There are a couple of ways you can do this. As #Chris Haas mentioned in the comments, supervisord can do this for you, or you can run a watchdog shell script from cron at regular intervals that checks to see if your script is running, and if not, start it. Here's one I use.
#!/bin/bash
FIND_PROC=`ps -ef |grep "php queue_agent.php --env prod" | awk '{if ($8 !~ /grep/) print $2}'`
# if FIND_PROC is empty, the process has died; restart it
if [ -z "${FIND_PROC}" ]; then
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
echo queue_agent.php failed at `date`
cd ${DIR}
nohup nice -n 10 php queue_agent.php --env prod -v > ../../sandbox/logs/queue_agent.log &
fi
exit 0
I think u can try to figure out why these two php scripts shut down as the first step. To solve this problem, u can use this php function:
void register_shutdown_function(callback $callback[.mixed $parameter]);
which Registers a callback to be executed after script execution finishes or exit() is called. So u can log some info when php files get shut down like this:
function myLogFunction() {
//log some info here
}
register_shutdown_function(myLogFunction);
Instead of putting the standard output and error output into /dev/null, u can put them into a log file(Since maybe we can get some helpful info from the output). So instead of using:
nohup php file_1.php >/dev/null 2>&1 &
nohup php file_2.php >/dev/null 2>&1 &
try:
nohup php file_1.php 2>yourLog.log &
nohup php file_2.php 2>yourLog.log &
If u want to autorun these two php files when u boot the server, try edit /etc/rc.local(which is autorun when the os start up). Add your php cli command lines in this file.
If u can't figure out why php threads get shut down, try supervisor as #Chris Haas mensioned.

Start as background process PHP built in server on Windows

I am using PHP built in server for testing and I was wondering is there a way you can hide cmd window when launching built in server using command php -S 127.0.0.1:809 -t Folder
I am currently working on Windows 10 so I need a Win solution.
not hundred percent sure on this, but you might try this one from:
What is cmd's equivalent to Bash's & (ampersand) for running a command without waiting for it to terminate?
so yours could be something like:
start /B php -S 127.0.0.1:809 -t Folder
You can create vbs script (run.vbs) and you can put this code in it
Dim oShell
Set oShell = WScript.CreateObject ("WSCript.shell")
oShell.run "cmd /C CD resource\php && php -S 127.0.0.1:809 -t HTML", 0
Set oShell = Nothing
0 in line signal for not displaying command line window.

Redirection in PHP exec call creates empty file

It's quite simple, and I'm out of ideas. I'm sure there is a quick workaround.
exec('echo 123 &> /var/log/123.log');
I'm sure it's not about the permissions, because the file 123.log is created, but it's just- empty. I've also tried shell_exec, but it doesn't create the file at all.
Also tried all variants of redirection, i.e. 1> 2> >.
Using PHP to capture the output is not the option, as the output in production is huge, and I don't want to run into memory issues.
Any ideas appreciated.
Btw, I'm using Ubuntu 12.04 LAMP.
Debian and Debian based Linux distributions like Ubuntu are using dash and not bash as /bin/sh by now.
&> is a bash extension, the dash does not know about.
The correct posix-compatible way to write cmd &> file is cmd > file 2>&1
cmd > file 2>&1 works in all posix-compatible shells: dash, bash, ksh, zsh, ash ...
So you need to change your code to:
exec('echo 123 > /var/log/123.log 2>&1');
Try shell_exec without &:
echo shell_exec("echo 123 > /var/log/123.log");
Only thing that did help was to create a shell script with exec permissions, e.g. test.sh:
#!/bin/bash
echo 123 &>> /var/log/123.log
and execute it like this:
shell_exec('[full path to]/test.sh');
So, redirection operator is not important, but everything else is (#! directive, shell_exec).

PHP exec in background using & is not working

I am using this code on Ubuntu 13.04,
$cmd = "sleep 20 &> /dev/null &";
exec($cmd, $output);
Although it actually sits there for 20 seconds and waits :/ usually it works fine when using & to send a process to the background, but on this machine php just won't do it :/
What could be causing this??
Try
<?PHP
$cmd = '/bin/sleep';
$args = array('20');
$pid=pcntl_fork();
if($pid==0)
{
posix_setsid();
pcntl_exec($cmd,$args,$_ENV);
// child becomes the standalone detached process
}
echo "DONE\n";
I tested it for it works.
Here you first fork the php process and then exceute your task.
Or if the pcntl module is not availabil use:
<?PHP
$cmd = "sleep 20 &> /dev/null &";
exec('/bin/bash -c "' . addslashes($cmd) . '"');
The REASON this doesn't work is that exec() executes the string you're passing into it. Since & is interpreted by the shell as "execute in the background", but you don't execute a shell in your exec call, the & is just passed along with 20 to the /bin/sleep executable - which probably just ignores that.
The same applies to the redirection of output, since that is also parsed by the shell, not in exec.
So, you either need to find a way to fork your process (as described above), or a way to run the subprocess as a shell.
My workaround to do this on ubuntu 13.04 with Apache2 and any version of PHP:
libssh2-php, I just used nohup $cmd & inside a local SSH session using PHP and it ran it just fine the background, of course this requires putting certain security protocols in place, such as enabling SSH access for the webserver user, so it would have exec-like permissions then only allowing localhost to login to the webserver ssh account.

How to check if a php script is still running

I have a PHP script that listens on a queue. Theoretically, it's never supposed to die. Is there something to check if it's still running? Something like Ruby's God ( http://god.rubyforge.org/ ) for PHP?
God is language agnostic but it would be nice to have a solution that works on windows as well.
I had the same issue - wanting to check if a script is running. So I came up with this and I run it as a cron job. It grabs the running processes as an array and cycles though each line and checks for the file name. Seems to work fine. Replace #user# with your script user.
exec("ps -U #user# -u #user# u", $output, $result);
foreach ($output AS $line) if(strpos($line, "test.php")) echo "found";
In linux run ps as follows:
ps -C php -f
You could then do in a php script:
$output = shell_exec('ps -C php -f');
if (strpos($output, "php my_script.php")===false) {
shell_exec('php my_script.php > /dev/null 2>&1 &');
}
The above code lists all php processes running in full, then checks to see if "my_script.php" is in the list of running processes, if not it runs the process and does not wait for the process to terminate to carry on doing what it was doing.
Just append a second command after the script. When/if it stops, the second command is invoked. Eg.:
php daemon.php 2>&1 | mail -s "Daemon stopped" you#example.org
Edit:
Technically, this invokes the mailer right away, but only completes the command when the php script ends. Doing this captures the output of the php-script and includes in the mail body, which can be useful for debugging what caused the script to halt.
Simple bash script
#!/bin/bash
while [true]; do
if ! pidof -x script.php;
then
php script.php &
fi
done
Not for windows, but...
I've got a couple of long-running PHP scripts, that have a shell script wrapping it. You can optionally return a value from the script that will be checked in the shell-script to exit, restart immediately, or sleep for a few seconds -and then restart.
Here's a simple one that just keeps running the PHP script till it's manually stopped.
#!/bin/bash
clear
date
php -f cli-SCRIPT.php
echo "wait a little while ..."; sleep 10
exec $0
The "exec $0" restarts the script, without creating a sub-process that will have to unravel later (and take up resources in the meantime). This bash script wraps a mail-sender, so it's not a problem if it exits and pauses for a moment.
Here is what I did to combat a similar issue. This helps in the event anyone else has a parameterized php script that you want cron to execute frequently, but only want one execution to run at any time. Add this to the top of your php script, or create a common method.
$runningScripts = shell_exec('ps -ef |grep '.strtolower($parameter).' |grep '.dirname(__FILE__).' |grep '.basename(__FILE__).' |grep -v grep |wc -l');
if($runningScripts > 1){
die();
}
You can write in your crontab something like this:
0 3 * * * /usr/bin/php -f /home/test/test.php my_special_cron
Your test.php file should look like this:
<?php
php_sapi_name() == 'cli' || exit;
if($argv[1]) {
substr_count(shell_exec('ps -ax'), $argv[1]) < 3 || exit;
}
// your code here
That way you will have only one active instace of the cron job with my-special-cron as process key. So you can add more jobs within the same php file.
test.php system_send_emails sendEmails
test.php system_create_orders orderExport
Inspired from Justin Levene's answer and improved it as ps -C doesn't work in Mac, which I need in my case. So you can use this in a php script (maybe just before you need daemon alive), tested in both Mac OS X 10.11.4 & Ubuntu 14.04:
$daemonPath = "FULL_PATH_TO_DAEMON";
$runningPhpProcessesOfDaemon = (int) shell_exec("ps aux | grep -c '[p]hp ".$daemonPath."'");
if ($runningPhpProcessesOfDaemon === 0) {
shell_exec('php ' . $daemonPath . ' > /dev/null 2>&1 &');
}
Small but useful detail: Why grep -c '[p]hp ...' instead of grep -c 'php ...'?
Because while counting processes grep -c 'php ...' will be counted as a process that fits in our pattern. So using a regex for first letter of php makes our command different from pattern we search.
One possible solution is to have it listen on a port using the socket functions. You can check that the socket is still listening with a simple script. Even a monitoring service like pingdom could monitor its status. If it dies, the socket is no longer listening.
Plenty of solutions.. Good luck.
If you have your hands on the script, you can just ask him to set a time value every X times in db, and then let a cron job check if that value is up to date.
troelskn wrote:
Just append a second command after the script. When/if it stops, the second command is invoked. Eg.:
php daemon.php | mail -s "Daemon stopped" you#example.org
This will call mail each time a line is printed in daemon.php (which should be never, but still.)
Instead, use the double ampersand operator to separate the commands, i.e.
php daemon.php & mail -s "Daemon stopped" you#example.org
If you're having trouble checking for the PHP script directly, you can make a trivial wrapper and check for that. I'm not sufficiently familiar with Windows scripting to put how it's done here, but in Bash, it'd look like...
wrapper_for_test_php.sh
#!/bin/bash
php test.php
Then you'd just check for the wrapper like you'd check for any other bash script: pidof -x wrapper_for_test_php.sh
I have used cmder for windows and based on this script I came up with this one that I managed to deploy on linux later.
#!/bin/bash
clear
date
while true
do
php -f processEmails.php
echo "wait a little while for 5 secobds...";
sleep 5
done

Categories