php script to kill shell 'sleep' process - php

I need to write some scripts for some automation work,
I put a php file on local apache server
test.php
<?php
system("bash inform.sh");
?>
the content of inform.sh is:
#!/bin/bash
proc_id=`ps -ef|grep "sleep"|grep -v "grep"|awk '{print $2}'`
kill -9 $proc_id
I run sleep process on a shell, and open the php page on firefox : localhost/test.php
but it doesn't kill the sleep process,
if I run the php directly through shell, then it works
what's wrong with this and how to deal with it? thanks

I use the following shell command and then it is OK:
sudo -u apache sleep 2000

Related

End Process Created by PHP exec in Ubuntu using Apache

I have an Ubuntu VM running in VirtualBox that hosts a server using Apache. The concept of the server is to accept HTTP POST requests, store them in a MySQL database and then execute a Python script with the relevant POST data to be displayed in a Discord channel.
The process itself is working but each time the PHP script calls the Python script, a new process is created that never actually ends. After a few hours of receiving live data the server runs out of available memory due to the amount of lingering processes. The PHP script has the following exec call as the last line of code;
exec("python3 main.py $DATA");
I would like to come up with a way to actually kill the processes created from this exec command (using user www-data), either in the Python file after the script is executed or automatically with an Apache setting that I probably just do not know about.
When running the following command in a terminal I can see the different processes;
ps -o pid,user,%mem,command ax | sort -b -k3 -r
There are 3 separate processes that show up, 1 referencing the actual python3 exec command as marked up in PHP;
9903 www-data 0.4 python3 main.py DATADATADATADATADATADATA
Then another process showing the more common -k start commands;
9907 www-data 0.1 /usr/sbin/apache2 -k start
And lastly another process very similar to the PHP exec command;
9902 www-data 0.0 sh -c python3 main.py DATADATADATADATADATADATA
How can I ensure Apache cleans these processes up - OR what do I need to add into my Python or PHP code to appropriately exec a Python script without leaving behind processes?
Didn't realize the exec command in php would wait for a return output indefinitely. Added this to the end of the string I was using in my exec call; > /dev/null &
i.e.: exec("python3 main.py $DATA > /dev/null &");

How to allow backgrounding a process to survive a session termination?

So I'm using a Putty session to run a script on background.
my script can be located by
cd /var/www/listingapp.
Then I will run the command
php /var/www/listingapp/public/index.php batch/GetPrefecture init > /dev/null &
so I'm expecting that the script will run in background, but after I close the putty CLI and try to re-open it, the process can no longer be found when entering the command
ps aux | grep /var/www/listingapp
So this means it stopped. How do I make it to run in background when session is terminated?
nohup php /var/www/listingapp/public/index.php batch/GetPrefecture init &

background script spawned by PHP command script print to screen

I need to make a background script that is spawned by PHP command line script that echos to the SSH session. Essentially, I need to do this linux command:
script/path 2>&1 &
If I just run this command in linux, it works great. Output is still displayed to the screen, but I can still use the same session for other commands. However, when I do this in PHP, it doesn't work the same way.
I've tried:
`script/path 2>&1 &`;
exec("script/path 2>&1 &");
system("script/path 2>&1 &")
...And none of these work. I need it to spawn the process, and then kill itself so that I can free up the session, but I still want the output from the child process to print to the screen.
(please comment if this is unclear... I had a hard time putting this into words :P)
I came up with a solution that works in this case.
I created a wrapper bash script that starts up the PHP script, which in turn spawns a child script that has its output redirected to a file, which the bash script wrapper tails.
Here is the bash script I came up with:
php php_script.php "$#"
ps -ef | grep php_script.log | grep -v grep | awk '{print $2}' | xargs kill > /dev/null 2>&1
if [ "$1" != "-stop" ]
then
tail -f php_script.log -n 0 &
fi
(it also cleans up "tail" processes that are still running so that you don't get a gazillion processes when you run this bash script multiple times)
And then in your php child script, you call the external script like this:
exec("php php_script.php >> php_script.log &");
This way the parent PHP script exits without killing the child script, you still get the output from the child script, and your command prompt is still available for other commands.

launching php script running on server (and opening sockets) from a php page

I'd need you precious help on a matter I am spending hours on.
Scope: Apache2 and PHP running on a raspberry pi;
Premise: my little knowledge of Linux environment!
The objective: launching a long-run php script, that opens sockets, from another php script running as webpage. In other terms, the application is a chat and I need to start the php server script form a web page, for my convenience.
The issue: if I run it from the the console, logged as "pi", with the following command
php -q /var/www/chatSocket.php > /var/www/tmp/socketProcessOutput.txt 2>&1 & echo $!
it works like a charm, but if it try to do so from a script, with the following (don't mind the concatenated strings and assignment of output to variables - it made no difference removing them):
$result .= "Result of pkill (killed process): " .shell_exec('sudo pkill -f SongWebSocket.php') ."\n";
$result .= "Launching new process: id returned:". shell_exec('php -q /var/www/chatSocket.php > /var/www/tmp/socketProcessOutput.txt 2>&1 & echo $!') ."\n";
$result .= "Checking running SongWebSocket.php process:" ."\n";
$result .= shell_exec('ps -A aux| grep -e SongWebSocket.php -e USER') ."\n";
.. it does not work (it seems like it launch the script but the sockets ar not open).
Any clue why this happens?
Also, and this can be for my little knowledge of Linux, why i get a dioffrent aoutput from the command
ps aux| grep -e SongWebSocket.php -e USER
if I launch it from the shell, as user pi, or from the sript, as www-data user.
I Look forward for your help. Thanks in advance!
Marco.
www-data user doesn't have the permisson I guess. why not add "sudo" for every shell_exec line? (sudo starts the programm with root permission). it's not pretty and not secure but it might work on you local home-network. sudo php ... sudo ps -A aux etc. In addition you should make sure that the php safe_mode is off. you can see that by adding phpinfo(); to you php code

How to check if a php script is still running

I have a PHP script that listens on a queue. Theoretically, it's never supposed to die. Is there something to check if it's still running? Something like Ruby's God ( http://god.rubyforge.org/ ) for PHP?
God is language agnostic but it would be nice to have a solution that works on windows as well.
I had the same issue - wanting to check if a script is running. So I came up with this and I run it as a cron job. It grabs the running processes as an array and cycles though each line and checks for the file name. Seems to work fine. Replace #user# with your script user.
exec("ps -U #user# -u #user# u", $output, $result);
foreach ($output AS $line) if(strpos($line, "test.php")) echo "found";
In linux run ps as follows:
ps -C php -f
You could then do in a php script:
$output = shell_exec('ps -C php -f');
if (strpos($output, "php my_script.php")===false) {
shell_exec('php my_script.php > /dev/null 2>&1 &');
}
The above code lists all php processes running in full, then checks to see if "my_script.php" is in the list of running processes, if not it runs the process and does not wait for the process to terminate to carry on doing what it was doing.
Just append a second command after the script. When/if it stops, the second command is invoked. Eg.:
php daemon.php 2>&1 | mail -s "Daemon stopped" you#example.org
Edit:
Technically, this invokes the mailer right away, but only completes the command when the php script ends. Doing this captures the output of the php-script and includes in the mail body, which can be useful for debugging what caused the script to halt.
Simple bash script
#!/bin/bash
while [true]; do
if ! pidof -x script.php;
then
php script.php &
fi
done
Not for windows, but...
I've got a couple of long-running PHP scripts, that have a shell script wrapping it. You can optionally return a value from the script that will be checked in the shell-script to exit, restart immediately, or sleep for a few seconds -and then restart.
Here's a simple one that just keeps running the PHP script till it's manually stopped.
#!/bin/bash
clear
date
php -f cli-SCRIPT.php
echo "wait a little while ..."; sleep 10
exec $0
The "exec $0" restarts the script, without creating a sub-process that will have to unravel later (and take up resources in the meantime). This bash script wraps a mail-sender, so it's not a problem if it exits and pauses for a moment.
Here is what I did to combat a similar issue. This helps in the event anyone else has a parameterized php script that you want cron to execute frequently, but only want one execution to run at any time. Add this to the top of your php script, or create a common method.
$runningScripts = shell_exec('ps -ef |grep '.strtolower($parameter).' |grep '.dirname(__FILE__).' |grep '.basename(__FILE__).' |grep -v grep |wc -l');
if($runningScripts > 1){
die();
}
You can write in your crontab something like this:
0 3 * * * /usr/bin/php -f /home/test/test.php my_special_cron
Your test.php file should look like this:
<?php
php_sapi_name() == 'cli' || exit;
if($argv[1]) {
substr_count(shell_exec('ps -ax'), $argv[1]) < 3 || exit;
}
// your code here
That way you will have only one active instace of the cron job with my-special-cron as process key. So you can add more jobs within the same php file.
test.php system_send_emails sendEmails
test.php system_create_orders orderExport
Inspired from Justin Levene's answer and improved it as ps -C doesn't work in Mac, which I need in my case. So you can use this in a php script (maybe just before you need daemon alive), tested in both Mac OS X 10.11.4 & Ubuntu 14.04:
$daemonPath = "FULL_PATH_TO_DAEMON";
$runningPhpProcessesOfDaemon = (int) shell_exec("ps aux | grep -c '[p]hp ".$daemonPath."'");
if ($runningPhpProcessesOfDaemon === 0) {
shell_exec('php ' . $daemonPath . ' > /dev/null 2>&1 &');
}
Small but useful detail: Why grep -c '[p]hp ...' instead of grep -c 'php ...'?
Because while counting processes grep -c 'php ...' will be counted as a process that fits in our pattern. So using a regex for first letter of php makes our command different from pattern we search.
One possible solution is to have it listen on a port using the socket functions. You can check that the socket is still listening with a simple script. Even a monitoring service like pingdom could monitor its status. If it dies, the socket is no longer listening.
Plenty of solutions.. Good luck.
If you have your hands on the script, you can just ask him to set a time value every X times in db, and then let a cron job check if that value is up to date.
troelskn wrote:
Just append a second command after the script. When/if it stops, the second command is invoked. Eg.:
php daemon.php | mail -s "Daemon stopped" you#example.org
This will call mail each time a line is printed in daemon.php (which should be never, but still.)
Instead, use the double ampersand operator to separate the commands, i.e.
php daemon.php & mail -s "Daemon stopped" you#example.org
If you're having trouble checking for the PHP script directly, you can make a trivial wrapper and check for that. I'm not sufficiently familiar with Windows scripting to put how it's done here, but in Bash, it'd look like...
wrapper_for_test_php.sh
#!/bin/bash
php test.php
Then you'd just check for the wrapper like you'd check for any other bash script: pidof -x wrapper_for_test_php.sh
I have used cmder for windows and based on this script I came up with this one that I managed to deploy on linux later.
#!/bin/bash
clear
date
while true
do
php -f processEmails.php
echo "wait a little while for 5 secobds...";
sleep 5
done

Categories