PHP file not being executed from post-commit - php

I have this in post-commit:
#!/bin/sh
REPOS="$1"
REV="$2"
/usr/bin/php /home/name/svn/scripts/post-commit.php $REPOS $REV
But whatever I do, post-commit.php isn't being executed, not even with a chmod a+rw on it. Also there is no output from exec.
What am I missing?
Update: removed exec > ./logs/log.txt from this example since it seems to confuse people.

try:
#!/bin/sh
REPOS="$1"
REV="$2"
#debug:
echo "------------------------------"
date >> /tmp/debug.txt
echo "$#" >> /tmp/debug.txt
id >> /tmp/debug.txt
env >> /tmp/debug.txt
/usr/bin/php /home/name/svn/scripts/post-commit.php "$REPOS" "$REV" > /full/path/to/log.txt 2>&1
Also, verify that your post script works fine when executed by hand.

exec replaces the current shell process, and doesn't start a new one. So after the exec command, your shell stops.
The purpose of your particular exec command eludes me by the way ... So just remove it and you should be fine.

you'd better to exec a 'cd' first, to a directory where you really want the shell to execute.
i'm not sure the path SVN will have when running this, but of course your script have potential privilege problems

Related

How run PHP script file in the background forever

My issue seems to be asked before but hold on, this one is a bit different.
I have 2 php files, I run the following commands:
nohup php file_1.php >/dev/null 2>&1 &
nohup php file_2.php >/dev/null 2>&1 &
This will create basically 2 php processes.
My problem is that sometimes one of the files or even both of them are killed by the server for unknown reason and I have to re-enter the commands over again. I tried 'forever' but doesn't help.
If the server is rebooted I will have to enter those 2 commands, I thought about Cronjob but I'm not sure if it would launch it twice which would create more confusion.
My question is how to start automatically the files if one or both of them got killed? What is the best way to achieve this which would check exactly that file_1.php or that file_2.php is indeed running?
There are a couple of ways you can do this. As #Chris Haas mentioned in the comments, supervisord can do this for you, or you can run a watchdog shell script from cron at regular intervals that checks to see if your script is running, and if not, start it. Here's one I use.
#!/bin/bash
FIND_PROC=`ps -ef |grep "php queue_agent.php --env prod" | awk '{if ($8 !~ /grep/) print $2}'`
# if FIND_PROC is empty, the process has died; restart it
if [ -z "${FIND_PROC}" ]; then
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
echo queue_agent.php failed at `date`
cd ${DIR}
nohup nice -n 10 php queue_agent.php --env prod -v > ../../sandbox/logs/queue_agent.log &
fi
exit 0
I think u can try to figure out why these two php scripts shut down as the first step. To solve this problem, u can use this php function:
void register_shutdown_function(callback $callback[.mixed $parameter]);
which Registers a callback to be executed after script execution finishes or exit() is called. So u can log some info when php files get shut down like this:
function myLogFunction() {
//log some info here
}
register_shutdown_function(myLogFunction);
Instead of putting the standard output and error output into /dev/null, u can put them into a log file(Since maybe we can get some helpful info from the output). So instead of using:
nohup php file_1.php >/dev/null 2>&1 &
nohup php file_2.php >/dev/null 2>&1 &
try:
nohup php file_1.php 2>yourLog.log &
nohup php file_2.php 2>yourLog.log &
If u want to autorun these two php files when u boot the server, try edit /etc/rc.local(which is autorun when the os start up). Add your php cli command lines in this file.
If u can't figure out why php threads get shut down, try supervisor as #Chris Haas mensioned.

Run compiled python script from PHP

I need to run a python script, compiled with pyinstaller via a PHP generated webpage.
I tried shell_exec(), exec() and system() without success.
I regularly run the script from terminal in background using:
temperature_sensor_code > /dev/null 2>&1
I've added www-data user to sudoers. I know it's not a good way but I need it in order to send killall temperature_sensor_code command (this is works).
This is my situation:
<?php
$run = escapeshellcmd('temperature_sensor_code > /dev/null 2>&1');
shell_exec($run);
header("Refresh: 0; URL=index.php");
?>
I've made a symlink in /usr/bin, also tried with the full path of the script with no luck.
UPDATE: to make it simpler, i've created a simple sh script run.sh and put in /var/www and make it RUN with
shell_exec("/var/www/run.sh");
this is working for me. So I put my script temperature_sensor_code in /var/www but this is not working. If I add var_dump(exec("/var/www/temperature_sensor_code/temperature_sensor_code"));
gives me: string(0) ""
I think there are problems with the compiled python script because the PHP side seems to be OK.
escapeshellcmd() does this:
Escape shell metacharacters
$run = escapeshellcmd('temperature_sensor_code > /dev/null 2>&1');
var_dump($run);
string(43) "temperature_sensor_code \> /dev/null 2\>\&1"
But you have shell metacharacters that you do want to behave as shell metacharacters:
temperature_sensor_code > /dev/null 2>&1
^ ^^^^
You're also doing no troubleshooting at all:
You discard all command output (that's what sending it to /dev/null does)
You don't get the return code
I suggest to:
Switch to exec() and make sure you use all its arguments and not just the mandatory ones
Get rid of > /dev/null until you diagnose the issue

Allow PHP/Apache to shell_execute commands on Ubuntu

I'm trying to execute a command through PHP with shell_exec. The PHP file is hosted by Apache on my Ubuntu server.
When I run this:
echo shell_exec("ps ax | grep nginx");
Then I get to see data. But when I run another command, for example:
echo shell_exec("cat /usr/local/nginx/config/nginx.config");
Then it's not showing anything at all. But when I copy that command and paste it in my terminal, then it executes fine.
My Apache server is running as user www-data. So I edited sudoers and added this line:
www-data ALL=(ALL:ALL) ALL
I know this is a security risk, but I wanted to make sure (for now) that www-data is able to execute all commands. But, for some reason I'm still not able to execute all commands with my PHP script.
Anyone any idea what to do?
have you read http://php.net/manual/en/function.shell-exec.php
There is quite a discussion in comments section. Top comment is:
If you're trying to run a command such as "gunzip -t" in shell_exec and getting an empty result, you might need to add 2>&1 to the end of the command, eg:
Won't always work:
echo shell_exec("gunzip -c -t $path_to_backup_file");
Should work:
echo shell_exec("gunzip -c -t $path_to_backup_file 2>&1");
In the above example, a line break at the beginning of the gunzip output seemed to prevent shell_exec printing anything else. Hope this saves someone else an hour or two.
echo shell_exec("sudo cat /usr/local/nginx/config/nginx.config");
Try that.

Redirection in PHP exec call creates empty file

It's quite simple, and I'm out of ideas. I'm sure there is a quick workaround.
exec('echo 123 &> /var/log/123.log');
I'm sure it's not about the permissions, because the file 123.log is created, but it's just- empty. I've also tried shell_exec, but it doesn't create the file at all.
Also tried all variants of redirection, i.e. 1> 2> >.
Using PHP to capture the output is not the option, as the output in production is huge, and I don't want to run into memory issues.
Any ideas appreciated.
Btw, I'm using Ubuntu 12.04 LAMP.
Debian and Debian based Linux distributions like Ubuntu are using dash and not bash as /bin/sh by now.
&> is a bash extension, the dash does not know about.
The correct posix-compatible way to write cmd &> file is cmd > file 2>&1
cmd > file 2>&1 works in all posix-compatible shells: dash, bash, ksh, zsh, ash ...
So you need to change your code to:
exec('echo 123 > /var/log/123.log 2>&1');
Try shell_exec without &:
echo shell_exec("echo 123 > /var/log/123.log");
Only thing that did help was to create a shell script with exec permissions, e.g. test.sh:
#!/bin/bash
echo 123 &>> /var/log/123.log
and execute it like this:
shell_exec('[full path to]/test.sh');
So, redirection operator is not important, but everything else is (#! directive, shell_exec).

php command line exec() multiple execution and directories?

I am trying to Execute a multiple commands in php using exec() and shell_exec but i am getting a null value back which i shouldn't and nothing is happening (if i copy and paste the strings below in the command line it will work fine and accomplish the job needed) this is the commands i am using:
$command = "cd /../Desktop/FolderName;";
$command .= 'export PATH=$PATH:`pwd`;';
$command .= 'Here i execute a compiler;';
and then i use the escapeshellcmd()
$escaped_command = escapeshellcmd($command);
then
shell_exec($escaped_command);
any ideas what i am doing wrong and i also tried escapeshellarg() instead of escapeshellcmd()?
Solution: the Problem was the permission of the execution compiler for other owners is non and this was the problem.
because when you are using exec() function in php the owner of the file will be www-data so you need to give permission for the www-data either from the ACL of ubuntu or whatever linux based operating system(you can know the owner by doing this exec('whoami')), or by the files you need to execute.
(Sorry my bad English)
On Linux you can add your Commands in a Shell Script.
You can put this in any file:
#!/bin/bash
cd /../Desktop/FolderName
export PATH=$PATH:`pwd`
EXECUTE COMPILER
And save this as fille.sh
Then, add execution permissions:
chmod +x path/to/file.sh
From PHP, you can call this Script executing:
shell_exec('sh path/to/file.sh');
Hope this helps!

Categories