I am trying to build PHP wrapper for PowerBI. I installed PowerBI Cli (https://github.com/Microsoft/PowerBI-Cli) on my local and when I run any PowerBI Cli command on my terminal, it is working well. It is working well even when I run the commands using the _www user (sudo -u _www powerbi config)
However, when I run them through PHP using either shell_exec or Symphony's Process Component (https://symfony.com/doc/current/components/process.html), I am getting the following exception:
env: node: No such file or directory.
I am facing this issue on Mac Sierra. The commands are working well on Linux using the PHP exec()
Try linking,
"ln -s /path/where/command/is stored/ /to/path/where u want to exec/"
Sometimes the program are stored in usr/local/bin/program meanwhile as per default you are executing in usr/bin/program
And then in shell use the new path you have set.
Example for linking suppose if you have path for command,
/usr/bin/powerbi then with above command you can link new path usr/powerbi after that you can use new path in exec or shell command.
Try using the full path rather than the command. Without knowing your exact path I can't tell you exactly what to do but it would be something like this:
$output = shell_exec("sudo -u _www /path/path/powerbi config");
var_dump($output);
Edit:
Or, change directories first. So using my example above, it would be:
$output = shell_exec("cd /path/path/powerbi; sudo -u _www powerbi config");
Related
My php code runs a shell file, which opens a tmux session and runs a node.js bot. And when I write this code as php phpfile.php from the terminal, it works, but when I enter phpfile.php from the browser, it does not work. As far as I understand, the problem is with the permissions of the apache user, but it does not work even though I have given him all kinds of permissions. When I try the command sudo -u apache tmux new -s node I get the result [exited]
php code:
<?php shell_exec('bash ./tmux.sh'); ?>
shell code:
tmux new -s node
tmux send-keys -t node.0 "node ./js/bot.js" ENTER
Note: i added apache user to visudo like apache ALL=(ALL:ALL) NOPASSWD: ALL
Note 2: i run this in aws ec2 server using aws linux
if you are using an operating system that uses the security layer SELinux. Please try to turn it off or try writing a rule to it.
I always turn it off.
nano /etc/selinux/config
SELINUX=disabled
I have an application in php that works on the docker. I would like to send a command from php code to container that should create files (some_dir/certs/cert.crt etc.). This command i run like this (by system/exec/shell_exec or symfony/process)
system("traefik-certs-dumper file --source acme.json --dest some_dir --version v2");
When php run this code then directory has been created but not files, also i don't have any error.
This command works when i make it from terminal via docker exec but not from php. This is probably some permission problem between php and docker container, but i don't know how can i set it.
I'm trying to set in docker file this, but not working:
RUN chmod 777 /go/bin/traefik-certs-dumper
RUN usermod -u 1000 www-data
also standard command works like this:
system("mkdir -p some_dir_1234");
system("touch some_dir_1234/some_file_1234");
How can I allow an installed library to create files?
I finally was able to find a solution. In a separate container, I had a process supervisor running, who saw this file, because it was mounted to the main application directory also. What had to be done was to mount the file to the main container and the supervisor container.
Try
exec("traefik-certs-dumper file --source acme.json --dest some_dir --version v2");
intead of system()
I have a php script running in a cronjob on the server.
However, I am unexpectedly getting different $PATH from the same user, depending on how I execute the command.
I log in as user ubuntu:
ubuntu#:$ echo $PATH
/home/ubuntu/bin:/home/ubuntu/.local/bin:/home/ubuntu/.nvm/versions/node/v12.3.1/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin
I then sudo su bitbucket:
bitbucket#:$ echo $PATH
/home/bitbucket/.nvm/versions/node/v12.3.1/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games
I execute a script from a cronjob running as bitbucket and output the following debug to a log file:
$ whoami
bitbucket
The above proves the user is bitbucket, then:
$ echo $PATH
/usr/bin:/bin
Please note I am not running as sudo. I am utilising sudo to switch user, but not using sudo to echo $PATH.
How is it that the same user has 2 different $PATH variables?
You didn't say which shell you're using so I'm going to assume it's bash. The first, and perhaps most important, thing to note is that when you run sudo su bitbucket you're getting an interactive shell. Which means that ~/.bashrc will be sourced. Lots of people modify PATH in that script. Something that tends to cause problems. Why? Because non-interactive shells, such as the one launched by cron to run your command, won't read ~/.bashrc.
Your cron job gets a PATH equivalent to running this command: sudo su bitbucket -c 'echo $PATH'. Play around with that to get a better understanding of how this works. For example, instead of echo $PATH try env.
I have meshlab installed in my machine running Ubuntu 14.04 OS. I can access it from command line using meshlabserver command. But problem arises whenever I try to call it from a php script using the command
<?php
system('meshlabserver 2>&1');
?>
It shows the error meshlabserver: cannot connect to X server. After going through a few websites I did the following things:
I moved the meshlabserver executable from /usr/bin to /usr/local/bin and gave it executable permissions using
sudo chmod a+x meshlabserver
But when I ran the whoami command from my php script (calling the meshlabserver), it showed www-data. So I gave executable permissions for all users to the meshlabserver using
sudo chmod 777 /usr/local/bin/meshlabserver
But still it is showing the same meshlabserver: cannot connect to X server error. meshlabserver comamnd is working fine when ran from the command line.
I really need to call meshlab from the php script for my website. Thus any help would be highly appreciated. Thanks in advance.
It seems the php script can't access your display variable. If you logged in via ssh remember to tunnel your X-server via 'ssh -X ...' Your second option is to create a virtual frame buffer using Xvfb and redirect the display variable to it:
export DISPLAY=:100.0
Xvfb :100 &
Note the ampersand for the second command as Xvfb needs to be running in the background.
a combo of prior answers works for me:
ssh -X, as well as export DISPLAY=:0.0 (on remote)
I need to make a dump via pg_dump in php so i've got a function like this:
function fnDump()
{
exec("/usr/local/bin/sudo -u pg_user /usr/local/bin/pg_dump mon_alarm > /usr/home/user/monitor_test/renew_db/mon_alarm.sql",$out);
var_dump($out);
}
The problem is that mon_alarm.sql file is empty.
But when i execute this command via command line everything works fine.
What should i change to create a dump in php?
If you're running PHP under a standard webserver setup, that won't work because it will run under the context of the webserver user, so sudo won't let you change user context like that.
If this is a script you're going to have to adjust sudo to run only the pg_dump command as passwordless sudo permissions for the user, otherwise your sudo command will prompt for a password and ruin your automated process.