Run bash command via PHP, change dir then execute binary - php

So here's what I'm trying to do. I'm trying to start a binary under another user, via a PHP script. Apache has sudo access. The command works fine when ran via putty logged in as "test".
passthru('bash -c "sudo -u test cd /home/test/cs/ ; ./hlds_run"');
also might I add that
passthru('bash -c "sudo -u test ./home/test/cs/hlds_run"');
Won't work because of how the binary is written (it won't find it's resources unless you cd to the folder before, tested on terminal)

If everyone has access to /home/test/cs:
passthru('cd /home/test/cs && sudo -u test ./hlds_run');
If only the user test has access to the directory:
passhtru('sudo -u test sh -c "cd /home/test/cs && ./hlds_run"');
To arrive at the second invocation, you should already be familiar with system vs execve semantics (used by passthru and sudo respectively).
This is the tested shell string we need to run as a specific user:
cd /home/test/cs && ./hlds_run
We can ensure that it always runs as a specific user with sudo, but sudo uses execve semantics. We have to convert our shell string to an execve array, and since this command A. relies on shell functionality like cd and B. does not include dynamic values, the best way to do this is to simply invoke a shell to interpret it verbatim:
{ sh, -c, cd /home/test/cs && ./hlds_run }
We can now apply sudo to run as our specific user:
{ sudo, -u, test, sh, -c, cd /home/test/cs && ./hlds_run }
passthru runs as a shell, so now we have to convert the execve array above back into a shell string, taking extreme care with quoting to ensure the shell will parse it into the exact argument list above. Fortunately this is a relatively simple case:
sudo -u test sh -c "cd /home/test/cs && ./hlds_run"
We can now give it to passthru:
passthru('sudo -u test sh -c "cd /home/test/cs && ./hlds_run"');

Have you try this? (make sure if you have exec right for hlds_run)
passthru('bash -c "sudo -u test && /home/test/cs/hlds_run"');

Related

Running tar command from php causes Cannot unlink: Read-only file system

I am trying to implement simple backup feature of some directories (mainly directories in /etc) which is handled by laravel. Basically I store .tar archives containing specific directory files.
This is a command used to create a backup archive of a single directory:
shell_exec("cd {$backupPath} && tar -cf {$dirName}.tar -P {$fullPathToDir}")
This is a command to restore directory from a backup archive:
shell_exec("cd / && sudo tar -xf {$backupPath . $dirName} --recursive-unlink --unlink-first")
For test reasons I let http user run sudo tar, however my initial idea was to create a bash script that will handle that, and add it to sudoers. Running command or shell script gives same errors.
The problem is if I run it through php I get errors like this:
Cannot unlink: Read-only file system
But, if i run it from command line, it works:
su http -s /bin/bash -c "cd / && sudo tar -xf {$backupPath . $dirName} --recursive-unlink --unlink-first"
Running this both on full archlinux system and archlinux docker container gives me same results. I would appreciate any kind of help.
So issue was with systemd unit for php-fpm 7.4, where ProtectSystem was set to true, after commenting it out, everything worked as expected.
sed -i 's:ProtectSystem=full:#ProtectSystem=full:' /usr/lib/systemd/system/php-fpm7.service

Need to trigger a Linux Backend script with WordPress Button

I have made a button on wordpress and linked it to a linux backend script.
This gives me no error, and the default user output is "www-data"
#!/bin/bash
whoami
touch file
But I want to trigger a git commit using my local user through this script.
#!/bin/bash
sudo -u my_username -p my_password -H sh -c "cd /var/www/html/forcetalks_new/; /usr/bin/git add . ; touch test_file"
This is neither touching the file not adding the files. I wonder what could be the possible reason. And any other solution is welcomed.
PS. I tried using GIT commands in the first script after giving GIT sudo permissions to "www-data" and that didnt work as well.
Try the first script approach with:
#!/bin/bash
sudo -u my_username -p my_password -H sh -c "/path/to/script > /path/to/log 2>&1"
So a script calling a script:
#!/bin/bash -x
whoami
git status
git add .
touch file
The goal is to see what is going on with /path/to/log, thanks to the bash -x option (and the git status command)
Note that if you want to add "file" itself, you should touch the file first, then add.
Note also that a git commit might be needed at some point to actually persist those changes in the Git repository

How to run Docker from php?

I am a beginner in using Docker, and I am trying to run docker from php, specifically I need to run openface code. I used the command lines provided here https://cmusatyalab.github.io/openface/setup/ to make sure that docker is running on my pc correclty and it works. but now I need to call it from PHP, I wrote the same commands in batch file as follows
docker run -p 9000:9000 -p 8000:8000 -t -i bamos/openface /bin/bash
cd /root/openface
./demos/compare.py images/examples/{lennon*,clapton*}
pause
and tried to execute it in php by calling
echo shell_exec ("test.bat");
but when I run the batch file directly, only the first line is executed. as it seems the next command is not being executed inside the docker container.
how can I make all commands execute?
any help will be very much appreciated,
thank you
The problem is the first bash won't exit before you exit it and the rest of the commands are interpreted by host bash.
What you need is that your work is done and then you get inside a bash (inside the container)
docker run -p 9000:9000 -p 8000:8000 -t -i bamos/openface /bin/bash -c "cd /root/openface && ./demos/compare.py images/examples/{lennon*,clapton*} && exec bash"
First "bash -c" is to execute the commands and last command exec bash override the main bash and gives you a shell

Bin Bash file not found

i'm really beginner with SH Scripts.
I found a small sh script that convert a php file with wget to html. I would like to do a small cronjob with it. But everytime i run that script i get the message (Translated) "Defect Interpreter" > File or folder not Found".
My script is only
#!/bin/bash
rm -rf header-wrapper.html && wget http://master.gnetwork.eu/header-wrapper.php -O header-wrapper.html -q
The error message tells you that your shebang is wrong (bash executable is not found at /bin/bash when cron job starts).
From your cron job, use bash explicitly when calling the script:
bash myscript.sh
instead of:
./myscript.sh
Also, do not make any assumptions on the working directory of the cron job. Change the directory in your bash script before doing anything else
#!/bin/bash
cd /my/desired/path && \
rm -rf header-wrapper.html && \
wget http://master.gnetwork.eu/header-wrapper.php -O header-wrapper.html -q
Try typing Bash in before the filename.
bash [File Name Here]
instead of
.[File Name Here]
Sorry if someone else already answered this.

How to run sudo terminal commands with php5?

Under a controlled environment, I will try to execute some calls to shell, some of such commands will include sudo privileges.
I tried this php code line:
$out = shell_exec('sudo -u root -S ls < /home/user/.y/.qqz');
Where at last .qqz is a file containing actual password.
However apache log shows this output:
[sudo] password for www-data:
Like the password file is not being passed to the command stdi?
I already made www-data part of the sudo group. How can I get my objective done?

Categories