Is it possible to redirect a plain text log file to stderr within a Docker container that is writable by PHP?
I have a PHP application that is writing to a file and we're trying to move it into a Docker container without changing any code. I've tried symlinking but this results in permissions errors.
Not sure what type of symlinking you tried but you should be able to do this with something like
RUN ln -sf /dev/stderr /path/to/your/text.log
In your Dockerfile (where the text.log path is the one inside the container).
I ended up finding this solution which uses a fifopipe to pipe the data into stdout.
# Allow `nobody` user to write to /dev/stderr
mkfifo -m 600 /tmp/logpipe
chown nobody:nobody /tmp/logpipe
cat <> /tmp/logpipe 1>&2 &
https://github.com/moby/moby/issues/6880#issuecomment-344114520
Related
My Dockerfile:
FROM php:7.0-fpm
# Install dependencies, etc
RUN \
&& mkfifo /tmp/stdout \
&& chmod 777 /tmp/stdout
ADD docker-entrypoint.sh /usr/local/bin/docker-entrypoint.sh
ENTRYPOINT ["/usr/local/bin/docker-entrypoint.sh"]
as you can see I'm creating a named pipe at /tmp/stdout. And my docker-entrypoint.sh:
#!/usr/bin/env bash
# Some run-time configuration stuff...
exec "./my-app" "$#" | tail -f /tmp/stdout
My PHP application (an executable named my-app) writes its application logs to /tmp/stdout. I want those logs to then be captured by Docker so that I can do docker logs <container_id> and see the logs that the application wrote to /tmp/stdout. I am attempting to do this by running the my-app command and then tailing /tmp/stdout, which will then output the logs to stdout.
What I'm seeing happen is that when I run my application, it hangs when it writes the first log message. I believe this happens because there is nothing "reading" from the named pipe, and writing to a named pipe blocks until something reads from it. This is confirmed if I do docker exec -it <container_id> bash, and then do tail -f /tmp/stdout myself inside the container. Once I do that, the container immediately exits because the application has written its logs to the named pipe.
For reasons that I won't bloat this post with, it's not possible for my-app itself to write logs to stdout. It has to use /tmp/stdout.
Can anybody tell me why this isn't working, and what I need to change? I expect I have to change the exec call in docker-entrypoint.sh, but I'm not sure how. Thank you!
What I'm seeing happen is that when I run my application, it hangs when it writes the first log message. I believe this happens because there is nothing "reading" from the named pipe, and writing to a named pipe blocks until something reads from it.
This is correct, see fifo(7). But with your example code
exec "./my-app" "$#" | tail -f /tmp/stdout
this should actually work since the pipe will start ./my-app and tail simultaneously so that there is something reading from /tmp/stdout.
But one problem here is that tail -f will never terminate by itself and so neither your docker-entrypoint.sh/container. You could fix this with:
tail --pid=$$ -f /tmp/stdout &
exec "./my-app" "$#"
tail --pid will terminate as soon as the process provided by id terminates where $$ is the pid of the bash process (and through exec later the pid of ./my-app).
For reasons that I won't bloat this post with, it's not possible for my-app itself to write logs to stdout. It has to use /tmp/stdout.
Does this mean it has to write to a filesystem path or is the path /tmp/stdout hardcoded?
If you can use any path you can use /dev/stdout / /proc/self/fd/1 / /proc/$$/fd/1 as logging path to let your application write to stdout.
If /tmp/stdout is hardcoded try symlinking it to /dev/stdout:
ln -s /dev/stdout /tmp/stdout
I need to make a static page from the dynamic one with all assets downloaded and all the links converted to local ones and download it in some tmp folder. Like when you press Ctrl+S in a browser. I tried using wget with shell_exec:
shell_exec("wget -E -H -k -p http://youmightnotneedjquery.com/ 2>&1");
The problem is that it works perfectly when I run it from console, but when I use shell_exec, I get an error
Permission denied youmightnotneedjquery.com/index.html: No such file
or directory Cannot write to 'youmightnotneedjquery.com/index.html'
(No such file or directory).
As I understand, there is some problem with permissions, I tried to create a seperate directory
with some high permissions and www-data as owner and specify it in the command using -O flag, but I get an error that I can't use -k and -O flags at the same time. So I hope to solve that issue with permission, but I still have to specify the destination folder somehow. Or maybe there's a php solution without wget that I can use, as it seems not quite hard but a lot of work to do.
You may try something like
shell_exec("cd some_nice_dir && wget ...")
You may also want to read up on man wget as it has a lot to say about interferences between -O and several of the other options you specify.
Helped using -P flag and creating a folder with owned www-data
shell_exec("wget -E -H -k -p http://mysite.local/ -P some-temp-folder 2>&1")
I want to execute a Bash script present on the system from a PHP script. I have two scripts present on the system. One of them is a PHP script called client.php present at /var/www/html and the other is a Bash script called testscript present at /home/testuser.
My client.php script looks like
<?php
$message=shell_exec("/home/testuser/testscript 2>&1");
print_r($message);
?>
My testscript looks like
#!/bin/bash
echo "Testscript run succesful"
When i do the following on terminal
php client.php
I get the following output on terminal
Testscript run successful
But when i open the page at
http://serverdomain/client.php
I get the following output
sh: /home/testuser/testscript: Permission denied
I get this error even after I did chmod +x testscript.
How do I get it to work from the browser? Please help.
I would have a directory somewhere called scripts under the WWW folder so that it's not reachable from the web but is reachable by PHP.
e.g. /var/www/scripts/testscript
Make sure the user/group for your testscript is the same as your webfiles. For instance if your client.php is owned by apache:apache, change the bash script to the same user/group using chown. You can find out what your client.php and web files are owned by doing ls -al.
Then run
<?php
$message=shell_exec("/var/www/scripts/testscript 2>&1");
print_r($message);
?>
EDIT:
If you really want to run a file as root from a webserver you can try this binary wrapper below. Check out this solution for the same thing you want to do.
Execute root commands via PHP
Without really knowing the complexity of the setup, I like the sudo route.
First, you must configure sudo to permit your webserver to sudo run the given command as root. Then, you need to have the script that the webserver shell_exec's(testscript) run the command with sudo.
For A Debian box with Apache and sudo:
Configure sudo:
As root, run the following to edit a new/dedicated configuration file for sudo:
visudo -f /etc/sudoers.d/Webserver
(or whatever you want to call your file in /etc/sudoers.d/)
Add the following to the file:
www-data ALL = (root) NOPASSWD: <executable_file_path>
where <executable_file_path> is the command that you need to be able to run as root with the full path in its name(say /bin/chown for the chown executable). If the executable will be run with the same arguments every time, you can add its arguments right after the executable file's name to further restrict its use.
For example, say we always want to copy the same file in the /root/ directory, we would write the following:
www-data ALL = (root) NOPASSWD: /bin/cp /root/test1 /root/test2
Modify the script(testscript):
Edit your script such that sudo appears before the command that requires root privileges(say sudo /bin/chown ... or sudo /bin/cp /root/test1 /root/test2). Make sure that the arguments specified in the sudo configuration file exactly match the arguments used with the executable in this file.
So, for our example above, we would have the following in the script:
sudo /bin/cp /root/test1 /root/test2
If you are still getting permission denied, the script file and it's parent directories' permissions may not allow the webserver to execute the script itself.
Thus, you need to move the script to a more appropriate directory and/or change the script and parent directory's permissions to allow execution by www-data(user or group), which is beyond the scope of this tutorial.
Keep in mind:
When configuring sudo, the objective is to permit the command in it's most restricted form. For example, instead of permitting the general use of the cp command, you only allow the cp command if the arguments are, say, /root/test1 /root/test2. This means that cp's arguments(and cp's functionality cannot be altered).
I was struggling with this exact issue for three days. I had set permissions on the script to 755. I had been calling my script as follows.
<?php
$outcome = shell_exec('/tmp/clearUp.sh');
echo $outcome;
?>
My script was as follows.
#!bin/bash
find . -maxdepth 1 -name "search*.csv" -mmin +0 -exec rm {} \;
I was getting no output or feedback. The change I made to get the script to run was to add a cd to tmp inside the script:
#!bin/bash
cd /tmp;
find . -maxdepth 1 -name "search*.csv" -mmin +0 -exec rm {} \;
This was more by luck than judgement but it is now working perfectly. I hope this helps.
It's a simple problem. When you are running from terminal, you are running the php file from terminal as a privileged user. When you go to the php from your web browser, the php script is being run as the web server user which does not have permissions to execute files in your home directory. In Ubuntu, the www-data user is the apache web server user. If you're on ubuntu you would have to do the following:
chown yourusername:www-data /home/testuser/testscript
chmod g+x /home/testuser/testscript
what the above does is transfers user ownership of the file to you, and gives the webserver group ownership of it. the next command gives the group executable permission to the file. Now the next time you go ahead and do it from the browser, it should work.
I want to execute a Bash script present on the system from a PHP script. I have two scripts present on the system. One of them is a PHP script called client.php present at /var/www/html and the other is a Bash script called testscript present at /home/testuser.
My client.php script looks like
<?php
$message=shell_exec("/home/testuser/testscript 2>&1");
print_r($message);
?>
My testscript looks like
#!/bin/bash
echo "Testscript run succesful"
When i do the following on terminal
php client.php
I get the following output on terminal
Testscript run successful
But when i open the page at
http://serverdomain/client.php
I get the following output
sh: /home/testuser/testscript: Permission denied
I get this error even after I did chmod +x testscript.
How do I get it to work from the browser? Please help.
I would have a directory somewhere called scripts under the WWW folder so that it's not reachable from the web but is reachable by PHP.
e.g. /var/www/scripts/testscript
Make sure the user/group for your testscript is the same as your webfiles. For instance if your client.php is owned by apache:apache, change the bash script to the same user/group using chown. You can find out what your client.php and web files are owned by doing ls -al.
Then run
<?php
$message=shell_exec("/var/www/scripts/testscript 2>&1");
print_r($message);
?>
EDIT:
If you really want to run a file as root from a webserver you can try this binary wrapper below. Check out this solution for the same thing you want to do.
Execute root commands via PHP
Without really knowing the complexity of the setup, I like the sudo route.
First, you must configure sudo to permit your webserver to sudo run the given command as root. Then, you need to have the script that the webserver shell_exec's(testscript) run the command with sudo.
For A Debian box with Apache and sudo:
Configure sudo:
As root, run the following to edit a new/dedicated configuration file for sudo:
visudo -f /etc/sudoers.d/Webserver
(or whatever you want to call your file in /etc/sudoers.d/)
Add the following to the file:
www-data ALL = (root) NOPASSWD: <executable_file_path>
where <executable_file_path> is the command that you need to be able to run as root with the full path in its name(say /bin/chown for the chown executable). If the executable will be run with the same arguments every time, you can add its arguments right after the executable file's name to further restrict its use.
For example, say we always want to copy the same file in the /root/ directory, we would write the following:
www-data ALL = (root) NOPASSWD: /bin/cp /root/test1 /root/test2
Modify the script(testscript):
Edit your script such that sudo appears before the command that requires root privileges(say sudo /bin/chown ... or sudo /bin/cp /root/test1 /root/test2). Make sure that the arguments specified in the sudo configuration file exactly match the arguments used with the executable in this file.
So, for our example above, we would have the following in the script:
sudo /bin/cp /root/test1 /root/test2
If you are still getting permission denied, the script file and it's parent directories' permissions may not allow the webserver to execute the script itself.
Thus, you need to move the script to a more appropriate directory and/or change the script and parent directory's permissions to allow execution by www-data(user or group), which is beyond the scope of this tutorial.
Keep in mind:
When configuring sudo, the objective is to permit the command in it's most restricted form. For example, instead of permitting the general use of the cp command, you only allow the cp command if the arguments are, say, /root/test1 /root/test2. This means that cp's arguments(and cp's functionality cannot be altered).
I was struggling with this exact issue for three days. I had set permissions on the script to 755. I had been calling my script as follows.
<?php
$outcome = shell_exec('/tmp/clearUp.sh');
echo $outcome;
?>
My script was as follows.
#!bin/bash
find . -maxdepth 1 -name "search*.csv" -mmin +0 -exec rm {} \;
I was getting no output or feedback. The change I made to get the script to run was to add a cd to tmp inside the script:
#!bin/bash
cd /tmp;
find . -maxdepth 1 -name "search*.csv" -mmin +0 -exec rm {} \;
This was more by luck than judgement but it is now working perfectly. I hope this helps.
It's a simple problem. When you are running from terminal, you are running the php file from terminal as a privileged user. When you go to the php from your web browser, the php script is being run as the web server user which does not have permissions to execute files in your home directory. In Ubuntu, the www-data user is the apache web server user. If you're on ubuntu you would have to do the following:
chown yourusername:www-data /home/testuser/testscript
chmod g+x /home/testuser/testscript
what the above does is transfers user ownership of the file to you, and gives the webserver group ownership of it. the next command gives the group executable permission to the file. Now the next time you go ahead and do it from the browser, it should work.
I've made a simple bash script for server admininstration and I cannot figure how can I run it in safely inside a php page: I'd like to create a php admininstration page but I obviously don't want to hard-code root password anyware. Let's make an example (this is a foo script, of course)
#!/bin/bash
touch /$1
this simple/stupid script will not work if the user who run it as no writing permission on /.
Actually the script add apache virtualhosts, ftp users and so on...
any ideas?
thanks
Use
sudo /path/to/executable/file
and set up sudo so it can execute the following command for the current user as a root.
http://www.sudo.ws/sudo/sudoers.man.html - here is the sudoers manual, the configuration file, that you have to modify.
zerkms ALL = (ALL) NOPASSWD: /sbin/iptables -L FORWARD -n -v -x
This is example from my /etc/sudoers. Here I allowed to run command /sbin/iptables -L FORWARD -n -v -x as root without asking a password for user zerkms.