Is there a way to start a PHP script that starts when the docker containers are started with docker compose up -d?
E.g. if there are two containers: One of them is the PHP-Container and the other one is a MariaDB-Container. The PHP-Script should collect data from an API and save them into the MariaDB-Container when the docker containers are started.
I tried to use
CMD [php script.php]
in the Dockerfile and tried to modify the ENTRYPOINT of the Dockerfile:
COPY start-container /usr/local/bin/start-container
RUN chmod +x /usr/local/bin/start-container
ENTRYPOINT ["start-container"]
I also tried it with wait-for-it following this guide, put the "wait-for-it.sh" in the same folder as the Dockerfile and added following lines to the dockerfile:
COPY wait-for-it.sh wait-for-it.sh
RUN chmod +x wait-for-it.sh
CMD ["./wait-for-it.sh", "mariadb:3306" , "--strict" , "--timeout=60", "start_data_collection.sh"]
I also wrote a bash-script that starts the php script because I was not sure whether the php script can be executed from a dockerfile. But nothing seems to work and I don't know what to do next.
You can simply wait for mysql in your bash script :
while !(mysqladmin ping > /dev/null 2>&1)
do
sleep 3
done
and then call your php script.
You can use --host/--port option on mysqladmin to point to your other container. Be aware that you might need to handle the authentication to the mysql server.
If you don't want to do that, and keep the simple tcp check, you should add a -- like so, if we follow the examples from the github you linked :
CMD ["./wait-for-it.sh", "mariadb:3306" , "--strict" , "--timeout=60", "--", "start_data_collection.sh"]
Because you want to run start_data_collection.sh at the end of wait-for-it.sh, you don't want to pass it as argument to the wait-for-it.sh script
Related
I have a PHP server that I need to launch in a docker image along a Python service. Both of them need to be in the same image. At first, I wrote the Dockerfile to start the PHP server, by following a simple guide I found online, and I came up with this:
FROM php:7-apache
COPY ./www/ /var/www/html
WORKDIR /var/www/html
EXPOSE 70
Then, because I need a third service running on a second container, I created the following docker-compose file:
version: '3.3'
services:
web:
build: .
image: my-web
ports:
- "70:80"
secondary-service:
image: my-service
ports:
- "8888:8888"
Using only that, the website works just fine (except for the missing service on the web container). However, if I want to start a service inside the web container alongside the web, I need to start the website manually from a bash script, since docker can only have one CMD entry. This is what I tried:
FROM php:7-apache
COPY ./www/ /var/www/html
RUN mkdir "/other_service"
COPY ./other_service /other_service
RUN apt-get update && bash /other_service/install_dependenci172.17.0.1es.sh
WORKDIR /var/www/html
EXPOSE 70
CMD ["bash", "/var/www/html/launch.sh"]
And this is launch.sh:
#!/bin/bash
(cd /other_service && python3 /other_service/start.py &) # CWD needs to be /other_service/
php -S 0.0.0.0:70 -t /var/www/html
And that also starts the server without problems, along with other_service.
However, when I go to my browser (in the host) and browse to http://localhost:70, I get the error "Connection reset". The same happens when I try to do a request using curl localhost:70, which results in curl: (56) Recv failure: Connection reset by peer.
I can see in the log of the web that the php test server is running:
PHP 7.4.30 Development Server (http://0.0.0.0:70) started
And if I open a shell inside the container and I run the curl command inside of it, it gets the webpage without any problems.
I have been searching similar questions around, but none if them had an answer, and the ones that did didn't work.
What is going on? Shouldn't manually starting the server from a bash script work just fine?
Edit: I've just tried to only start the PHP server like below and it doesn't let me connect to the webpage either
#!/bin/bash
#(cd /other_service && python3 /other_service/start.py &) # CWD needs to be /other_service/
php -S 0.0.0.0:70 -t /var/www/html
I found the issue. It was as easy as starting the Apache server too:
#!/bin/bash
(cd /other_service && python3 /other_service/start.py &) # CWD needs to be /other_service/
/etc/init.d/apache2 start
php -S 0.0.0.0:70 -t /var/www/html
We are having a Docker server 'Docker version 17.03.0-ce, build 60ccb22'. We have a number of workers, around 10,each one of them performs a really simple task that takes a few seconds to complete and exit. We decided that every one of them is going to start a docker container and when the script finishes, the container gets stopped and removed. What is more, crontabs deal with running So, we created a bash script for every worker that instantiates the container with the flags --rm and -d and also starts the script file in the bin/ folder
#! /bin/sh
f=`basename $0`
workerName=${f%.*} \\name of the bash script without the part after the .
//We link with the Docker host the folder of the worker and a log file that is going to be used for monitoring from outside the container.
docker run --rm -d --name $workerName -v `cat /mnt/volume-fra1-05/apps/pd-executioner/master/active_version`:/var/www/html -v /mnt/volume-fra1-06/apps/$workerName.log:/var/www/html/logs/$workerName.log iqucom/php-daemon-container php bin/$workerName
echo `date` $0 >> /var/log/crontab.log
So, we created a bash script for every worker that instantiates the container with the flags --rm and -d and also starts the script file in the bin/ folder. All the workers are very similar to the structure and the code and really simple, there are not big code differences. However, we have experienced the following behaviour: some containers (random ones every time) refuse to stop and be removed even after many hours. Inside the container, the process php bin/$workerName is still running with PID 1. There is nothing like an infinite loop in the code that could stop the script from finishing. It happens randomly and still cannot find a pattern. Do you have any idea on why this might be happening?
So this can be some issue related to your PHP script getting stuck somehow. But since you are sure it is suppose to timeout after lets assume 240secs then you should can change your container command to
docker run --rm -d --name $workerName -v cat /mnt/volume-fra1-05/apps/pd-executioner/master/active_version:/var/www/html -v /mnt/volume-fra1-06/apps/$workerName.log:/var/www/html/logs/$workerName.log iqucom/php-daemon-container timeout 240 php bin/$workerName
This will make sure that any stuck container will exit after a timeout if it doesn't exit on its own
Well, I need to run a Docker using a PHP function. I have a web page where pushing a link I execute a shell order using shell_exec or exec. This works for me if the execution is like an ls or something that expects a result. The problem is that if the command is to run the Docker (or for example a ping) it doesn't work.
What I want is when the user clicks the link, the shell will execute a command to run Docker in the browser, and the page will be redirected there.
For exemple, if I use shell_exec('firefox'); this should open a new firefox browser, but it doesn't work. It seems that the browser is opened but few seconds later is closed.
This is the Docker execution that doesn't work.
public function executeDocker() {
$result = shell_exec('docker run --rm -p 3838:3838 shiny/gsva_interactive /usr/bin/shiny-server.sh');
echo "<br><br>Execution: ".$result;
}
shell_exec will only return the output of a, in this case Docker, command only when the command has exited completely. In the case of ping (it will just keep pinging) and probably in the case of your Docker image, the process will never exit by itself, so it will never give a response.
Using passthru instead of shell_exec should give you the commandline output of your Docker script right back as a response.
If the Docker container is not meant to exit you should probably start it in detached mode with $containerId = shell_exec('docker run -d --rm -p 3838:3838 shiny/gsva_interactive /usr/bin/shiny-server.sh'), so the docker run command will exit. This will return the container id, which you can use with $result = shell_exec("docker ps -f \"id=$containerId\"") to check if the container is running correctly and redirect the user if it is.
i was having the same issue running docker exec via shell_exec.
shell_exec('docker exec -it containerid /usr/bin/cmd);
Getting rid of the -i option worked for me.
Finally I solved it. The problem was in the user group and permissions. In the system that I was using, CentOS, apache server uses a user called apache. This user needs to be in the docker group and reboot the services.
Now it works. Thanks to everyone who helped me.
I'm trying to display my PIs temperatures in a website that I can access anywhere at any time.
So far I've been able to get the CPU and GPU temps working. However my HDD temp won't show in the browser. It works fine in terminal.
Here is a pic:
As you'll notice I didn't have the GPU temp showing either, however this was fixed by using the following command:
sudo usermod -G video www-data
I haven't been successful in getting this to work for smartmoxntools, though.
Does anyone know how to make it work?
Also, is it safe to have these in an external website? Can hackers inject php code to run shell commands using it?
in order to run some root privileged command in website, you need to put www-data in your /etc/sudoers to allow the www-data to run as root for the command, here is the line you need in /etc/sudoers:
www-data ALL=(root) NOPASSWD: /usr/sbin/smartctl
When executing under your web server, your script will probably have a different PATH configured, so it will run differently from how it runs in the Terminal.
Try putting the full path to smartctl in your script, e.g.
sudo /usr/local/bin/smartctl -A -d sat /dev/sda | awk '/^194/ {print $10}'
I'm trying run command from my index.php:
$output = shell_exec('docker images');
and then output results,
or run new container the same way:
$output = shell_exec('docker run hello-world');
It seems that I could not run ANY docker cmd via php.
How do it properly?
I did the following to get this working:
Created a php file called index.php on /var/www/html/ with this content:
<?php
echo '<pre>';
$content = system('sudo docker images', $ret);
echo '</pre>';
?>
Edited sudoers file with visudo, adding the following line at the end:
www-data ALL=NOPASSWD: /usr/bin/docker
Checked http://localhost/index.php and it worked!
You can even build and run containers with this, hope it works for you.
You can do this:
vi rd.php
Put this content in rd.php file
<?php
$output = shell_exec('RET=`docker run hello-world`;echo $RET');
echo $output;
Now you can run
php rd.php
You can view the result :
Hello from Docker. This message shows that your installation appears to be working correctly. To generate this message, Docker took the following steps: 1. The Docker client contacted the Docker daemon. 2. The Docker daemon pulled the "hello-world" image from the Docker Hub. (Assuming it was not already locally available.) 3. The Docker daemon created a new container from that image which runs the executable that produces the output you are currently reading. 4. The Docker daemon streamed that output to the Docker client, which sent it to your terminal. To try something more ambitious, you can run an Ubuntu container with: $ docker run -it ubuntu bash For more examples and ideas, visit: http://docs.docker.com/userguide/
That's all !
I hope this help you