How to set an environment variable for an apache2 -DFOREGROUND [duplicate] - php

Here is my Dockerfile:
FROM ros:kinetic-ros-core-xenial
CMD ["bash"]
If I run docker build -t ros . && docker run -it ros, and then from within the container echo $PATH, I'll get:
/opt/ros/kinetic/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
If I exec into the container (docker exec -it festive_austin bash) and run echo $PATH, I'll get:
/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
Why are the environment variables different? How can I get a new bash process on the container with the same initial environment?

The ENTRYPOINT command is only invoked on docker run, not on docker exec.
I assume that this /ros_entrypoint.sh script is responsible for adding stuff to PATH. If so, then you could do something like this for docker exec:
docker exec -it <CONTAINER_ID> /ros_entrypoint.sh bash

docker exec only gets environment variables defined in Dockerfile with instruction ENV. With docker exec [...] bash you additionally get those defined somewhere for bash.
Add this line to your Dockerfile:
ENV PATH=/opt/ros/kinetic/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
or shorter:
ENV PATH=/opt/ros/kinetic/bin:$PATH

This is old question but since it's where google directed me I thought I'll share solution I ended up using.
In your entrypoint script add a section similar to this:
cat >> ~/.bashrc << EOF
export PATH="$PATH"
export OTHER="$OTHER"
EOF
Once you rebuild your image you can exec into your container (notice bash is invoked in interactive mode):
docker run -d --rm --name container-name your_image
docker exec -it container-name /bin/bash -i
If you echo $PATH now it should be the same as what you have set in .bashrc

Related

How to run a PHP script on docker?

I have index.php:
<?php
echo "Hello World";
?>
Dockerfile from the website: https://docs.docker.com/samples/library/php/
FROM php:7.2-cli
COPY . /usr/src/myapp
WORKDIR /usr/src/myapp
CMD [ "php", "./index.php" ]
I build image and run container:
docker build -t my-php-app .
docker run -p 7000:80 --rm --name hello-world-test my-php-app
I see only text "Hello World" but my application doesn't work in http://localhost:7000/ why?
If you want to run some script "on the fly" with php-cli you can create the container and remove it immediately after the script execution.
Just go to the directory with your code and run:
Unix
docker container run --rm -v $(pwd):/app/ php:7.4-cli php /app/script.php
Windows - cmd
docker container run --rm -v %cd%:/app/ php:7.4-cli php /app/script.php
Windows - power shell
docker container run --rm -v ${PWD}:/app/ php:7.4-cli php /app/script.php
--rm will remove the container after execution
-v $(pwd):/app/ will mount current directory
php:7.4-cli is the image
and php /app/script.php is the command which will be executed after the container is created
You can keep the same base image as you have php:7.2-cli:
FROM php:7.2-cli
COPY . /usr/src/myapp
WORKDIR /usr/src/myapp
CMD [ "php", "./index.php" ]
...build the image:
docker build -t my-php-app .
...run it:
docker run --rm --name hello-world-test my-php-app
You will obtain:
Hello World
Everything you did was correct except the port mapping (-p 7000:80) which is not necessary because you aren't running a web server.
== EDIT
If you want to run it as a web server, use the following Dockerfile:
FROM php:7.2-apache
COPY . /var/www/html/
...build it:
docker build -t my-php-app .
...and run it:
docker run -p 8080:80 -d my-php-app
You will then have your PHP script running on 8080.
 1. Create simple php script:
echo '<?php echo "Working";' > my.php
 2. Run docker:
docker run -p 8080:8080 --rm -v $(pwd):$(pwd) php:7.4-cli php -S 0.0.0.0:8080 $(pwd)/my.php
 3. Open in browser:
http://localhost:8080/
Many answers suggest using Apache for this, but that is not required. You need to have your application in the container run continuously on a specific port. You can keep the php:7.2-cli image, but your CMD should be different:
CMD [ "php", "-S 0.0.0.0:80", "./index.php" ]
This will run the built-in PHP webserver and after that you can expose it with the docker run command you already had
Here is a quick and simple example with Docker on Windows 11, assuming you have a similar directory structure as the example below:
C:\Users\YourName\Workspace\MyProject\program.php
And program.php has the following content:
<?php echo "It works!"; ?>
Then, in the Command Prompt, navigate to the project directory:
cd C:\Users\YourName\Workspace\MyProject
Run with CLI
docker run --rm -p 8080:8080 -v %CD%:/cli php:7.4-cli php -S 0.0.0.0:8080 /cli/program.php
View: http://localhost:8080
Run with SERVER
docker run --rm -d -p 8081:80 -v %CD%:/server --mount type=bind,source="%CD%",target=/var/www/html php:apache
View: http://localhost:8081/program.php
Then feel free to modify program.php and refresh the page.
Environment
Docker version 20.10.16, build aa7e414
Windows 11 Home, Version 22H2, OS build 22622.436

Docker run php : Interactive shell

I am running the command
docker run php
and the terminal shows 'Interactive shell' and the docker image exits automatically. Here is the docker status
docker ps -a
"docker-php-entrypoi…" Less than a second ago Exited (0) 3 seconds ago
Please try the following:
docker run -it --rm php bash
You need to tell docker run that it's an interactive process and allocate a tty for keyboard input, i.e.
$ docker run -it php
Interactive shell
php >
php needs -a to run in an interactive mode. -it is to keep a persistent session. To get an interactive directly, just run:
docker run -it --rm php php -a

How to run Docker from php?

I am a beginner in using Docker, and I am trying to run docker from php, specifically I need to run openface code. I used the command lines provided here https://cmusatyalab.github.io/openface/setup/ to make sure that docker is running on my pc correclty and it works. but now I need to call it from PHP, I wrote the same commands in batch file as follows
docker run -p 9000:9000 -p 8000:8000 -t -i bamos/openface /bin/bash
cd /root/openface
./demos/compare.py images/examples/{lennon*,clapton*}
pause
and tried to execute it in php by calling
echo shell_exec ("test.bat");
but when I run the batch file directly, only the first line is executed. as it seems the next command is not being executed inside the docker container.
how can I make all commands execute?
any help will be very much appreciated,
thank you
The problem is the first bash won't exit before you exit it and the rest of the commands are interpreted by host bash.
What you need is that your work is done and then you get inside a bash (inside the container)
docker run -p 9000:9000 -p 8000:8000 -t -i bamos/openface /bin/bash -c "cd /root/openface && ./demos/compare.py images/examples/{lennon*,clapton*} && exec bash"
First "bash -c" is to execute the commands and last command exec bash override the main bash and gives you a shell

Docker "Operation not permitted" issue on Windows

I'm trying to use Docker on Windows through Docker Toolbox, but I'm struggling to make it work. I've pulled Docker PHP image. For example, this simple ls command fails:
$ docker run -it --rm -v /$(pwd):/home/projects php:7.0-cli ls -l /home/projects
ls: cannot open directory /home/projects: Operation not permitted
Also, any other operation within the mounted volume fails with Operation not permitted message.
Looks like a path issue with the volume mapping. Docker Toolbox uses Git Bash for the terminal, which uses /c as the root of the C: drive:
$ echo $(pwd)
/c/Users/elton
So your /$(pwd) is prepdening an extra forward slash. I'd try with a fully-qualified path first just to verify:
$ docker run -it --rm -v /c/projects:/home/projects php:7.0-cli ls -l /home/projects

Running package manager inside the docker

I've built an image for the purpose of PHP development, and it became clear to me that I didn't really thought about how to access the tools that I need for every day development. For example: composer, package manager for PHP, I need it to run whenever composer.json updates. I thought it is worth installing those tools inside the same image, but then I don't have a way to access them. So, I can:
Create separate image for composer and run it in different container
Install composer on my host machine.
I'd like to avoid option 2), but then, does it have sense having a setup like 1) ? How did you guys solved this issue ?
Unless you have some quite specific requirements there is a third option:
Connect to the container using docker exec command:
docker exec -it CONTAINER-NAME/ID COMMAND [ARG...]
Here is the example:
1: Create your application:
echo "<?php phpinfo();" > index.php
2: Start container:
docker run -it --rm --name my-apache-php-app -p 80:80 -v "$PWD":/var/www/html php:5.6-apache
3: Open another terminal window and exec required commands inside running container:
docker exec -it my-apache-php-app curl -sS https://getcomposer.org/installer | php
docker exec -it my-apache-php-app ls
If you need shell inside running container - run:
docker exec -it my-apache-php-app bash
That's it!

Categories