The problem Docker PHP-Redis and healthcheck - php

I have two services. Redis and PHP. How to make php wait for the launch of redis and write about it.I tried using healthcheck-s but apparently I am not setting it up correctly.
docker-compose.yml
version: '2'
services:
php:
build:
context: docker/web
dockerfile: Dockerfile-php-7.0
container_name: php
ports:
- "8280:80"
links:
- redis:redis
redis:
build: docker/cache
container_name: redis
Dockerfile-php-7.0
FROM php:7.0
RUN pecl install redis \
&& docker-php-ext-enable redis
COPY . /usr/src/myapp
WORKDIR /usr/src/myapp
CMD [ "php", "./index.php"]
EXPOSE 80
index.php
<?php
echo 'Starting';
$redis = new Redis();
$redis->connect(getenv('host'), getenv('6379'));
var_dump($redis->incr('foo'));
?>
Dockerfile
FROM redis:3.2-alpine
COPY conf/redis.conf /usr/local/etc/redis/redis.conf
CMD [ "redis-server", "/usr/local/etc/redis/redis.conf" ]
EXPOSE 6379
Don't be afraid to scold me. I am just starting to learn docker.
I would be very grateful for any help !!!

Here is the docker-compose.yml file that you can use:
version: '2.1'
services:
php:
build:
context: docker/web
dockerfile: Dockerfile-php-7.0
container_name: php
ports:
- "8280:80"
depends_on:
redis:
condition: service_healthy
redis:
build: docker/cache
container_name: redis
healthcheck:
test: ["CMD", "redis-cli", "ping"]
interval: 10s
I also change the PHP script (index.php) as follows:
<?php
echo 'Starting';
$redis = new Redis();
$redis->connect('redis', '6379');
var_dump($redis->incr('foo'));
Now the whole stack works in my machine. I can connect redis container from php.
The php container log shows the following
docker logs -f php
Startingint(1)

Related

How can I run two PHP scripts using Docker?

I have an application, that work with rabbitmq. There is 2 php scripts (send and receive messages) but i can run only one script using Dockerfile
CMD ["php", "./send.php"]
But i have to run two scripts. My tutor ask me to do two containers for each script:
version: "3"
services:
rabbit_mq:
image: rabbitmq:3-management-alpine
container_name: 'rabbitmq'
ports:
- 5672:5672
- 15672:15672
volumes:
- ./docker/rabbitmq/data/:/var/lib/rabbitmq/
- ./docker/rabbitmq/log/:/var/log/rabbitmq
- ./docker/rabbitmq/conf/:/var/conf/rabbitmq
environment:
- API_URL=Api:8000
send:
build:
context: './docker'
dockerfile: Dockerfile
image: php:7.4.cli
container_name: send
ports:
- 8000:8000
volumes:
- ./:/app
depends_on:
- rabbit_mq
receive:
image: php:7.4.cli
# build:
# context: './docker'
# dockerfile: Dockerfile
container_name: receive
ports:
- 8001:8001
volumes:
- ./:/app
depends_on:
- rabbit_mq
What can I do to run 2 scripts using "docker-compose up" command? I serf a lot of web-pages, but couldn't find anything, I really need your help!
you did not specify if those scripts terminate process or not, but to run them, you can make docker-compose like this:
version: "3"
services:
rabbit_mq:
# existing configuration
send:
# existing configuration
command: ["php", "./send.php"]
receive:
# existing configuration
command: ["php", "./receive.php"]
If you would like to run it as part of the docker-compose you can add these lines to certain container blocks in the composer file:
command: php send.php
command: php receive.php
https://docs.docker.com/compose/compose-file/#command
If you need more complicated things like restarting on failure, take a look at using `supervisor'.

Dockerfile build Laravel Telescope trying to connect to Redis service yet to "up"

My project is defined in a docker-compose file, but I'm not too familiar with docker-compose definitions.
When I try to docker-compose up -d in a fresh setup, the following error occurred during the build of a docker image.
This is after composer install, under post-autoload-dump. Laravel tries to auto discover packages (php artisan package:discover).
Generating optimized autoload files
> Illuminate\Foundation\ComposerScripts::postAutoloadDump
> #php artisan package:discover --ansi
RedisException : php_network_getaddresses: getaddrinfo failed: Name or service not known
at [internal]:0
1|
Exception trace:
1 ErrorException::("Redis::connect(): php_network_getaddresses: getaddrinfo failed: Name or service not known")
/var/www/vendor/laravel/framework/src/Illuminate/Redis/Connectors/PhpRedisConnector.php:126
2 Redis::connect("my_redis", "6379")
/var/www/vendor/laravel/framework/src/Illuminate/Redis/Connectors/PhpRedisConnector.php:126
Please use the argument -v to see more details.
Script #php artisan package:discover --ansi handling the post-autoload-dump event returned with error code 1
ERROR: Service 'my_app' failed to build: The command '/bin/sh -c composer global require hirak/prestissimo && composer install' returned a non-zero code: 1
The reason it cannot connect to my_redis:6379 is because my_redis is another service in the same docker-compose.yml file. So I assume the domain is not ready yet, since docker-compose wants to first build my images before hosting containers.
EDIT I just found this GitHub issue linking to my problem: https://github.com/laravel/telescope/issues/620. It seems that the problem is related to Telescope trying to use the Cache driver. The difference is I'm not using Docker just for CI/CD, but for my local development.
How can I resolve this problem? Is there a way to force Redis container to up first before building my_app? Or is there a Laravel way to prevent any domain discovery? Or is there a way to specify the building of an image depends on another service to be available?
If you want to see my docker-compose.yml:
version: '3.6'
services:
# Redis Service
my_redis:
image: redis:5.0-alpine
container_name: my_redis
restart: unless-stopped
tty: true
ports:
- "6379:6379"
volumes:
- ./redis/redis.conf:/usr/local/etc/redis/redis.conf
- redisdata:/data
networks:
- app-network
# Postgres Service
my_db:
image: postgres:12-alpine
container_name: my_db
restart: unless-stopped
tty: true
ports:
- "5432:5432"
environment:
POSTGRES_DB: my
POSTGRES_PASSWORD: admin
SERVICE_TAGS: dev
SERVICE_NAME: postgres
volumes:
- dbdata:/var/lib/postgresql
- ./postgres/init:/docker-entrypoint-initdb.d
networks:
- app-network
# PHP Service
my_app:
build:
context: .
dockerfile: Dockerfile
image: my/php
container_name: my_app
restart: unless-stopped
tty: true
environment:
SERVICE_NAME: my_app
SERVICE_TAGS: dev
working_dir: /var/www
volumes:
- ./:/var/www
- /tmp:/tmp #For CS Fixer
- ./php/local.ini:/usr/local/etc/php/conf.d/local.ini
- fsdata:/my
networks:
- app-network
# Nginx Service
my_webserver:
image: nginx:alpine
container_name: my_webserver
restart: unless-stopped
tty: true
ports:
- "8080:80"
volumes:
- ./:/var/www
- ./nginx/conf.d/:/etc/nginx/conf.d/
networks:
- app-network
# Docker Networks
networks:
app-network:
driver: bridge
# Volumes
volumes:
dbdata:
driver: local
redisdata:
driver: local
fsdata:
driver: local
There is a way to force a service to wait another service in docker compose depends_on, but it only wait until the container is up not the service, and to fix that you have to customize redis image by using command to execute a bash script that check for redis container and redis daemon availability check startup-order on how to set it up.
I currently mitigated this by adding --no-scripts to Dockerfile and added a start.sh. Since it is Laravel's package discovery script that binds to post-autoload-dump that wants to access Redis.
Dockerfile excerpt
#...
# Change current user to www
USER www
# Install packages
RUN composer global require hirak/prestissimo && composer install --no-scripts
RUN chmod +x /var/www/scripts/start.sh
# Expose port 9000 and start php-fpm server
EXPOSE 9000
CMD ["/var/www/scripts/start.sh"]
start.sh
#!/usr/bin/env sh
composer dumpautoload
php-fpm
I'm sure you've resolve this yourself by now but for anyone else coming across this question later, there are two solutions I have found:
1. Ensure Redis is up and running before your App
In your redis service in docker-compose.yml add this...
healthcheck:
test: ["CMD", "redis-cli", "ping"]
...then in your my_app service in docker-compose.yml add...
depends_on:
redis:
condition: service_healthy
2. Use separate docker compose setups for local development and CI/CD pipelines
Even better in my opinion is to create a new docker-compose.test.yml In here you can omit the redis service entirely and just use the CACHE_DRIVER=array. You could set this either directly in your the environment property of your my_app service or create a .env.testing. (make sure to set APP_ENV=testing too).
I like this approach because as your application grows there may be more and more packages which you want to enable/disable or configure differently in your testing environment and using .env.testing in conjunction with a docker-compose.testing.yml is a great way to manage that.

PhpStorm + Docker + Xdebug + DB SSH tunnel

Locally I have following docker-compose configuration:
nginx:
build:
context: ./nginx
ports:
- "80:80"
volumes:
- ./../logs:/home/web/logs/
- ./../:/home/web/my-website.com/
depends_on:
- php
php:
build:
context: ./php
volumes:
- ./../:/home/web/my-website.com/
working_dir: /home/web/my-website.com/
expose:
- "8123"
php container has Xdebug installed into it, I can easily connect to it from PhpStorm.
I have remote ClickHouse database which is connected via SSH Tunnel. When I start my container I just go into my container and execute:
ssh -4 login#host.com -p 2211 -L 8123:localhost:8123 -oStrictHostKeyChecking=no -Nf
After this, my site is able to use this connection, but when I execute console command
./yii analysis/start-charts 003b56fe-db47-11e8-bcc0-52540010e5bc 205
from PhpStorm, I'm getting an exception:
Failed to connect to 127.0.0.1 port 8123: Connection refused
If I jump into the container and launch the same command, everything works fine.
What's wrong? Why PhpStorm doesn't see my SSH tunnel?
I've got an answer on "superuser" site: https://superuser.com/questions/1374463/phpstorm-docker-xdebug-db-ssh-tunnel/1375961#1375961
Besides, I've added ports node to my php container definition, now it's the following:
php:
build:
context: ./php
volumes:
- ./../:/home/web/my-website.com/
working_dir: /home/web/my-website.com/
expose:
- "8123"
ports:
- "8123:8123"
depends_on:
- redis
- mysql

run batch in wordpress docker container

I am trying to run batch in wordpress official container.
In container I want to run very simple batch like below:
./wordpress_batch.php
define('BASEPATH', '/path/to/wordpress');
require_once(BASEPATH . '/wp-load.php');
# batch program start
echo "batch test";
var_dump($wpdb);
But no output is shown.
This code work in wordpress in host machine.
What is wrong in docker container?
Any ideas to run this code?
Thanks.
./docker-compose.yml:
version: "2"
services:
wordpress:
build: containers/wordpress
ports:
- "9000:80"
depends_on:
- db
environment:
WORDPRESS_DB_HOST: "db:3306"
env_file: .env
volumes:
- ./wordpress_batch:/var/batch/
db:
build: containers/db
env_file: .env
volumes:
- db-data:/var/lib/mysql
volumes:
db-data:
driver: local
./containers/db/Dockerfile:
FROM mysql:latest
./containers/wordpress/Dockerfile:
FROM wordpress:latest
The way I ran the code:
$ docker-compose run wordpress bash
# root#97658bd14387:/var/batch# php wordpress_batch.php

How use php from another docker container

in my app i have separated docker containers for nginx, mysql, php and supervisor. But now i require set in supervisor a program which run php script. It`s possible call php from another container?
EDIT
Example:
When i run supervisor program test, then i see error: INFO spawnerr: can't find command 'php'. I know that php is not in the container supervisor, but how i call from container php? And i require same php as for application.
./app/test.php
<?php
echo "hello world";
docker-compose.yml
version: "2"
services:
nginx:
build: ./docker/nginx
ports:
- 8080:80
volumes:
- ./app:/var/www/html
links:
- php
- mysql
php:
build: ./docker/php
volumes:
- ./app:/var/www/html
ports:
- 9001:9001
mysql:
build: ./docker/mysql
ports:
- 3306:3306
volumes:
- ./data/mysql:/var/lib/mysql
supervisor:
build: ./docker/supervisor
volumes:
- ./app:/var/www/html
ports:
- 9000:9000
supervisor.conf
[program:test]
command = php /var/www/html/test.php
process_name = %(process_num)02d
numprocs = 1
autostart = false
autorestart = true
Please check this repo in github
I used angular, laravel and mongo
with 3 containers, for mongo, php-fpm and nginx to make proxi to api and angular.
Angular dose not use nodejs container because I build angular ng build and this out the build in the folder angular-dist.
The folder angular-src its the source code of angular
into folder laravel run the command composer install, if you use Linux use sudo chmod 777 -R laravel
you can see this
and the route http://localhost:8000/api/
and the route http://localhost:8000/api/v1.0

Categories