I want to make a project in PHP (Symfony) and MongoDB.
I created the file docker-compose.yml:
web_server:
build: .
ports:
- 5000:5000
links:
- mongo
mongo:
image: mongo:3.0
container_name: mongo
command: mongod --smallfiles
expose:
- 27017
And I try to run Docker Compose in PHP Storm but I recived:
Removing old containers...
(Re)building services...
mongo uses an image, skipping
Building web_server
Cannot locate specified Dockerfile: Dockerfile
Starting...
Building web_server
Cannot locate specified Dockerfile: Dockerfile
No containers created for service: web_server
No containers created for service: mongo
Failed to deploy 'Compose: docker-compose.yml': Some services/containers not started
I don't know what I should do, what should contain Dockerfile, what create containers.
Thanks!
Done!
I use Dockerfile from https://github.com/lepiaf/docker-symfony2 (with all files) and previously docker-compose.yml.
Thanks!
Related
I try to dockerize a symfony project.
Docker compose at the root of the project:
version: "3"
services:
php:
build:
context: ./docker/php
nginx:
build:
context: ./docker/nginx
ports:
- 80:80
At the end of /docker/php/Dockerfile:
WORKDIR /var/www/symfony
COPY . /var/www/symfony
I expect whole project to be copied into /var/www/symfony
But when I:
docker exec -it my-container /bin/bash
What I see is an empty folder.
Why can't docker copy all directory structure into symfony folder?
I will assume that Symfony is installed at the root of your project's directory. If that is the case, then your build context is wrong.
Docker will only make the files inside your build context available during the build, you can see it as the root directory of the build.
If your context: ./docker/php is there in order to use a different Dockerfile, then you should specify the Dockerfile's path and keep . as your build context.
version: "3"
services:
php:
build:
context: .
dockerfile: ./docker/php/Dockerfile
nginx:
build:
context: .
dockerfile: ./docker/nginx/Dockerfile
ports:
- "80:80"
Let me know if that works
I am dockerizing laravel (lumen) app locally on Mac computer.
docker-compose.yml:
version: "3.9"
services:
# LibreOffice Service
libreoffice:
image: lscr.io/linuxserver/libreoffice:latest
container_name: libreoffice
environment:
- PUID=1000
- PGID=1000
- TZ=Europe/London
volumes:
- ./:/home
ports:
- "3000:3000"
restart: unless-stopped
#PHP Service
app:
build:
context: .
dockerfile: Dockerfile
image: digitalocean.com/php
container_name: app
restart: unless-stopped
tty: true
environment:
SERVICE_NAME: app
SERVICE_TAGS: dev
working_dir: /var/www
volumes:
- ./:/var/www
networks:
- app-network
#Nginx Service
webserver:
image: nginx:alpine
container_name: webserver
restart: unless-stopped
tty: true
ports:
- "8080:80"
- "443:443"
volumes:
- ./:/var/www
- ./nginx/conf.d/:/etc/nginx/conf.d/
networks:
- app-network
#Docker Networks
networks:
app-network:
driver: bridge
As you see in yml file I am running my app in nginx container and everything works fine.
But when I try to run command:
docker exec libreoffice soffice --headless --invisible --convert-to pdf --outdir "home/public/tmp" "home/public/tmp/hi.docx"
in my application, it throws the following error:
sh: 1: docker: not found
After wasting days I thing that it is trying to find docker in nginx container not on my local computer. Means all other services I have defined in docker-compose.yml file can not be accessed in my application because my application = nginx container. But why? What should I do then? How should I create environment to access another services in my application?
MY BIG QUESTION
Why it is even running whole app in container? When I run app with nginx, then my app breaks the connection with my local environment and trying to find all other containers in nginx container. For example if I need to convert some files and for that convertation I need libreoffice service to run in background. And when I try to connect it with soffice --headless --invisible --convert-to pdf --outdir command it throws an error like:
sh: 1: soffice: not found
Because it is looking for soffice in nginx container not in my local docker at all. If that is the case then How can I even run my application with nginx? Do I need to run all other containers in nginx container? How is it possible?
After making some more research I have found that it is impossible to access to the container from another container and run some command inside.
Some solutions to that problem:
Use service's REST API to connect.
If your service doesn't have REST API to access and run it
Removing libreoffice service as container and installing it to php container with linux command in Dockerfile:
RUN apt-get update && apt-get install -y libreoffice
Command can be run remotely using ssh. Check this topic (I don't recommend)
I have a docker-compose.yml with a php container that already exists insid::
version: "2"
services:
php:
image: wodby/drupal-php:5.6-3.3.1
environment:
PHP_SENDMAIL_PATH: /usr/sbin/sendmail -t -i -S mailhog:1025
PHP_FPM_CLEAR_ENV: "no"
DB_HOST: mariadb
DB_USER: drupal
DB_PASSWORD: drupal
DB_NAME: drupal
DB_DRIVER: mysql
volumes:
- docker-sync:/var/www/html
I need to install xdebug onto the docker container so that I can then use it with PHPStorm.
A lot of tutorials ar using a Dockerfile is needed to create a php image with xdebug on it. I have not used docker for more than hosting my project locally so I am confused to how the Dockerfile comes into play and if you can use a Dockerfile with a docker-compose file as well.
Does anyone know the steps I need to take in order to add xdebug to this existing container
I am using the following stack: https://github.com/wodby/docker4drupal
I'm new to docker. I have a WordPress stack for local dev that implements wp-cli via a different container. The WP container has PHP 7.2.4, but the wp-cli container appears to have php 5.6.27.
What's the best approach to updating php for wp-cli?
remove wp-cli container, install wp-cli, save as a new container
use a different container for wp-cli
update php inside the existing container
?
snippets from my docker-compose file:
wordpress:
container_name: wordpress
depends_on:
- db
image: jburger/wordpress-xdebug
volumes:
- "./public:/var/www/html"
wpcli:
command: "--info"
container_name: wpcli
entrypoint: wp
image: tatemz/wp-cli
links:
- db:mysql
volumes:
You're pulling in an image which hasn't been freshly built/pushed in a year.
The DockerFile itself of these images is exactly what you need. If you clone the original repo into a folder, set the build param in your docker-compose file to that folder, and then run docker-compose build, you'll have a fresh image.
The ideal setup is to actually have a 'workspace' container, which contains all of the tools needed to interact with your project, for a reference of what that looks like, see laradock (it can be a bit overwhelming).
I want to play around with docker so I created my own 2 container, nginx and php. Both container are build successfully and are published on docker hub. After that I created a fig.yml in my projects folder. If I run fig up -d in my terminal, then I got the following error:
Recreating playground_php_1...
Cannot start container e087111c...: [8] System error: no such file or directory
Any ideas how I can fix this problem?
Here is my fig.yml:
web:
image: mc388/docker-nginx:latest
ports:
- "80:80"
- "443:443"
links:
- php
volumes:
- ./logs/nginx:/var/log/nginx
volumes_from:
- php
php:
image: mc388/docker-php:latest
volumes:
- .:/var/www/
And here are the links to the config of both docker containers:
https://github.com/mc388/docker-nginx
https://github.com/mc388/docker-php
The underlying php:fpm image has the following line in the Dockerfile:
WORKDIR /var/www/html
You then delete this directory, which breaks the default CMD command as it uses WORKDIR as its base.
I don't know much about PHP and this Docker image, but it's worth reading the documentation and looking at any examples you can find on how to use it; deleting a directory feels like you're working against the image.