Please bear with me. Pretty new to Docker.
I'm deploying Docker containers (detached) to an AWS EC2 registry using CodeDeploy. On deploy, the following command is run after setting some environmental variables etc:
exec docker run -d ${PORTS} -v cache-${CACHE_VOLUME} --env-file $(dirname $0)/docker.env --tty "${IMAGE}:${TAG}"
The container runs an image located and tagged in EC2 Container Service. No problems so far.
Since this is a PHP application (specifically a Symfony2 application) I would normally need to issue the following command to execute database migrations on deployment:
php app/console doctrine:migrations:migrate --no-interaction
Now, is there any to run this command during "docker run..." while keeping the container running, or do I need to run another container specifically for this command?
Many thanks!
You need create entrypoint. This script runs at the container startup.
entrypoint.sh file:
#create or update db
./waitforit.sh <DB_HOST>:<DP_PORT> -t 30
php app/console doctrine:migrations:execute
# start apache
apache2-foreground
wait for it it is a script waited when database is started up
Just leaving this here for the next one that searches for this... ;-)
When using a recent version of Doctrine, there is a pretty handy parameter for this:
php bin/console doctrine:migrations:migrate --no-interaction --allow-no-migration
The "allow-no-migration" parameter instructs doctrine not to throw an exception, when there is nothing to do...
I do as follows:
docker-compose exec [containerID] ./app/console migrations:migrate --no-interaction
Related
I am quite new to Laravel even though I have already created several mini projects.
Today I tried to create a new Laravel project with Sail in the way that the documentation indicates
curl -s https://laravel.build/blade-components | bash
The application is created correctly, I raise the container with ./vendor/bin/sail up, I compile the styles with sail npm run dev, I run the migrations and everything is correct in the browser (localhost).
The problem comes when trying to install JetStream with the composer command
sail composer require laravel/jetstream
The error is:
OCI runtime exec failed: exec failed: unable to start container
process: exec: "composer": executable file not found in $PATH: unknown
For some reason, it's like composer doesn't get installed to the project.
When I run sail, or sail npm (for example), it recognizes the commands and shows me the different actions available. I tried to run the composer command from sail's attach shell and it doesn't recognize it.
Executing task: docker exec -it 7b2cd6402559708130d9fdf7b8f8e8cbcd9ed47d524a77dd10cf2ee0068b5150 bash
root#7b2cd6402559:/var/www/html# composer
bash: composer: command not found
Then I opened previous Laravel projects to test if the sail composer command worked and the same thing happens (it didn't before), so it's not a project-specific thing.
I would greatly appreciate your help! P.S.: sorry for my English, greetings from Argentina!
I encountered the same problem.
My operating system is Windows11 and I created the laravel project in WSL
.
I found that if I service docker start in WSL, sail up -d will create or start containers in WSL and sail composer will show the same error.
But if I do not service docker start in WSL, sail up -d will create or start containers in Windows11 and everything is fine.
p.s. Sorry for my English, too.
Either I miss something, or the whole chain lacks something.
Here's my assumption:
The whole point of containerization in development, is to reduce the cost of environment setup, and create a prepared image with all the required pieces.
So, when I read that Laravel Sail is installing laravel via containerization, I get excited. Thus I install it via their instructions, and everything works.
Then the problem begins. Because:
After a successful installation, I create a git repo, with GitHub's default laravel .gitignore
Then I push the newly installed laravel app into my git repo.
Then I ask a developer to start developing it. Please note that:
He does not have PHP installed
He does not have Composer installed
He clonse the repo, and as per installation guide, runs ./vendor/bin/sail up
But ./vender folder is correctly excluded in .gitignore
Thus his command results in:
bash: ./vendor/bin/sail: No such file or directory
He Googles it of course, and finds out that people suggest to run composer update
He goes to install composer, then before that PHP, then all extensoins of PHP, then ...
Do I miss something here? The whole point of containerization was to not install the required environment locally.
What is the proper way of running a laravel app, that is not installed from https://laravel.build, but is cloned from a git repo, WITHOUT having PHP or Composer installed locally?
Update
I found Bitnami laravel docker and it's exactly what containers should be.
You are right and the other developer doesn't need to have php nor composer installed.
All he/she needs is Docker installed on the local machine.
If you scaffolded the project with what is mentioned in the official Laravel docs under the Getting started section, then you will have a docker-compose.yml file in your project root directory.
For Windows
For Linux
For Mac OS
All the developer has to do after git cloning the repository is to run
docker-compose up --build -d
That's it.
For those struggling with this issue... I've found a command that work perfectly fine.
First of all, you don't need to locally have any PHP or Composer installed, maybe there is a misunderstanding about it, all you need is Docker.
Docker will install everything you need in something I understand is like a sandbox, not locally, for each project.
And for those downloaded projects, from GIT as example, that does not have vendor folder, and obviously cannot execute sail up you can simple execute:
docker run --rm --interactive --tty -v $(pwd):/app composer install
That command will download a composer image for docker, if you do not have one yet. Then, will run a composer install and you are free to execute a ./vendor/bin/sail up if you hadn't configured an alias or just sail up if you already configure an alias.
That's all.
The official documentation lists the following command.
docker run --rm \
-u "$(id -u):$(id -g)" \
-v $(pwd):/var/www/html \
-w /var/www/html \
laravelsail/php81-composer:latest \
composer install --ignore-platform-reqs
If you were to clone a Laravel project and run this command in the project root, it would create a very small container with php and composer installed and run composer in the project root to install all php dependencies. In effect, this installs the Laravel core code into the cloned project. Once the project in set up this way, the user should create a local .env file to match their development evironment.
cp .env.example .env # creates a .env file to be populated for the local environment
With the envronment set up, they can now create the application containers in docker and run the application. Laravel provides the Sail helper for this.
./vendor/bin/sail up -d # runs the docker containers in detached mode
Now it's a matter of setting up the laravel app and running the Laravel app. (I'm assuming the app uses one of the Laravel start kits that rely on Node.js. If you are using a Blade only application, you can skip the "npm" commands.)
sail artisan key:generate # (Best Practice) Generate a new application key on each machine
sail artisan migrate # Scaffold the database structure
sail artisan db:seed # (Optional) Seed the database with data
sail npm install # (Optional) Install front-end dependencies (Inertia, Vue, React, others...)
sail npm run dev # (Optional) Run the front-end framework in development mode
With this, the new developer should be running an exact copy of both the project and the development environment as the original developer.
Your project README may include additional steps to set up some other dependencies, but this is the basic workflow for contributing to a Laravel project.
The only prerequisites for this workflow is to have Docker installed with an Internet connection. This is most easily accomplished on Windows, Mac, and Linux by installing Docker Desktop.
Alternate for Older Projects
If you are working on an older project that doesn't use Laravel Sail, but does have a docker-compose.yml file, you should be able to build and run the necessary containers with the following command.
docker-compose up --build -d
Once you have the containers running, you would need to install the project dependencies directly into the container.
docker ps # find the container ID of your project's container
docker exec -it CONTAINER_ID php artisan key:generate
docker exec -it CONTAINER_ID php artisan migrate
docker exec -it CONTAINER_ID php artisan db:seed
docker exec -it CONTAINER_ID npm install
docker exec -it CONTAINER_ID npm run dev
Of course, Docker Desktop simplifies this process. With a button click you can have a terminal shell open directly in your container eliminating the need for the docker exec command.
I'm looking for a way, to execute symfonys console after a docker container is started, such as a database migration. I'm using a alpine php-fpm image, which has the following command at the end of its Dockerfile:
CMD ["php-fpm"]
When i try to override this in my docker-compose file, either php-fpm won't start or the symfony console command won't run. Ideally the execution should wait a few seconds, to ensure that the database is started.
What would be a good solution for this scenario?
My current setup involve PhpStorm IDE in which I have imported Symfony 3 projects which is basically CLI tool. On the host machine I don't have PHP installed so I'm running the application from Docker container which has PHP and Xdebug installed.
I don't have issues to debug web applications from Docker containers but with Symfony and this CLI tool it seems a little bit more tricky.
My question is how to properly set this up and debug it from PhpStorm? I tried to create a new debug configuration (PHP Remote Debug) but breakpoints are not trigged.
Suppossing you have followed into the instructions mentioned into the following links:
Can't connect PhpStorm with xdebug with Docker
How to setup Docker + PhpStorm + xdebug on Ubuntu 16.04
Or similar questions
Then you need to follow theese steps:
Step1:
Get shell access to your container via running:
docker exec -ti ^container_id^ /bin/sh
Or if running a debian/ubuntu based one (or you installed manually bash):
docker exec -ti ^container_id^ /bin/bash
The ^container_id^ can be found via docker ps command it is the first column of the table. If running on a small window just pipe int into less -S resulting the command:
docker ps | less -S
Then export the following enviromental variables:
export PHP_IDE_CONFIG="serverName=0.0.0.0:5092"
export XDEBUG_CONFIG="idekey=PHPSTORM"
Please keep in mind to setup the correct value specified into Servers section as you see in the image:
It is important in order not to run into the problem specified in this question.
Then just enable debugger listentin into the phpstorm and spawn the cli as you do when you run a symfony application.
I am yet to find an elegant and efficient way to run Laravel Artisan commands in my Docker based local dev environment.
Could anybody suggest the recommended or "proper" way to do things like migrations?
Or, has anybody found a neat way of doing this? Ideally with examples or suggestions.
Things that I've considered:
A new container (sharing the same volume and db link) with ssh, just for running commands (seems nasty).
Hacks in supervisor that could then end up running on live (not ideal).
Editing db configs, or trying to hack in a "host" environment, so that at least things like migrate can be run from the host.
Creating web front ends to run things (really nasty).
Trying to build a "signal" for it things.
I'm still getting my head around Docker and it's new-container-for-everything approach.
I suppose I want to balance cool-dev-ops stuff with why-do-I-need-another-fake-server-just-get-it-working-already.
I'd love to commit to it for my dev workflow, but it seems to become awkward to use under certain circumstances, like this one...
Any suggestions and ideas are welcome. Thanks all.
Docker 1.3 bring new command exec
So now you can "enter" running container like
docker exec -it my-container-name /bin/bash
After that you can run any command you want
php artisan --version
The best practice regarding Docker is to run each process inside it's own container. Therefore, the ideal way to run artisan commands is to have an image for creating containers specifically for this purpose.
I've created an image which can be pulled from the Docker Hub dylanlindgren/docker-laravel-artisan and it works really well. It's on GitHub as well if you want to take a look at the Dockerfile behind it.
I've also just written a blog post describing the way that all these seperate containers fit together.
There are a couple of possibilities...
Mounting a host directory in your container as the folder in which your Laravel app lives. That way you can just run php artisan migrate or composer update from the host. You might have problems with deployment, though, since you would have to replicate that part of your environment on the server.
adding an SSH server to your container (which is not recommended; here's a good discussion of that).
build and use nsenter, a tool for "entering" a running container and getting shell access. Note, I haven't used it, I just found it a while ago via a reference in the link above.
If you're primarily interested in deployment and you're doing it via a dockerfile, then the answer would be to add composer install and php artisan migrate to your dockerfile so they run when the container is built.
I'm interested in hearing more answers to this. It's something that I'm just getting into as well and would like to learn more about.
I use SSH and run migrations from a terminal inside the container.
I personally like Phusion's approach of using Docker as a 'lightweight virtual machine'. So I used their baseimage-docker which I've extended to create my own Docker image for Laravel applications.
I'm aware Phusion's image can be contentious in the Docker community, and that SSH is frowned upon by some who advocate Docker containers as microservices. But I'm happy with Phusion's approach until there are more established tools and practices for the multi-container approach.
I'm in the process of figuring out creating Docker images for Laravel projects, this is what I have so far.
FROM base_image_with_LAMP_stack_and_dependencies
WORKDIR /var/www/html/app
COPY composer.json composer.json
RUN composer install --no-scripts --no-dev --no-autoloader
COPY . .
RUN echo 'chown -R www-data:www-data /var/www/ \
&& composer dump-autoload \
&& cp .env.example .env \
&& php artisan key:generate \
&& php artisan migrate \
&& apachectl -D FOREGROUND' >> /root/container_init.sh && \
chmod 755 /root/container_init.sh
EXPOSE 80
CMD /root/container_init.sh
This way, there is no dependency on database during build time, and the migration process can run every time a new container is started.