I am trying to setup set of docker containers to serve couple of applications.
One of my goals is to isolate PHP applications from eachother.
I am new to Docker and not fully understand its concepts.
So only idea i came up with is to create a dedicated php-fpm container per-application.
I started with official image: php:7.0-fpm but now I think that I may need to create my own general purpose pfp-fpm container (based on mentioned above), add some programs to it (such as ImageMagick) and instantiate couple of such php-fpm+stuff containers per PHP-application, setting up volume pointing strictly to that application source code.
Am I thinking in right direction?
now I think that I may need to create my own general purpose pfp-fpm container (based on mentioned above), add some programs to it
That is the idea: you can make a Dockerfile starting with FROM php:7.0-fpm, with your common programs installed in it.
Then you can make a multiple other Dockerfiles (each in their different folder), starting with FROM <yourFirstImage>, and declaring specifics to each php applications.
Related
I have a few dozens of php apps that I want to dockerize. I am wondering what will be the best design for management and performance wise.
one big container with all services included (php-fpm, mysql, nginx etc)
separate containers for all services:
container-php-fpm-app1
container-nginx-app1
container-mysql-app1
container-php-fpm-app2
container-nginx-app2
container-mysql-app2
one container for service, that service hosts all apps:
container-php-fpm - for all php-fpm pools
container-nginx - for all nginx virtual hosts
container-mysql - for all databases
I understand running separate containers lets you make changes to one service without impacting another. You can run different php configurations and extensions and versions without worrying about the other services being affected. All of my apps are Wordpress based, so configuration will (or should) be consistent across the board.
For now I am leaning toward separation, however I am not sure if this is the best approach.
What do you guys think?
You should run one service in a container, that's how it's designed. So 1 is out the door.
If you look at three, you have a tight coupling between your apps. If you want to migrate to a new php-version for app1, or have a different dependency there, you're in trouble, so that's not a good one.
The standard is to do 2. A container per service.
Per docker documentation multi-service container:
It is generally recommended that you separate areas of concern by
using one service per container. That service may fork into multiple
processes (for example, Apache web server starts multiple worker
processes). It’s ok to have multiple processes, but to get the most
benefit out of Docker, avoid one container being responsible for
multiple aspects of your overall application. You can connect multiple
containers using user-defined networks and shared volumes.
Also based on their best practices:
Each container should have only one concern
Decoupling applications into multiple containers makes it much easier
to scale horizontally and reuse containers.
I would suggest using option 2 (separate containers for all services).
The most common pattern that I have seen is a separate container per application. That being said, there also value in having related containers near one another but still distinct, hence the concept of Pods used in Kubernetes.
I would recommend one container per application.
I am trying to wrap my head around an optimal structure for Dockerization of a web app. So one of the best practices recommendations for using Docker is having one process per container. So where do I put the source code of my app?
Assume I am making a simple nginx and php app. The one process per container rule suggests having a nginx container that serves static assets and proxies php requests to a php-fpm container.
Now where do I put the source code? Do I keep it in a separate container and use volumes_from in Docker compose to let the two containers access the code? Or do I build each container with the source code inside (I suppose that makes it easier with versioning)?
What are the best practices around this?
Do I keep it in a separate container and use volumes_from in Docker compose to let the two containers access the code?
That is the usual best-practice, which avoid duplicating/synchronizing codes between components.
See "Creating and mounting a data volume container".
This is not just for pure data, but also for other shared resources like libraries, as shown in this article "How to Create a Persistent Ruby Gems Container with Docker":
I have seen Jenkins being used as CI for Docker containers. Is Dokku also a CI platform like Jenkins?
If, what is the difference when I want to do CI with Docker containers for a PHP application?
Are you maybe confusing drone with Dokku? Dokku is a platform for execution of heroku apps drone is a docker based CI. I don't know much about drone but since docker can't be run inside a docker container without some hacking you are better off sticking to a traditional CI like jenkins, bamboo, team city or such.
Continuing from Usman Ismail's answer...
If you look at dokku-alt, the distinction is less clear. In particular dokku-alt allows you to use a Dockerfile for the build rather than buildstep, so it's not specific to Heroku like apps.
Dokku-alt is not in itself a CI system, but out of the box it does verify that the build completes without error before it's deployed, and using git hooks you could connect in your test-suite to run on every git push and block deployment when it fails.
CI typically is a bit more than this. You'd normally have multiple deployments for test, staging and live, and to some extent it also encompasses a set of practices. Dokku-alt gives you some very useful parts of CI, and a fairly clear path to building more of it fairly easily, but it's not a complete CI system in itself.
You might well prefer to keep your main git repository elsewhere, and keep jenkins in the picture for automating transfer to dokku-alt.
I am trying to work what the point of the environments folder is.
Originally I had the idea that you could point the webserver to the different dev and prod folders in the environment folder but after reading up a bit I realise this is not the case.
In Yii 1 you would solve this by just having multiple index.php's i.e.:
index.php
index-local.php
So the question is what benefit does this new environment structure actually give me over the old way?
I've found environments very useful in allowing me to keep a common code base for multiple client projects (based on Yii App Advanced) and setting up a different environment for each specific client, keeping their custom code private and separate.
To do this I store the environments folder in a separate git repo from the rest of the code and pull down the relevant folder on a client/project basis.
This lets me use a base common code for all projects and add/override any file for a specific client or project whilst still allowing separate dev/prod config settings. If the client uses other developers too, they are also catered for. In this way, only common code I choose will be shared amongst clients and custom code will be kept private.
I've also moved the composer.json file into the environments folder so I can pull in different extensions per client/project keeping those private too.
That init command can be a very powerful tool and you don't have to limit yourself to the template provided by the core developers.
If you don't need environments, then don't use them, but I assure you some people will find it very useful.
Yii2 documentation in WIP, but you should read this :
https://github.com/yiisoft/yii2/blob/master/docs/guide/apps-advanced.md#configuration-and-environments
You need to use yii init command to switch between these environments.
EDIT :
This new environment feature is more than just use different config file. You can use different folder structure, different entry script...etc
Personnaly I won't use this feature, I don't need it (I will use a different entry script as with Yii 1), but I think this is not useless.
I think you didn't get the real purpose of environments introduced in Yii2.
I'll try to explain what was the main purpose of adding environments to yii from the developers point of view on an example and hope you will really appreciate its usefulness.
Let's suppose for a moment that you are a team of developers (e.g. 5-7 person) working on mid-to-large project implemented in Yii. To effectively work on that project your team decides to use some CVS or SVN (e.g. GIT) and keep all the files of the project in repository in cloud for the whole team. That's de facto standard while working on mid-to-large projects in teams and nobody will resist that it's the only comfortable and easy way.
Ok, now let's suppose you use Yii 1.x or Yii2 with the approach of different entry scripts to differentiate between local (development) and production environments to connect to db or set some other environment specific configs. Everything is ok and working. But suppose your team members implemented something new on the project and you check out repository to work on updated version and you suddenly find out that your local config file (in this case entry script with config) is overwritten with other team member's file who pulled the changes to repository (because each of you is using your local machine db with other database name or OS, or config, or simply because your team uses one local development server db, but you are on vacation and can't use anything except your local machine).
So generally Yii2 environment adds more flexibility for using different environments each with it's own specific configurations while using also general (common) configs when working in teams on mid-to-large projects hence why the example in guide is given on advanced app project.
Surely you can overcome everything stated above with some solutions or .gitignore which is used by default to overcome the problem stated in Yii2 with environments. But:
Why bother if everything is already done?
and
It was just one little example of usefulness of Yii2 environments. More depends on the project and your imagination.
Overall Yii2 is great product. Not only it adds many new features to already great framework, but it also is more robust and flexible than Yii 1.x (despite the fact that Yii 1.x was already very robust).
As for Laravel or any other PHP framework, it really depends... Everyone will find his/her own favorite.
For those who are tired of copying files around, I created a useful script that you can run in background to keep the files in sync on your dev environment:
File sync-env-files.sh
#!/bin/bash
ENVIRONMENT_DIR="/var/www/example.com/environments/dev/"
DIR="/var/www/example.com/"
while [ true ]; do
for envFile in `find $ENVIRONMENT_DIR -type f`
do
file=${envFile/$ENVIRONMENT_DIR/$DIR}
if [ `stat -c "%Y" $file` -gt `stat -c "%Y" $envFile` ]; then
#echo "copying modified file $file to $envFile"
/bin/cp -f $file $envFile
fi
done
sleep 2
done
Then run the script in background or add to cron with flock
nohup server/sync-env-files.sh >/dev/null 2>&1 &
I would like to mention in addition to #AngelCoding, since this question still gets seen, that I use the environments folder lots now and definitely see the point of it.
The very first things I do in any open source project is create one project for the code base on GitHub and then another, private, one on Bitbucket for the configuration, in other words the environments folder.
Having this folder has made it a lot easier for me to separate my configuration into a private repository.
So the environments folder has a lot of uses and really helps to separate configuration for easier usage even if it does not seem like it initially.
I have what I believe is a tricky situation but this usually indicates I'm ignorant of something quite simple.
I currently have one php project that is in an svn repo, call it 'firstproject'. The working copy of this is an apache virtual host dir so I can happily run this project at any point of development through a browser.
I now have another project 'newproject' that I want to use some of the core code of 'firstproject' but when newproject requires me to refactor parts of the firstproject classes I would like for that to be integrated back into firstproject at some point.
Is there any way to set svn up so I can have a working copy of newproject happily in it's own apache virtual host and comprising some code from firstproject and then its own code and for svn to keep tabs on which is which or is it a case of creating a 'newproject' branch of firstproject, editing away adding the newproject code and then doing some sort of merging of code back into 'firstproject' when it seems appropriate?
Many thanks to anyone who can help me thinking about this, it feels like there should be a neat way but maybe there isn't.
It seems to me that firstproject should release some deliverable (a library). newproject can then consume this.
If that library needs to change, then the changes should be made in firstproject, and released.
That's how I'd normally manage this sort of dependency in a non-PHP world. I'm not sure that in PHP it should/would be any different. It's a bit of a headache, but you're building a reusable library that a downstream project can consume (and choose which version it consumes, note).
If there is a large part of the code that is common, then you can consider using a common repository instead of two seperate repositories for these two projects. You can then take out a branch for the code that might change within the project newproject . The workspace on the newproject apache server will have to be from this branch so that any changes can be committed back to the branch and merged back to the trunk when you are ready. This way you can incorporate the changes back to the firstproject mainline/trunk.
You can also explore svn externals for another approach to mix and match if you need a workspace with checkouts from multiple repositories. I have not implemented this myself but you may find the details here:
http://svnbook.red-bean.com/en/1.1/ch07s04.html