Replicating Heroku environment in Docker (or other container) - php

I would like to have a local development environment that is a clone of my Heroku environment, with same dependencies, etc.
I can develop locally on my OSX machine, but I use brew for package management, and it is a chore to keep all the versions the same as my heroku setup.
My issue is I don't like deploying to my Heroku development environment every time I make a change. It really slows down my development time, waiting for the slug to compile and deploy.
Is there a way to use Docker to mimic exactly whats in my Heroku stack? (PHP, Nginx, MongoDB ... ) Anybody done this or have a link to a tutorial?
Thanks

You might consider dokku. It uses Heroku buildpacks by default and you can add things like MongoDB instances with its plugin system (here's a dokku mongo plugin).

Related

How do I maintain work inside drupal containers for working in teams?

I am new to Drupal and just looking for some help getting my dev environment going and using best practices.
So I have a working container with drupal, mariadb, drush etc... so after installing Drupal with the installer, I install themes and such however it seems if I drop the container I lose all my work. How could I ever work in a team then? How do I keep that work? Do I use git inside the container and pull and push from within?
As far as I'm aware, work inside the container does not necessarily reflect into my local working directory.
Any help would be much appreciated.
I dont know about dupral but generally in docker you would mount a folder from your local filesystem where docker is running when you start the container. The data in the "/your/local/folder" will be accessible both in the container and in your local filesystem. It'll also survive a restart of the container.
docker run -d \
-v </your/local/folder>:</folder in container>:z \
<your image>
The trick will be to identify the data in the container you want on your local filesystem.
Look here for different alternative ways to handle persistent data in docker:
https://docs.docker.com/storage/volumes/
I can highly recommend you Lando for Drupal 8.
SEE: https://docs.devwithlando.io/tutorials/drupal8.html
It's a free, open-source, cross-platform, local development environment and DevOps tool built on Docker container technology.

What's a good toolchain and deployment process for GIT and Deployment for a Windows user working on a LAMP Stack?

I run a custom PHP site on a Managed VPS (LAMP stack) and am the solo developer, but want to start using GIT so freelancers can contribute. Currently I don't use GIT.
GIT
For GIT, I wanted to use Visual Studio Online (VSO) since I've used it before but am open to suggestions if it's better for the suggested deployment process.
Deployment
I kept a "Dev" folder and a "Live" folder on the web server and simply did all of my dev in the Dev folder, tested there, then ran rsync to push it to the Live folder. I couldn't easily run it locally since it has things like Linux symlinks, and I work on a Windows computer.
The Goal
I want to start adding GIT to this process, integrate that into a decent build process, and still use a Windows IDE for development. Though maybe I should install a Linux VM on my Windows machine so I can start running the site directly from pulling the latest version from GIT?
I need a setup that would be easy for other developers to join on as I find freelancers to help out.
Suggestions?
here's what I do for
Git deployment
For deployment, I follow a very-simple git-flow technique (http://nvie.com/posts/a-successful-git-branching-model/#the-main-branches) with 2 branches:
master for prod
dev for new features
You always develop new features on dev branch, so dev is always ahead of master in term of functionnalities.
When you want to deploy, you SSH on your server, you git pull dev branch, and if everything works fine, you git merge with master.
You can always switch on the prod server between prod and dev version with git branch master and git branch dev. No need of 2 folders, git handles this !
If you have a bug on prod environment, you can switch branch to master on your local computer and fix the bug. Then you upload it via ssh.
Multi collaborator development (developers with windows/linux/mac but 1 production server)
I personally use docker, it creates a single VM which you can customize to match your prod environment (linux version, apache version, php version and my sql version). You create 1 docker image, and then every collaborator can download this image and run it on his computer.

Vagrant based development

I was just discovered Vagrant and i want to use it in my development. I just wanted to see how are some more experienced developers handing this.
I will assume on my local machine i will have a folder, say ~/server/, where i will keep all my projects (one in each folder), and each will contain a Vagrantfile.
Questions:
GIT: do i install the git on my machine and make the pushing/pulling locally, or put it on the vm for each project and run those from there?
DB: the database will obviously go into the vm for each project, but how will i be able to easily modify them? Should i install phpmyadmin or a tool like that on each vm?
what is the best way to access the vms in the browser? Do i assign each of them a different IP and then add a record into my /etc/hosts?
I'm just starting out with Vagrant, so there are probably questions i have that didn't even popped into my head yet, so any other suggestions you could give me that you think are important for this will be very useful to me.
Thanks in advance for the answers.
GIT: In my opinion, you should install git and setup repositories via vagrant provisioning on each VM, and after that you can create git hooks on your local machine that will update the code on each VM on local commit.
DB: You don't need to install phpmyadmin on vms. You can easily modify DB (I am assuming, you want to modify records) via DB client installed on your local machine (preferred) or you can also use local machine's phpmyadmin with remote connection.
Yes, you can do that.

Setting up a Laravel 4 app on a VPS

I am trying to deploy my first Laravel App. So I hope I am providing all the necessary info. I have walked down several paths trying to deploy this app. I tried a shared hosting account, but found too many errors to continue deploying my Laravel app. In the meantime, someone has said to me I need a VPS, so I may go with that.
So with a new VPS, I now am trying to install the following: phpMyAdmin, node.js, Composer, and Laravel 4. These are the technologies I am using on my local server with MAMP. Now after being overwhelmed with the information on installing each on a VPS, I have found myself extremely confused. Some places say I need to install Ubuntu. Some say I need to install Apache first. Some talk about using CentOS. I honestly have no idea what I need to install, and in what order. All I really need is to figure out how to set up a PHP environment on my VPS with phpMyAdmin, Node.js, and Composer. After that I am pretty sure it's all straight forward, as far as installing my app.
I also saw some one talking about committing my app to Git, and the cloning it to the VPS. If I did this, I would still need to set up the environment correct? Once again, I hope I have provided the necessary information. If my question is not clear, could you please refer me to a resource that I can study.
You don't need install Laravel separately from the app it is part of - these days a PHP app just contains everything it needs in its vendor folder. How to deploy depends on how you have arranged your dependencies locally, but the simplest way is to copy everything in your local project to your remote server (FTP or rsync). I don't think Laravel demands a VPS, but if you are using Node as well, then yes you will.
So, the short answer is: if it works locally, copy it up to the remote host, and it should work there. Make sure you've set up your config system in your app so that it can cope with the different settings you need in local/remote environments, such as database connection settings.
My feeling is that a shared host would be easier for you as a beginner - is the Node.js component of your app critical? Running your own VPS is not difficult, but there is quite a bit to learn. Your distro (such as Ubuntu) would be ready-installed, and on top of that you would use the package system (something like apt-get) to install Apache, PHP, PHP modules, phpMyAdmin, git, and whatever else you need.
Yes, you can certainly deploy using Git. One way to do this is to create bare repositories on your server in a private place, set it up as a remote in your local dev machine, and push to it as your off-site copy. Then, from your dev or production web folders, pull and update submodules. This is not trivial, and requires at least a working knowledge of Git - so presently I wouldn't recommend this route.

Setting up Jenkins Continuous Integration and Selenium web application testing

I am trying to set up Jenkins to run our Cucumber features. I am a little lost when it comes to setting this up. Here are some details of our setup:
Cucumber features with steps written in Ruby and PHP (using Cuke4PHP)
PHP application (which often links to other PHP applications)
Using Capybara and Selenium to exercise Javascript
In development environments, since our apps need to link to each other, we set up apache vhosts with domains like http://developername.dev.exampleapp.com
How should I set up this Jenkins environment to run our cucumber tests?
It seems like you would want to set up a virtual machine using VirtualBox or something in order to set up an environment similar to your production environment and serve the project from that virtual host. But then do you run the tests outside the virtual machine? Or do you run the tests on the virtual machine and report back to Jenkins? Do you set up virtual hosts when you provision the virtual machine? How do you set up your project to use an isolated database? How do you run your features in parallel so they don't take forever? If someone could shed some light I would greatly appreciate it.
We have recently started using Vagrant to setup a development environment on virtual machine in which we use folder sharing between host machine and guest Vm for application source code and application database itself is on Virtual machine. We haven't implemented with CI yet though.
For setting up your application environment for a build, you can use Vagrant to setup your applications environment. And assign a ip for your Vm which can handle Virtualhost for that ip by itself.
For running the tests, your selenium/acceptance tests should run from your host/build machine, considering a client and server architecture for you applications as pointed by Amber. But unit test should run on your VM itself. I don't have much idea about running features in parallel. But will share my experience once we have implemented whole process into CI.
Do you have your clients run their browsers on your production servers? (Hopefully not - hopefully they run them on their own computers!) Thus:
The server VM is the equivalent of your production server, so it's not where the tests should be running - Selenium tests run in the browser. Depending on how many browsers/what kinds of browser setups you want to test, you can either set up separate VMs with OS/browser combos to run the tests, or you can run them outside of a VM on a standard browser install.

Categories