I have small team (4 developers) and we created a site, that has currently 500 users.
Our development cycle looks like this:
We make changes (install plugins or customize code) on our localhost.
We push it to development environment. We can do stress tests, we have a couple of test users and so on.
We evaluate it.
We push it to production.
Deploying new code is as simple as git pull, but the real problem is database.
Almost all plugins require me to click something in admin panel. What is more: some of them create new tables or update existing tables when I install them. For example add some new user metadata for every user.
I thought about some solutions:
Installing plugins manually on all environments:
I can install plugins manually, but this is error prone.
We had a couple of situations, where code was the same, but sites worked different because of some simple misconfigurations.
Exporting whole database:
Export production database and import to dev.
Install plugin on dev.
Migrate dev to production.
It has a real problem - it is not atomic. If user created a post while I was installing plugin on dev - it is gone.
Creating scripts from database changes:
I could record, what wordpress did, while I was installing plugin via MySQL binary log.
The problem is, that one can get all users in one database query and then add some metadata with another using ids. This means, that binary log records query with a bunch of ids. If new user registered during process - it is gone.
Recording configuartion changes with selenium IDE:
I can then export test suite to python and replay with changed host address. It could work, but it seems rather like workaround, then a solution.
Modifying all the plugins so that they are configured by php files:
This seems legit, but might create quite a lot work.
Can you tell me, how do you automate wordpress deployments with database changes?
There is a couple of similar questions on SO:
Managing Wordpress blog on development and live environment
Wordpress Dev Environment Migration
but these are rather basic questions and they do not solve my problem.
Did you try with WP Migrate DB Pro?
I use this plugin for db transfer.
Related
At work we took back our existing store running on Magento 2 from an external development agency. I need to get the project running in local development (with docker).
I familiarized myself with a vanilla project from the official docs and managed to get it running by downloading the vanilla template with composer, granting the proper permissions on files and folder and running the magento setup:install command.
My question is how do one goes when kick starting from an existing (production running) project?
Do I need to run setup:install again? If I do, why?
What do I need to import from production to ensure any content or configuration created via the admin is also running on my local setup? Should I import the complete Database from production?
I know our setup is using more than just php and mysql, but env.php seems to be listing only db configuration and admin url. Where can I get the complete service configuration informations about what our setup uses?
Anything else I am missing to get started with an existing project for local development?
As someone who is running Magento 2 on a local environment myself, hopefully I can shed some light on this.
If you have a direct copy of the live site, you do not need to run setup:install again.
Ensure you have a copy of the entire Magento 2 site (you can technically ignore the vendor folder, as you can run composer install and it will redownload those files, but that's up to you). Also get a copy of the entire database. Magento 2 is notorious for copying the same data to multiple tables so something could break if you don't have everything.
What do you mean by "service configurations" If you are referring to Magento 2 extensions, that data is saved in the database, not the env.php file. env.php is only for server side configurations, such as the DB information, Caching, and things of that nature. On mine, I use Redis for site Cache, so that would be included in that file as well, as an example.
When you first unpack the site to your local environment, run composer update in the directory. This will ensure you have all the proper files installed. If you are going to run a local dev environment, set the mode to development with the following command: bin/magento deploy:mode:set developer. This will allow you to make changes and to view those changes by just refreshing the page, rather than flushing cache all the time.
All queries are replied correctly by Eric. I am also not sure about "service configurations" you have mentioned here. If this is about third-party extensions/services you can check config.php file for this.
I have 2 environments for WordPress (dev && production), I'm working on dev env, when I add some new plugins, some will create tables or update
configs in DB, how can I handle all these changes and then migrate to production env's DB?
I'm using git for file changes, but I can't handle the DB changes which created by plugins. How to integrate these changes in git? Or other workarounds?
I'm using a WordPress docker image, and I mounted an existing folder to /var/www/html
I upload the mounted folder to git for version control.
Except to manage all changes in version control tool.
Update:
I'm using wordpress 5.2.2.
How can I put a database under git (version control)? This one is the same. But looks like a little difference.
As this answer says, keep both data dump and schema dump. Is data dump have a correct diff info with previous? So that I can manually add this change to something like liquibase's changeset?
My concern is just the DB changes which changed by 3rd-part plugin, and I will hardly to trace.
Here's is what we do. Any proper plug in will initialize new DB tables/fields on activation, and remove DB tables/fields when the plug-in is deactivated. In this way the plug in itself handles all DB migration functions. We write our plugins in this way, and nearly all plug-ins work in a similar fashion. We just commit plugin code to git, test in Dev, then release to production, and activate. Boom database is migrated.
Nearly all database changes are driven by new plugins installation. Let it manage the database via it's own activate /deactivate hooks.
So I have a Staging and Live Magento site. I want to do thing things like theming, installing extensions, Magento upgrades etc. on the staging site and then push it to the live server.
Problem is that while i am working on the changes, there are live orders coming into the live site. So I cannot simply copy the files and database to the live server. I will be missing orders and customers and all my order numbers will be screwed up.
So what is the correct way to keep both databases in sync ? Something like a Magento upgrade or extension install will make changes to the database, so I can't just use the live database for both.
I saw someone mention using GIT / GitHub, not sure how it works but will that all you to push only new database changes to the live server, along with changed files ?
Thanks
You shouldn't need to keep your staging and production databases in sync when it comes to variable data such as customers, orders etc.
Instead, you should have a dev-stripped database (no logs, customers, orders etc) from production which you use in staging. You can place all the test orders you want to re-populate that data. This also means you don't have the risk of emailing live customers from your staging database.
In terms if updating your database, design etc, you need to track two areas of change:
File changes (templates, CSS, Javascript, modules, core Magento updated code...)
Configuration changes (database)
For file changes, you should be using a version control system (Git, Subversion, etc) and tracking your changes as you make them. When they're tested and ready to deploy to production, then you can merge them into master.
For database changes, the only way you can really ensure that when you deploy your code to production that everything just switches over is if you use Magento setup/upgrade scripts to add your configuration to the database. If you haven't used them before, they're basically a PHP file and a version number in your config.xml for a module that you bump up, add a new install script and tell it what you want to change, whether that be creating new tables or data, saving or changing configuration values, etc.
The major benefits of tracking all your configuration changes in installer scripts (I mean all of it, literally), is:
When you deploy your source code and reindex, Magento will update your production database with all the new configuration it needs
You don't have to worry about managing database changes separately
It's all versioned - each module version increment can have a new install script, so you can see what's changed in the module over time.
For more information, Alan Storm has a great article on them.
I am currently using git as my version control for my php written apps. This is fine for source code.
I am also learning phinx for some database version control, initially I was excited about this as I thought it was a full blown database migration tool, that worked similar to git in that I could "pull" or "push" to and from a database repository, but i can see now it is more for use for database schema changes to be migrated from local to production.
I would like a system that allows me to ensure that my local version of the database is up to date, so for example a customer registers an account on the live site. I should in theory be able to have this change reflected fairly quickly on the local version. It would need to be totally framework independent, i.e easy to integrate into opencart/wordpress and my own frameworks.
It would work in a similar way to a version control system, where I could simply push or migrate the latest production database to some repository, and pull this down to the local version.
Are there any tools out there that do this? And if not can someone advise, theoretically how I could go about writing something like this myself?
Does anyone have a good Drupal upgrade strategy for an install that is in production? No one talks about this in books and it's hard to find a definitive answer in forums and email lists.
Ex:
Lock down prod, don't allow
updates to data
copy prod
copy prod database to dev
turn off all modules in dev
upgrade core Drupal in dev (update db if necessary)
upgrade modules in dev (update db if
necessary)
turn on modules
test
migrate code and db to prod
turn site back on
Your strategy sounds good, but it would require a site to be in “read only” mode for quite a while. This is not always feasible. Also I am not quite sure why you would turn on and off all of the modules?
May I propose a slightly different approach
copy prod database to dev
replicate prod code in dev
upgrade core Drupal in dev
run update.php
test
For each module
. Upgrade modules in dev
. Run update.php
. Test
Put into maintenance mode
Backup database
Migrate code to production
Run update.php
Put back online test
This way there is a lot more testing but less downtime, also you will be able to work out which module breaks things if there is an error. It also dosn't rely on you uploading the DB from dev to live.
Don't think there's any need to turn off modules before running update.php anymore (except maybe between major versions). And I definitely wouldn't run update.php once per module - that doesn't make sense with the way update hooks work.
If you're at all comfortable with the command-line (and running on a Linux server) then definitely have a look at Drush. It can simplify the process and allow parts of it to be scripted.
In addition, if you're looking for a formal update process to move stuff from your dev server to production for a large site, you should also be getting up to speed on the hooks that run during install and update.