Version Control WordPress Website with themes installed - php

Being new to Wordpress development ecosystem have few queries which I m stumbled upon currently.
Have installed wordpress and it was working as expected. Had installed a theme to the website recently to the local working copy. Also, added 2 more members to the team which was previously single person and thus made the code version controlled with bitbucket server repository.
Now, the issue is when the new developers cloned the remote repository to their local machine ( entire wordpress with installed theme ) is available. But its not working properly as the new pages/links/menus/images added are not reflecting for the other's local .
Would like to know the general thumbrules which needs to be followed while version controlling Wordpress. Is the added themes/additional code changed files etc. only need to be version controlled instead of the whole Wordpress files? In my scenario what would be the changes I need to do ( any files which I need to configure to get the latest repository code to run locally )
Would we need to version control the Wordpress including the Database?
Please advise.

Yes the main issue you have there is all the things you want are within the database. There are version control things for db but not sure how best they work... so you either export and share you db with them, or all work off a remote db from a server. You can change db credentials, including ip address etc in wp-config.
As for version control, I only save my theme in git and the wp-config etc etc... the bare minimum I can. Then I use composer https://getcomposer.org/ to pull in wordpress and make use of https://wpackagist.org/ to get plugins.
Roots did a nice introduction to composer and WordPress - where I started learning from and is worth a look: https://roots.io/using-composer-with-wordpress/

Related

How to include WordPress plugin changes and related db migrations in git?

I have 2 environments for WordPress (dev && production), I'm working on dev env, when I add some new plugins, some will create tables or update
configs in DB, how can I handle all these changes and then migrate to production env's DB?
I'm using git for file changes, but I can't handle the DB changes which created by plugins. How to integrate these changes in git? Or other workarounds?
I'm using a WordPress docker image, and I mounted an existing folder to /var/www/html
I upload the mounted folder to git for version control.
Except to manage all changes in version control tool.
Update:
I'm using wordpress 5.2.2.
How can I put a database under git (version control)? This one is the same. But looks like a little difference.
As this answer says, keep both data dump and schema dump. Is data dump have a correct diff info with previous? So that I can manually add this change to something like liquibase's changeset?
My concern is just the DB changes which changed by 3rd-part plugin, and I will hardly to trace.
Here's is what we do. Any proper plug in will initialize new DB tables/fields on activation, and remove DB tables/fields when the plug-in is deactivated. In this way the plug in itself handles all DB migration functions. We write our plugins in this way, and nearly all plug-ins work in a similar fashion. We just commit plugin code to git, test in Dev, then release to production, and activate. Boom database is migrated.
Nearly all database changes are driven by new plugins installation. Let it manage the database via it's own activate /deactivate hooks.

no option for adding new plugin in Heroku WordPress

I just installed WordPress in Heroku, now I wanted to add the plugin in WordPress but it does not allow me to add the plugin, no option is available for adding new plugin why? please help I'm in great trouble
Heroku is a bit unusual as hosting providers go. The filesystem your code is deployed on is ephemeral - any changes you make to it (like installing plugins) will be blown away a) the next time you deploy and b) at least once daily due to dyno restarts. In addition, if you're running multiple dynos, making changes on one won't reflect on the others.
What you'll need to do is install the plugins locally in your development environment, and check all the added files into the Git repository and deploy that to Heroku.

Magento sync orders on staging and live databases

So I have a Staging and Live Magento site. I want to do thing things like theming, installing extensions, Magento upgrades etc. on the staging site and then push it to the live server.
Problem is that while i am working on the changes, there are live orders coming into the live site. So I cannot simply copy the files and database to the live server. I will be missing orders and customers and all my order numbers will be screwed up.
So what is the correct way to keep both databases in sync ? Something like a Magento upgrade or extension install will make changes to the database, so I can't just use the live database for both.
I saw someone mention using GIT / GitHub, not sure how it works but will that all you to push only new database changes to the live server, along with changed files ?
Thanks
You shouldn't need to keep your staging and production databases in sync when it comes to variable data such as customers, orders etc.
Instead, you should have a dev-stripped database (no logs, customers, orders etc) from production which you use in staging. You can place all the test orders you want to re-populate that data. This also means you don't have the risk of emailing live customers from your staging database.
In terms if updating your database, design etc, you need to track two areas of change:
File changes (templates, CSS, Javascript, modules, core Magento updated code...)
Configuration changes (database)
For file changes, you should be using a version control system (Git, Subversion, etc) and tracking your changes as you make them. When they're tested and ready to deploy to production, then you can merge them into master.
For database changes, the only way you can really ensure that when you deploy your code to production that everything just switches over is if you use Magento setup/upgrade scripts to add your configuration to the database. If you haven't used them before, they're basically a PHP file and a version number in your config.xml for a module that you bump up, add a new install script and tell it what you want to change, whether that be creating new tables or data, saving or changing configuration values, etc.
The major benefits of tracking all your configuration changes in installer scripts (I mean all of it, literally), is:
When you deploy your source code and reindex, Magento will update your production database with all the new configuration it needs
You don't have to worry about managing database changes separately
It's all versioned - each module version increment can have a new install script, so you can see what's changed in the module over time.
For more information, Alan Storm has a great article on them.

Adding A Moodle Project to Source Control

I'm not a php nor moodle developer. I worked as a python developer for many years and now work as a devops eng.
One of my clients uses the moodle framework for their site with no source control. I've spoken with their lead developer and he insists there's no way to have a moodle repo without the entire directory structure in the repository, that is all the auth, admin, backup, badges, etc directories, since many files in those directories have been touched by their development team
I did a file count and it's over 50K files, which is insane for a code repo.
Has anyone managed to solve this problem for a moodle site before? Specifically a clean CI process using source control?
I have been through a similar process on several occasions. Your best bet is to clone a clean copy of Moodle from the github repo. Then look at the version.php file on the client site to identify the exact version they are using. Next checkout that same version in the clean copy (use gitk to search for that version number). Finally copy across the code from the client site and then the standard git commands should allow you to audit what has changed and commit it in sensible steps.
Once cleaned up, keep all the changes in a branch.

Dealing with SVN and FTP and dynamic files created/changing on the server?

Basically I've got various projects all version controlled using subversion. This is for many reasons: backup of files in case of bugs/issues in the future; backup of files in case of local system failure etc; collaboration from others in the company; etc..
One of the systems we work with is Wordpress which does updates and installs plugins through its administration panel and such, plus on installing it the system creates various files (including a wp-config.php file and a .htaccess file). This means that on install there are files on the server integral to the running of the system which aren't on the local systems and aren't in svn. Plus any installed plugins and updates aren't mirrored in version control or the local copy.
Plus it feels wrong (specifically when you compare with data normalisation in databases and such) to be working with two copies of the same code - one in version control and one on the server.
So my question is am I using the tools in the right way? Is there any way that the public_html folder from the server can "point" to the latest version in the repo? Or can SVN be configured to read from the public_html folder and automatically add+commit any files created/edited on the server?
Or do people just literally download anything that gets changed/created and add them to SVN manually? Or do people not care? Maybe I've misinterpreted what SVN is for? I'm using it for backup effectively.
Thanks
Tom
I only have versioned my own wordpress theme. All the other stuff including the data is live on the server and solely backuped from there.
The code of wordpress and the plugins used are developed elsewhere, they have their own repositories, and i do not mess mine with code I never will touch.
The question is how to deal with configurations. I am currently running a wiki where I document all the plugins installed live and what configuration properties I have set up.
A sync of live to local then goes like this:
Update wordpress version and plugins to the versions written in the wiki
Setting all configuration options as written in the wiki.
Importing the data base (except wp_options). Converting the static URL of wp_content files to the local scheme.
Syncronisation of the wp_content directory
In many cases your hosting provides regular backup. But is you use VPS you have more freedom to do whatever you want. I have made my public_html folder under version control and created a small script to commit every night. So I can have a complete version history of my site with changes traced. You can also create a script just to copy this folder elsewhere. There may be other better solutions for enterprises, but this may be enough for small project.

Categories