Best way to update database structure on production server? - php

Im building an application and would like to know what is the best way to update the production database.
Right now i deploy my code (CakePHP Application via a GIT Repository )
just like this tutorial https://www.digitalocean.com/community/tutorials/how-to-set-up-automatic-deployment-with-git-with-a-vps
But what is the propper way to update the database structure? not data.
Lets say I create a new table or alter some fields in another table in debug, right now i export the query and then ssh into server and connect to MySql and inject the query in the database.
but im sure there must be another way not so complicated.

What I like to do is write a set of functions for converting my data into the new fields/tables and so on.
For example I recently had to make a change to my database where I store the transaction dates because the format I stored the dates in doesn't evaluate correctly during a query.
I wrote a PHP function that does the conversion for each field in
the database.
I grabbed a backup copy of the live production database.
I tested my function using a dry run on the backup copy of the live database.
Once everything checked out okay, I put my application/site into a temporal maintenance mode and grabbed another copy of the live database. To be sure that any interactions between my customers and database were in tact and there would be no gap.
I ran the functions on that copy of the database and re-uploaded to my database sever.
I took my site/application out of maintenance mode.
The result of having everything prepared in advance and testing ensured only 3-5 minutes of downtime for deployment.

Database migrations is that what you need. If you are using CakePHP v2.x read this:
http://book.cakephp.org/2.0/en/console-and-shells/schema-management-and-migrations.html
If CakePHP 3.x this:
http://book.cakephp.org/3.0/en/migrations.html
In your post-receive hook you have to set some trigger to run migrations on your production server

Why wouldn't you use migrations as it's built into Cake. That would allow you to create them locally and then when the code is pushed via git, run the migration in production.

Related

MySQL temporarily modify database for development

For development I have a set of tables in a database I want to modify the tables during development and testing. But be able to go back to the original state i started with every time I run the application. is there a way to achieve this without actually backing up and restoring every time.
OS: Win10 Software: XAMPP MySQL(MariaDB) with PHP
You can make a backup and then restore the backup with a different name. Then point the dev/test environment of your application (you do have a different "test" copy of your application as well, I hope?) at the new copy of the database.
Each time you want to "go back to the start", just restore the backup (with the alternative name) again.
Backup/restore is scriptable, so you can automate it if you need to.
If your data is being exhausted as you say there's no other way than return to original state, but you might look for the ways to make it faster and easier. If the db is big and you look to shorten the restore time look at the suggestions here:
https://dba.stackexchange.com/questions/83125/mysql-any-way-to-import-a-huge-32-gb-sql-dump-faster
Also you can write a shell script that wraps the restore operations suggested in one of the solutions from the link above in single command.

Keeping external MySQL database updated with production data

I have a site with both a production version (something.com) and a staging version not accessible to public used for testing / verification. The issue is that a lot of the testing is specific to the data which changes rapidly on production, making the staging database outdated fairly quickly. At first I would manually use mysqldump once or twice a week to back up the production DB and reimport it on staging. Now that the data on the live site changes a lot quicker, doing this only once a week isn't enough and the whole process is a bit tedious.
I'm thinking there has to be a way to automatically do this. I'm thinking I could make the staging DB accessible from the production server and write a script that dumps the production database and overwrites staging, put that in a nightly cron job and be done with it. The database is quite big though (last backup was over 400 megs) so I was wondering if maybe there would be a way to do incremental updates? There are multiple tables and the data isn't necessarily dated (for instance, it isn't user accounts with a created_on field) so I'm not really sure if finding all the operations done during a specific time span is doable. Maybe there's a trigger that could log all the operations, which could then be run on the staging DB nightly?
If it's of any help, the database is for a Symfony 2 application.
Thanks

Erasing db entries on Wamp

I used Wamp when building a sign-in at for my company. In the table the first 40 entries are test entries. My boss asked if I could erase these entries. My concern is if I erase those will it have an effect on the other entries (customers singing in) that have been made. I only took one db class in undergrad and the professor told us that you don't change entries unless you have too. It is easy to screw up a db and can be hard to fix it. I am wondering if anyone has any advice about if it is worth erasing the first 40 entries, or could it mess up the other entries. Basically if its not broke is is worth fixing?
The front end of the app is android written in Java and XML. The back end in PHP that talks to the db.
Deleting generally entries is fine, just understand the DB structure. Do the ID's for those users get referenced elsewhere, how does the DB connect data? Basic principle is as follows. On your development environment migrate the database downstream (from production to development). On dev (and dev only!) delete the 40 entries. Test. I assume you don't have unit testing in place (write unit tests!) but you can still test all the functionality manually. Honestly this should not hurt anything.
If you have to do this on production then make a backup of the DB and test out deleting the entries. If it doesn't work then drop the DB and re-import the backup. This is not advised because if you fail to import in a decent amount of time people are going to take notice. If this is your first database dump and re-import then test it on a useless database or duplicate the production database and name it something different.
Also you should look up frameworks for your PHP if they don't currently have one. Laravel has 'Migrations' which allow you to run code to update your database. This ensures that you can write code on dev, test it, deploy to production and just run the migration (all automatable). Here is some info on it: http://laravelbook.com/laravel-migrations-managing-databases/
Good luck and remember, ALWAYS TEST ON DEV FIRST.

Bring data periodically from Linked Database in SQLServer 2008

We are developing a system on PHP with SQL Server 2008. Is a system that must work with the invoices stored in another SQL Server instance, that I have linked to my Database using sp_addlinkedserver.
The problem is that I think I need to have it loaded locally (because of performance). Si I'm thinking to make a my own "invoices" table, and two times per day somehow bring the data from the linked table to the locally stored one.
How can I program SQL to do this every X amount of time?
What approach I should use to program the importing?
It first I though to make my own script to do this, but I would preffer to have SQL Server to handle this, but that depends on your opinion :)
Thnak you!
Guillermo
NOTE: Replication sounds overkill for me.. I dont need to have real-time synconization. Neither I need to update the database, just read.
One option is to use replication to copy the data. However, it may take more administration than you're planning. Replication is great for managing a consistent and timely copy of the data.
Another option is to setup a SQL Server job that will run a SQL script to insert into your target table using a select from your linked server.
You could also use SQL Server Integration Services (SSIS). You would create a SSIS package where you would build a data flow that transfers your data from the source table to the target table. You wouldn't need a linked server for this approach, because your data sources are defined within the SSIS package. And, you can use a SQL Server job to schedule the package run times.

Sync large local DB with server DB (MySQL)

I need to weekly sync a large (3GB+ / 40+ tables) local MySQL database to a server database.
The two databases are exactly the same. The local DB is constantly updated and every week or so the server DB need to be updated with the local data. You can call it 'mirrored DB' or 'master/master' but I'm not sure if this is correct.
Right now the DB only exist locally. So:
1) First I need to copy the DB from local to server. With PHPMyAdmin export/import is impossible because of the DB size and PHPMyAdmin limits. Exporting the DB to a gzipped file and uploading it through FTP probably will break in the middle of the transfer because of connection to the server problems or because of the server file size limit. Exporting each table separately will be a pain and the size of each table will also be very big. So, what is the better solution for this?
2) After the local DB us fully uploaded to the server I need to weekly update the server DB. What the better way to doing it?
I never worked with this kind of scenario, I don't know the different ways for achieving this and I'm not precisely strong with SQL so please explain yourself as good as possible.
Thank you very much.
This article should get you started.
Basically, get Maatkit and use the sync tools in there to perform a master-master-synchronization:
mk-table-sync --synctomaster h=serverName,D=databaseName,t=tableName
You can use a DataComparer for mysql.
Customize the template synchronization, which specify the data which tables to synchronize.
Schedule a weekly update on the template.
I have 2 servers daily synchronized with dbForge Data Comparer via command line.

Categories