(Make this CW if needed)
We are two developers working on a web application based (PHP5, ZF, Doctrine, MySQL5). We are working each with a local webserver and a local database. The database schema is defined in a YAML file.
What's the best way to keep our database schemas in sync?
Here's how we do it: Whenever developer "A" makes a change, he generates a migration class. Then he commits the migration file developer "B" executes the migration class.
But creating a migration class on every db change is a rather tedious process.
Do you have a better solution?
I don't know how you do in the Zend Framework with Doctrine. Here's how I would do it in Symfony with Propel. Though the exact procedure may vary, but the underlying concept is the same.
I have unit tests on my DAL.
Whenever the schema changes, we check in the yml and the generated ORM code ( You do have a source control, don't you). I set the check-in to auto-mode, meaning I will get all the check-in instantly.
If the schema changes don't affect my thing, then I would just ignore the changes. But if the schema changes break my thing, then I will rebuild my form, ORM classes and whatnot by using symfony propel build command. Rebuilding those infrastructures is just a single command line thing, so there is no problem for me.
Finally, after rebuilding, I will run my unit tests, to make sure everything is OK. If not, I better get them fixed!
I see that this question is already answered but Doctrine can migrate your databases for you without having to blow away the whole thing. We use this for every schema change where one developer changes his/her local yaml file, generates new models locally, creates a migration using Doctrine, runs that migration locally to change the db, then checks in both the new yaml file and the migration. Then other developers check out the changed yaml file and migration, then they generate new models and run the migration to sync their db. Deploying the code to our QA and production environments is pretty much the same process.
More information on Doctrine migrations can be found on the Doctrine site.
Related
I created a separate integration directory/project (let's call it as project-Interface) in Laravel and I created new DB (let's call it as DB-A) and 3 new tables (table-a,table-b,table-c) through artisan migrate command and models related to that as well in the solution.
Now there is a different project (let's call it as project-Web) in Laravel and it has its own database (let's call it as DB-B) and has its own tables and models created (table-x,table-y,table-z).
I need to do the following:
Can I use DB-B of table-x from project-Interface to save some
details in it?
Can I create the tables table-a,table-b and table-c from project-Interface through migrate command in DB-B?
Can I add some more columns in table-x from project-Interface and use it?
Can I use the existing DB connection of DB-B from the env file of project-Web and use it in env file of Project-Interface?
If yes, will it create a new table in DB-B of project-Web, if I created it from project-Interface? Will the changes reflect there also? Also, if I created new columns in table-x from project-Interface, will that column be reflected in project-Web?
So the purpose here is that the DB should be the same and it should be common wherein it should contain all the tables in it and whatever changes I am making from project-Interface or project-Web it should use the DB-B.
Some answers:
It seems that you are asking if you can write to database B in project A. The short answer is yes, if you can connect to it then you can write to it. The database does not know that you are connecting from a different Laravel project or a project sitting in a different repo - it only understands database users.
You can also run migrations belonging to another project, but based on the information so far, I would advise against this. I will explain more below.
Adding columns to a different database sounds like a restatement of question 2, and my answer would be the same.
The problem you have with mixing migrations is that the second project (project-Web) is not able to create all its tables. This will add complexity to:
creating a new development environment (e.g. when adding a new team member to the project)
creating automated database tests that need to tear down and recreate the database from scratch for every run
creating a new staging/production environment
Given that you want some communication between the two projects, I wonder if an HTTP API would be in order? This will be much easier to manage:
All your tables are stored in one project, and all your logic is stored here too. This will simplify migrations enormously (and thus the creation of all environments I mentioned earlier).
You can offer a web service in one project to offer read/write services to the other project. This is much easier to test from an automation standpoint: in the API project you can run API/HTTP tests, and in the client project, you can use mocks.
This approach will allow you to test the projects in isolation in a clean manner, and then do automated integration testing if you wish (e.g. by deploying both projects to a test server using CI/CD, running the migrations and any seed files, and then running tests on one that causes both projects to work together in the designed fashion).
I'm using database migrations with PHP symfony framework and I noticed that there are a lot of files, and when I build my project and every time many files (migrations) are executed.
What is best practice to manage migrations? Could I delete them and create only one database dump file, for database initiation?
I'm assuming you're using Doctrine Migrations to create these files. On dev you can always regenerate the database using the doctrine:schema:update command.
When setting up a new instance you can also use the doctrine:schema:create command which will also create the latest table definitions for you.
That brings us to your question; do you need to keep all the migrations? As long as you know that the migrations have been executed on outdated instances you can safely archive (my preferred option) or delete them as they will never be called again.
I would prefer to use migrations on production instead of doctrine:schema:update, because you can have some problems while updating process. It is always a better option IMO to keep an eye on what happens with the DB.
On development env you can easily use doctrine:schema:update command but you should use migrations on production. In some cases the command may brake the migrations (it happened to me a few times).
I'm about to start building an API for an existing app with the DB already in production. Functionality will slowly be ported to the API in the future and the app will become more "API-centric".
One of the main starting points is to adopt migrations and a build process. I have reservations on the best way to create migrations for an existing schema without breaking production when they are executed.
As we would like to move quickly with porting functionality over to the API, we ideally want to recreate our current schema as part of our build process and get some core unit tests in place - as opposed to just creating migrations for future changes.
This is where I become unsure as to the best place to start.
What is the best approach for a task like this?
Could the current schema be imported as our first migration?
Could this initial migration be wrapped in something like:
if ( App::environment() !== 'production' ) to ensure it isn't executed in a production environment?
Is it ok to exclude a migration for a particular environment or could this cause problems?
Is there maybe another approach or something stupidly simple I'm missing? :)
I created a tool not to long ago that will generate all the migrations for your current database schema. It also adds the newly created migrations to the migrations table, as the tables already exist. You can get it here: https://github.com/Xethron/migrations-generator
Secondly, I use the following line of code in my DatabaseSeeder, but you can add it to any function you wish to disable in production:
if (App::environment() === 'production') {
exit('Don\'t be stupid, this is a production server!!!');
}
It shouldn't be a problem if you stop execution by throwing an error or as I do above. If you don't, Laravel will believe the changes happened successfully and remove them from the migrations table, and cause errors when running your migrations. The only exception is if you exclude both the up and down code (But I can't see why you would want to do that)
Hope this helps.
I wouldn't suggest running migrations in production environment at all, but if you must then make a backup copy of the database first and also create a copy of the database locally and do the migration there then test to make sure it all works properly.
You could create your first migration and manually add all the schema layout to it. But I would actually do this in a few migrations to have each table as its own migration.
Since migrations are run from the CLI using artisan you can pass the environment and the database you want the migration to be run on:
artisan migrate --database="connectionName" --env="local"
There is no issue running migration on a particular environment besides what I stated above for production.
Remember to add all existing schema layouts from step 1 (the migration file name excluding the extension e.g. 2014_03_25_143340_AddCountriesTable) to the migrations table in the database, otherwise running the command in step 2 will throw errors about table already exist.
Hope this helps.
I am building a complex application but i want to know that is it safe to use doctrine migrations in production.
For eg. the site has been used for 1 year and company wants to add extra attribute to user table.
So do i straight way chnage by going in database or through doctrine migrations
This is one of the intended uses (and benefits) of migrations - to automate the changes to your database quickly and accurately. Yes, they can and in most cases should be used to update your database in production.
Edit: The Symfony2 documentation also explains clearly this is one of the purposes of migrations.
Of course, the end goal of writing migrations is to be able to use them to reliably update your database structure when you deploy your application. By running the migrations locally (or on a beta server), you can ensure that the migrations work as you expect.
...
Yes it would be safe.
I would just add an extra attribute in the User entity. Then run the doctrine:generate:entities command. That should generate the get/set methods. Then just update your database using doctrine:schema:update --force. That should add it into your database table.
What are the current best practices for testing database interaction with Symfony2? I have a simple CRUD setup and i want to make sure my testing is OK. Right now, i have 4 tests, each one making sure that create, update, delete and list actions are occurring ok.
I have two magic methods, __construct and __destruct, on my test case. Inside them, i call exec() with 'php app/console ...' in order to create the database, create the schema and later on drop the database. However, this is SLOW as hell and it happens all the time when i have more than one test case.
How should i proceed when it comes to database testing and isolating such tests?
I think it's best to always start clean to be sure that tests are fully isolated. To do that I'm simply building the database structure before each test and than I'm filling it with fixtures needed for a given test.
Note that I'm only building needed database tables and I'm only inserting needed fixtures. It's a bit faster than loading a big database dump. It's also more clean as tests don't share fixtures (which makes them easier to maintain).
I have a base test case class called KernelAwareTest which helps me in building the schema. You can find it on gist: https://gist.github.com/1319290
setUp() boots the Symfony kernel and stores a reference to it in a class property (together with references to the DIC and the entity manager). Also a call to generateSchema() is made to generate the database schema (it uses Schema Tool from Doctrine).
By default it generates the database structure for all entities known to the entity manager. You can change this behaviour in your test class by overriding the getMetadatas() method.
P.S.: I tried using in memory database (sqlite) but it wasn't perfect. Anyway it's probably better to run tests against the database you use on production.
Database testing is always slow as you need to create/drop your schema before/after each test. To avoid unnecessary operations, you could:
create schema in the 'setUpBeforeClass' method;
ensure your 4 tests are launched in one thread with the '#depend' annotation;
drop schema in the 'tearDownAfterClass' method.
The schema will be created/droped only once for your tests case.
You can also setup doctrine to use an inmemory sqlite database (which is very fast):
doctrine:
dbal:
driver: pdo_sqlite
path: :memory:
memory: true
Anyway, '_construct' and '_destruct' should never be used in phpunit test cases, instead you should use 'setUp' and 'tearDown'.
The question is pretty old but still valid today so here is my experience and how I handle it today on my Symfony projects.
I started of using an SQLite in-memory database for my tests and I rebuild the db schema + inserted fixtures before each single test case. This had two major drawbacks:
It was still way too slow :(
On dev & prod I used Mysql which soon became a problem because SQLite simply does not have all the features needed and sometimes behaves differently
Using MSQL for the tests and rebuilding the schema + inserting fixtures before each test was simply too slow. So I was looking for alternatives...
I stumbled across this blog post: http://alexandre-salome.fr/blog/Symfony2-Isolation-Of-Tests
The guy here suggests to run tests inside active database transactions and simply roll back any changes after every single test.
I took this idea and created a bundle for it: https://github.com/dmaicher/doctrine-test-bundle
The setup of the bundle is really easy and does not require changing any existing php test classes. Internally it changes the doctrine config to use custom database connections + driver.
With this bundle you can simply create your database schema + insert fixtures ONCE BEFORE running the whole testsuite (I prefer doing this in a custom phpunit bootstrap file). Using the phpunit listener all tests will run inside database transactions.
I've been using this bundle since a while already and for me it works out perfectly using SQLite, MySQL or PostgreSQL.
Since a while its also used on the symfony-demo project.
testing on local machine is pain in the ... ,so i'm started to using ci system buddy.works (there is free stand-alone version) , and for this i neeeded to resolve this issue on my own.
result is :
all tests works
tests are runing on production sql data
tests are running in separation (not in dev or production) - so i can do
anything that i want with database
all pushes to git are tested and i have notification if something is broken
all pushes/pull request to deploy branch are automatic uploaded to production
This is my solution :
second parameters.yml in config with configuration for test
on production i'm making daily sqldump
on starting test on ci this sql backup is copied via scp to test machine
to prepare all this i'm using robo.li ( my config is below)
/**
* This is project's console commands configuration for Robo task runner.
*
* #see http://robo.li/
*/
class RoboFile extends \Robo\Tasks
{
function loadDb(){
$this->taskExecStack()
->stopOnFail()
->exec(" mysql -h mariadb -u root -pqwerty -e 'create database test' ")
->exec(" mysql -h mariadb -u root -pqwerty test < test.sql ")
->run();
}
function prepareDb(){
$this->taskExecStack()
->stopOnFail()
->exec("cp app/config/parameters-test.yml app/config/parameters.yml")
->run();
$this->taskReplaceInFile('app/config/parameters.yml')
->from('database_host: 127.0.0.1')
->to("database_host: 'mariadb'")
->run();
$this->taskReplaceInFile('app/config/parameters.yml')
->from('database_user: dbuser')
->to("database_user: 'root'")
->run();
$this->taskReplaceInFile('app/config/parameters.yml')
->from('database_password: 123')
->to("database_password: 'qwerty'")
->run();
}
}
i hope it help you to create idea how to organize all this . Using stand alone ci is difficult to setup , but it's really good idea