Recently I developed a full automated DB cache solution - mysql+redis. Based on sql queries it builds cache, reset and updates itself all without manual cache managing. It's also apply on multiple joins, inserts, deletes and updates. It works fantastic on production in my previous project writen on PHP from the scratch without framework.
I would like to make this plugin compatible with Laravel.
The problem is to find the best solution of injection, so it will be possible to intercept sql queries and return cache instead of sql execution.
The solution must be one universal. It must intercept, handle and return cached result for all the SQL not mater ORM or DB.
Will be glad to share final solution in open source once done.
Any ideas?
Related
So I am working on this project which basically clones the functionalities of a sofware programmed in C# (I have no information about the source code).
Problem
I am developing a web application that will do a simple update on the database (db shared with the same software mentioned above). What i want to do? I want to 'spy' the SQL Queries executed by the software and by that, acquiring the information about it to perform my own queries clone a specific feature of that sofware. How can i accomplish that?
My Solution:
My idea is, I use the software, specificaly the feature I am interested in, then i check in the database which queries did the software execute and by doing that, i gain information about the updated tables and can code my own features.
My question is, how can you 'spy' the queries did the software execute using SQL Server Management ?
You probably want to log all the queries run by the app with Sql Server Profiler.
I'm currently building an automatic tests using selenium and Behat for a PHP application that is uses MySQL for database with InnoDB engine. The database is around 50GB with a lot of data in it.
The tests are running fine, but currently I'm struggling with the cleanup before every new test run.
Since the tests are inserting data (creating users for e.g) I would like to put the DB in known state before each tests is running. A cleanup script is pretty complicated to do without side effects because of the many relations between data.
My question is if there any good practice for restoring the DB to a known state in a fast way (50GB is a lot of data) just before the Behat features are executed?
Don't test with 50GB of data. Create a database with less data specifically for testing.
You could use transactions with a "dry run" mode which would rollback all queries. That wouldn't restore a certain state of your DB, but it would allow you to interact with your data without making any changes.
I've just started using YII and managed to finish my first app. unfortunately, launch day is close and I want this app to be super fast. So far, the only way of speeding it up I've come across, is standard caching. What other ways are there to speed up my app?
First of all, read Performance Tuning in the official guide. Additionally:
Check HTTP caching.
Update your PHP. Each major version gives you a good boost.
Use redis (or at least database) for sessions (default PHP sessions are using files and are blocking).
Consider using nginx instead (or with) apache. It serves content much better.
Consider using CDN.
Tweak your database.
These are all general things that are relatively easy to do. If it's not acceptable afterwards, do not assume. Profile.
1. Following best practices
In this recipe, we will see how to configure Yii for best performances and will see some additional principles of building responsive applications. These principles are both general and Yii-related. Therefore, we will be able to apply some of these even without using Yii.
Getting ready
Install APC (http://www.php.net/manual/en/apc.installation.php)
Generate a fresh Yii application using yiic webapp
2.Speeding up sessions handling
Native session handling in PHP is fine in most cases. There are at least two possible reasons why you will want to change the way sessions are handled:
When using multiple servers, you need to have a common session storage for both servers
Default PHP sessions use files, so the maximum performance possible is limited by disk I/O
3.Using cache dependencies and chains
Yii supports many cache backends, but what really makes Yii cache flexible is the dependency and dependency chaining support. There are situations when you cannot just simply cache data for an hour because the information cached can be changed at any time.
In this recipe, we will see how to cache a whole page and still always get fresh data when it is updated. The page will be dashboard-type and will show five latest articles added and a total calculated for an account. Note that an operation cannot be edited as it was added, but an article can.
4.Profiling an application with Yii
If all of the best practices for deploying a Yii application are applied and you still do not have the performance you want, then most probably, there are some bottlenecks with the application itself. The main principle while dealing with these bottlenecks is that you should never assume anything and always test and profile the code before trying to optimize it.
If most of your app is cacheable you should try a proxy like varnish.
Go for general PHP Mysql Performance turning.
1)Memcache
Memcahced open source distributed memory object caching system it helps you to speeding up the dynamic web applications by reducing database server load.
2)MySQL Performance Tuning
3)Webserver Performance turning for PHP
Django models are really cool because you define all your models/tables right in the code, and then sync it with the database. That way when you go to update your production server, you just run the migration/sync script and you can't forget to update any tables.
The project I'm working on now though isn't Django or Python-based, it's written in PHP, and all the queries are written in straight SQL (no ORM). We've got many databases that need to be updated every time we make a change. Right now we're basically copying and pasting our SQL scripts and running them where-ever they need to be ran, or if it's a big change, we might use a script. The problem though, is that sometimes we forget to include some SQL.
If, however, we had a code-based solution, then it would automatically get checked in with our pushes, and we couldn't forget to run it. So... I'm looking for a solution that will let us define all our models in PHP, but let us continue to write straight SQL without the use of an ORM (project is 10 years old, would be too much work to implement an ORM right now). Would be nice if it could convert our existing DB into PHP models too.
Are there an existing solutions for this?
I haven't used a PHP-based system with the fantastic model support offered by Django, but this project looks promising: Django-like PHP querying interface
you can use Doctrine2 I guess. There is a support for native SQL http://www.doctrine-project.org/docs/orm/2.0/en/reference/native-sql.html
This might cost you but this is what we use for old projects.
SQLYog
http://www.databasejournal.com/features/mysql/article.php/1584401/Synchronizing-Your-MySQL-Databases-Using-a-Free-MySQL-Admin-Tool---SQLyog.htm
DBDeploy - Opensource
http://dbdeploy.com/
PHING & DBDeploy - how-to
http://www.davedevelopment.co.uk/2008/04/14/how-to-simple-database-migrations-with-phing-and-dbdeploy/
Setup is following:
Drupal project, one svn repo with trunk/qa/production-ready branches, vhosts for every branch, post-commit hook that copies files from repository to docroots.
Problem is following: Drupal website often relies not only on source code but on DB data too (node types, their settings, etc.).
I'm looking for solution to make this changes versionable. But not like 'diffing' all data in database, instead something like fixtures in unit tests.
Fixture-like scripts with SQL data and files for content that should be versionable and be applied after main post-commit hook.
Is there anything written for that purpose, or maybe it would be easy to adapt some kind of build tool (like Apache Ant) or unit testing framework. And it would be very great, if this tool know about drupal, so in scripts I can do things like variable_set(), drupal_execute().
Any ideas? Or should I start coding right now instead of asking this? :)
It sounds like you've already got some infrastructure there that you've written.
So I'd start coding! There's not anything that I'm aware of thats especially good for this at the moment. And if there is, I imagine that it would take some effort to get it going with your existing infrastructure. So starting coding seems the way to go.
My approach to this is to use sql patch files (files containing the sql statements to upgrade the db schema/data) with a version number at the start of the filename. The database then contains a table with config info in (you may already have this) that includes info on which version the database is at.
You can then take a number of approaches to automatically apply the patch. One would be a script that you call from the postcommit that checks the version the database is at, and then checks to see if the latest version you have a patch for is newer than the version the db is at, and applies it/them (in order) if so.
The db patch should always finish by updating aforementioned the version number in the config table.
This approach can be extended to include the ability to set up a new database based on a full dump file and then applying any necessary patches to it to upgrade it as well.
Did a presentation on this at a recent conference (slideshare link) -- I would STRONGLY suggest that you use a site-specific custom module whose .install file contains versioned 'update' functions that do the heavy lifting for database schema changes and settings/configuration changes.
It's definitely superior to keeping .sql files around, because Drupal will keep track of which ones have run and gives you a batch-processing mechanism for anything thaht requires long-running bulk operations on lots of data.
My approach to this is to use sql patch files (files containing the sql statements to upgrade the db schema/data) with a version number at the start of the filename.
I was thinking of file (xml or something) with needed DB structure, and tool that applies necessary changes.
And yes, after more research I agreee: it will be easier to code it than to adapt some other solutions. Though some routines from simpletest drupal module will be helpful, I think.
You might want to check out the book Refactoring Databases.
The advice I heard from one of the authors is to have a script that will upgrade the database from version to version rather than building up from scratch each time.
Previously: Drupal Source Control Strategy?