I need to modify all queries that are executed through Zend\Db before sending them to DB.
Basically it needs to add additional WHERE statement to all selects, updates and deletes and additional column and value in inserts.
I was thinking about writing my own TableGateway feature for that, the problem is that I would like to avoid being restricted to TableGateway alone and have this functionality while using Zend\Db\Adapter and TableGateway at the same time.
You can have a look at some of the events dispatched from the table gateway if that make sense in your context: http://framework.zend.com/apidoc/2.4/namespaces/Zend.Db.TableGateway.Feature.EventFeature.html
There is a preSelect event that is triggered and which you can probably listen to.
I have ended up writing custom db adapter that handles all the logic. I'll probably share it as an open source if I'll have a time to clean up the code.
Related
I'm new to Symfony and Doctrine.
I got a project where I need a method inside a Symfony service to be called with data from the DB whenever a dateTime object saved in that DB table "expires" (reaches a certain (dynamic) age).
As I'm just starting out I do not have any code yet. What I need is a start point to get me looking in the right direction as neither the life cycle callbacks nor the doctrine event listener / dispatcher structure seems to be able to solve this task.
Am I missing something important here or is it maybe just a totally wrong start to my problem which actually can't be solved by doctrine itself?
What came to my mind is a cron-job'ish structure, but that kind of implementation is not as dynamic as required but bound to specific time frames which may be not reactive enough and maybe even immensly decreases the performance in different situations.
If I'm getting your problem right: You want something that executes when a record's datetime expires.
The main problem is that you would have to call PHP based on a DB event which is not straight forward...
One possible solution can be a Symfony command that's executed periodically(using cron) and you select the expired entities and do the required actions.
So as far as I found out doctrine is really not able to do this task in the descriped way. Of course the DB can't react to a part of a record it saved without an external action triggering the lookup.
So what I will propably go with is a shell programm called at.
It actually is something like I (and katon.abel) mentioned. It is able to enter one time crons which are then executed according to the provided time (that I then do not need to save in the DB but just pass it to at).
This way I can easily create the crons via symfony, save the needed data via doctrine and call the callback method via a script triggered by at.
I've got a script that fetches data from a database using doctrine. Sometimes it needs to fetch the data for the same entity, the second time however it uses the identity map and therefor might go out of sync with the database (another process can modify the entities in the db). One solution that we tried was to set the query hint Query::HINT_REFRESH before we run the DQL query. We however would like to use it also with simple findBy(..) calls but that doesn't seem to work? We would also like to be able to set it globally per process so that all the doctrine SELECT queries that are run in that context would actually fetch the entities from the DB. We tried to set the $em->getConfiguration()->setDefaultQueryHint(Query::HINT_REFRESH, true); but again that doesn't seem to work?
Doctrine explicitly warns you that it is not meant to be used without a cache.
However if want to ignore this, then Cerad's comment (also mentioned in in this answer) sound right. If you want to do it on every query though you might look into hooking into a doctrine event, unfortunately there is no event for preLoad, only postLoad, but if you really don't care about performance you could create a postLoad listener which first gets the class and id of the loaded entity, calls clear on the entity manager and finally reloads it. Sounds very wrong to me though, I wash my hands of it :-)
I want to create a static twig file whenever a operation like persist or update will be done in Admin. What i am thinking is to create a listener called GenerateStaticListener. The listener will listen the events postPersist or postUpdate and called the custom function. The Custom function will fetch the item from database and write it into the html file and then i will include the twig file(html file ) into my layout. The purpose for this is to generate a html file (i.e twig) once so that to fetch the item from database query will not hit again and again on database and give better performance.
Scenario:
The Menu section of my site contain lots of items.And fetching will be done from more than one table and around 10 queries will be fired to fetch the items. Limitation will be lack of page performance and user experience. The Menu Section is just like mashable.com
I just want to know from you guys that which is the right approach.
Any other approach???
In this context, use listeners is good solution to do that work.
But create static file isn't the best way to do. You will have some problem like write access, location etc...
Have you look after cache system ? Like varnish for file cache or memcached for stock your data into RAM.
You can check this bundle for Sf2 LeaseWeb/Memcached
With this, your controller will get data from RAM and not from disk (BDD), that is 10x faster or more.
I have created a crud system. Here I have a main model and then dependent details model. So their are multiple rows (some times > 100 ) entered into details model related to parent model. All operations handled through grid ( inline edit ). So create, update, delete operations are performed through single request (logical checking via DB).
Now I am considering to use DB transaction to whole operations. But I am confused, how I can implement this using same structure. I already have suggestions to move my all code to one model, So transaction can be applied there. But I am thinking, if any other approach can be used for retain separation of main and details model code.
Are you using AJAX for the changes or does it rely on manual form submissions?
You can use what's called a UnitOfWork pattern.
Save any changes that the user makes to each line of the grid but don't actually commit them to the DB. Then, place a geneic save button on the page that will cause your server to actually commit all the changes through a transaction.
You can keep a list server side of each row that the user changes. You don't need to track all the rows because if they don't change anything you don't need to save anything.
What sort of approach are you using in your Models? Do your models know about their own persistence (do you do things like $user->save()), or are you doing a more data-mapper approach ($userManager->save($userEntity))? The latter makes transaction handling a lot easier.
If you're doing the active-record type of patter ($user->save()), your best bet is probably just to manually grab your db connection and manage the transaction in your controller.
If you're doing data-mapper stuff, you have more options, up to and including doing a whole unit-of-work implementation.
As for my last comment, Moving code to parent model is my solution for now. So I think I should mark this thread answered.
We are now in a situation on a project that we need manipulate multiple user entities at once. By example we will disable 50 users at once. Normally we did that in a gateway
Gateway
Query the data at once by the gateway with a query
OR
Loop through multiple users
Load entity
Manipulate data
validate
save()
But that is not the best practice solution.
The first option overrule the possibility to validate the data.
The second is not good performing, because we need iterate the entity for all users
What do you suggest? We want a fast solution, but also a save solution
Hope someone know the right solution. Thanks!
When we use the
You need an Object Realational Mapper (ORM) that has the feature to load those multiple users that match the loop at once so the load is reduced.
Similar for the save operation at the end. All changed entities should be stored at once with a unit of work.
Check the product documentation of the ORM you're using or contact it's vendor to find out which features it offers to support your development.
I would suggest the first option. Updating them all at once in a single query. But you don't state what kind of data validation you would like to do. Maybe you could start a database transaction and issue a second query to validate the result of your 'user disable query' before committing?