I want to create a static twig file whenever a operation like persist or update will be done in Admin. What i am thinking is to create a listener called GenerateStaticListener. The listener will listen the events postPersist or postUpdate and called the custom function. The Custom function will fetch the item from database and write it into the html file and then i will include the twig file(html file ) into my layout. The purpose for this is to generate a html file (i.e twig) once so that to fetch the item from database query will not hit again and again on database and give better performance.
Scenario:
The Menu section of my site contain lots of items.And fetching will be done from more than one table and around 10 queries will be fired to fetch the items. Limitation will be lack of page performance and user experience. The Menu Section is just like mashable.com
I just want to know from you guys that which is the right approach.
Any other approach???
In this context, use listeners is good solution to do that work.
But create static file isn't the best way to do. You will have some problem like write access, location etc...
Have you look after cache system ? Like varnish for file cache or memcached for stock your data into RAM.
You can check this bundle for Sf2 LeaseWeb/Memcached
With this, your controller will get data from RAM and not from disk (BDD), that is 10x faster or more.
Related
I need to modify all queries that are executed through Zend\Db before sending them to DB.
Basically it needs to add additional WHERE statement to all selects, updates and deletes and additional column and value in inserts.
I was thinking about writing my own TableGateway feature for that, the problem is that I would like to avoid being restricted to TableGateway alone and have this functionality while using Zend\Db\Adapter and TableGateway at the same time.
You can have a look at some of the events dispatched from the table gateway if that make sense in your context: http://framework.zend.com/apidoc/2.4/namespaces/Zend.Db.TableGateway.Feature.EventFeature.html
There is a preSelect event that is triggered and which you can probably listen to.
I have ended up writing custom db adapter that handles all the logic. I'll probably share it as an open source if I'll have a time to clean up the code.
I'll try to explain this as clear as possible. I'm working on a rather large project and have created a CrudController. This controller has some default actions (indexAction, createAction, archiveAction...). I've also created some corresponding views.
The edit.html.twig view will draw the form with the form_widget() function
The list.html.twig view gets a data array and a (configurable) columns array. This draws a simply table with the required columns AND some action buttons (by default an edit and archive button).
The idea is that I am now able to rapidly develop my application: I create a new Entity, Repository, FormType and lastly a Controller that extends CrudController instead of BaseController. In my EntityController, I make sure that the configuration for my CrudController is correct (entity names, pointers to the FormType...). This all works like a charm.
However, I've come to the point where I have an Entity (Project) that needs some additional actions (beside Edit and Archive), namely "Render" or "Download" (when rendering is done).
As I didn't want to completely overwrite my list view for this one entity just to add the extra actions and as I'll come across this scenario again, I decided to try and move the rendering of the actions out of the list view: I created an actions.html.twig view that would render just the actions and an actionsAction that would add the additional actions, based on the given project's status (should it show a Render button or a Download button?).
In my list.html.twig I used:
{% for row in data %}
{{ render(controller(entityControllerActionsAction, {'id': row.id})) }}
...
{% endfor %}
However, it turned out render is quite the memory hog, and as soon as I display some entities in the list, it causes a "memory exhausted" error. A page that only used 12MB before now suddenly used more than 128MB (the memory limit), which is simply unacceptable. Also loading times increased massively
I'm looking for a decent, object-oriented, DRY solution to this problem.
Is there a reason why the render() function would be such a memory and performance hog here? If I could reduce that, the problem would be solved.
Would there be a way to mimick the render() function or use something different.
I'm not looking for prebuilt code or anything, but rather advice on which direction to proceed in. Thanks for your time.
As mentioned in this Stackoverflow post "render" spawns a new request and increases memory usage therefore. Try using with a block structure instead which defines a default actions block. Overwrite this for the project entity.
I have created a crud system. Here I have a main model and then dependent details model. So their are multiple rows (some times > 100 ) entered into details model related to parent model. All operations handled through grid ( inline edit ). So create, update, delete operations are performed through single request (logical checking via DB).
Now I am considering to use DB transaction to whole operations. But I am confused, how I can implement this using same structure. I already have suggestions to move my all code to one model, So transaction can be applied there. But I am thinking, if any other approach can be used for retain separation of main and details model code.
Are you using AJAX for the changes or does it rely on manual form submissions?
You can use what's called a UnitOfWork pattern.
Save any changes that the user makes to each line of the grid but don't actually commit them to the DB. Then, place a geneic save button on the page that will cause your server to actually commit all the changes through a transaction.
You can keep a list server side of each row that the user changes. You don't need to track all the rows because if they don't change anything you don't need to save anything.
What sort of approach are you using in your Models? Do your models know about their own persistence (do you do things like $user->save()), or are you doing a more data-mapper approach ($userManager->save($userEntity))? The latter makes transaction handling a lot easier.
If you're doing the active-record type of patter ($user->save()), your best bet is probably just to manually grab your db connection and manage the transaction in your controller.
If you're doing data-mapper stuff, you have more options, up to and including doing a whole unit-of-work implementation.
As for my last comment, Moving code to parent model is my solution for now. So I think I should mark this thread answered.
I'm developing a project management tool in Symfony, right now I'm creating a module to recording the logs i.e, to capture every event like New project create, task create, task status changes, deletion of projects and task, etc.
I have a log table where I have planned to insert new rows whenever any of the above event occurs. But for doing this, I need to go into each controller and call the log model to execute the insert query. Its almost like I'm going to work on all the actions in the controller again for appending this code. is there any other way to call the model only once using some event dispatcher like class in Symfony.
Glad your are using Propel, there is a bunch of plugins and/or behavior for tracking what happend to your object. I will give you a list of what I've found:
pmPropelObjectLogBehaviorPlugin: Maintains a class changelog (the changes of each instance).
AuditableBehavior: Add ability to log activity for propel objects
propel-listener-behavior: Makes you attach listeners to propel generated objects that inform you about updates on those.
ncPropelChangeLogBehaviorPlugin: a Behavior for Propel objects that allows you to track any changes made to them.
JMSAOPBundle does exactly that.
If I may suggest, I think it's better to add custom events for each action, with this way you can extend your app with more listener without losing control. If you use doctrine you can also work with doctrine event system
I'm currently writing an EPOS integration for Magento. When an order is made, it's ID is placed in a queuefile. Once a minute, a cron looks at the queue, fires the top order to the EPOS web api and then moves the ID to either a successlist file or a faillist file, depending on the outcome.
In order to display the contents of these lists to the user, I created an admin page that reads the file (containing a serialized array), creates a varien_object containing the order ID, customer name and timestamp for each order, and then stores all of these in an instance of a Varien_Data_collection. This collection is then passed to the _prepareCollection function in the grid.php for rendering the grid view.
In 1.4.1.1, the grid renders fine, but the pagination is broken and the filtering doesn't work.
In 1.3.2.4 the grid renders but says there are 'No Records Found'.
Does anybody know what could be causing these issues, and if there is a better way of going about displaying information from a file in Magento?
The reason why you can see the entries (1.4+), but can't filter is that Magento is using the collection api to modify the object. If you are just pulling values out of a model, its no big deal, but if you are searching and filtering, Magento needs the collection to be an instance of a database. It uses Varien_Db_Select objects to make queries which resolve to raw sql, so that's not going to work on an array.
I would recommend trying to deal with the data in a different way.
It sounds like you are working with a flat file, so the obvious solution of constructing a sql query to fetch everything for you won't cut it. I would try creating an instance of Zend_Db_Table, and populating the values on the fly.
Something like this:
class Foo_Bar_Model_Resource_Success_Collection extends Mage_Core_Model_Resource_Db_Abstract
{
public function _construct()
{
//declare write adapter in config
$table = new Zend_Db_Table('my_db.my_table');
foreach($this->getEposArray() as $entry)
$table->insert($entry);
$this->_init('my_table', 'id');
}
}
Admittedly, I've never done anything quite like this, but have had the custom grid filter problem crop up on me before, and know that if you want to search, you need to have your data in a table of some sort. Check out Zend's docs on the matter. I'm pretty sure that there's a way to do this inside of Magento, but I couldn't begin to think about a solution.
My advice, store your cron job data in a database, it will make pulling the data back out much easier.