We have an web application built in PHP Laravel, which exposes a bunch of schema objects via JSON API calls. We want to tie changes in our schema to AngularJS in such a way that when the database updates, the AngularJS model (and subsequently the view) also updates, in real-time.
In terms of the database, it can be anything, such as mySQL, SQL Server, etc. There's a couple of ways we're thinking about this:
mySQL commits fire some sort of event at Laravel, which then fires a call to all relevant/listening models/views in AngularJS.
Before any data is changed (edited/added) - Laravel fires an event to AngularJS. In other words, after any successful DB commit, another "thing" is done to notify.
The second seems the obvious, clean way of doing this - since the database is not involved lower down the stack. Is there any better way of doing this?
This question is related:
How to implement automatic view update as soon as there is change in database in AngularJs?
but I don't quite understand the concept of a "room" in the answer.
What (if any) is the best way to efficiently tie database commits (pushing) to the AngularJS view (to render changes)? We want to avoid polling a JSON API for changes every second, of course.
I've also had a similar requirements on one of my projects. We solved it with using node.js and sockjs. Flow is like this:
There is a node.js + SockJS server to which all clients connect.
When db is updated, laravel issues a command to node.js via http (redis also a posibility)
Node.js broadcasts the event to all interested clients (this depends upon your business logic)
Either the client reloads the data required or if message is small enough it can be included in node.js broadcast.
Hope this helps. There is no clean way to do this without using other technologies (node.js / web socket / SSE etc). Much of it depends up on the configuration your clients will be using as well.
Related
What approach, mechanisms (& probably code) one should apply to fully implement Model-to-Views data update (transfer) on Model-State-Change event with pure PHP?
If I'm not mistaken, MVC pattern states an implicit requirement for data to be sent from Model layer to all active Views, specifying that "View is updated on Model change". (otherwise, it doesn't make any sense, as users, working with same source would see its data non-runtime and absolutely disconnected from reality)
But PHP is a scripting PL, so it's limited to "connection threads" via processes & it's lifetime is limited to request-response cycle (as tereško kindly noted).
Thus, one has to solve couple issues:
Client must have a live tunnel connection to server (Server Sent Events),
Server must be able to push data to client (flush(), ob_flush()),
Model-State-Change event must be raised & related data packed for transfer,
(?) Data must be sent to all active clients (connected to same exact resource/URL) together, not just one currently working with it's own processes & instance of ModelClass.php file...
UPDATE 1: So, it seems that "simultaneous" interaction with multiple users with PHP involves implementation of WEB Server over sockets of some sort, independent of NGINX and others.... Making its core non-blocking I/O, storing connections & "simply" looping over connections, serving data....
Thus, if I'm not mistaken the easiest way is, still, to go and get some ready solution like Ratchet, be it a 'concurrency framework' or WEB server on sockets...
Too much overhead for a couple of messages a day, though...
AJAX short polling seems to be quite a solution for this dilemma....
Is simultaneous updating multiple clients easier with some different backend than PHP, I wonder?.. Look at C# - it's event-based, not limited to "connection threads" and to query-reply life cycle, if I remember correctly... But it's still WEB (over same HTTP?)...
I would like to realtime update my map view when someone adds a post without refreshing the page in Ushahidi(opensource website project). Its backend uses php (Kohana framework) and MySQL database, and front-end uses AngularJS. There might be not built-in mechanism about this. So I want to find out which materials are suitable.
First, I need to use event driven lib, such as this, to detect database changes. Second, how to detect database changes. Third, use web socket based lib, such as Socket.io, to realtime push data to the front-end to display.
I have read echo feature in Laravel php framework which seems to fit this, but not in Kohana. Is there any better approach in this system or some good references? Thanks
Let's start with a simpler matter: sending notifications.
PHP is not suitable as a WebSocket server (though you can do this). It is better to write it in Node and notify it from PHP. WS server will continue to send notifications. You can also use a commercial service that will do it.
Detection of changes in the database: If the application has been correctly written (ie it uses ORM, not raw INSERT/UPDATE/DELETE/REPLACE queries) then it is enough to modify the save method using https://docs.koseven.ga/guide-api/ORM#changed data.
In case there are several raw queries you can handle them manually.
Otherwise, you can use triggers. Mark record as updated (eg: insert data to log/notification table). In the WebSocket server, you can periodically check this data and notify recipients.
Right now I'm using long-polling for my web chat app. The reason why I don't use sockets is because of legacy browser compatibility issues. I'm also refraining from NodeJS because my server don't allow long processes running (it's a shared server). However, I want to improve my chat response time without continuously sending AJAX signals to the database.
I thought maybe if we can make the mysql database trigger a response to the server, that would be great. Is it possible, or nah? Why/How? Thanks!
Unfortunately, not. The database is passive/reactive. It can perform actions, such as sending emails, but those are exceptional cases and not the norm.
If you want to make something happen when a specific thing is added to the database then the best place to add it is to the data layer which is inserting the data into the database.
Databases are not involved directly with long pulls, since they don't intercept the web request. Try moving your logic to the middle tier and having a caching mechanism there to interact with your clients.
Are you really seeing any performance problems that would warrant you wanting to change the polling mechanism? The database isn't able to contact a client, its job is to be a data store.
IMHO, applications not suffering from performance problems polling is a great approach. Anything else you are going to try is most likely going to add an extra layer to the application that you probably don't want to have to manage now.
Sorry for the "answer" but I gotta build the points to be able to comment :)
In the contest of a web server:
In order to avoid re-querying (using find), one could try and keep between requests the cursor reference returned by find. The Cursor object is a complex object storing for example socket connections. How to store such an object to avoid re-querying on subsequent web requests? I am working in Node.js but any advice is helpful (regardless of the language: rails, C#, Java, PHP).
(I am using persistent sessions)
Facebook and Twitter's stream features are more complex than a simple query to a db. Systems like this tend to have two major backend components in their architecture, serving you data: slow and fast.
1) The first backend system is your database, accessed via a query to get a page of results from the stream (being someone's twitter feed or their fb feed). When you page to the bottom or click 'more results' it will just increment the page variable and query against the API for that page of your current stream.
2) The 2nd is a completely separate system that is sending realtime updates to your page via websockets or paging against an API call. This is the 'fast' part of your architecture. This is probably not coming from a database, but a queue somewhere. From this queue, handlers are sending your data to your page, which is a subscriber.
Systems are designed like this because, to scale enormously, you can't depend on your db being updated in real time. It's done in big batches. So, you run a very small subset of that data through the fast part of your architecture, understanding that the way the user gets it from the 'fast' backend may not look exactly how it will eventually look in the 'slow' backend, but it's close enough.
So... moral of the story:
You don't want to persist your db cursor. You want to think 1) do I need updates to be realtime 2) and if so, how can I architect my system so that a first call gets me most of my data and a 2nd call/mechanism can keep it up to date.
I'm using Flex 4. Have PHP backend and mysql database with one table consisting of multiple rows.
I take the raw data from the result event and decode it using JSON. I then dump the data into an ArrayCollection that I use as my datagrid's data provider.
My question is how can I tell when someone inserts a new row into the mysql table so that I can automatically refresh my ArrayCollection, thus seamlessly updating my datagrid one element at a time? Right now, it's just a one time call and the connection is closed. If someone inserts a new row into the database my program doesn't recognize, unless I restart it. I'd like to auto-update the AC whenever a single new row is inserted into the mysql database. Is there a way I can "listen" for this change?
Ah, you've stumbled upon the age old question of the web realm: Polling or Pushing?
Polling means that you ping the server every few seconds or minutes to check if there's any data that has changed. If there is, your server sends you the new changed data which up update appropriately on your front-end. The 'protocol' on how to interpret which piece of data needs to be updated is totally up to you since there's no real standard (since data in itself can be very different from system to system). Polling is still in use today in many systems that do not need crucial 'live' information and since it doesn't need a consistent connection, it's particularly good for iffy internet like mobile. Plus, everything is an HTTP request, so there's no enterprise firewall that can block it.
Pushing means that you have a constant connection between your front end and back end which normally goes over RTMPT (HTTP UDP protocol to circumvent enterprise firewalls, but not 100%). It's great if you need real time data (like say financial data) to be delivered to you quickly. However, the user needs a consistent internet connection and you need to have a server capable of dealing with the amount of connections and sessions management. Normally, most people end up using Java since there are many libraries to handle pushing (BlazeDS, GRaniteDS, Livecycle, Wowza, etc).
Since you're using PHP, you'll probably need to use polling as your solution, but need to implement it yourself. I'm sure there are libraries out there to help you out though.
No, there is no automatic way to do that. But you can regularly 'ping' your server and ask for new rows. Use
setInterval(myFunctionName, timeToWaitBetweenEachCallInMilliseconds);
to do that.