Best way to implement a push notification systems - php

We would like to make a realtime price change system to be able to see difference with competitor prices against our product prices.
We have no experience with Push notification / COMET system by Javascript and PHP. If you don't mind, I would like to get your experiences, suggestions about this technique. So here are my questions:
What is the best way doing like that system?
We are experienced PHP developers, so is PHP appropiate for this task?
If you know any project or solution ( open source or commercial ) that is able to do this, could you please share?

Here is the approach that we use. Javascript sends regular AJAX request to PHP file. PHP file makes database query and if noting found, just sleeps for 0.5 second (or 1 second), then makes database query again. If 30 second passed and still nothing new found in database (this is needed for giving output, before HTTP timeout occurs), then it outputs something (like noting found). Javascript starts another query immediately after it received output from the last query. Javascript always keeping track of last ID of the database table which comet is monitoring. This is used to query only database rows that is greater than last ID that we have seen.
Yes PHP is appropriate. Just remember one important thing! You need to close any sessions that are open before entering in comet loop. PHP uses session locking to prevent two threads writing into same session simultaneously. If you forget to close session all other threads will be locked (like browsing through website will be impossible).
I can advice open source PHP framework that we use. It called Stingle. It has solid and production ready Comet plugin.

try to use socket.io , no need to send request from client to server to get the data, on server side just send data using socket, client will get data.
just avoid make http request for notification since the notification is almost realtime.

Related

Send info from One browser to another in Laravel 5.2.37

I have a page where user can Add/Update records. Code is written in Laravel 5.2
Let's say I opened the that add/update page in chrome and same url in FireFox. So, if user create a new record in Chrome browser, info should be received immediately to Firefox. So, that I don't need to send ajax based reqeust to server to show complete list.
My question is, where should I start for this? Is there any blog that I can go through step by step ?
You definitely need to use WebSockets to achieve it. There a couple of good links in the tiagoRL's answer. But also, since you said you are using Laravel 5.2 I strongly recommend you to broadcast events. If you are a Laracasts user, take a look to the related videos.
Basically this is the main link:
https://laravel.com/docs/5.2/events#broadcasting-events
Also to simplify the server-side stuff, I'd go for Pusher
To have this kind of realtime messaging between two or more clients you'll need to use sockets. One option is to use AJAX pooling, but if you want to be real time, then use sockets.
With sockets you can create connection tunnels between many clients, however you will still need a server implementation. Due the this nature of persistent connections, you'll need a server architecture that can support many connections open at the same time, that's why NodeJS non-blocking IO comes in hand, using less resources than PHP would, for example.
More about this can be found here: http://www.html5rocks.com/en/tutorials/websockets/basics/
On the client side, there are websockets, which is a feature implemented in HTML5 compliant browsers.
References:
Here is a tutorial: https://blog.kaazing.com/2012/08/08/a-step-by-step-tutorial-of-building-a-simple-peer-to-peer-websocket-app-part-1/
One server implementation is available for NodeJS called Socket.IO http://socket.io/
The video here shows exactly what can be done with it: http://tutorial.kaazing.com/
Another good reference: https://developer.mozilla.org/en-US/docs/Web/API/WebSockets_API
This is a very normal thing to do and is achieved via AJAX background polling. You can do
Assumption
If user must be authenticated to see the page, in both browsers the same user is logged in.
Demand is low, so server load will be minimal.
If this is not true, look into web sockets.
Structure
Route for page (that you already have)
Route to return rows
all rows
rows starting from certain point
So the page loads and retrieves all rows, either server side or client side. Set the last row ID as a JavaScript variable. Then set an AJAX call to a timer. The AJAX call sends the last row already on the page, if there are new rows they are returned and the last row variable is updated. Alternatively you can use timestamps to track which rows are new.
There is no way of sharing session or cookies across different browsers.
Your problem is also unrelated to the programming language / framework in which your project is written.
The way I recommend you is to make periodical ajax calls to fetch newly added rows only in order to prepend or append them to the current list.
This way, you save lots of resources and time not refreshing the whole list.
Although I have never used it, if you prefer a persistent connection than socket I/O is the way you should follow.
You can check the following page for more info on ajax or socket I/O comparison.
http://www.cubrid.org/blog/cubrid-appstools/nodejs-speed-dilemma-ajax-or-socket-io/
I hope it helps you.
You can start with this:
Step by Step Guide to Installing Socket.io and Broadcasting Events with Laravel 5.1 using Laravel Homestead
This example shows you how use real time events.
The idea in your case is to send an event when a new record is saved or updated, and when the others receives this event, refresh the list of records.

Dynamic data updating

I'm making an app which accesses a database and updates the data every view seconds over a PHP script, the problem is that it currently always updates all data, I would like to know how to program something that dynamically updates data and decides what data to update and what not, so it basically keeps track of change somehow. So how would I best go along doing something like this ?
I think that there should be some where that this question has already be asked but I couldn't find it so maybe someone can show me a website where to look.
In general, you will need to user either XHR requests, web sockets, or HTTP/2 to solve this problem. Since HTTP/2 is not universally supported on the browser side, it may not work for you. Here is the outline of the solution:
Every few seconds, javascript you provide in the browser will need to poll the server for updates using an XHR request. You can use the returned data to update the screen with Javascript. If you only want to do some simple updates, like updating some numbers, you might use raw Javascript or jQuery. If your polling will result in complex screen updates or you want to move a lot of functionality into the client, you probably want to redo your client using one of the JavaScript frameworks like React or Angular.
Use web sockets (or HTTP/2) to create a persistent connection to the server, and have the server send updates to the clients as the data changes. This probably will require some code in the application to broadcast or multicast the updates. The client code would be similar to case 1, except that the client would not poll for updates.
The polling solution is easier to implement, and would be a good choice as long as you don't have too many clients sending polls at too high a rate - you can overwhelm the servers this way.

How to process massive data-sets and provide a live user experience

I am a programmer at an internet marketing company that primaraly makes tools. These tools have certian requirements:
They run in a browser and must work in all of them.
The user either uploads something (.csv) to process or they provide a URL and API calls are made to retrieve information about it.
They are moving around THOUSANDS of lines of data (think large databases). These tools literally run for hours, usually over night.
The user must be able to watch live as their information is processed and is presented to them.
Currently we are writing in PHP, MySQL and Ajax.
My question is how do I process LARGE quantities of data and provide a user experience as the tool is running. Currently I use a custom queue system that sends ajax calls and inserts rows into tables or data into divs.
This method is a huge pain in the ass and couldnt possibly be the correct method. Should I be using a templating system or is there a better way to refresh chunks of the page with A LOT of data. And I really mean a lot of data because we come close to maxing out PHP memory and is something we are always on the look for.
Also I would love to make it so these tools could run on the server by themselves. I mean upload a .csv and close the browser window and then have an email sent to the user when the tool is done.
Does anyone have any methods (programming standards) for me that are better than using .ajax calls? Thank you.
I wanted to update with some notes incase anyone has the same question. I am looking into the following to see which is the best solution:
SlickGrid / DataTables
GearMan
Web Socket
Ratchet
Node.js
These are in no particular order and the one I choose will be based on what works for my issue and what can be used by the rest of my department. I will update when I pick the golden framework.
First of all, you cannot handle big data via Ajax. To make users able to watch the processes live you can do this using web sockets. As you are experienced in PHP, I can suggest you Ratchet which is quite new.
On the other hand, to make calculations and store big data I would use NoSQL instead of MySQL
Since you're kind of pinched for time already, migrating to Node.js may not be time sensitive. It'll also help with the question of notifying users of when the results are ready as it can do browser notification push without polling. As it makes use of Javascript you might find some of your client-side code is reusable.
I think you can run what you need in the background with some kind of Queue manager. I use something similar with CakePHP and it lets me run time intensive processes in the background asynchronously, so the browser does not need to be open.
Another plus side for this is that it's scalable, as it's easy to increase the number of queue workers running.
Basically with PHP, you just need a cron job that runs every once in a while that starts a worker that checks a Queue database for pending tasks. If none are found it keeps running in a loop until one shows up.

Push data to page without checking periodically for it?

Is there any way you can push data to a page rather than checking for it periodically?
Obviously you can check for it periodically with ajax, but is there any way you can force the page to reload when a php script is executed?
Theoretically you can improve an ajax request's speed by having a table just for when the ajax function is supposed to execute (update a value in the table when the ajax function should retrieve new data from the database) but this still requires a sizable amount of memory and a mysql connection as well as still some waiting time while the query executes even when there isn't an update/you don't want to execute the ajax function that retrieves database data.
Is there any way to either make this even more efficient than querying a database and checking the table that stores the 'if updated' data OR tell the ajax function to execute from another page?
I guess node.js or HTML5 webSocket could be a viable solution as well?
Or you could store 'if updated' data in a text file? Any suggestions are welcome.
You're basically talking about notifying the client (i.e. browser) of server-side events. It really comes down to two things:
What web server are you using? (are you limited to a particular language?)
What browsers do you need to support?
Your best option is using WebSockets to do the job, anything beyond using web-sockets is a hack. Still, many "hacks" work just fine, I suggest you try Comet or AJAX long-polling.
There's a project called Atmosphere (and many more) that provide you with a solution suited towards the web server you are using and then will automatically pick the best option depending on the user's browser.
If you aren't limited by browsers and can pick your web stack then I suggest using SocketIO + nodejs. It's just my preference right now, WebSockets is still in it's infancy and things are going to get interesting once it starts to develop more. Sometimes my entire application isn't suited for nodejs, so I'll just offload the data operation to it alone.
Good luck.
Another possibility, if you can store the data in a simple format in a file, you update a file with the data and use the web server to check its timestamp.
Then the browser can poll, making HEAD requests, which will check the update times on the file to see if it needs an updated copy.
This avoids making a DB call for anything that doesn't change the data, but at the expense of keeping file system copies of important resources. It might be a good trade-off, though, if you can do this for active data, and roll them off after some time. You will need to ensure that you manage to change this on any call that updates the data.
It shares the synchronization risks of any systems with multiple copies of the same data, but it might be worth investigating if the enhanced responsiveness is worth the risks.
There was once a technology called "server push" that kept a Web server process sitting there waiting for more output from your script and forwarding it on to the client when it appeared. This was the hot new technology of 1995 and, while you can probably still do it, nobody does because it's a freakishly terrible idea.
So yeah, you can, but when you get there you'll most likely wish you hadn't.
Well you can (or will) with HTML5 Sockets.
This page has some great info about this technology:
http://www.html5rocks.com/en/tutorials/websockets/basics/

notification and messaging system in javascript and php (without the need of having to install additional software serverside)

i'm running a social network with a messaging and notification feature. each time a user sends a message or places a notification for another user, a row is inserted into a table news_updates with the details about the message or notification and all his friends are inserted into the news_seen table. (once the message is read, or the item related to the notification is opened, seen is set to 1, i'm doing this at the end of my callback function for my ajax request - i'm gathering all the newsitem_ids from all the news items, that are currently open and then i'm doing a big insert with all the newsitem_ids in it).
news_seen:
newsitem_id bigint,
user_id big int,
seen int DEFAULT '0'
at the moment, i'm running an ajax request every 3 seconds, to check the news_updates JOIN news_seen for news.
this turns out to be a huge server load now that i'm getting more and more users. i've been reading a lot about xmpp and the like and i think a push notification service would be great for my site.
the only thing is, i can't really decide on which way to go, since there are so many options.
also, i thought about creating my own system. i'm planning to do it like this:
create an xml file for each user on initial registration (and run a batch for the already registered users)
once a user sends out a news update (i have my own php function for writing them into the db), i include a small command to manipulate the xml file for the respective friends
instead of doing my 3sec ajax request, i'd establish a long connection to the xml file using jquery stream and in case changes were made since the last request, i'd do my usual ajax request that polls the data from the db.
instead of running my check_seen inside the ajax request, i'd insert all the new items into a global array, to be used by an intervaled function that tests if any item in the list is currently being viewed.
do you think this is a good idea?
To be honest I do not think I would implement your specification.
For example I would use a lighter data-model then XML. I would use JSON instead.
I would avoid touching the DISC(database) as much as possible(slow).
Doing two requests instead of one(long-polling/polling). I would try to avoid this.
I would probably try to avoid wasting CPU-time by not using interval functions, but only calling function when needed. I would probably use something like REDIS's pubsub.
Polling / Long-polling is (most of the times) a bad idea in PHP because of blocking IO. I found an interesting project named REACT which I believe does non-blocking IO(expensive). I have not tested this(the performance) myself, but this could be an option.
For XMPP you will have to install additional software. I for instance liked Prosody for it's easy installation/usage. Then you would need to have a look at BOSH. For your bosh client I would use strophe.js or JSJaC.
I would probably use something like socket.io, Faye or maybe vertx.io instead, because it would scale a lot better.

Categories