I am building a website in PHP that handles the sessions in Redis in JSON format.
This way the session can be accessed by both the PHP interpreter and a node.js server.
What I am trying to do is now to add notifications in said website; the procedure I was thinking of is the following: (just figure it as a simple friend request to simplify it all)
user A sends friend request.
PHP uses cURL to say to node.js service to send notification
user B gets a notification because he is connected to node.js via socket.io
What are the general guidelines to achieve this? Because it looks clear to me that if user A and B are in different servers, it will be impossible to scale horizontally;
Thanks in advance.
Sounds like a you could make use of Web Sockets here with a Publication / Subscription protocol, architecture.
You get Server client functionality with web sockets.
Node is a perfect choice for a websocket server, lots of small IO.
See http://en.wikipedia.org/wiki/Web_sockets
I'm wouldn't think if the shared session is required for php - node communication, just have your clients push requests through the socket and handle the reposes as needed.
I think the approach you propose sounds quite reasonable. However, instead of doing a direct request to the service, you could consider using a message queue (zeromq, rabbitmq, whatever) which would allow you to scale it more easily as you can easily add logic to the queue processing to pass the message to the correct node instance.
I managed to get 1400 concurrent connections with socket.io on a cheap VPS with no special configuration so even with no tricks it should scale quite well. In my case most of these connections were also sending and receiving data almost constantly. It could probably have handled more than that, 1400'ish was simply the max number of users I happened to get.
Though I'd worry more about getting all those users first ;)
Use Redis's built in pub-sub capability. When the PHP server gets a request, it publishes it to a channel set up for that purpose. Each of your node servers subscribes to that channel and checks if the user involved is connected to it. If so, it sends the notification via socket.io. As a bonus, you avoid an extra network connection and your overall logic is simplified.
simply setup ur database as per required then whenever a activity is made just tell ur node js to transfer the related information through redis to php and make a process and in make a response back from php to node via channel keep checking the notification from table and show
Related
I'm creating a mobile app with a server backend that will authenticate a user and continuously send them updates whilst listening for post data from the mobile app. These updates will be specific to the person, pulled from a database.
From my research it seems that I should use a websocket. I'm familiar with PHP so have tried Ratchet. I've created a simple chat script with Ratchet which queries a database onMessage and sends the data to the client.
My question is, are websockets right for this? When a server receives a connection it must query the db every 5 seconds and send updated info to the app. It must listen for messages that will change the db query. Everything in Ratchet's docs seems to be focussed on subscriptions to topics rather than treating each client individually, although I've gotten around this by using:
$client = $this->clients[$from->resourceId];
$client->send("whatever_message"):
Am I complicating things by using Ratchet? Or should I use a child process to handle each client?
I am sorry for a vague questions. I've researched as best I can but cannot establish whether I'm heading in the wrong direction! Thank you for any help.
That is a good formula. Sending post data from the apps while maintaining a socket connection is a good distribution of processes. However PHP might not be your best option for running the socket server.
The reason for this is PHP is a single threaded language which doesn't sport an elegant event system.
Take NodeJs as an alternative. It too is single threaded, however you can register events on socket servers allowing the software to run additional control processes while it waits for network activity.
This does not limit you to javascript however. Work can still be delegated to PHP processes from the NodeJs application (I use NodeJs as an example only, there are other options such as Java, Python, or good ol' native).
For moving work to PHP, you can either execute commands, or use a job server to enable synchronous and asynchronous tasks.
Here are a few resources you can combine to accomplish this:
http://nodejs.org/
http://socket.io/
http://gearman.org/
http://php.net/manual/en/book.gearman.php
And if you are using Symfony:
https://github.com/mmoreram/GearmanBundle
I am developing an picture sharing application with messages. Now want to know how to make it instant just like Viber or whatsapp. The timer of more than a minute is not a good idea the user will have to wait for a long time for the message to arrive. And HTTP request every second will have a lots of overload on the server.
I have heard about the socket programming but not sure if it is the best way. Also how to implement that?
So the question is What is the best way of implementing this kind of application?
I am using IOS and PHP as a server language.
There are two approaches:
You can use your own protocol over any kind of socket you want (most probably a TCP or SSL stream), and then you need a server at the other end that keeps these connections open and send notifications on the right connections when something happens (new message...). There are probably existing frameworks for this, though all I can think of right now are more geared towards integration with web applications rather than native ones. Note that this will only work as long as you app is active in the foreground. This also means you will get as many simultaneous connections to your server as there are apps running, so you may have a scalability issue.
or you can use Apple Push Notifications to send notifications to the app that there is something new happening. You may include all the relevant data in the payload, or you may trigger a connection by your app to fetch the rest of the data. This will work both in the background and the foreground, though slightly differently.
You may also mix both: use APNs when your app is in the background, and you own connection when it is in the foreground.
How to get a information on the client side, without requesting the database?
The example would be a simple chat application. Two clients logged to a stream, like chat room. One filling a form, sending information by ajax to the database. Tthe other one gets it without requesting, like pushed from a event listener from the database.
Is that possible?
Many thanks.
It is possible to implement non-blocking I/O with PHP in a similar vein as nodejs. see: http://reactphp.org/
I would still say PHP is probably not the right tool for the job if you're just looking to make a realtime chat app. This is what nodejs excels at.
The HTTP layer doesn't support what you wish to accomplish. You can find more information about this in this Stackoverflow page.
You might want to use node.js and socket.io. You can also try this tutorial about creating a chat system with node.js and socket.io.
You can pass information in-memory in server like Node.js.
Chat message would be uploaded using AJAX POST and distributed to other clients via active SSE connections (you'd have to keep track of them, e.g. in an array).
However, without a database you don't have persistency of messages. That's fine if it's OK to lose messages when clients disconnect, but in case of a chat users may expect to receive backlog of messages sent while they were offline.
I am curently trying to make a chat application aimed at 1000-1500 users. What currently happens is once my webpage loads I make an ajax request every second to check if there is anything new in the database. I want to know if this is the standard practise or if there is a more efficient way to be notified by the server somehow when an insertion occurs.
Use WebSockets. Or at the very least AJAX polling. Firing a request every second from 1500 clients will most likely kill your server.
Look at http://socket.io/ if you are open to introduce something new to your stack. But there are PHP websocket solutions out there if you are limited to PHP.
Your approach is a standard method called Polling. Based on the number of clients this should be perfectly fine for a server with up-to-date hard-ware (do HEAD requests via AJAX that indicate the status via HTTP status code).
The other alternative - as pointed out by Jan - is called Pushing.
Pros: Involves a lot less requests to the server.
Cons: Requires technology that may or may not be provided by your client's browser.
In case you'll opt for the second approach, take a look into Server-Sent Events (W3C draft).
This specification defines an API for opening an HTTP connection for receiving push notifications from a server in the form of DOM events. The API is designed such that it can be extended to work with other push notification schemes such as Push SMS.
I'm finishing a JQuery plugin, and I need to collect usage data of some activity while the plugin is active. This data needs to be stored remotely on my servers.
However, I'm trying to find the best approach to do this. It would be similar, I guess, to how web analytics data is collected. I see two options right now and I have outlined the basic steps below.
A. AJAX -
With this approach:
I use JQuery to setup an Ajax request in my JQuery Plugin to POST data to a server
I setup this server to capture the data that was POSTED, and then use PHP to insert into a database table
B. SOCKETS -
With this approach:
I sign up for a service like PubNub
I use the PubNub Javascript SDK in my JQuery plugin to "publish" a message to a given "channel"
I set up a dedicated server or cloud server, and using SSH to login, install a web server + database + PHP, and make sure it works ok
I create my PHP script (using the PubNub PHP SDK) to "subscribe" to the pubnub "channel" my plugin will publish messages on, and then setup the mechanism to capture the data from the message and insert into a database table
I set up supervisord to daemonize my php script, and then start the daemon since I need a long-lived PHP process to run the PubNub Subscribe feature via a server.
I prefer A because it's the simplest option for me, I can use any hosted MySQL/PHP server to get it done with minimum hassle. However, I'm concerned how performant this approach would be when the plugin is being used in thousands of different web sites, or a few very busy websites, with potentially 10 - 100 database submissions per second on my remote server.
Would it make more sense to use the B approach with sockets instead, at a potentially much higher cost because of the PubNub subscription I would require to pull this off. Also, being that I don't need asynchronous connectivity as I need to make only one request per user, I'm thinking sockets might be overkill compared to the trusty ol' HTTP request directly to my server (as opposed to a message being sent through PubNub, and then to me).
What really is the best way to pull something like this off?!
Thanks in advance.
You are correct, sockets are overkill. Even Google Analytics uses HTTP requests to send data. These requests aren't required to be timely (e.g. milliseconds don't matter) so there's no reason to open a socket.
Option A for sure. Additionally check out something like AppFog if you are really worried about tons of hits to your PHP script, they offer quite a bit for free and that could take the load off of your server if it gets to be an issue.