I'm finishing a JQuery plugin, and I need to collect usage data of some activity while the plugin is active. This data needs to be stored remotely on my servers.
However, I'm trying to find the best approach to do this. It would be similar, I guess, to how web analytics data is collected. I see two options right now and I have outlined the basic steps below.
A. AJAX -
With this approach:
I use JQuery to setup an Ajax request in my JQuery Plugin to POST data to a server
I setup this server to capture the data that was POSTED, and then use PHP to insert into a database table
B. SOCKETS -
With this approach:
I sign up for a service like PubNub
I use the PubNub Javascript SDK in my JQuery plugin to "publish" a message to a given "channel"
I set up a dedicated server or cloud server, and using SSH to login, install a web server + database + PHP, and make sure it works ok
I create my PHP script (using the PubNub PHP SDK) to "subscribe" to the pubnub "channel" my plugin will publish messages on, and then setup the mechanism to capture the data from the message and insert into a database table
I set up supervisord to daemonize my php script, and then start the daemon since I need a long-lived PHP process to run the PubNub Subscribe feature via a server.
I prefer A because it's the simplest option for me, I can use any hosted MySQL/PHP server to get it done with minimum hassle. However, I'm concerned how performant this approach would be when the plugin is being used in thousands of different web sites, or a few very busy websites, with potentially 10 - 100 database submissions per second on my remote server.
Would it make more sense to use the B approach with sockets instead, at a potentially much higher cost because of the PubNub subscription I would require to pull this off. Also, being that I don't need asynchronous connectivity as I need to make only one request per user, I'm thinking sockets might be overkill compared to the trusty ol' HTTP request directly to my server (as opposed to a message being sent through PubNub, and then to me).
What really is the best way to pull something like this off?!
Thanks in advance.
You are correct, sockets are overkill. Even Google Analytics uses HTTP requests to send data. These requests aren't required to be timely (e.g. milliseconds don't matter) so there's no reason to open a socket.
Option A for sure. Additionally check out something like AppFog if you are really worried about tons of hits to your PHP script, they offer quite a bit for free and that could take the load off of your server if it gets to be an issue.
Related
I recently need to work on a project which involves having a chat. This chat must update in real-time and it is estimated to be used by more than 9000 users at the same time. I have done some researching on how to do that and came to a conclusion: Use ajax
While I researched on ajax, I found a problem:
Problem 1:
If there are a lot of users where the browser is constantly creating ajax call for a file to get the database chat content, wouldn't that put a lot of strain on the server and eventually won't it collapse?
There are a lot of libraries out there which maybe can fullfil my needs but I wanted to start from scratch, is it possible?
Take an example, whatsapp: if you open dev tools you don't see it making ajax calls but when I receive messages, it also doesn't makes the call. facebook on the other hand will get ajax call when users receive a message.
PS: I am not looking for the code, I just want a way to do it. I can code it myself. (I am using php with mysqli)
You'll need to utilize WebSockets: https://developer.mozilla.org/en-US/docs/Web/API/WebSockets_API
This allows the browser to keep a connection open with the server for constant communication, both to and from the server.
The alternative is polling, which is sending periodic ajax requests, as you described.
From the Mozilla page:
With this API, you can send messages to a server and receive
event-driven responses without having to poll the server for a reply.
I'm creating a mobile app with a server backend that will authenticate a user and continuously send them updates whilst listening for post data from the mobile app. These updates will be specific to the person, pulled from a database.
From my research it seems that I should use a websocket. I'm familiar with PHP so have tried Ratchet. I've created a simple chat script with Ratchet which queries a database onMessage and sends the data to the client.
My question is, are websockets right for this? When a server receives a connection it must query the db every 5 seconds and send updated info to the app. It must listen for messages that will change the db query. Everything in Ratchet's docs seems to be focussed on subscriptions to topics rather than treating each client individually, although I've gotten around this by using:
$client = $this->clients[$from->resourceId];
$client->send("whatever_message"):
Am I complicating things by using Ratchet? Or should I use a child process to handle each client?
I am sorry for a vague questions. I've researched as best I can but cannot establish whether I'm heading in the wrong direction! Thank you for any help.
That is a good formula. Sending post data from the apps while maintaining a socket connection is a good distribution of processes. However PHP might not be your best option for running the socket server.
The reason for this is PHP is a single threaded language which doesn't sport an elegant event system.
Take NodeJs as an alternative. It too is single threaded, however you can register events on socket servers allowing the software to run additional control processes while it waits for network activity.
This does not limit you to javascript however. Work can still be delegated to PHP processes from the NodeJs application (I use NodeJs as an example only, there are other options such as Java, Python, or good ol' native).
For moving work to PHP, you can either execute commands, or use a job server to enable synchronous and asynchronous tasks.
Here are a few resources you can combine to accomplish this:
http://nodejs.org/
http://socket.io/
http://gearman.org/
http://php.net/manual/en/book.gearman.php
And if you are using Symfony:
https://github.com/mmoreram/GearmanBundle
I'm attempting to build a notification system for a PHP application. Every time a booking is placed, we need a notification to appear within a specified user account type inside the application.
I'm using CodeIgniter 2 on a virtual dedicated host, so I'd have the option of requesting the installation of whatever is required to get the job done.
So far, I know that PHP has limited powers over how can trigger jQuery, in that it's limited to the web browser. I know that Node.js and Socket.io can do what I want, but how would that tie in with PHP, if at all?
I also know that a polling mechanism would be bad. I've considered a method that would send the row ID via PHP to a jQuery script within the confirmation page, which could — in theory — accomplish what I have in mind, but this would rely on the web browser of the customer, which is a bit weak.
I've spent a couple of days fumbling around this question, since I'm only just getting to grips with jQuery, while I know hardly anything about Node.js or Socket.io, or what they can and cannot do, or — as mentioned earlier — how they connect with PHP.
Any advice would be welcome.
With real time push methods server pushes data to the clients(channel subscribers) whenever there is an event occurs in the server. This method is advanced than pull method like polling etc, and this will be a live communication(ie, client gets live updates from server with no time. In pull method there is a time interval between each query).
Examples for real time push methods: Faye, pusher, socket.io, slanger
Most of the real time push methods are built on ruby or nodejs. So if you wish to setup your on real time server you must setup them in your server(probably ruby or nodejs) and you can communicate with that server from php using curl statements.
Also there are php libraries available for these operations.
If you like to setup slanger then you can use the pusher php library itself (may be you need to modify it slightly to use with slanger). And if you like to use faye then here is a php library wrote my self: faye php wrapper
You could store notifications in database, with corresponding timestamp.
Then, use long pooling to receive messages in jQuery, that calls PHP for notifications.
Cool example was given in this anwser:
How do I implement basic "Long Polling"?
I am building a website in PHP that handles the sessions in Redis in JSON format.
This way the session can be accessed by both the PHP interpreter and a node.js server.
What I am trying to do is now to add notifications in said website; the procedure I was thinking of is the following: (just figure it as a simple friend request to simplify it all)
user A sends friend request.
PHP uses cURL to say to node.js service to send notification
user B gets a notification because he is connected to node.js via socket.io
What are the general guidelines to achieve this? Because it looks clear to me that if user A and B are in different servers, it will be impossible to scale horizontally;
Thanks in advance.
Sounds like a you could make use of Web Sockets here with a Publication / Subscription protocol, architecture.
You get Server client functionality with web sockets.
Node is a perfect choice for a websocket server, lots of small IO.
See http://en.wikipedia.org/wiki/Web_sockets
I'm wouldn't think if the shared session is required for php - node communication, just have your clients push requests through the socket and handle the reposes as needed.
I think the approach you propose sounds quite reasonable. However, instead of doing a direct request to the service, you could consider using a message queue (zeromq, rabbitmq, whatever) which would allow you to scale it more easily as you can easily add logic to the queue processing to pass the message to the correct node instance.
I managed to get 1400 concurrent connections with socket.io on a cheap VPS with no special configuration so even with no tricks it should scale quite well. In my case most of these connections were also sending and receiving data almost constantly. It could probably have handled more than that, 1400'ish was simply the max number of users I happened to get.
Though I'd worry more about getting all those users first ;)
Use Redis's built in pub-sub capability. When the PHP server gets a request, it publishes it to a channel set up for that purpose. Each of your node servers subscribes to that channel and checks if the user involved is connected to it. If so, it sends the notification via socket.io. As a bonus, you avoid an extra network connection and your overall logic is simplified.
simply setup ur database as per required then whenever a activity is made just tell ur node js to transfer the related information through redis to php and make a process and in make a response back from php to node via channel keep checking the notification from table and show
I'm working on feeding the client realtime events using event-streams and HTML5 SSEs client-side.
But some of my events will actually come from form submissions by other clients.
What's the best method for detecting these form submissions (so as to append them to the event-stream script) ASAP (after they occur)?
So essentially, I need realtime cross-script messaging between multiple instances of different scripts instantiated by different clients, analagous to X-doc messaging in JS, but for PHP.
The best I can come up with is to repeatedly poll a subdir of /tmp for notification files, which is a terrible solution.
Often you can use MYSQL to play the role of the tmp dir you were talking about. This is more portable because they don't have to be on the same server to do this and the data is separate. However the scripts will have to manually check the mysql location to see if the other one has taken care of this. The other option is to open sockets and write back and forth in real time or to use some prebuilt tool for just this purpose which I'm pretty sure might exist.
If you want the events to be triggered near to realtime, then you need to handle them synchronously - which means running a daemon. And the simplest way to implement a daemon which can synchronize data across client connections is to use an event based server. There's a nice implementation of the latter using php here - there are plenty of examples of how to daemonize a PHP process on the interent. Then just open a blocking connection to the server from your PHP code / access this via comet.