Can someone please tell me if socket.io is only useful if the page your clients will use is a HTML page.
I want to create a node server that can push events to my existing PHP pages.
The pages are different and not suffixed with html.
All the examples I read use Chatroom examples with Index.html etc.
I simply want to know if what I want to do is even feasible.
Many thanks in advance.
When you write a php page the output is html. I hope that answers your question
PHP and socket.io work together. The only difference between doing it with html and with PHP is the way you link the two together (the common tutorial shows a way that only works with html, but there is another way that works with both html and php).
Try something like this (from a similar answer of mine, https://stackoverflow.com/a/25189436/3842050):
var socket = require('socket.io');
var express = require('express');
var http = require('http');
var app = express();
var server = http.createServer(app);
var io = socket.listen(server);
Then remove the app.use and app.get as they are no longer needed for how this is going to be done. Then add server.listen(8000); at the end of the server.js. For dependencies, use: <script src="//cdn.socket.io/socket.io-1.0.0.js"></script>. Then, to run your server, go to it in terminal and type node server.js. Then just connect to it with your client. Also, for events, in the server, use:
io.on('connection', function (client) {
client.on('someEvent', function(someVariables){
//Do something with someVariables when the client emits 'someEvent'
io.emit('anEventToClients', someData);
});
client.on('anotherEvent', function(someMoreVariables){
//Do more things with someMoreVariables when the client emits 'anotherEvent'
io.emit('anotherEventToClients', someMoreData);
});
});
And in your client code:
socket.emit('someEvent', variables);
socket.on('anEventToClients', function(something){
//Code when anEventToClient is emitted from the server
});
Related
I want to serve (mostly) static content with Apache as that's what I'm comfortable in, but I want Node.js to handle server sent events. However, they are both on the same machine.
The problem is, if I set up my sseListener.html in my Apache server like so:
sseListener.html (Apache)
<body>
<div id="test"></div>
<script type="text/javascript">
var source = new EventSource("http://localhost:8888/test2js.js");
var test = document.getElementById("test");
source.addEventListener("message", function(e){
test.innerHTML = "";
test.innerHTML = JSON.parse(e.data).test;
}, false);
source.onopen = function(){
console.log("open: ");
}
source.onclose = function(){
console.log("close: ");
}
source.onerror = function(){
console.log("error: ");
}
</script>
</body>
I get this error in the console:
EventSource cannot load http://localhost:8888/test2js.js. No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://localhost' is therefore not allowed access.
The following is my Node server which the above script is trying to communicate with:
test2js.js (Node)
var http = require("http");
var date = new Date();
function onRequest(request, response) {
console.log("Request received.");
response.writeHead(200, {"Content-Type": "text/event-stream"});
response.write("{ \"id\": \"" + date + "\", \"data\": \"test\"}");
response.end();
}
http.createServer(onRequest).listen(8888);
console.log("Server has started.");
I realize this is because I'm trying to use an EventSource client served via Apache to communicate with a server hosted by Node, just like how an Ajax call would fail due to cross-domain issues.
Reading around the internet, I know I can just set up a proxy from Apache to Node, but then I've also read that this defeats my purpose of having Node handle concurrent connections - Apache will set up threads for its communication with Node instead of not having to do that at all.
How I understand this process to work:
But, I don't like that the requests have to route to Apache first then go to Node second. I want the requests to go straight to Node.
I realize there are two "obvious" ways to do this:
Set up my whole app in Node -> This is not a good option for me as I am much more comfortable with PHP and my JavaScript ability is not as good as my PHP
Just handle the SSE by Apache -> I'd rather not do this either. The server I'm running on isn't actually mine alone, I'm only mounting my app on it and "borrowing" the space so I'd like to implement something as lightweight as I can muster.
So thinking about the problem, I came up with a solution I'm not sure will work, but I don't know how to implement it either: make Apache fetch the client js code and link it to sseListener.html, like so:
<body>
<div id="test"></div>
<script type="text/javascript" src="path/to/node/file/system/sseClient.js"></script>
</body>
The way I'm guessing this would work, then, would be like:
In other words: save the client.js file (with EventSource) in the Node directory, grab that file with Apache and attach it somehow, serve that to the client. So when the client makes a request, it will go straight to Node rather than Apache.
So my questions:
Is this possible?
If so, how can I implement this?
Otherwise, are there any other methods to serve files to clients by Apache but offset the SSE side to Node, without making it so that Apache has to also handle the connections?
Literally, all I want my Node server to do is push data to clients that Apache will generate. It won't do anything complex at all. I just want to use its ability for concurrent connections to make a more efficient app.
I think proxy is bad idea for your aims. Try this:
var source = new EventSource("http://localhost:8888/test2js.js");
var test = document.getElementById("test");
source.onmessage = function(e){
test.innerHTML = "";
var data = JSON.parse(e.data);
test.innerHTML = data.test + data.date;
};
In your test2js.js :
function onRequest(request, response) {
console.log("Request received.");
response.writeHead(200, {"Content-Type": "text/event-stream",
"Cache-Control": "no-cache",
"Access-Control-Allow-Origin": "*"
});
response.write("data:{\"date\":"+date+",\"test\":\"some value\"}\n\n");
response.end();
}
I have been working with jquery/ajax requests. I have successfully got a ajax request which will retrieve data from a database, the problem is, that i'm constantly serving window.setInterval() to refresh this function every x amount of seconds.
How would I change this to keep the ajax request alive, so it updates the html content without having to serve multiple requests to my ajax script.
My code follows:
window.setInterval(function()
{
$(function ()
{
$.ajax({
url: 'Ajax.php'+SearchTerm, dataType: 'json', success: function(rows)
{
$('#NumberOfVotes').empty();
for (var i in rows)
{
var row = rows[i];
var QuestionID = row[0];
var Votes = row[1];
$('#NumberOfVotes')
.append(Votes);
}
}
});
});
}, 500);
A lot of this depends on how your server would be able to update it's content dynamically. That said, what you are looking for is websockets. Websockets are designed to replace the long-polling paradigm.
EDIT: Since you use mainly php for your server technology, look at Ratchet. I've heard good things about it http://socketo.me/
Here is an excellent article on using websockets with HTML
http://net.tutsplus.com/tutorials/javascript-ajax/start-using-html5-websockets-today/
.NET has a great socket library in SignalR
http://signalr.net/
There is a myriad of php documentation on sockets out there
http://php.net/manual/en/book.sockets.php
look into using web sockets - you could send the client a message anytime they need to go an look for new data - that way your not making any unnecessary requests. Try checking out pubnub -service is cheap and could handle everything you need.
You could set xhr.multipart = true and modify server code. see Multipart Responses Example Code. Alternative way is to use websockets as mentioned
You need something server side that keeps the request alive until it has something to return. This is usually called "Comet", "Long-polling" or "Push".
The principle is :
You send a request client-side via AJAX
Your server receives the request, and doesn't return a response yet. It sleeps/waits until it has something to return
A new entry in your database ! Your server now has something to return : it returns some JSON data for the waiting request
Your receive the response server side, display what you have to display, and go back to step 1 sending another request.
Now, the implementation server side will depend on the language/framework you are using.
Edit :
Some examples using PHP :
Comet and PHP
Simple Comet Implementation Using PHP and jQuery
I've been working on a project for a couple of Minecraft servers that use Bukkit. I'm trying to create a web page that contains a dynamic map of the servers' worlds, as well as a real-time event update system, where a <div> is updated as events happen on the server. To give a brief outline of how my system works, the Minecraft servers communicate events with a Node.js webserver over the same network via UDP packets, and the Node.js webserver uses these packets to build JavaScript objects containing the event info. The objects are then stored, and passed to Jade whenever the page is requested. Jade takes care of the templating.
What I want to do is update this page dynamically, so that the user doesn't have to refresh the entire page to update the list of events. What I'm trying to implement is something like the Facebook ticker, which updates every time a Facebook friend does something like posting a status, commenting on a post, or 'liking' a post.
In reading this question on SO, I've concluded that I need to use long polling in a PHP script, but I'm not sure of how to integrate PHP with a webserver written almost entirely in Node.js. How could I go about doing this?
EDIT:
I've run into a problem in the clientside code.
This is the script block:
script(src='/scripts/jadeTemplate.js')
script(src='/socket.io/socket.io.js')
script(type='text/javascript')
var socket = io.connect();
socket.on('obj', function(obj) {
var newsItem = document.createElement("item");
jade.render(newsItem, 'objTemplate', { object: obj });
$('#newsfeed').prepend(newsItem);
console.log(obj);
alert(obj);
});
And this is objTemplate.jade:
p #{object}
// That's it.
When the alert() and console.log() are placed at the top of the script, it alerts and logs, but at the bottom, they don't execute (hence, I think it's a problem with either the creation of newsItem, the jade.render(), or the prepend.
If I need to provide any more snippets or files let me know. I'm still tinkering, so I might solve it on my own, but unless I update, I still need help. :)
I'd skip PHP and take a look at socket.io. It uses websockets when possible, but it will fall back to long-polling when necessary, and the client side library is very easy to use.
Whenever your node.js server has a new object ready to go, it will push it to all connected browsers. Use ClientJade to render the object using your template (you may have to break out the relevant part of the main template into its own file), then prepend the generated dom element to your feed.
First, if it isn't this way already, you'll need to break out the relevant part of your jade template into its own file. Call it objTemplate.jade. Then use ClientJade to create a compiled template that can be run in the browser: clientjade objTemplate.jade > jadeTemplate.js. Put jadeTemplate.js in your public js directory.
In your node.js app, you'll have something like this (pseudo-codey):
var io = require('socket.io').listen(httpServer);
listenForUDPPackets(function(obj) {
saveObjSomewhere(obj);
io.sockets.emit('obj', obj);
});
Then on the client, something like this:
<script src="/js/jadeTemplate.js"></script>
<script src="/socket.io/socket.io.js"></script>
<script>
var socket = io.connect();
socket.on('obj', function(obj) {
var newsItem = document.createElement();
jade.render(newsItem, 'objTemplate', obj);
$('#newsFeed').prepend(newsItem);
});
</script>
So I'm running a crawler on my server and I'm needing to execute javascript to gain access to some of the data on my target site (target being the one I want to crawl). I had a question regarding a different approach to the problem here, but it's not needed for answering this one: [Dead]How to successfully POST to an old ASP.NET site utilizing Asynchronous Postback
My javascript is executed in the browser I call my php crawler from. The problem is that all javascript requests are targeted back at my own server rather than the target site (I get lead to links like /index.php on my own site rather than the target site).
My experience with javascript is pretty minimal and I'm not sure how I should redirect my requests to my target. Here is an example of a javascript function from the page that I'm calling:
<script type="text/javascript">
//<![CDATA[
var theForm = document.forms['aspnetForm'];
if (!theForm) {
theForm = document.aspnetForm;
}
function __doPostBack(eventTarget, eventArgument) {
if (!theForm.onsubmit || (theForm.onsubmit() != false)) {
theForm.__EVENTTARGET.value = eventTarget;
theForm.__EVENTARGUMENT.value = eventArgument;
theForm.submit();
}
}
//]]>
</script>
... and the way that I call it:
echo "<SCRIPT language='javascript'>__doPostBack('-254870369', '')</SCRIPT>";
Is there some way of aliasing the server address from my own server to the target server or doing some other kind of handy workaround that would fix this problem?
There is no need to inject javascript in the target.
You can use wireshark to study all request made by the target. Wireshark is a quite hard to master but powerful. Instead you can try the net tab of the firebug addon.
Once you know how the target send requests and receive data from their server, you can use curl to imitate the request/receiving data. You don't need any more to build crawlers.
If this not answers your question explain a little more the scenario.
I am working on using node.js's connection abilities to continiously long poll a php script and I was wondering if anyone knows the thoery behind (maybe even a code sample) linking to php in node.js? I was thinking I need a new client and I add a request for the php page and then I add a response event listener which then fires off a event function which grabs the returned data and throws it back into the server function.
I am, however, a noob and need some initial guidance since their api documentation is not the easiest to read; the English is too wordy and it's white font on a dark background...not nice.
Thanks,
var sys = require('sys'),
http = require('http'),
url = require("url"),
path = require("path"),
events = require("events");
var twitter_client = http.createClient(80, "192.168.23.128");
var tweet_emitter = new events.EventEmitter();
function get_tweets() {
var request = twitter_client.request("GET", "/?url=ajax/session", {"host": "192.168.23.128"});
request.addListener("response", function(response) {
var body = "";
response.addListener("data", function(data) {
body += data;
});
response.addListener("end", function() {
sys.puts(body);
var tweets = JSON.parse(body);
if(tweets.length > 0) {
tweet_emitter.emit("tweets", tweets);
}
});
});
request.end();
}
setInterval(get_tweets, 5000);
http.createServer(function (req, res) {
sys.puts("accessed Server");
res.writeHead(200, {'Content-Type': 'text/plain', "Access-Control-Allow-Origin": "*"});
var t = JSON.stringify({id:"test"});
var listener = tweet_emitter.addListener("tweets", function(tweets) {
res.write(tweets);
});
res.write(t);
res.end();
}).listen(8124);
sys.puts('Server running at http://127.0.0.1:8124/');
This seemed to work. Taken from a mixture of other tutorials
Was just doing some research on this topic, and wanted to drop in an answer for anyone that might be looking to do the same thing.
The comments on the OP made good points as to whether or not this sort of thing would be an efficient use of resources, or a waste of nodes event-based processing abilities. I would say that passing requests on to an Apache/PHP server would be inefficient, because you're essentially doing the same thing as having recurring AJAX request sent to the Apache server. The only difference is you now have a man-in-the-middle sending the requests.
Apache is still serving requests just the same as it always is, it is just serving them to the Node.js server rather than the client. This does not build in any efficiencies, other than taking a bit of load off the client and placing it on the server.
The correct way to do this, as #Paul mentioned, is to use some sort of PHP processor that will allow you to bypass Apache. There's some fancy methods for getting this done using FastCGI and PHP-FPM - they're pretty high level so you might have some trouble integrating them into Node.js on your own.
On the bright side, there's a node module already being built to do just this: node-php. It's pretty young ("omega-alpha-super-beta-proof-of-concept"), but may be able to handle what you're trying to do. If it can't, at least it's a good starting point, and you can fork off to make your own additions