Sending messages from PHP to Node.js - php

How to send messages from php to node.js? I have a linux server running php and node.js.
When a user completes a transaction (via php), I'd like send a message from php to node.js. Node will then update the client via a socket connection.
What's a good way to send a small amount of data from php to node.js without defeating the performance of node.js?

The suggestion seems to be to talk to node through the HTTP interface, just as any other client does. You can talk to node via HTTP using cURL in php
See: http://groups.google.com/group/socket_io/browse_thread/thread/74a76896d2b72ccc/216933a076ac2595?pli=1
In particular, see this post from Matt Pardee
I faced a similar problem with wanting to keep users informed of a new
note added on to a bug, and similar notifications that could really
only be effectively sent from PHP to my Node server. What I did
follows (apologies if this gets all garbled and unformatted in
sending, if it does, I'd be happy to paste the code somewhere else):
First, you'll need to use cURL from PHP. I wrote a function for my
class like this:
function notifyNode($type, $project_id, $from_user, $data) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://127.0.0.1');
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_HTTPHEADER, array('Expect:'));
curl_setopt($ch, CURLOPT_PORT, 8001);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 2);
curl_setopt($ch, CURLOPT_POST, true);
$pf = array('f' => $type, 'pid' => $project_id, 'user_from' => $from_user,
'data' => array());
foreach($data as $k => $v) {
$pf['data'][$k] = $v;
}
curl_setopt($ch, CURLOPT_POSTFIELDS, http_build_query($pf));
curl_exec($ch);
curl_close($ch);
}
You'll notice that I send the cURL request on the same server since
both PHP and NodeJS are running there, your mileage may vary. The port
I set this code to connect to is 8001 (this is the port my Node server
is running on, and the port the socket.io server connects to). This
sends a HTTP POST request with the post field encoded. This is all
pretty standard cURL stuff.
In your Node app you probably have something like:
var server = http.createServer(function(req, res) {});
server.listen(8001);
var io = io.listen(server, { transports: ['websocket', 'flashsocket', 'xhr-polling'] });
...
well what we'll do here is expand on the http.createServer part, to
listen for connections coming from our local host ("127.0.0.1"). The
createServer code then becomes:
var server = http.createServer(function(req, res) {
// Check for notices from PHP
if(res.socket.remoteAddress == '127.0.0.1') {
if(req.method == 'POST') {
// The server is trying to send us an activity message
var form = new formidable.IncomingForm();
form.parse(req, function(err, fields, files) {
res.writeHead(200, [[ "Content-Type", "text/plain"]
, ["Content-Length", 0]
]);
res.write('');
res.end();
//sys.puts(sys.inspect({fields: fields}, true, 4));
handleServerNotice(fields);
});
}
}
});
From there you can implement your handleServerNotice function..
function handleServerNotice(data) {
...
}
etc etc. I haven't tested this in a while, and in fact that code block
was commented out on my node server, so I hope what I've pasted here
works - in general this concept is proven and I think it'll work for
you. Anyway just wanted to be sure you knew it's been a few months so
I'm not sure exactly why I commented out. The code I wrote took a
little research -- like setting the 'Expect:' header in cURL -- and I
was pretty excited when it finally worked. Let me know if you need any
additional help.
Best,
Matt Pardee

A bit late, but you could communicate with your node client using the Redis Pub/Sub mechanism in a very simple and effective way. All you need to do is install redis on your server.
On the php side, initialize Redis then publish a message
$purchase_info = json_encode(array('user_id' =>$user_id,
'purchase_information'=>array('item'=>'book','price'=>'2$'));
$this->redis->publish('transaction_completed', $purchase_info);
On the node.js side
var redis = require('redis');
var purchase_listener = redis.createClient();
purchase_listener.subscribe('transaction_completed');
purchase_listener.on('message', function(channel, message){
var purchase_data = JSON.parse(message);
user_id = purchase_data.user_id;
purchase_info = purchase_data.purchase_information;
// Process the data
// And send confirmation to your client via a socket connection
})
Is this scalable ? (In response to #mohan-singh)
When talking about scalability you need to think about your infrastructure's architecture and your particular needs but here's a quick answer :
I've been using a variant of this mechanism on a high traffic real-time application without problems but here's what you should be careful about:
Redis PUB/SUB is not a queuing system, that means if your node process goes down all the messages that were sent WHILE it is down will be lost.
If you have more than 1 subscriber to the publisher they will all receive the same message and handle it, be careful about that if you have more than a node process listening to the same redis db handling your real-time logic (There are easy ways to go around this though)
The nice thing about this system is that you don't need to add anything to your existing infrastructure and can get started immediately, it's very fast and it behaves exactly like an HTTP server.
Here are your alternatives for more scalable options:
Using a self-hosted fast messaging queue server (ActiveMQ, RabbitMQ, beanstalkd ... ) server to handle your messaging logic between php and node, these tend to be fast but as the load increases you lose a bit of performance, and have to maintain/scale your messaging servers, and take care of duplication across regions which is not an easy and enjoyable thing (depending on what you enjoy doing).
Using a hosted messaging queue server (IronMQ, SQS...) Some of these(IronMQ) are pretty fast and will be great for your use case but introduce some (minor) complexity to your codebase.
Building a messaging queue with Redis with clustered node servers : https://davidmarquis.wordpress.com/2013/01/03/reliable-delivery-message-queues-with-redis/
Using HTTP inside a VPN to communicate with node servers. Once you see your traffic spiking you will only need to load-balance your node servers and add as much stateless servers as you need and send POST messages to that load balancer.
The point of this lengthy edit is that there is no such thing as a magic scalable solution, you need to weigh your options and see which one works the best for your use case.
In my opinion, if you're starting to build your first iteration now, choose any option that you're comfortable with, write very clean code and when you start scaling it will be very easy to change, this is what I've done :)

I found such problem can be solved simply by using the Express framework.
Let's suppose php sends a json message to the node server and the server replies with ok.
In app.js
var app = require('express')();
var http = require('http').Server(app);
var io = require('socket.io')(http);
var bodyParser = require('body-parser')
app.use(bodyParser.json());
app.post('/phpcallback', function(req, res) {
var content = req.body;
console.log('message received from php: ' + content.msg);
//to-do: forward the message to the connected nodes.
res.end('ok');
});
http.listen(8080, function(){
var addr = http.address();
console.log('app listening on ' + addr.address + ':' + addr.port);
});
In test.php
<?php
$data = array("name" => "Robot", "msg" => "Hi guys, I'm a PHP bot !");
$data_string = json_encode($data);
$ch = curl_init('http://localhost:8080/phpcallback');
curl_setopt($ch, CURLOPT_CUSTOMREQUEST, "POST");
curl_setopt($ch, CURLOPT_POSTFIELDS, $data_string);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_HTTPHEADER, array(
'Content-Type: application/json',
'Content-Length: ' . strlen($data_string))
);
echo curl_exec($ch)."\n";
curl_close($ch);
?>
Here we have also a more detailed example where a php script could drop a message to the users of a specific chat room.
https://github.com/lteu/chat
My personal impression about Redis approach: Cumbersome. You need to run Apache, nodeJS and Redis, three servers together in the same. And PubSub mechanism is quite different from the emit of socket.io, so you need to see if it is compatible with your existing code.

I was looking for a really simple way to get PHP to send a socket.io message to clients.
This doesn't require any additional PHP libraries - it just uses sockets.
Instead of trying to connect to the websocket interface like so many other solutions, just connect to the node.js server and use .on('data') to receive the message.
Then, socket.io can forward it along to clients.
Detect a connection from your PHP server in Node.js like this:
//You might have something like this - just included to show object setup
var app = express();
var server = http.createServer(app);
var io = require('socket.io').listen(server);
server.on("connection", function(s) {
//If connection is from our server (localhost)
if(s.remoteAddress == "::ffff:127.0.0.1") {
s.on('data', function(buf) {
var js = JSON.parse(buf);
io.emit(js.msg,js.data); //Send the msg to socket.io clients
});
}
});
Here's the incredibly simple php code - I wrapped it in a function - you may come up with something better.
Note that 8080 is the port to my Node.js server - you may want to change.
function sio_message($message, $data) {
$socket = socket_create(AF_INET, SOCK_STREAM, SOL_TCP);
$result = socket_connect($socket, '127.0.0.1', 8080);
if(!$result) {
die('cannot connect '.socket_strerror(socket_last_error()).PHP_EOL);
}
$bytes = socket_write($socket, json_encode(Array("msg" => $message, "data" => $data)));
socket_close($socket);
}
You can use it like this:
sio_message("chat message","Hello from PHP!");
You can also send arrays which are converted to json and passed along to clients.
sio_message("DataUpdate",Array("Data1" => "something", "Data2" => "something else"));
This is a useful way to "trust" that your clients are getting legitimate messages from the server.
You can also have PHP pass along database updates without having hundreds of clients query the database.
I wish I'd found this sooner - hope this helps! 😉

We do it by using message queue. There are a lot of solutions like radis (https://github.com/mranney/node_redis) or 0mq (http://zeromq.org/). It allows to send a message to subscribers (for example from php to nodejs).

Step 1. Get the PHP Emitter:
https://github.com/rase-/socket.io-php-emitter
$redis = new \Redis(); // Using the Redis extension provided client
$redis->connect('127.0.0.1', '6379');
$emitter = new SocketIO\Emitter($redis);
$emitter->emit('new question', '<b>h<br/>tml</b>');
add this to your index.js:
var redis = require('socket.io-redis');
io.adapter(redis({ host: 'localhost', port: 6379 }));
io.on('connection', function(socket){
socket.on('new question', function(msg) {
io.emit('new question', msg);
});
});
add something like this to your index.html
socket.on('new question', function(msg) {
$('body').append( msg );
});

Related

Trying to tcp connect Laravel app to workerman server

I have a server running and waiting/listening for connections. The server is based on Mark Framework which is using Workerman. So far I'm able to start the server and when I load the URL/host on the browser it shows content (in this case I'm expecting a simple Hello world).
This is index.php which I use to start the server
use Mark\App;
require 'vendor/autoload.php';
$api = new App('http://127.0.0.1:8080');
$api->count = 2; // process count
$api->any('/', function ($requst) {
return 'Hello world';
});
$api->start();
Now, I have a simple Laravel app that I want when I open a certain page to connect to that server and show the content from it.
I'm not sure how to do this. What I have so far in the controller is this
public function index() {
try {
$host = "127.0.0.1";
$port = 8080;
$resource = stream_socket_client("tcp://$host:$port", $error_no, $error_str, 20, STREAM_CLIENT_CONNECT);
$lines = stream_get_line($resource, 8192);
var_dump($lines);
} catch(Exception $e){
return response($e->getMessage(), 500);
}
var_dump($resource);
return View::make('index');
}
var_dump($resource) which is a variable for the connection shows
resource(10) of type (stream)
var_dump($lines); shows false which I guess is because the $resource doesn't make any connection
bool(false)
Any ideas here on how to approach this?
TCP connection does not rely on any protocol, it's just a pure data transfer protocol which enables you to transfer bytes with ACL confirmation. You name it TCP and you include some Mark Framework which is http protocol based process.
The docs say it helps you to quickly write APIs with php. It means it's not a TCP raw server with custom protocol but it is a http protocol oriented server which is simply a REST API ready for http GET/POST requests to make your life easier.
You can do the same things with laravel on both sides until you need realtime connection. Realtime socket connections are made via javascript because HTTP is not built for this.
From laravel side just read: https://laravel.com/docs/9.x/http-client#making-requests
I might overwatched something, but sockets are rarely used in web. By calling TCP you called for 'sockets' and your post seems to negate your need for them.

How to get response from server using curl in php

i am using CURL to get data from server. The way it works is like the following:
A device send data to routing application which is found on server.
To get the data from the routing application, clients must ask with GET method specifying server address, port and parameter.
once a client is connected, the application start sending data on every new packet arrived from the device to connected clients. see below picture
now lets see my code that i run to get the response:
<?php
$curl = curl_init('http://192.168.1.4/online?user=dneb');
curl_setopt($curl, CURLOPT_PORT, 1818);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, TRUE);
$result = curl_exec($curl);
curl_close($curl);
echo $result;
With this CURL request i can get the response data from routing application. But the routing application will never stop sending data to connected clients, so i will get the result only if i close the routing application, and it will echo every data as one. Now my question is how can i echo each data without closing the connection or the connection closed by the routing application? i.e When data received, display the data without any conditions. You can suggest any other options to forward this data to another server using TCP. Thanks!
a http connection that never close? don't think php's curl bindings are suitable for that. but you could use the socket api,
$sock=socket_create(AF_INET,SOCK_STREAM,SOL_TCP);
socket_set_block($sock);
socket_connect($sock,"192.168.1.4",1818);
$data=implode("\r\n",array(
'GET /online?user=dneb HTTP/1.0',
'Host: 192.168.1.4',
'User-Agent: PHP/'.PHP_VERSION,
'Accept: */*'
))."\r\n\r\n";
socket_write($sock,$data);
while(false!==($read_last=socket_read($sock,1))){
// do whatever
echo $read_last;
}
var_dump("socket_read returned false, probably means the connection was closed.",
"socket_last_error: ",
socket_last_error($sock),
"socket_strerror: ",
socket_strerror(socket_last_error($sock))
);
socket_close($sock);
or maybe even http fopen,
$fp=fopen("http://192.168.1.4:1818/online?user=dneb","rb");
stream_set_blocking($fp,1);
while(false!==($read_last=fread($fp,1))){
// do whatever
echo $read_last;
}
var_dump("fread returned false, probably means the connection was closed, last error: ",error_get_last());
fclose($fp);
(idk if fopen can use other ports than 80. also this won't work if you have allow_url_fopen disabled in php.ini)

PHP Socket and TCP IP Command

I have a device with relays connected to my network. I am able to connect to device via the built in url host to turn on and off the relays. What I would like to do is be able to send commands to the device turning on and off the relays either via php, vb code or make my own ASP url's for each relay that will do that same thing. I am using Visual Studio 2013.
I have the IP of the device and the port number.
I need to send it a 6 byte command: example
0xFD,Ox2,020,1,0,0x5d
This command will tell Relay 1 to turn on.
0xFD,Ox2,020,1,1,0x5d
This command will tell Relay 1 to turn off.
Any help with this would be greatly appreciated Thank you!
I think you may want to check out fsockopen
You can do something like:
$socket = fsockopen($ip, $port);
if($socket) {
fwrite($socket, $string);
}
Instead of building an ASP web site just to turn on/off the device, you can simply create a very simple HTML page with 2 buttons (ON and OFF). If you put this page on the network, it can be opened by any browser.
Assuming that your command string is processed as an URL, the Javascript functions executed when clicking on a button would call the following procedure:
function httpGet(theUrl)
{
var xmlHttp = new XMLHttpRequest();
xmlHttp.open( "GET", theUrl, false );
xmlHttp.send( null );
return xmlHttp.responseText;
}
If the device is controlled by TCP/IP, you may need to code some C# functions, unless using a utility allowing to send data to TCP/IP (example "Packet sender").

PHP Curl API Response Time Differs from different Server

I have a set up where I have two servers running a thin-client (Apache, PHP). On Server A, it's consider a client machine and connects to Server B to obtain data via a Restful API. Both servers are on the same network. On Server B, the response of the request is shown below:
{
"code": 200,
"response_time": {
"time": 0.43,
"measure": "seconds"
}
}
Server B calculates the time completed for each task by using microseconds to flag the start and end of a request block. But when I use curl on Server A to make the call to the Server B, I get very strange results in terms on execution time:
$url = "https://example.com/api";
/*server B address. I've tried IP address as well without any change in results.
This must go over a SSL connection. */
$start_time = microtime(true);
$curl2 = curl_init();
curl_setopt($curl2, CURLOPT_URL, $url);
curl_setopt($curl2, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($curl2, CURLOPT_SSL_VERIFYPEER, FALSE);
curl_setopt($curl2, CURLOPT_USERAGENT, "Server A User Agent");
$result = curl_exec($curl2);
$HttpCode = curl_getinfo($curl2, CURLINFO_HTTP_CODE);
$total_time = curl_getinfo($curl2, CURLINFO_TOTAL_TIME);
$connect_time = curl_getinfo($curl2, CURLINFO_CONNECT_TIME);
$namelookup_time = curl_getinfo($curl2, CURLINFO_NAMELOOKUP_TIME);
$end_time = microtime(true);
$timeDiff = round(((float)$end_time - (float)$start_time), 3);
I get the following for each Time Check:
$timeDiff = 18.7381 (Using Microseconds)
$total_time = 18.7381 (Transfer Time)
$connect_time = 0.020679
$namelookup_time = 0.004144
So I'm not sure why this is happening. Is there a better way to source data from another server in your network that holds your API? It would be like if Twitter's Site was consuming their API from another server that isn't the API server. I would think that the time for the curl to the API would be pretty similar to the time reported by the API. I understand there the API doesn't take into account network traffic and speed to open the connection - but 18 seconds versus 0.43 seems strange to me.
Any ideas here?
This is not the issue with curl anymore. Rather its the problem with your network setup. You can check this out by doing few things.
1) Use ping command to check the response time.
From Server-A: ping Server-B-IP
From Server-B: ping Server-A-IP
2) Similarly you can use the traceroute(for windows tracert) command to check the response time as well. You should get the response instantly.
From Server-A: traceroute Server-B-IP
From Server-B: traceroute Server-A-IP
3) Use wget or curl commandline to download a large file(let say 100 MB) From one server to another, and then check how long does they take. For example using wget:
From Server-B: wget http://server-A-IP/test/test-file.flv
From Server-A: wget http://server-B-IP/test/test-file.flv
4) Apart from these basic routine check, you can also use some advance tools to sort this network problem out. For example the commands/examples from the following two links:
Test network connection performance between two Linux servers
Command line tool to test bandwidth between 2 servers
I had the same problem about 3 days ago. I've wasted an entire afternoon to find the problem. At the end I contacted my server provider and told him the problem. He said, that this is not a problem of my script, but of the carrier (network).
Maybe it is the same problem I had, so contact your server provider and ask him.
Did you tried it with file_get_contents? It would be interesting if the response time is the same with it.

PHP: Remote Function Call and returning the result?

I'm not very expert to PHP. I want to know how to communicate between 2 web servers. For clearance, (from 1st Server) run a function (querying) on remote server. And return the result to 1st server.
Actually the theme will be:
Web Server (1) ----------------> Web Server (2) ---------------> Database Server
Web Server (1) <---------------- Web Server (2) <--------------- Database Server
Query Function() will be only located on Web Server (2). Then i need to run that query function() remotely from Web Server (1).
What is it call? And Is it possible?
Yes.
A nice way I can think of doing would be to send a request to the 2nd server via a URL. In the GET (or POST) parameters, specify which method you'd like to call, and (for security) some sort of hash that changes with time. The hash in there to ensure no third-party can run the function arbitrarily on the 2nd server.
To send the request, you could use cURL:
function get_url($request_url) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $request_url);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 10);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$response = curl_exec($ch);
curl_close($ch);
return $response;
}
This sends a GET request. You can then use:
$request_url = 'http://second-server-address/listening_page.php?function=somefunction&securityhash=HASH';
$response = get_url($request_url);
On your second server, set up the listening_page.php (with whatever filename you like, of course) that checks for GET requests and verifies the integrity of the request (i.e. the hash, correct & valid params).
You can do so by using an API. create a page on second server that takes variables and communicates to the server using those vars (depending on what you need). and the standard reply from that page should be either JSON or XML. then read that from server 1 by requesting that file and getting the reply from the 2nd server.
*NOTE if its a private file, make sure you use an authentication method to prevent users from accessing the file
What you are aiming to do is definitely possible. You will need to set up some sort of api in order for server one to make a request to server 2.
I suggest you read up on SOAP and REST api
http://www.netmagazine.com/tutorials/make-your-own-soap-api
Generally you will use something like CURL to contact server 2 from server 1.
Google curl and you should quickly get idea.
Its not going to be easy to give you a complete solution so I hope this nudge in the right direction is helpful.

Categories