Running nodejs program from PHP exec vs http - php

I would want to run a nodejs program from PHP. What would be the fastest way to do it when you compare these options:
Run the nodejs code as a webservice and call via a http URL
Run via exec and grab the output
Edit:
The nodejs code will run a headless browser (casperjs/phantomjs) and it will render a page with canvas data and then submit the canvas image to a service.

That depends a lot on the nodejs application itself, if the initialization is relatively heavy, and it can already be pre-initialized as a webservice then that is probably a bit quicker.
Otherwise, if it is very simple an exec is probably quicker since you're skipping the whole http part

Sounds like the node code will already making an external network call which would be orders of magnitude slower than a local network call so in terms of performance there should not be much difference.
Of course if performance is an important requirement you should measure both approaches and select the best for your case.
If you have a requirement to be able to invoke this process from a different application then the http endpoint would be better provided security measures are properly implemented as an endpoint would potentially increase your attack surface.

Related

Call nodejs script from php script

Imagine I have a feature coded on nodejs because it implements a library which solves pretty well my problem and I want to add this feature on another PHP script that makes more stuff.
Is a good practice just call the nodejs script with shell_exec? Or should I code a standard communication protocol between them like HTTP requests? Am I coding too much instead of just run shell_exec?
The choice to call the script directly from the PHP code or via a http request is completely at your own discretion. However, you should measure the cost effect performance-wise.
If calling the script directly will not compromise the performance of the PHP code, then it is perfectly fine. You should also consider the long run effect of this. If the data you're processing will grow over time, then depending on you algorithm, and the Big-O effect of the data size, then this won't be an effective method.
The all round best approach will be to go the path of the micro-services concept. You could create an external API that will execute the script externally (with your data), and import the result back into the PHP process. The only drawback from this is the cost, especially if this will require you to acquire other CPU resource(s).

Web application using Node.js/socket.io and php/mysql

I am relatively new to node.js and socket.io. Currently I have a half finished private web project, which runs only with PHP with a MySQL database on the server side. I decided to bring it to a more advanced level using socket.io, for several features within the project.
So I read a lot about it and watched a whole bunch of tutorials. Also I found this and this during my research.
My question is, if that is still the common way to develop a web application?
More exactly: to use on one event (like a form submit) both an AJAX request and a socket.emit, for those events it is necessary/wanted.
The background of this thought is the following. I have a whole bunch of calculations running now in PHP. And the node.js server runs logically in JavaScript. So I can easily implement a node.js server without changing anything on my AJAX requests. Or rewrite everything I have so far, to js and use only a node.js server.
But this leads to 3 more questions:
Which runs possibly faster on the server side. A calculation scripted with PHP or JavaScript?
How to use transactions on a node.js server while using MySQL?
And how great is the influence by converting a PHP array to a JSON object, what you could avoid with the usage of just the node.js server where you don't need to convert anything.
JavaScript is executed on the client side so you are limited by the user's hardware whereas PHP is executed on your server. See this post for more info about performance comparaison.
I highly suggest you take a look at this pure node.js client that will perfectly do the job in your case.
PHP has many functions to use on JSON data (json_decode(), json_encode(), ...) but Node.js don't require JSON data to be converted. In the end, it really depend on your usage and how you plan to store and use that data

Node.js run as program vs php

I have been looking at the node.js application vs php, I found many comparisons comparing these two server technique. Mostly people suggest that the javascript V8 engine is much faster than php in terms of running speed of a single file calculation.
I worked on some javascript code for Node.js, now I have this idea I don't know if it is correct.
In my opinion, Node.js runs a javascript application and listen on a port. so this javascript application is an application that is running on server computer. Therefore, the application code is all copied in the memory of the computer. stuff like global variables are declared and saved at the beginning when node.js execute this program. So any new request come in, the server can use these variables very efficiently.
In php, however, the php program execute *.php file based on request. Therefore, if some request is for www.xx.com/index.php, then the program will execute index.php, and in which, there may be stuff like
require("globalVariables.php");
then, php.exe would go there and declare these variables again. same idea for functions and other objects...
So am I correct in thinking that php may not be a good idea when there are many other libraries that need to be included?
I have searched for the comparison, but nobody have talked about this.
Thanks
You are comparing different things. PHP depends on Apache or nginx (for example) to serve scripts, while Node.js is a complete server itself.
That's a big difference, cause when you load a php page, Apache will spawn a thread and run the script there. In Node all requests are served by the Node.js unique thread.
So, php and Node.js are different things, but regarding your concern: yes, you can mantain a global context in Node that will be loaded in memory all the time. On the other hand PHP loads, runs and exits all the time. But that's not the typical use case, Node.js web applications have templates, that have to be loaded and parsed, database calls, files... the real difference is the way Node.js handles heavy tasks: a single thread for javascript, an event queue, and external threads for filesystem, network and all that slow stuff. Traditional servers spawn threads for each connection, that's a very different approach.

What technology shall I use for a webpage that constantly requests data from server

We need to create a web-based frontend for displaying some data. The problem is that the data needs to be updated about once a second.
For me as a web-developer the obvious solution is AJAX.
Unfortunately, one of the purposes of this web frontend is to be displayed inside of embedded browser window which is expected to run constantly for months or even years. That's it, months of work with no restart / refresh.
During testing we ran a proof of concept interface (which requested a simple set of data each 1,5s) in Safari for over a month. During this period of time, the memory usage of Safari raised from ~30 MB to over 100MB.
Thus we're afraid of stability of such a solution.
I'm wondering if you could recommend us any other technique for this task, possibly with less overhead (when requesting simple sets of data - as in our case - I'm afraid the HTTP headers are very significant part of data)
I would suggest looking into node.js and the now.js plugging, which allows for realtime updates via websockets. It even has support for older browsers, so if the browser does not support websockets, it will do a fall over to either a comet server implementation, AJAX or an iframe.
It's extremely easy to setup on a linux environment, and there's ample documentation to get you started.
It works with javascript and runs on the Google V8 javascript engine, so if you've ever worked with OOP Javascript, you should be able to pick it up relatively easy.
LINKS:
http://nodejs.org/
http://nowjs.com/
How about Adobe AIR as front-end? You can use Flash/FLEX inside which have decent garbage collectors so long running shoudn't be a problem. AIR also allows to write in XHTML and JavaScript so it could be a good option if you're only familiar with those technologies
PHP is not a good choice for this kind of requests. Comet seems to be a good way to receive data from server. You can use for example excellent Tornado (Python) as backend.
ActionScript allows to use TCP sockets so you can write your own protocol for even better performance and use BOOST Asio (C++) or Netty (Java) as scalable backend
Maybe websocket ? Instead of making an AJAX request each X seconds, the server push new data as they comes.
My personal faverite is php4+, mysql, apache or lightpd webserver.
Tough I also suggest Python.
I specialize in what you are mentioning, with that said, will you be actually looking at the screen? If not you should request the page using an http socket or via a wget cronjob on a linux box.
Yes the http header is very important, if you try to strip them out the webserver will issue a "Server - Bad Request" Error.
Let me know what you decide, I have a lot to share :)
I suspect that the problem is not AJAX per se, but using a browser an sich: I don't think any where made with constant running in mind, and I'm assuming that all (re)loading processes will become some form of extra memory in the end.
I think you would be best off to consume your data trough something simple you design yourself. You can obviously produce it on the same spot (server, requestable via HTTP or whatever you like most), but you do not need a complete webbrowser if your goal is first "a couple of years uptime".

PHP CLI + Ajax for web terminal

I'm trying to determine the best approach to providing an Ajax based terminal using PHP. I haven't made an attempt at writing it yet but having rolled the idea around, the only way I could see it possible, would be 2 scripts:
Script 1; handles Ajax communication
between server and client browser. when a
request is made to use the terminal,
it connects to (or starts as a service then
connects to) Script 2 via a socket.
Script 2; performs the system calls,
passing back output to the Ajax
script for output via the socket.
There are multiple holes I can see in this though, and I'm wondering if anyone has created/seen a set of scripts that can perform these tasks? Any insight would be greatly appreciated!
Thanks :)
Edit: I think I was unclear about a few things. I've found a few scripts that imitate terminals, providing nearly the functionality that I'm looking for, such as AjaxPHPTerm (http://sourceforge.net/projects/ajaxphpterm/)
The problem is that, I'm trying to find a method that permits interaction with shell scripts. If a script prompts Press any key to continue, or Select option [x], using AjaxPHPTerm, it just hangs or drops out of the shell script.
That's why I started thinking sockets, or streams; some way of forming a direct I/O stream to the system calls.
Http is stateless and AJAX, sockets or any other technology based on pages generated by server will not change it magically. Whatever tricks You would use, it will be not efficient and simply not worth the effort (In my opinion at least).
The problem seems to be that AjaxPHPTerm is actually closer to a shell than a terminal (glancing at the code, it seems to do its own CWD handling, and has a simple read-eval-print loop).
Assuming a Posix-compatible OS on the server, the proper way to implement this would probably be to use the pseudo-terminal facility, so that your web terminal appears like a virtual terminal on the system, that running programs can interactively access.

Categories