Imagine I have a feature coded on nodejs because it implements a library which solves pretty well my problem and I want to add this feature on another PHP script that makes more stuff.
Is a good practice just call the nodejs script with shell_exec? Or should I code a standard communication protocol between them like HTTP requests? Am I coding too much instead of just run shell_exec?
The choice to call the script directly from the PHP code or via a http request is completely at your own discretion. However, you should measure the cost effect performance-wise.
If calling the script directly will not compromise the performance of the PHP code, then it is perfectly fine. You should also consider the long run effect of this. If the data you're processing will grow over time, then depending on you algorithm, and the Big-O effect of the data size, then this won't be an effective method.
The all round best approach will be to go the path of the micro-services concept. You could create an external API that will execute the script externally (with your data), and import the result back into the PHP process. The only drawback from this is the cost, especially if this will require you to acquire other CPU resource(s).
Related
I would want to run a nodejs program from PHP. What would be the fastest way to do it when you compare these options:
Run the nodejs code as a webservice and call via a http URL
Run via exec and grab the output
Edit:
The nodejs code will run a headless browser (casperjs/phantomjs) and it will render a page with canvas data and then submit the canvas image to a service.
That depends a lot on the nodejs application itself, if the initialization is relatively heavy, and it can already be pre-initialized as a webservice then that is probably a bit quicker.
Otherwise, if it is very simple an exec is probably quicker since you're skipping the whole http part
Sounds like the node code will already making an external network call which would be orders of magnitude slower than a local network call so in terms of performance there should not be much difference.
Of course if performance is an important requirement you should measure both approaches and select the best for your case.
If you have a requirement to be able to invoke this process from a different application then the http endpoint would be better provided security measures are properly implemented as an endpoint would potentially increase your attack surface.
I am relatively new to node.js and socket.io. Currently I have a half finished private web project, which runs only with PHP with a MySQL database on the server side. I decided to bring it to a more advanced level using socket.io, for several features within the project.
So I read a lot about it and watched a whole bunch of tutorials. Also I found this and this during my research.
My question is, if that is still the common way to develop a web application?
More exactly: to use on one event (like a form submit) both an AJAX request and a socket.emit, for those events it is necessary/wanted.
The background of this thought is the following. I have a whole bunch of calculations running now in PHP. And the node.js server runs logically in JavaScript. So I can easily implement a node.js server without changing anything on my AJAX requests. Or rewrite everything I have so far, to js and use only a node.js server.
But this leads to 3 more questions:
Which runs possibly faster on the server side. A calculation scripted with PHP or JavaScript?
How to use transactions on a node.js server while using MySQL?
And how great is the influence by converting a PHP array to a JSON object, what you could avoid with the usage of just the node.js server where you don't need to convert anything.
JavaScript is executed on the client side so you are limited by the user's hardware whereas PHP is executed on your server. See this post for more info about performance comparaison.
I highly suggest you take a look at this pure node.js client that will perfectly do the job in your case.
PHP has many functions to use on JSON data (json_decode(), json_encode(), ...) but Node.js don't require JSON data to be converted. In the end, it really depend on your usage and how you plan to store and use that data
I have been looking at the node.js application vs php, I found many comparisons comparing these two server technique. Mostly people suggest that the javascript V8 engine is much faster than php in terms of running speed of a single file calculation.
I worked on some javascript code for Node.js, now I have this idea I don't know if it is correct.
In my opinion, Node.js runs a javascript application and listen on a port. so this javascript application is an application that is running on server computer. Therefore, the application code is all copied in the memory of the computer. stuff like global variables are declared and saved at the beginning when node.js execute this program. So any new request come in, the server can use these variables very efficiently.
In php, however, the php program execute *.php file based on request. Therefore, if some request is for www.xx.com/index.php, then the program will execute index.php, and in which, there may be stuff like
require("globalVariables.php");
then, php.exe would go there and declare these variables again. same idea for functions and other objects...
So am I correct in thinking that php may not be a good idea when there are many other libraries that need to be included?
I have searched for the comparison, but nobody have talked about this.
Thanks
You are comparing different things. PHP depends on Apache or nginx (for example) to serve scripts, while Node.js is a complete server itself.
That's a big difference, cause when you load a php page, Apache will spawn a thread and run the script there. In Node all requests are served by the Node.js unique thread.
So, php and Node.js are different things, but regarding your concern: yes, you can mantain a global context in Node that will be loaded in memory all the time. On the other hand PHP loads, runs and exits all the time. But that's not the typical use case, Node.js web applications have templates, that have to be loaded and parsed, database calls, files... the real difference is the way Node.js handles heavy tasks: a single thread for javascript, an event queue, and external threads for filesystem, network and all that slow stuff. Traditional servers spawn threads for each connection, that's a very different approach.
I'm trying to determine the best approach to providing an Ajax based terminal using PHP. I haven't made an attempt at writing it yet but having rolled the idea around, the only way I could see it possible, would be 2 scripts:
Script 1; handles Ajax communication
between server and client browser. when a
request is made to use the terminal,
it connects to (or starts as a service then
connects to) Script 2 via a socket.
Script 2; performs the system calls,
passing back output to the Ajax
script for output via the socket.
There are multiple holes I can see in this though, and I'm wondering if anyone has created/seen a set of scripts that can perform these tasks? Any insight would be greatly appreciated!
Thanks :)
Edit: I think I was unclear about a few things. I've found a few scripts that imitate terminals, providing nearly the functionality that I'm looking for, such as AjaxPHPTerm (http://sourceforge.net/projects/ajaxphpterm/)
The problem is that, I'm trying to find a method that permits interaction with shell scripts. If a script prompts Press any key to continue, or Select option [x], using AjaxPHPTerm, it just hangs or drops out of the shell script.
That's why I started thinking sockets, or streams; some way of forming a direct I/O stream to the system calls.
Http is stateless and AJAX, sockets or any other technology based on pages generated by server will not change it magically. Whatever tricks You would use, it will be not efficient and simply not worth the effort (In my opinion at least).
The problem seems to be that AjaxPHPTerm is actually closer to a shell than a terminal (glancing at the code, it seems to do its own CWD handling, and has a simple read-eval-print loop).
Assuming a Posix-compatible OS on the server, the proper way to implement this would probably be to use the pseudo-terminal facility, so that your web terminal appears like a virtual terminal on the system, that running programs can interactively access.
I have a python script that parses a large set of data into an internal memory structure, and implements various fetch functions on the structure.
I want to built a simple web frontend for this script, with the condition that the data is only initialized/loaded once (since re-loading upon each fetch would consume too much time/resources). Essentially, the python handler needs to maintain its state between calls, so the data structure is preserved in memory.
Note: PHP's exec() or similar will not work, as this instantiates a new python handler per request. I have heard vague references to using mod_python for this purpose?
I have implemented a solution to a very similar problem. My solution was to use an xmlrpc server, specifically
twisted.web.xmlrpc
I have a method that allows for injection of new data, and have methods for retrieving the data.
Use a persistent server like CherryPy or Twisted Web. All requests will be served by the same process.