can PHP and node.js run on the same server - php

I have a web application with Apache and PHP on the back end. I am in the process of enhancing this with many new features and considering using node.js for the new work.
First of all, can PHP and node.js co exist on the same machine? I do not see why not.
Second, can I just call node.js code directly from Javascript and return JSON back?

Yes, and yes. Node and Apache / PHP can co-exist on a single server.
The only issue you are likely to run into is that they cannot both listen on the same port. HTTP, by default, runs on port 80 and only one process can "listen" on a single port at any one time. You may therefore have to run the Node app on a different port (for example, 8080), which could bring in difficulties if any of your target users are restricted to only port 80.

You can run node and PHP on same server, and even on the same port. The key is to use a server like nginx in front listening on port 80, set up PHP in Nginx as you normally would (using php-fpm) and set up your Node instance to listen locally on some high port like 8081.
Then just configure nginx to proxy all the Node requests through to localhost:8081 using the directory name as a filter. You're essentially setting up nginx to treat Node requests a bit like it treats PHP requests: it forwards them off to some other daemon then handles the response as it comes back. Nginx is great at this. It will add a layer of security and it will boost performance because it's so good at managing many connections at once even when the backend doesn't.
Another benefit of this is that you can have multiple separate Node instances on different domains too, and use regular nginx rules to handle it all. You can also run it alongside other app servers like apps written in Go.
You'd benefit from Nginx's configurability, its SSL and HTTP/2 support and huge speed in serving static files, and not having to serve static files from your Node app (if you don't want to).

Yes, You can do it. If you server is an Ubuntu or Debian, follow these steps:
Open your terminal an write:
sudo curl -sL https://deb.nodesource.com/setup_8.x | bash -
sudo apt-get install nodejs
If curl is not installed on your server:
sudo apt-get install curl
To your Node.js application not stop when you exit the Terminal without shutting down your instance, use a package called Forever.
npm install -g forever
If your site is uploaded and NPM and Forever are configured correctly, it is time to start the Node.js instance. If you’re using Express.js, run the following command to start a Forever instance:
forever start ./path/to/your/project
In the above command you'll notice I am feeding the ./bin/www script because that is what npm start launches for Express.js. Be sure to change the script to whatever your launch script is.
By default, the website (nodejs) is running at http://localhost:3000 which isn't ideal for remote visitors. We want to be able to access our site from a domain name processed by Apache. In your Apache VirtualHost file, you might have something that looks like the following:
<virtualhost *:80>
ServerName www.example.com
ProxyPreserveHost on
ProxyPass / http://localhost:3000/
ProxyPassReverse / http://localhost:3000/
</virtualhost>
We are telling Apache to create a proxy that will get our Node.js http://localhost:3000 site every time the www.yousite.com domain name is hit. All assets and pages will use the www.yoursite.com path instead of http://localhost:3000 leading everyone to believe the website is being served no differently than any other.
However, by default, the Apache proxy modules are not enabled. You must run the following two commands if the modules are not already enabled:
a2enmod proxy
a2enmod proxy_http
You may be required to restart Apache after enabling these modules.
I get this information on The Poliglot Developer.

Yeh, if you use php to serve javascript pages to clients the javascript code can use AJAX requests to access routes exposed from your node server.

Related

Nginx Proxy Manager and php Configuration

I earlier on installed nginx and was using it to run all php applications including phpmyadmin. When I needed to run some nodejs applications, setting up reverse proxy became an issue. So I found a solution Nginx Proxy Manager. However, I notice that NPM cannot run at the same time with nginx. So that means I can't run my normal php applications.
What could be a way of integrating or using Nginx Proxy Manager with php applications?
I expected that I would simply find a way of having sites-enabled in nginx proxy manager. But this does not exist. I find it easy to use Nginx Proxy Manager for application that run on specific ports.

Load balancing PHP built-in server?

My development environment consists of the single-threaded built-in PHP server. Works great:
APP_ENV=dev php -S localhost:8080 -c php.ini web/index.php
One issue with this is that the built-in server is single-threaded. This makes lots of parallel XHRs resolve sequentially. Worst of all, it doesn't mimic our production environment very well. Some front-end issues with concurrency simply don't exist in this set up.
My question:
What's an existing solution that I could leverage that would proxy requests asynchronously to multiple instances of the same PHP built-in server?
For example, I'd have a few terminal sessions running the built-in server on different ports, then each request is routed to a different one of those instances. In other words, I want multiple instances of my application running in parallel using the simplest possible set up (no Apache or Nginx if possible).
A super-server, like inetd or tcpserver†, works well. I'm a fan of the latter:
tcpserver waits for incoming connections and, for each connection,
runs a program of your choice.
With that, now you want to use a reverse proxy to pull the HTTP protocol off the wire and then hand it over to a connection-specific PHP server. Pretty simple:
$ cat proxy-to-php-server.sh
#!/bin/bash -x
# get a random port -- this could be improved
port=$(shuf -i 2048-65000 -n 1)
# start the PHP server in the background
php -S localhost:"${port}" -t "$(realpath ${1:?Missing path to serve})" &
pid=$!
sleep 1
# proxy standard in to nc on that port
nc localhost "${port}"
# kill the server we started
kill "${pid}"
Ok, now you're all set. Start listening on your main port:
tcpserver -v -1 0 8080 ./proxy-to-php-server.sh ./path/to/your/code/
In English, this is what happens:
tcpserver starts listening on all interfaces at port 8080 (0 8080) and prints debug information on startup and each connection (-v -1)
For each connection on that port, tcpserver spawns the proxy helper, serving the given code path (path/to/your/code/). Pro tip: make this an absolute path.
The proxy script starts a purpose-built PHP web server on a random port. (This could be improved: script doesn't check if port is in use.)
Then the proxy script passes its standard input (coming from the connection tcpserver serves) to the purpose-built server
The conversation happens, then the proxy script kills the purpose-built server
This should get you in the ballpark. I've not tested it extensively. (Only on GNU/Linux, Centos 6 specifically.) You'll need to tweak the proxy's invocation of the built-in PHP server to match your use case.
Note that this isn't a "load balancing" server, strictly: it's just a parallel ephemeral server. Don't expect too much production quality out of it!
† To install tcpserver:
$ curl -sS http://cr.yp.to/ucspi-tcp/ucspi-tcp-0.88.tar.gz | tar xzf -
$ cd ucspi-tcp-0.88/
$ curl -sS http://www.qmail.org/moni.csi.hu/pub/glibc-2.3.1/ucspi-tcp-0.88.errno.patch | patch -Np1
$ sed -i 's|/usr/local|/usr|' conf-home
$ make
$ sudo make setup check
I'm going to agree that replicating a virtual copy of your production environment is your best bet. You don't just want to cause issues, you want to cause yourself the same issues. Also, there's little guarantee that you will hit all of the same issues under the alternate setup.
If you do want to do this, however, you don't have particularly many options. Either you direct incoming requests to an intermediate piece of software which then dispatches them to the php backends -- this would be the Apache, Nginx solutions -- or you don't, and the request is directly handled by the single php thread.
If you're not willing to use that interposed software, there's only one layer between you and the client: networking. You could, in theory, set up a round-robin DNS for yourself. You give yourself multiple IPs, load up a PHP server listening on each, and then let your client connections get spread across them. Note that this would assign each client to a specific process -- which may not be the level of parallel you're looking for.

Nodejs and wamp server confusion

The situation
I have been developing in php and using wamp for the past 2 years. Then I come across a module to implement a chat system followed by instant notifications. So I go look it up and found this awesome "nodejs" that allows you to connect to connected users in realtime.
This guy nodejs socket.io and php uploaded a way to integrate nodejs socket.io and php without node server.
So I downloaded his project (github) and ran it on my computer but it gave
connection refused error from 8080 So,
I go to nodejs site and install nodejs on my system (windows). It automatically updated my environment variables and I could just go to my command line to run a example project as
path(...)node nodeServer.js
and then run the index file of the project from the shared link and it starts working. everything runs smooth and nice.
MY QUESTION
If without installing nodejs on my system I cannot run the node app in the small example project then how am I supposed to install nodejs on live server (apache) and use command line to start nodejs.
I know this might be too silly but I am really new to nodejs so I don't know if I can run node on live php server. If it is possible then can anyone tell me how can I do that ? or is it just an ideal situation and can't be done.
Node.js do not need to be installed with Apache. Node.js itself provide a server that would listen on a port. You can use Apache or Nginx to use proxy. You can run your application without these server also.
Create a file index.js using the code below and run node index.js
var http = require('http');
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.end('Hello World\n');
}).listen(1337, '127.0.0.1');
console.log('Server running at http://127.0.0.1:1337/');
Open you browser and enter this url : http://127.0.0.1:1337/ You will see Hello World over there. In this case nodejs is listening on port 1337
If you are using cloud or VPS or any kind of solution that allows you full control of stuff installed, you can just install node.js there and run what you need...
https://github.com/joyent/node/wiki/installing-node.js-via-package-manager
some services will allow you to pick what gets installed... so you just pick nodejs and run it alongside your apache.
However, if you are using shared hosting solution, there is limited number of those actually even hosting node (if any) and solving this would be almost impossible for you.
Second Edit: Sorry for editing twice, but there is a thing with "no nodejs server" in mentioned stackoverflow post - there is actually a server and mentioned need to npm install certain modules... this is not right way to do this, but if you still want to try this you need node installed (and npm along with it) and then you need to npm isntall mentioned packages, add simple server file quoted in the post, run it and then have all you need for your chat...
If you need some help, ping me, but if this is time critical project, rather find some third party solution... and then learn about this one.
TLDR find a hosting service that'll give u admin and support firewall requests, or self host w/ a free dns subdomain and have a script update your ip so you dont have to pay for static.
My Experiences:
You can actually utilize node for input/output stream manipulation as well. Look at gulp and node for more info. Using bower and bluebird on top of a git project makes setting up apps very easy and quick via node.
As for using socket.io w/ a node/wamp setup, I've actually used this in the past. I had wamp installed on the server initially, but I used the apache directives to reverse proxy requests on 8080 to the node.js app from the client scripts.
I did have to install node separately, though, so you'll need something like ramnode maybe (I think they allow hosted apps like iis/mvc etc too).
Easiest hosting setup for development imo was self host wamp/node w/ a free subdomain from afraid.dns.
Otherwise ramnode gives you full access to admin features on your vm, i believe. So you may be able to install node there as long as you request firewall permissions when needed for xtra ports (socket.io used diff ports for requests on page so I didnt have to worry about CORs crap or anything).

Running a custom webserver in cloud9 environment and reaching it from the outside world

Is it possible to run serve my web application from another server than the one provided in cloud9?
For example : I would like to run different applications (PHP, Node.js - not sure what's possible yet) with nginx as the backend server (i) and/or a reverse proxy (ii) (to try different scenarios and configuration options).
Is it possible to run nginx and serve content to the outside world in cloud9?
Is it possible to have nginx as a reverse proxy in cloud9?
EDIT:
Here they write:
$PORT is exposed to the outside: When you run an application which listens on the port specified in the environment variable $PORT, you can access this application using the http://projectname.username.c9.io URL scheme. The proxy expects the server on that port to be a HTTP server. Other protocols are not supported.
This leads me to believe that if I would start nginx on port=$PORT it would be accesible via the specified URL sheme - can anyone confirm? Maybe anyone has tried this and can share some time-saving tips. Thanks.
I know this might be a late reply but might be helpful for those who are wondering how to do the same.
Short answer
I've created a repository to hold all the configuration needed on the process. Just run a command and NGINX and PHP-FPM will be serving and accessible from internet.
GitHub repo: https://github.com/GabrielGil/c9-lemp
Explanation
Basically to run NGINX on a c9 environment as you noted, you just have to make it listen on port 8080. YOu can either edit the default site on /etc/nginx/sites-available or create and enable your own (That's what the script above does)
Then, in order to run PHP-FPM script using NGINX, configure some permissions and the socket on the webserver is needed. By default, c9 uses ubuntu:ubuntu and the webserver www-data:www-data.
The script above also does this changes for you.
Hope this help you, or other users on similar situations.
You can run nginx on a normal Cloud9 workspace, as long as it listens to port 8080 (the value of $PORT). The URL scheme to reach your server would be http://projectname-username.c9.io, however. Please refer to the docs.c9.io for more up-to-date help on running applications.
One other thing you can do if you have another server where you would like to host your software, is to create an ssh workspace (https://docs.c9.io/ssh_workspaces.html). That way, you can connect Cloud9 to an external server directly.

PHP built in web server with multiple sites?

Does the PHP built in web server allow multiple instances/sites?
Background on the project skeleton
I am working through Zend for the first time via Chrisopher Valles' tutorial.. with a slight difference. He uses Vagrant to instantiate a VBox instance, and I'm working locally in Ubuntu 12.x LTS...
https://github.com/christophervalles for more details on the Vagrant box...
I want to utilize the internal PHP 5.5 server if possible, but getting an error when (of course) running the 2nd call.
Is the best/usual solution to have the core service be on some variable port when in development, and run the web client on port 80?
I'd, of course, need to rewrite some of my client's code to point to the new port, but would the Zend Service need a rewrite anywhere? I'd say no.
Starting up my site's core/api services Zend project :
>php -S 0.0.0.0:8080 -t public/ public/index.php
PHP 5.5.10-1+deb.sury.org~precise+1 Development Server started at Day Date Time
Listening on http://0.0.0.0:8080
Document root is /home/core_site/public
Press Ctrl-C to quit.
Starting up my site's web client, that talks to the first server:
>php -S 0.0.0.0:8080 -t public/ public/index.php
[Day Date] Failed to listen on 0.0.0.0:8080 (reason: Address already in use)
you could just use a different port
php -S 0.0.0.0:8081 -t public/ public/index.php
Thanks for reading my book!
About your question, Andrew is right about using a different port. As far as I know you cannot do domain name based hosts on the built in php server. If you don't want to use different ports each time you should look into using apache or nginx.
If you go with nginx and php you can re-use the config files I use for the Vagrant machine so you don't have to do everything from scratch. Essentially the OS on the Vagrant machine is a Ubuntu 12.04 (the same as yours) so shouldn't be hard to re-use the configs :D
You can check out the php.ini and the nginx vhosts used on Vagrant here.
Cheers!

Categories