The situation
I have been developing in php and using wamp for the past 2 years. Then I come across a module to implement a chat system followed by instant notifications. So I go look it up and found this awesome "nodejs" that allows you to connect to connected users in realtime.
This guy nodejs socket.io and php uploaded a way to integrate nodejs socket.io and php without node server.
So I downloaded his project (github) and ran it on my computer but it gave
connection refused error from 8080 So,
I go to nodejs site and install nodejs on my system (windows). It automatically updated my environment variables and I could just go to my command line to run a example project as
path(...)node nodeServer.js
and then run the index file of the project from the shared link and it starts working. everything runs smooth and nice.
MY QUESTION
If without installing nodejs on my system I cannot run the node app in the small example project then how am I supposed to install nodejs on live server (apache) and use command line to start nodejs.
I know this might be too silly but I am really new to nodejs so I don't know if I can run node on live php server. If it is possible then can anyone tell me how can I do that ? or is it just an ideal situation and can't be done.
Node.js do not need to be installed with Apache. Node.js itself provide a server that would listen on a port. You can use Apache or Nginx to use proxy. You can run your application without these server also.
Create a file index.js using the code below and run node index.js
var http = require('http');
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.end('Hello World\n');
}).listen(1337, '127.0.0.1');
console.log('Server running at http://127.0.0.1:1337/');
Open you browser and enter this url : http://127.0.0.1:1337/ You will see Hello World over there. In this case nodejs is listening on port 1337
If you are using cloud or VPS or any kind of solution that allows you full control of stuff installed, you can just install node.js there and run what you need...
https://github.com/joyent/node/wiki/installing-node.js-via-package-manager
some services will allow you to pick what gets installed... so you just pick nodejs and run it alongside your apache.
However, if you are using shared hosting solution, there is limited number of those actually even hosting node (if any) and solving this would be almost impossible for you.
Second Edit: Sorry for editing twice, but there is a thing with "no nodejs server" in mentioned stackoverflow post - there is actually a server and mentioned need to npm install certain modules... this is not right way to do this, but if you still want to try this you need node installed (and npm along with it) and then you need to npm isntall mentioned packages, add simple server file quoted in the post, run it and then have all you need for your chat...
If you need some help, ping me, but if this is time critical project, rather find some third party solution... and then learn about this one.
TLDR find a hosting service that'll give u admin and support firewall requests, or self host w/ a free dns subdomain and have a script update your ip so you dont have to pay for static.
My Experiences:
You can actually utilize node for input/output stream manipulation as well. Look at gulp and node for more info. Using bower and bluebird on top of a git project makes setting up apps very easy and quick via node.
As for using socket.io w/ a node/wamp setup, I've actually used this in the past. I had wamp installed on the server initially, but I used the apache directives to reverse proxy requests on 8080 to the node.js app from the client scripts.
I did have to install node separately, though, so you'll need something like ramnode maybe (I think they allow hosted apps like iis/mvc etc too).
Easiest hosting setup for development imo was self host wamp/node w/ a free subdomain from afraid.dns.
Otherwise ramnode gives you full access to admin features on your vm, i believe. So you may be able to install node there as long as you request firewall permissions when needed for xtra ports (socket.io used diff ports for requests on page so I didnt have to worry about CORs crap or anything).
Related
I have developed a web app which shows information in real time from certain actions carried out by different users. For this I use websockets, built in PHP, in a local environment (WAMP) and works fine, but I need this to also work on an external server (web hosting service), which I only have access to through the CPanel and FTP.
Locally I make the websocket work executing next code line through Windows' CMD:
C:\wamp64\bin\php\php7.2.10\php.exe -q C:\wamp64\www\myapp\websocket_daemon.php
My question is, how can I achieve the same result in CPanel, or maybe there is another way?
It is not likely for a shared hosting environment (i.e. Apache with VirtualHost config, PHP, MySQL, and a CPanel interface) to support your websocket application.
For websocket to work, you need to either:
have a port dedicated to websocket in-bound connections; or
have a HTTP/HTTPS server that knows when to upgrade a connection and proxy pass to your websocket application.
To run your own websocket service, you should think about using Virtual Private Server services such as Amazon EC2, DigitalOcean VPS.
For that purpose you will need to have CLI (Command-Line Interface) access to the (Linux) server involved. Assuming that you have such access, running the WS service would look something like
./websocket_daemon.php
The small script assumes that you are in the appropriate folder. However, you need to resolve a few things before you get there:
Step 1: SSH support on your machine
You will need to ensure that your OS supports SSH. Your OS appears to be Windows, so you will need to either install Putty of Git Bash. Read about these technologies
Step 2 Generate an SSH key
In your CPanel, you will need to generate SSH keys:
Click on Manage SSH Keys
Click on Generate a New Key
Use the settings you prefer in order to generate a key, don't worry, you can remove the SSH keys at any time and recreate them if you realize that you prefer a different way to generate them
Read more here: https://docs.cpanel.net/cpanel/security/ssh-access/
SSH keys are composite keys, that is, it consists of a private and a public key. You can share your public key with anyone, but never ever send your private key to anyone. It should be on your computer and possibly saved to backups. Read more about SSH keys here: https://sectigo.com/resource-library/what-is-an-ssh-key
Step 3: Ensure that your computer uses the SSH keys you have generated for CPanel
You will need to tell your OS where the SSH key-pair is located, luckily this is not a new problem, see an exhausting discussion about this topic here: https://serverfault.com/questions/194567/how-do-i-tell-git-for-windows-where-to-find-my-private-rsa-key
Step 4: Test your SSH
Run the following command into your CLI that supports SSH:
ssh <username>#path
If there is no error, then you have successfully tested SSH and you are almost ready to proceed further
Step 5: Upload your Websocket script
You can do this via FTP, as you already know, but you can also do it via SCP. scp would not only use your newly created SSH connection and having fun with it, but it's also secure. Syntax:
scp file.txt remote_username#10.10.0.2:/remote/directory
Step 6: SSH to the server
See Step #4.
Step 7: Navigate to your file's location
Step 8: Ensure that you have the rights to run it
See more here: https://alligator.io/workflow/command-line-basics-file-permissions/
Step 9:: Execute the file
Run
./websocket_daemon.php
If this succeeded, then the job is basically done, you will need some script to run it upon startup and to manage it, but this is not strictly related to the question.
https://oracle-base.com/articles/linux/linux-scripts-running-in-the-background
https://smallbusiness.chron.com/run-command-startup-linux-27796.html
However, if the issue is not resolved yet, read further
Step 10: Ensuring WS support on server
You will need to set up your own WS support. Since you have managed to do so locally on your Windows, hopefully your know-how will work on the remote Linux as well. If not, read more here:
PHP Websocket server in Linux hosting
I'm a LAMP guy, and now start learning WebSockets via Ratchet. So far so good following the start up docs here, and hence i'm able to run the Ratchet Server, like this:
$ php server.php
And then my Javascript Clients can connect to it, etc.
But..
As a LAMP guy, i'm very used to have Apache (or) NGINX as the "Server" for any PHP files to serve to public. Now... should i just run that above command in my terminal, and that's gonna be the Ratchet Server?
Is there a way NOT to run the server like that? (or) Is there a way to let Apache (as an example) manage the Ratchet Server? Which means, let Apache start/stop the Ratchet whenever i type:
$ service httpd start
$ service httpd stop
I'm more confident this way. Plus, the SSL handling, etc also would be then done by Apache more easily. Am i right please?
Please kindly suggest, as i'm very new to this area. Thanks all :)
You indeed are right that running it in the command line is not a production ready solution.
In the last page of the tutorial (deployment) there are some ways to do it. For example, hypervisor is entirely explained how to set it up there.
If you don't like hypervisor usage, then you could try to just write a shell script which is executed on startup, that starts the server.php (less good solution, yet easier)
The ssl part you want to use is possible using a proxy with apache.
If you are using Apache web server (2.4 or above), enable these modules in httpd.conf file :
mod_proxy.so
mod_proxy_wstunnel.so
Add this setting to your httpd.conf file
ProxyPass /wss2/ ws://ratchet.mydomain.org:8888/
If you have any more questions please let me know.
Is it possible to run serve my web application from another server than the one provided in cloud9?
For example : I would like to run different applications (PHP, Node.js - not sure what's possible yet) with nginx as the backend server (i) and/or a reverse proxy (ii) (to try different scenarios and configuration options).
Is it possible to run nginx and serve content to the outside world in cloud9?
Is it possible to have nginx as a reverse proxy in cloud9?
EDIT:
Here they write:
$PORT is exposed to the outside: When you run an application which listens on the port specified in the environment variable $PORT, you can access this application using the http://projectname.username.c9.io URL scheme. The proxy expects the server on that port to be a HTTP server. Other protocols are not supported.
This leads me to believe that if I would start nginx on port=$PORT it would be accesible via the specified URL sheme - can anyone confirm? Maybe anyone has tried this and can share some time-saving tips. Thanks.
I know this might be a late reply but might be helpful for those who are wondering how to do the same.
Short answer
I've created a repository to hold all the configuration needed on the process. Just run a command and NGINX and PHP-FPM will be serving and accessible from internet.
GitHub repo: https://github.com/GabrielGil/c9-lemp
Explanation
Basically to run NGINX on a c9 environment as you noted, you just have to make it listen on port 8080. YOu can either edit the default site on /etc/nginx/sites-available or create and enable your own (That's what the script above does)
Then, in order to run PHP-FPM script using NGINX, configure some permissions and the socket on the webserver is needed. By default, c9 uses ubuntu:ubuntu and the webserver www-data:www-data.
The script above also does this changes for you.
Hope this help you, or other users on similar situations.
You can run nginx on a normal Cloud9 workspace, as long as it listens to port 8080 (the value of $PORT). The URL scheme to reach your server would be http://projectname-username.c9.io, however. Please refer to the docs.c9.io for more up-to-date help on running applications.
One other thing you can do if you have another server where you would like to host your software, is to create an ssh workspace (https://docs.c9.io/ssh_workspaces.html). That way, you can connect Cloud9 to an external server directly.
How does one do team development with node.js when all the developers develop on the same dev machine?
Right now the dev server setup has nginx and apache. Developers SSH into the dev server and they have their own subdomained sandboxes to work on (database is shared). They hack their code and they check into the SVN repo. Great, works fine....until we started using node.js.
It seems node is not like apache or nginx where there's an independent server that serves up code. In node, the server AND the app code is tied together, so what happens is each developer will need to start and stop the server when changes are made. This creates a problem if one instance is started, it blocks the port for other developers.
I'm also having trouble figuring out how to put the node code into the same SVN repository as the PHP app code.
A friend told me the developers can do "timesharing" where the node code can only be modified by someone in a specific timeframe. Not sure if this process is scalable.
Another option is to have everyone work locally off their computer with a VM copy of the dev server so they can develop independent of the dev server. This requires a lot of infrastructure change and I'm not ready to do that yet.
Any suggestions on how to do this with the current shared dev environment setup?
Also, the reason why we are using node.js is to have good comet support. But if this is becoming a blockage to our current infrastructure, I'm willing to try other technologies and servers that is similar to how nginx or apache works--so that it is independent of the app code and can be compatible with our current development environment.
PS. I tried the nginx http push module. It's not well-maintained and not many updates. Scared to use it in production.
You could have each developer's instance of Node.JS running on a different port.
i can't figure out how i should access the repository from a CakePHP project called fredistrano (you can do CakePHP deploys with a web 2.0 interface). i have fredistrano in my web broadcasting directory on a shared unix web server. when i use tortoisesvn from my laptop, i have to use svn+ssh://username#domain.com/svnpath/trunk/. i tried using the same thing in fredistrano, but i keep getting the svn command error "svn: Network connection closed unexpectedly". i copied and pasted the command: svn export --non-interactive --username myusername --password mypwd svn+ssh://myusername#mydomain.com/home/myusername/svn/mydomain.com/trunk tmpDir 2>&1 into my SSH terminal connected to the shared server and i get a prompt for a password, which i believe is actual a prompt for the SSH password and not the SVN password (see this post). fredistrano is failing because it can't deal w/ the SSH password prompt. i noticed in the fredistrano documentation that the example uses http://ipaddress/svn/test for the SVN URL. i copied my svn to my web broadcasting direrctory and tried this but get a connection refused error. my shared hosting provider is pretty strict and i doubt that i can use that. is there a way i can get svn+ssh to work w/ a PHP script like this (fredistrano is just using shell_exec() to execute svn commands)? is there a way i can get just get svn, http, or https working (or any other method that i don't know about)?
I am interested in this problem, too, and I hope that I'm close to the solution.
I haven't tried to put it into work in my application due to the lack of time and other high-priority tasks, but I guess that it should look something like this:
shell_exec(svn something svn+ssh://...)
$response = trim(fgets(STDIN))
[then check if the response contains password prompt text]
fwrite(STDOUT, 'yourpassword');
[analyze the next response and see if SVN has returned the requested information - log, info, whatever]
"svn: Network connection closed unexpectedly" most probably means that your host has restricted/forbidden access to other hosts. This might imply using sockets at all (SVN, HTTP, etc.) or maybe only non-HTTP. In this case you should try setting up your SVN server to allow HTTP requests (e.g. using mod_dav_svn for Apache).
This is only a guess - see my comment to your question.
How do you authenticate from your dev machine to the svn-server? You might be using a key to authenticate (Do you have putty pageant running?)
maybe check out the Subversion PHP Module (1.0.3) instead of wrapping shell_exec; it requires building from source, with phpize, ./configure and make (just built it against PHP 5.6 and Subversion 1.9.5)... while the Apache Module mod_dav (Subversion via HTTP/HTTPS) is not required for version control, rather an optional method of accessing the repository.