I have downloaded a SSL certificate that I have recieved from app.zerossl.com and placed it in the same directory as my main node script; and have used this code to install it.
var fs = require('fs');
let options = {
cert: fs.readFileSync(__dirname + '/certificate.crt'),
ca: fs.readFileSync(__dirname + '/ca_bundle.crt'),
key: fs.readFileSync(__dirname + '/private.key')
};
My configuration for running the server is the following:
var express = require('express');
var app = express();
var server = require('https').createServer(options, app);
var io = require('socket.io')(server);
Now i'm running MYSQL and PHP on XAMPP with the port set to 1337. In my modem i've set the DMZ to my Computer/Servers' internal IP Address. When I try to access my domain over the internet it comes up with an error. (didn't send any data, ERR_EMPTY_RESPONSE) assumingly from my Node JS server.
Now when I go on the https version of my website using the address bar, it comes up with a warning then redirects to my XAMPP server. The port is not set on the address bar, so i'm not sure why it's redirecting to the XAMPP server.
I'm wondering why isn't my SSL working and why is it redirecting to my XAMPP server instead of using the NodeJS server when I place in https?
Since you are running two servers and one public entry point, then you will need to use something like nginx to be able to access both from your external IP.
XAMPP is probably taking priority over the express server which is why is going there.
NOTE: If you are dealing with HTTPS, make sure to add router rule to utilize port 443.
Here are some docs on how to host two servers in one. In this case, it's two websites but you can change it to make it works for one website and one backend server since this is specifically for routing to different ports.
NOTE: You can skip server_name and just add the port forward in each configuration. This way you can have one port forward to 1337 for your xampp, and another port for your express server.
https://webdock.io/en/docs/how-guides/how-configure-nginx-to-serve-multiple-websites-single-vps#:~:text=If%20you%20are%20using%20a,to%20host%20all%20your%20domains.
Related
I built a small chat using PHP and node.js (using the socket.io library)
Essentially I use node.js for the server of the chat and PHP to handle the actual webpages.
Vagrant has the option to share your HTTP server with users: https://www.vagrantup.com/docs/share/http.html
The HTTP server is running on port 80 and node.js is running on port 3000.
On the chat.php page, I have this line of code:
socket = io.connect("http://localhost:3000");
When I execute the vagrant share command, it provides a URL which you can provide to other people and they will be able to access the site.
So given that URL, I edit the line of code mentioned above to include that URL:
socket = io.connect("http://ugly-elk-1232.vagrantshare.com:3000");
and then I start SSH into vagrant and start node from there.
However it doesn't work. On the chat page I can see timeout errors when socket.io tries to access port 3000.
Here's the error I get in the console (in chrome):
GET http://ugly-elk-1232.vagrantshare.com:3000/socket.io/?EIO=3&transport=polling&t=Lcp9sZh net::ERR_CONNECTION_TIMED_OUT
(the URL is random and will change every time I run vagrant share, but I always update it on the chat page)
Here's what's in my vagrantfile:
# -*- mode: ruby -*-
# vi: set ft=ruby :
Vagrant.configure("2") do |config|
config.vm.box = "scotch/box"
config.vm.network "private_network", ip: "192.168.33.10"
config.vm.hostname = "scotchbox"
config.vm.synced_folder ".", "/var/www", :mount_options => ["dmode=777", "fmode=666"]
config.vm.network "forwarded_port", guest: 3306, host: 3306
# Optional NFS. Make sure to remove other synced_folder line too
#config.vm.synced_folder ".", "/var/www", :nfs => { :mount_options => ["dmode=777","fmode=666"] }
end
Is there any way to make this work and allow port 3000 to be shared and not just port 80?
To be clear, the actual webpages are served perfectly fine. It's just that node is not accessible when I use vagrant share.
Edit:
I managed to somewhat solve the problem.
I use vagrant share --http 80 in one window and vagrant share --http 3000 in another.
Then I change the URL so that it connects to the node server that was shared on port 3000.
So the code then looks like this:
socket = io.connect("http://abc123.vagrantshare.com");
Including :3000 (the port) in the URL stops it from working. (not sure why, but I don't think this is the issue).
The problem now is that socket.io now resorts to polling and no longer uses websockets. I tried to force it to use websockets, but it gives 400 bad request every time it tries. Polling isn't necessarily bad and it works, but I wanted it to use websockets since I need to test how the site will behave when it's actually up and websockets are what will be used in that case.
because of
socket = io.connect("http://localhost:3000");
you're listening only locally and the app works only from your VM (it does not even work if you try to access it from your host machine)
If you want to share the app directly from port 3000 you should be able to run vagrant share as
vagrant share --http 3000
I have made a chat with SocketIO who is working well on local, but i'm trying to deploy it on my Apache server.
I'm using Php + NodeJs, not only Node
I had ERR_CONNECTION_REFUSED error but solved it by opening the good port, Listen 8000 in my ports.conf
My server.js look like this
var io = require('socket.io').listen(8000);
// Socket IO usage
My client.js is
var socket = io.connect('http://[MY SERVER IP]:8000');
// Other client code
I use localhost in local but I changed by my server ip.
But I still have this error
XMLHttpRequest cannot load http://[MY_SERVER_IP]:8000/socket.io/?EIO=3&transport=polling&t=1455101301883-60. No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'MY DOMAIN' is therefore not allowed access. The response had HTTP status code 404.
I really don't know what to do, this is the first I try to deploy a Nodejs + php app.
I had read some Stackoverflow questions to try to fix it, but i really don't know how...
I don't knwo if this is due to my Apache conf or I must change some NodeJS or SocketIO conf
Thanks for your help.
I found the problem.
Do not open the NodeJS server port on Apache config. It will make some conflicts and will prevent your Node server to work.
Just change the localhost by your server IP and it will work fine.
I am working on Application with nodejs and PHP along with Mysql
PHP is frontend here, nodejs is backend
Now
I created a form and while creating a form
i am sending ajax request to Nodejs
like
url: http://example.com:8124/sign_in
Which is working fine
Now Problem is that
When i enabled ssl on Apache.
now i am unable to send request to Node .
it is giving me error, like
Cross-origin policy , load unsafe content
How to resolve this issue
Thanks
That's not a problem with PHP, rather with JavaScript (AJAX). It's because you're trying to load contents from a server that doesn't use SSL from a web page served via SSL.
Simply enable SSL also on the Node app and it will work.
Edit
I do not recommend creating a proxy server in Node.js.
It's actually a good idea to create a proxy server in front of every Node.js app. Indeed, for safety reasons most websites built with Node.js have a Nginx reverse-proxy in front. That is: users connect to Nginx (chosen over Apache for the much better performances) and Nginx makes a request to the Node.js app.
With this setup, you would actually not need to enable SSL in Node.js, as long as Nginx has SSL enabled.
To use SSL directly in Node.js, you need to add just a couple of lines to your app.js file. See this SO question: How do I setup a SSL certs for an express.js server?
If the servers are on the same hostname (just a different port), then you won't need another SSL certificate; if the servers are on a different hostname (e.g. a subdomain) and your SSL certificate isn't a wildcard one, then you will need another certificate.
Speaking about the port... It's true that HTTPS by default runs on 443, but you're free to change it as you want. Just remember to specify it, for example: https://example.com:8443/
A simple way to enable ssl on node is to use a proxy in front of your application:
var fs = require('fs');
var httpProxy = require('http-proxy');
var privateKey = fs.readFileSync('key.pem').toString();
var certificate = fs.readFileSync('cert.pem').toString();
var chainCertificate = fs.readFileSync('ca.pem').toString();
httpProxy.createServer({
target: {
host: 'localhost',
port: ...your application port...
},
ssl: {
key: privateKey,
cert: certificate,
ca: chainCertificate
}
}).listen(...the port for ssl...);
I'm trying to create a live web server on my Windows 8.1 computer.
I am connected directly to my modem using ethernet (I do have a wireless router) but I am not connected to it on this computer (desktop).
I have XAMPP working and my website appears at http://localhost/home
However, if I put in my IP from www.whatismyip.com it does not load my web server.
What am I missing?
You need to create a Port Forwarding for Port 80 to your Computers local IP Address. There should be a Admin Panel for your Router (normally the Gateway - check out with Start - Run - cmd -> then insert "ipconfig" and check out the Gateway.
And i guess in XAMPP the Internet Access is blocked. But this is a simple Apache Server, so you need to open your httpd.conf File (Should be: “c:\xampp\apache\conf\extra\httpd-xampp.conf)
Search here for:
There should be a "Deny from all" - add a # in Front to deactivate that rule.
Restart your Xampp and it should work.
Here you find more Information about Port-Forwarding:
https://managewp.com/how-to-access-a-local-website-from-internet-with-port-forwarding
Any maybe you will also need to activate Port 80 on your Firewall (depends on your Configuration)
You need a method of telling request sent to your public ip to be forwarded to the private ip of the web server. Try logging into you device (router etc) and setting this up.
I am currently trying to connect to my elastic search cluster using the php elasticsearch client
I am having trouble using an https endpoint for this. I have my cluster behind a load balancer with a VIP in front, it is using Apache authentication and is on port 443. The trouble I am running into is that the config for the client seems to be parsing the hosts and removes https:// from the host name. this results in the client always trying to connect over port 80. I have tried adding :443 to the host name but I am then getting a curl error "empty reply from server". I know that this server has access (no firewall blocking) because i can manually make the curl call using https://myelasticsearch.com.
My question is, is there a way to specify the protocol to make the request over using this client? if not, where in the source is the parsing of the host array happening?
I have found a temporary solution, in src/Elasticsearch/Connections/AbstractConnection.php there is a defined transportSchema variable that is set to http. I changed this to https and also added the :443 to my host in the config and it works!
Just as an update to this question (in case anyone stumbles into it), this bug was fixed in Elasticsearch-PHP v1.1.0. You can specify https in the host now to use SSL:
$params = array();
$params['hosts'] = array (
'https://localhost', // SSL to localhost
'https://192.168.1.3:9200' // SSL to IP + Port
);
$client = new Elasticsearch\Client($params);