I'm trying to send a post request but It is not sending. I dont get output in the console log of the browser.
My node server.js is running in x.x.x.x:8000 then I connect it with my client.html. x.x.x.x:8000/client.html
Here is my node.js server.
function handler (req, res) {
var filepath = '.' + req.url;
if (filepath == './')
filepath = './client.html';
var extname = path.extname(filepath);
var contentType = 'text/html';
switch(extname){
case '.js':
contentType = 'text/javascript';
break;
case '.css':
contentType = 'text/css';
break;
}
path.exists(filepath, function (exists){
if (exists){
fs.readFile(filepath, function(error, content){
if(error){
res.writeHead(500);
res.end();
}
else{
res.writeHead(200, { 'Content-Type': contentType });
res.end(content, 'utf-8');
}
});
}
else{
res.writeHead(404);
res.end();
}
});
}
JAVASCRIPT CODE - I'm using ajax call and sending request to COMMAND.php
$.post(
'/var/www/COMMAND.php',
{ cmd: "OUT" },
function( data ){
console.log(data);
});
PHP COMMAND.php - This writes to the the named pipe in linux. When it is done writing it echo success.
<?php
if ($_POST['cmd'] === 'OUT') {
$con = fopen("/tmp/myFIFO", "w");
fwrite($con, "OUT\n");
fclose($con);
echo "SUCCESS";
exit;
}
?>
Why is it not sending any post requests to COMMAND.php? Is there any way for me to call COMMAND.php and execute the commands in it?
Because NodeJS runs JS, not PHP. Also, unlike Apache which has a built-in file handling, in NodeJS, you need to build code or use an existing library to route your urls to files.
As for your question, it's either you:
Setup another server to execute that PHP. Your AJAX is calling to your NodeJS server. You could route that request from NodeJS to your PHP server (Apache or whatever), basically making NodeJS act like a proxy.
Or create code in JavaScript for NodeJS that runs a similar routine as your PHP script, and you won't need PHP or another server anymore.
Related
I have a php file on server which have a function, and I have Node Js API also. I want to pass Node Js value to php script then get back the function output to node js.
I tried this using cookie-parser as sugggested by Christian in here. But it does not work
php script
<?php
$max = $_COOKIE["usrMob"]; // Taken from cookie
$min = 1111;
$number = mt_rand($min, $max); // Find random number
echo $number; // Send back to Node Js
?>
Node.Js
const express = require("express");
const cookieParser = require('cookie-parser');
const app = express();
app.use(cookieParser('Your Secret'));
router.get('/cookie', function (req,res)
{
// Set cookie
res.cookie('userMax', '46556') // options is optional
res.end();
console.log("Cookie is : " + res.cookie);
})
I have a php file on server which have a function, and I have Node Js
API also. I want to pass Node Js value to php script then get back the
function output to node js.
I tried this using cookie-parser as sugggested by Christian in here. But it does not work
Short answer
Sharing COOKIES won't work because of CORS, your nodejs server must be in the allow origin list of the PHP server.
Long answer
COOKIES are very used when storing user settings/tokens/password or some sensitive data on your browser that allows the user browsing experience behave different mostly the user decisions.
Therefore they cannot be sent in requests when different servers communicates between them unless they are allowed to leave to an 'authorized-origin' otherwise that would be a major leak of data through cookies, say hello to CORS (unless you don't own the target server).
Example:
You have a script on a TargetServer(TS), that sets a cookie there when user does some stuff. After the user finishes with your script you want to send data back to YourServer(YS), when the AJAX triggers, cookies won't be sent with the request as you normally see when you develop on localhost.
Following your stack of tools, another problem issues, each request that you'll make to YS will generate a new id/session (i'm looking at you PHPSESSID), and that means, you won't know for example if the user is logged or not, and you know for sure that he already logged earlier (Yes - he is logged, but in another session file ... ).
HOW TO TACKLE THIS PROBLEM:
Find an appropriate mechanism for encrypt/decrypt strings that your script and php will know.
When you're sending a request from TS to YS add a custom
header that YS will expect.eg. REQUEST-CUSTOM-HEADER: encodedVersionOf('hey-give-me-the-session-id') , PHP will see the incoming header, will decodeVersionOf('hey-give-me-the-session-id') and will trigger some special if and send you a response with a different header RESPONSE-CUSTOM-HEADER: encodedVersionOf('here-is-the-session-id'). Your script will now save it in COOKIES so you won't have to request it again. and just append it to your header on future requests.
If PHP recognizes the incoming string as a valid session then php can load that session that you know you had data in it with session_id($incoming_id), make sure to set session_id before session_start
I highly advise using JWT for this kind of things or some encrypted stringify json, so you can have an object like {session_id : 12idn3oind, userInfo: {name: 'test'}}.
Exchanging data through headers is the next best thing when CORS is involved.
I tackled this example once, wasn't pretty to do, but worth it in the end.
You can send/receive data to/from php, only thing is that you should use headers so you won't affect php output.
Since you own both servers you can do something like:
MOST IMPORTANT :
npm install -S express
Make sure you have enabled headers_module/mod_headers on your webserver.
We will use custom headers so you should allow & expose them:
.htaccess
Header add Access-Control-Allow-Headers "node-request, node-response"
Header add Access-Control-Allow-Methods "PUT, GET, POST, DELETE, OPTIONS"
Header add Access-Control-Expose-Headers "node-request, node-response"
Header add Access-Control-Allow-Origin "*"
PHP
<?php
$max = #$_COOKIE["usrMob"]; // Taken from cookie
$min = 1111;
$number = rand($min, $max); // Find random number
echo $number; // Send back to Node Js
if( isset($_SERVER['HTTP_NODE_REQUEST'])){
$req = json_decode($_SERVER['HTTP_NODE_REQUEST'], true);
$data = array();
// 'givemeanumber' is sent from Node.js server
if( isset($req['givemeanumber']) ){
$data = array(
'number' => $number
);
}
header('NODE-RESPONSE: '. json_encode(array("req" => $req, "res"=> $data)));
}
?>
Node.JS
Don't forget to change these line to point to your php-server:
getFromPHP('localhost', '9999', '/path-to-php-script', {givemeanumber: 1})
index.js
const express = require("express");
const app = express();
const port = 9999;
const { getFromPHP } = require('./middleware.js');
const apachePHPconfig = {
host: 'localhost',
port: 80,
urlpath: 'path-to-php-script'
}
app.get(
'/',
getFromPHP(apachePHPconfig.host, apachePHPconfig.port, apachePHPconfig.urlpath , {givemeanumber: 1}),
function (req, res) {
// here is your php object
console.log('php', req.php);
res.end();
})
app.listen(port, () => {
console.clear();
console.log(`Example app listening on port ${port}!`)
})
middleware.js
/**
* Middleware to get data from PHP
*/
const getFromPHP = (phpHost, phpPort, phpPath, phpObject) => {
if (typeof phpHost === 'undefined') {
throw new Error('phpHost was not defined');
}
if (typeof phpPort === 'undefined') {
throw new Error('phpPort was not defined');
}
if (typeof phpPath === 'undefined') {
phpPath = '/';
}
if (typeof phpObject !== 'object' ) {
phpObject = {};
}
return (req, res, next) => {
if (typeof req.php === 'undefined') {
req.php = {};
}
const options = {
hostname: phpHost, // change this to your php server host
port: phpPort, // change this with your php server port
path: phpPath, // change this with your php server path to script
method: 'POST',
headers: {
// here we send 'NODE-REQUEST', it will be available in php unde $_SERVER global prefixed with HTTP_ string because is a custom client request header.
'NODE-REQUEST': JSON.stringify(phpObject)
}
};
const isJSON = (str ) => {
try {
let j = JSON.parse(str);
return typeof j === 'object' && j !== null;
} catch (e) {
return false;
}
};
const httpModule = require('http');
let reqHttp = httpModule.request(options, (response) => {
if( typeof response.headers['node-response'] === 'undefined' || !isJSON(response.headers['node-response'])){
req.php = {};
}else{
req.php = JSON.parse(response.headers['node-response']);
}
// START - Remove this code when everything runs as expected
let dataStack = [];
response.on('data', (data)=>{
dataStack.push(data.toString());
})
response.on('end', ()=>{
console.log("PHP HEADERS", response.headers)
console.log('PHP OUTPUT', dataStack.join(''));
})
// END
next();
});
reqHttp.on('error', (e) => {
console.error(`problem with request to php server: ${e.message}`);
next();
});
reqHttp.on('end', () => {
next();
});
reqHttp.end();
}
}
exports.getFromPHP = getFromPHP;
I'm using Node.js (on AWS Lambda for alexa skill) to request my web server for a JSON file. But my server responds with a 'Javascript' not supported error html.
This is my code:
function httpsGet(myData, callback) {
// Update these options with the details of the web service you would like to call
var options = {
host: 'alexa0788.byethost24.com',
port: 80,
path: '/sample.php',
method: 'GET',
};
var req = http.request(options, res => {
res.setEncoding('utf8');
var returnData = "";
res.on('data', chunk => {
returnData = returnData + chunk;
});
res.on('end', () => {
console.log(returnData);
var pop = JSON.parse(returnData).firstName + JSON.parse(returnData).lastName;
callback(pop); // this will execute whatever function the caller defined, with one argument
});
});
req.end();
}
How can I make my server respond with the intended json and not force the client to support javascript? At the moment, I'm having a php file output the json. I tried calling a .json file too directly, instead of making a php file output json, but I see the same error.
For the backend of my site, visible only to a few people, I have a system whereby I communicate with a php via ajax like so:
function ajax(url, opts) {
var progress = false, all_responses = [], previousResponseLength = "";
var ajaxOptions = {
dataType: "json",
type: "POST",
url: url,
xhrFields: {
onprogress: function(e) {
if (!e.target.responseText.endsWith("\n")) return;
var response = e.target.responseText.substring(previousResponseLength).trim();
previousResponseLength = e.target.responseText.length;
var responses = response.split(/[\r\n]+/g);
var last_response;
for (var k in responses) {
if (responses[k] === "---START PROGRESS---") {
progress = true;
if (opts.onProgressInit) opts.onProgressInit();
} else if (responses[k] === "---END PROGRESS---") progress = false;
else all_responses.push(last_response = responses[k]);
}
if (progress && last_response !== undefined) opts.onProgress(JSON.parse(all_responses[all_responses.length-1]));
}
},
dataFilter: function(data){
return all_responses[all_responses.length-1];
}
}
$.extend(ajaxOptions, {
onProgress: function(data){
console.log(data);
}
});
return $.ajax(ajaxOptions);
}
And an example of a never-ending php script (until the user closes the connection):
const AJAX_START_PROGRESS = "---START PROGRESS---";
const AJAX_END_PROGRESS = "---END PROGRESS---";
session_write_close(); //fixes problem of stalling entire php environment while script runs
set_time_limit(0); //allows to the script to run indefinitely
output(AJAX_START_PROGRESS);
while(true) {
output(json_encode(["asdasd" => "asasdas"]));
sleep(1);
}
function output($msg) {
echo preg_replace("`[\r\n]+`", "", $msg).PHP_EOL;
ob_flush(); flush();
}
This allows me through 1 ajax request to 'poll' (am I using that term correctly?)
So if I want to execute a very long php script I can now check its progress, and the last response is delivered via jqhxr.done(callback).
Or, as in the example php script, I can open a connection and leave it open. Using sleep(1); It issues an update to the $.ajax object every 1 second.
Every response has to be json encoded, and if the response is 1 very long json that comes over multiple 'onprogress' calls, it waits until the end of the message (if responseText.endsWith("\n")) we're ready!)
My remote shared server didn't allow websockets so I made this. If the user closes the connection, so does the php script.
It's only got to work for a few admins with special privileges, and I don't need to worry about old browsers.
Can anyone see anything wrong with this script? Through googling I haven't found anybody else with this kind of method, so I expect something is wrong with it.
Extensive testing tells me it works just fine.
You invented long polling request, actually it's wide used as fallback to websockets, so nothing wrong with it.
About your code it's hard to say without testing, but when using such methods as long-polling, you need to double check memory leaks on browser side and on server side.
This is probably a simple mistake. I have a nodejs server running socket.io, I have got everything to work within the server.
However, I want to be able to make a CURL post via PHP to the node server, and have it emit the data. I can make the server receive the request, but when I try to emit the data, I get an error saying that the socket is not defined.
This is obvious in my code. My question is, how do I require socket.io before I setup the server? Hopefully a segment my code will help explain my problem:
var http = require('http')
, url = require('url')
, fs = require('fs')
, server;
server = http.createServer(function(req, res) {
// your normal server code
var path = url.parse(req.url).pathname;
switch (path){
case '/':
res.writeHead(200, {'Content-Type': 'text/html'});
res.write('<h1>Hello! Enter</h1>');
res.end();
break;
case '/index.html':
fs.readFile(__dirname + path, function(err, data){
if (err) return send404(res);
res.writeHead(200, {'Content-Type': path == 'json.js' ? 'text/javascript' : 'text/html'})
res.write(data, 'utf8');
res.end();
});
break;
case '/write.html':
fs.readFile(__dirname + path, function(err, data){
if (err) return send404(res);
res.write(data, 'utf8');
res.end();
});
break;
case '/post':
console.log(req.toString());
socket.broadcast.emit('user', {state: 'PHP Post', userid: 'PHP'});
res.writeHead(200);
res.end();
break;
default: send404(res);
}
}),
send404 = function(res){
res.writeHead(404);
res.write('404');
res.end();
};
server.listen(843);
// socket.io
var io = require('socket.io').listen(server);
// Regular socket.io events
how do I require socket.io before I setup the server?
That part is pretty easy; at the top of your file, do something like:
var http = require('http')
, url = require('url')
, fs = require('fs')
, socketIO = require('socket.io') // <-- added line
, server;
and at the bottom:
var io = socketIO.listen(server);
The problem is, in your POST handler, you're using socket.broadcast.emit, but socket isn't defined. Are you trying to send a message to all Socket.IO users? If so, you can use io.sockets.emit; I'd probably do something like this:
var http = require('http')
, url = require('url')
, fs = require('fs')
, server
, io;
...
case '/post':
console.log(req.toString());
io.sockets.emit('user', {state: 'PHP Post', userid: 'PHP'});
res.writeHead(200);
res.end();
break;
...
io = require('socket.io').listen(server);
If you're trying to send data to a single socket, or every socket except for a particular one (which is how you'd normally use socket.broadcast), you'll somehow need to map HTTP requests to Socket.IO sockets; using sessions for this is common.
Here is the PHP documentation
Here is how I would use it in an Ajax call, if I don't find a pure client way to do this.
$homepage = file_get_contents('http://www.example.com/');
echo $homepage;
Is there way to do this client side instead so I don't have to ajax the string over?
you could do
JS code:
$.post('phppage.php', { url: url }, function(data) {
document.getElementById('somediv').innerHTML = data;
});
PHP code:
$url = $_POST['url'];
echo file_get_contents($url);
That would get you the contents of the url.
It's 2020 and some modern approach;
async function file_get_contents(uri, callback) {
let res = await fetch(uri),
ret = await res.text();
return callback ? callback(ret) : ret; // a Promise() actually.
}
file_get_contents("https://httpbin.org/get", console.log);
// or
file_get_contents("https://httpbin.org/get").then(ret => console.log(ret));
JavaScript cannot go out and scrape data off of pages. It can make a call to a local PHP script that then goes on its behalf and grabs the data, but JavaScript (in the browser) cannot do this.
$.post("/localScript.php", { srcToGet: 'http://example.com' }, function(data){
/* From within here, data is whatever your local script sent back to us */
});
You have options like JSONP and Cross-Origin Resource Sharing at your disposal, but both of those require setting up the other end, so you cannot just choose a domain and start firing off requests for data.
Further Reading: Same origin policy
This function will return the file as a string just like the PHP file_get_contents().
function file_get_contents(uri, callback) {
fetch(uri).then(res => res.text()).then(text => callback(text));
}
However unlike PHP, JavaScript will go on to the next statement, not waiting for the data to return.
Not in a general sense. Cross-domain restrictions disallow Javascript code from doing this.
If the target site has CORS (cross-origin resource sharing) set up, you can use XMLHttpRequest to load files. Most sites do not, as it's off by default for security reasons, and is rarely necessary.
If you just need to include an HTML page, you can stick it in an <iframe> element. This is subject to some layout gotchas, though (the page ends up in a fixed-size element).
Or You can use php.js library. Which allow some php functions for javascript. file_get_contents() function one of them.
<script>
var data = file_get_contents('Your URL');
</script>
You can find more info about php.js : http://phpjs.org/
I think this may be useful for you:
An npm package with the "file-get-contents" method for node.js
https://www.npmjs.com/package/file-get-contents
It is asynchronous so if you are using express it should be used like this
app.get('/', async (req, res)=>{
//paste here the code below
}
Example
const fileGetContents = require('file-get-contents');
// A File request
try {
let data = await fileGetContents('/tmp/foo/bar');
console.log(data);
} catch (err) {
console.log('Unable to load data from /tmp/foo/bar');
}
// Or a HTTP(S) request
fileGetContents('https://pokeapi.co/api/v2/pokemon/1/').then(json => {
const pokemon = JSON.parse(json);
console.log(`Name of first pokemon is ${pokemon.name}`);
}).catch(err => {
console.err(`Unable to get content from PokeAPI. Reason: ${err.message}`);
});
<div id="svg">
</div>
<script>
function file_get_contents(uri, callback) {
fetch(uri).then(res => res.text()).then(text =>
{
var xmlSvg =text;
console.log(xmlSvg );
document.getElementById('svg').innerHTML = xmlSvg;
})
}
var uri ='You-urlllllllll-svg';
file_get_contents(uri);
</script>
function file_get_contents(filename) {
fetch(filename).then((resp) => resp.text()).then(function(data) {
document.getElementById("id").innerHTML = data;
});
}
file_get_contents("url");
<span id="id"></span>