Soon I'm going to have 3 identical scripts on 3 different VPS's and I want to have a fourth VPS which will control them all..
So for the sake of this question what I need to do is insert SQL rows and create files on the sub-servers, and the sub-servers also need to send statistical data back to the mother server. What is the most efficient way of doing this?
I was thinking of making scripts on the servers to do the jobs I need and using cURL to send requests to these scripts making use of URL parameters for the data which needs to be transferred, but perhaps there is a better way? Ideally I want it to be as fast as possible because they will most likely be sending requests to each other every second.
You could use XML-RPC, which exists in many manifestations:
http://us3.php.net/manual/en/book.xmlrpc.php
If you want dead-simple, just use plain HTTP(S) requests, provided you're careful about implementing it.
To perform a simple request, use cURL, file_get_contents, or fopen. This website is packed full of usage examples.
For simple comminication (ie. a script on server A triggers a script on server B), plain and simple HTTP queries works great. You can add basic authentication (htaccess) to avoid unauthorized people to trigger your script, and stronger security by using HTTPS.
Related
So I'm using http://socket.io with node.js and developing a chat to learn more about node.js
I use node.js to communicate with MySQL when users open the page.
But for bandwidth reasons I can't keep using MySQL for everything so I store a few variables in there. For example I don't want the user signed in twice, so I would store the socket ID for all users in JSON. If the socket ID already exists for that user (in node.js), then I would disconnect the socket.
This works fine, but let's say I wanted to disconnect a user, how would I go about doing that with PHP?
One option I've considered is maybe I update a table in the database with the required changes and then node.js checks that table every 60 seconds and does what it needs to do, then updates the table after the changes are completed.
Is that the best option or should I try to accomplish this with PHP? Obviously PHP would be more immediate -- but that's not too much of a concern for me.
It's quite simple:
Make sure your node.js code supports HTTP requests and not just sockets.
Add a route to perform any actions. For instance you could handle /api/disconnect/:userId which would do whatever you need.
In your PHP code, call the relevant URL using either file_get_contents or the cURL family of calls.
Of course, you want to make sure you have some decent security so that only your PHP script can call you webservices, otherwise you'd open the door to all sorts of attacks. In the short term, this could be done simply by only listening 127.0.0.1 and using that address to communicate.
Note that the title of your question does not actually match its text. You're not actually modifying the node.js code, you're just modifying the data it manipulates.
Not sure that I follow your description, however one thing that stands out is that you are considering modifying code using code. Don't. Self-modifying code (regardless if it split across separate components of a system) is a very bad idea.
If the socket ID already exists for that user, then I would disconnect the socket.
Exists where? Presumably in node.js.
While you could use the database to persist the data, I would try to inject the data directly into node.js from the PHP script. That eliminates the polling step. There is a question over how you re-populate the information into node.js after a restart, but that depends on a lot more information than you've provided.
I am creating an app for my clients to add to their webpages. however, I am hosting the database that stores the info for this app. All I want to do is do all the queries on my server and somehow pass the $var to their server.
so what I was thinking was to have my PHP page with all the MYSQL credentials store on my server and give them a code that calls that page and outputs the stuff, something like
require_once('192.163.163.163/config.php');
But I bet this is the least secure way to do this. I don't want to give anyone access to the central database and I am handling all the requests. Do you guys have any suggestions that I can pull the data off my db and pass it to their server in a $var without opening any doors?
If you can't afford to give away your DB credentials or other internal details of your system but you need the clients to be able to read data from you, then the only really secure way to do set your system up as an API that the clients can call.
Don't try to combine the two systems into a single app; it will open up holes that cannot be closed.
To create an API is fairly simple in principle. Just create a suite of normal PHP programs that accept a set of pre-defined arguments return the data in a pre-defined format that can be easily processed by the calling program -- eg maybe a JSON structure.
The clients would then simply call your system via an HTTP call. They'd never need to see your code; the wouldn't need to be hosted on the same server, and they wouldn't even need to be writing their system in the same language as yours.
There's a lot more to it than that -- it is, of course, perfectly easy to write an insecure API as well, and you'll want to read up on how to write a good API to avoid that sort of thing -- but that's your starting point. I hope it helps.
I currently have a bookmarklet that is used by many people in my department at work. The script takes a list of product IDs or gathers a list from the current page and visits each product detail page and gets various pieces of information. Using queries is not possible as I don't have database access. Anyway, I was thinking of building a fully fledged application for this + other features using either rails or PHP/some other framework. The biggest thing is the ability to send asynchronous HTTP requests as Javascript does.
I found this project: http://code.google.com/p/multirequest/, but haven't looked into it yet. What is the best way to go about this?
See this library https://code.google.com/p/multirequest/
It allows you to send multiple concurrent requests with CURL from PHP v5+
An example script is available which basically boils down to...
Define callback methods for when each request completes
Define what each request will look like (headers, etc)
Push each url to the MultiRequest_Handler class
Start the MultiRequest_Handler
At the most basic level, you can use multi curl in PHP. I've tested up to 150 simultaneous requests. You'd want to set a timeout to make sure a single request doesn't stall the whole process. It's not asynchronous, it's simultaneous, but it's quick.
I'm working out on a few projects using node and each line of code that I write spans lots of ideas of how could someone destroy my node process.
Right now I'm thinking of this:
require('http').createServer(function(req, res)) {
//DEAL WITH REQUEST HERE
}.listen(port, net);
That's standard code for setting up a server and dealing with requests.
Let's say that I want to bring down that node process, I could send POST requests with loads of data on them and node.js would spend lots of time (and bandwith) on receiving all of them.
Is there a way to avoid this?
PHP Pros: How do you normally deal with this?
Is there a way to tell node or php (maybe apache) to just ignore requests from certain IPs ?
You can limit the http request size.
Here is the middleware you can use.
https://github.com/senchalabs/connect/blob/master/lib/middleware/limit.js
http://www.senchalabs.org/connect/middleware-limit.html
P.S. Possibility duplication from maximum request lengths in node.js
For getting the IP address in node.js, you can try request.connection.remoteAddress.
Is there any way you can push data to a page rather than checking for it periodically?
Obviously you can check for it periodically with ajax, but is there any way you can force the page to reload when a php script is executed?
Theoretically you can improve an ajax request's speed by having a table just for when the ajax function is supposed to execute (update a value in the table when the ajax function should retrieve new data from the database) but this still requires a sizable amount of memory and a mysql connection as well as still some waiting time while the query executes even when there isn't an update/you don't want to execute the ajax function that retrieves database data.
Is there any way to either make this even more efficient than querying a database and checking the table that stores the 'if updated' data OR tell the ajax function to execute from another page?
I guess node.js or HTML5 webSocket could be a viable solution as well?
Or you could store 'if updated' data in a text file? Any suggestions are welcome.
You're basically talking about notifying the client (i.e. browser) of server-side events. It really comes down to two things:
What web server are you using? (are you limited to a particular language?)
What browsers do you need to support?
Your best option is using WebSockets to do the job, anything beyond using web-sockets is a hack. Still, many "hacks" work just fine, I suggest you try Comet or AJAX long-polling.
There's a project called Atmosphere (and many more) that provide you with a solution suited towards the web server you are using and then will automatically pick the best option depending on the user's browser.
If you aren't limited by browsers and can pick your web stack then I suggest using SocketIO + nodejs. It's just my preference right now, WebSockets is still in it's infancy and things are going to get interesting once it starts to develop more. Sometimes my entire application isn't suited for nodejs, so I'll just offload the data operation to it alone.
Good luck.
Another possibility, if you can store the data in a simple format in a file, you update a file with the data and use the web server to check its timestamp.
Then the browser can poll, making HEAD requests, which will check the update times on the file to see if it needs an updated copy.
This avoids making a DB call for anything that doesn't change the data, but at the expense of keeping file system copies of important resources. It might be a good trade-off, though, if you can do this for active data, and roll them off after some time. You will need to ensure that you manage to change this on any call that updates the data.
It shares the synchronization risks of any systems with multiple copies of the same data, but it might be worth investigating if the enhanced responsiveness is worth the risks.
There was once a technology called "server push" that kept a Web server process sitting there waiting for more output from your script and forwarding it on to the client when it appeared. This was the hot new technology of 1995 and, while you can probably still do it, nobody does because it's a freakishly terrible idea.
So yeah, you can, but when you get there you'll most likely wish you hadn't.
Well you can (or will) with HTML5 Sockets.
This page has some great info about this technology:
http://www.html5rocks.com/en/tutorials/websockets/basics/