I'm working out on a few projects using node and each line of code that I write spans lots of ideas of how could someone destroy my node process.
Right now I'm thinking of this:
require('http').createServer(function(req, res)) {
//DEAL WITH REQUEST HERE
}.listen(port, net);
That's standard code for setting up a server and dealing with requests.
Let's say that I want to bring down that node process, I could send POST requests with loads of data on them and node.js would spend lots of time (and bandwith) on receiving all of them.
Is there a way to avoid this?
PHP Pros: How do you normally deal with this?
Is there a way to tell node or php (maybe apache) to just ignore requests from certain IPs ?
You can limit the http request size.
Here is the middleware you can use.
https://github.com/senchalabs/connect/blob/master/lib/middleware/limit.js
http://www.senchalabs.org/connect/middleware-limit.html
P.S. Possibility duplication from maximum request lengths in node.js
For getting the IP address in node.js, you can try request.connection.remoteAddress.
Related
We are developing a bookmarklet, and we use JSONP to communicate with the server. We have reached a phase when we must send from the browser to server parameters that will exceed the well know 2000ish URL length.
We are looking for solutions to overcome this problem. Please note that the bookmarklet will be executed on 3rd party URLS, some of them are HTTP and some of them HTTPS, and JSONP is limited to GET requests only
The only thing I can think of would be to do multiple requests - throw an id in with the request and setup the state server side in a persistent way and then request the data.
Multiple requests is pretty ugly too - what if one message gets lost while another makes it, etc.
Unfortunately, JSONP doesn't have a lot of flexibility since its just simulating script loads - and theres really no way around this with current browser security standards.
With the know limitations, I see only three ways:
Send less data. Potentially you could compress it?
Use more than one request. This can be complicated for blobs, but should be possible.
Extend the URL length limit - there are configs for that in your server
I develop a website based on the way that every front end thing is written in JavaScript. Communication with server is made trough JSON. So I am hesitating about it: - is the fact I'm asking for every single data with http request query OK, or is it completely unacceptable? (after all many web developers change multiple image request to css sprites).
Can you give me a hint please?
Thanks
It really depends upon the overall server load and bandwidth use.
If your site is very low traffic and is under no CPU or bandwidth burden, write your application in whatever manner is (a) most maintainable (b) lowest chance to introduce bugs.
Of course, if the latency involved in making thirty HTTP requests for data is too awful, your users will hate you :) even if you server is very lightly loaded. Thirty times even 30 milliseconds equals an unhappy experience. So it depends very much on how much data each client will need to render each page or action.
If your application starts to suffer from too many HTTP connections, then you should look at bundling together the data that is always used together -- it wouldn't make sense to send your entire database to every client on every connection :) -- so try to hit the 'lowest hanging fruit' first, and combine the data together that is always used together, to reduce extra connections.
If you can request multiple related things at once, do it.
But there's no real reason against sending multiple HTTP requests - that's how AJAX apps usually work. ;)
The reason for using sprites instead of single small images is to reduce loading times since only one file has to be loaded instead of tons of small files at once - or at a certain time when it'd be desirable to have the image already available to be displayed.
My personal philosophy is:
The initial page load should be AJAX-free.
The page should operate without JavaScript well enough for the user to do all basic tasks.
With JavaScript, use AJAX in response to user actions and to replace full page reloads with targeted AJAX calls. After that, use as many AJAX calls as seem reasonable.
What way will be best to write online chat with js? If i would use AJAX and update information about users and messages every 5sec - HTTP requests and answers will make big traffic and requests will make high server load.
But how another? Sockets? But how..
You seem to have a problem with the server load, so I'll compare the relevant technologies.
Ajax polling:
This is the most straightforward. You do setTimeout loop every 5 seconds or so often to check for new chat messages or you set an iframe to reload. When you post a message, you also return new messages, and things shouldn't get out of order. The biggest drawback with this method is that you're unlikely to poll with a frequency corresponding to how often messages are posted. Either you'll poll too quickly, and you'll make a lot of extra requests, or you'll poll too slowly and you'll get chunks of messages at a time instead of getting them in a real-time-ish way. This is by far the easiest method though.
HTTP Push
This is the idea that the server should tell the client when there are new messages, rather than the client continually bothering the server asking if there are any new ones yet. Imagine the parent driving and kid asking "are we there yet?", you can just have the parent tell the kid when they get there.
There are a couple ways to both fake this and do it for real. WebSockets, which you mentioned, are actually creating a stream between the client and the server and sending data in real time. This is awesome, and for the 4 out of 10 users that have a browser that can do it, they'll be pretty psyched. Everyone else will have a broken page. Sorry. Maybe in a couple years.
You can also fake push tech with things like long-polling. The idea is that you ask the server if there are any new messages, and the server doesn't answer until a new message has appeared or some preset limit (30 seconds or so) has been reached. This keeps the number of requests to a minimum, while using known web technologies, so most browsers will work with it. You'll have a high connection concurrency, but they really aren't doing anything, so it should have too high a server cost.
I've used all of these before, but ended up going with long polling myself. You can find out more about how to actually do it here: How do I implement basic "Long Polling"?
You should chose sockets rather than AJAX Polling, However there isn't much around about how you can integrate socket based chats with MySQL.
I have done a few tests and have a basic example working here: https://github.com/andrefigueira/PHP-MySQL-Sockets-Chat
It makes use of Ratchet (http://socketo.me/) for the creation of the chat server in PHP.
And you can send chat messages to the DB by sending the server JSON with the information of who is chatting, (if of course you have user sessions)
There are a few ways that give the messages to the client immediately:
HTML5 Websockets
good because you can use them like real sockets
bad because only a few browsers support it
Endlessly-loading frame
good because every browser supports it
not so cool because you have to do AJAX requests for sending stuff
you can send commands to the client by embedding <script> tags in the content
the script gets executed immediately, too!
...
So, in conclusion, I would choose the second way.
The typical approach is to use long polling. Though better not do this in PHP (PHP will need one process for each connection thus drastically limiting the number of possible visitors to your site). Instead use node.js. It's perfect for chats.
Soon I'm going to have 3 identical scripts on 3 different VPS's and I want to have a fourth VPS which will control them all..
So for the sake of this question what I need to do is insert SQL rows and create files on the sub-servers, and the sub-servers also need to send statistical data back to the mother server. What is the most efficient way of doing this?
I was thinking of making scripts on the servers to do the jobs I need and using cURL to send requests to these scripts making use of URL parameters for the data which needs to be transferred, but perhaps there is a better way? Ideally I want it to be as fast as possible because they will most likely be sending requests to each other every second.
You could use XML-RPC, which exists in many manifestations:
http://us3.php.net/manual/en/book.xmlrpc.php
If you want dead-simple, just use plain HTTP(S) requests, provided you're careful about implementing it.
To perform a simple request, use cURL, file_get_contents, or fopen. This website is packed full of usage examples.
For simple comminication (ie. a script on server A triggers a script on server B), plain and simple HTTP queries works great. You can add basic authentication (htaccess) to avoid unauthorized people to trigger your script, and stronger security by using HTTPS.
I was wondering if anyone has done any testing on the speed differences of cURL and XHR (in regards to the time it takes to complete a request, or series of requests).
Specifically I'm wondering because I would like to use XHR to go to php script, and use cURL from there to grab a resource. The php page will ensure ensuring the data is in the right format, and alter it if it isn't. I would like to avoid doing this on the javascript end because it's my understanding that if the users computer is slow it could take noticeably longer.
If it makes a difference, all data will be retrieved locally.
There is no speed difference between the two. You're comparing an HTTP request to an... HTTP request. For our purposes they both do the exact same thing, only one does it in JavaScript and one in PHP. Having a chain will take twice as long (probably more) since you're making a request to your server, and then your server is making a request to another server.
I don't understand why you wouldn't want to just get the resource with JavaScript and scrap the PHP median. I don't see any problem with doing it that way. (Unless your data is on another domain, then it gets trickier, but it's still doable.)
If I understand the question correctly, the difference will be that XmlHttpRequest would be on the client side (javascript), and cURL would be on the server side (PHP)
This would impact performance one way or the other, depending on where the resource is (you say local), and how many concurrent requests you'll get.