XmlHttpRequest vs cURL - php

I was wondering if anyone has done any testing on the speed differences of cURL and XHR (in regards to the time it takes to complete a request, or series of requests).
Specifically I'm wondering because I would like to use XHR to go to php script, and use cURL from there to grab a resource. The php page will ensure ensuring the data is in the right format, and alter it if it isn't. I would like to avoid doing this on the javascript end because it's my understanding that if the users computer is slow it could take noticeably longer.
If it makes a difference, all data will be retrieved locally.

There is no speed difference between the two. You're comparing an HTTP request to an... HTTP request. For our purposes they both do the exact same thing, only one does it in JavaScript and one in PHP. Having a chain will take twice as long (probably more) since you're making a request to your server, and then your server is making a request to another server.
I don't understand why you wouldn't want to just get the resource with JavaScript and scrap the PHP median. I don't see any problem with doing it that way. (Unless your data is on another domain, then it gets trickier, but it's still doable.)

If I understand the question correctly, the difference will be that XmlHttpRequest would be on the client side (javascript), and cURL would be on the server side (PHP)
This would impact performance one way or the other, depending on where the resource is (you say local), and how many concurrent requests you'll get.

Related

node.js how to handle intentionally large (and bad) http requests

I'm working out on a few projects using node and each line of code that I write spans lots of ideas of how could someone destroy my node process.
Right now I'm thinking of this:
require('http').createServer(function(req, res)) {
//DEAL WITH REQUEST HERE
}.listen(port, net);
That's standard code for setting up a server and dealing with requests.
Let's say that I want to bring down that node process, I could send POST requests with loads of data on them and node.js would spend lots of time (and bandwith) on receiving all of them.
Is there a way to avoid this?
PHP Pros: How do you normally deal with this?
Is there a way to tell node or php (maybe apache) to just ignore requests from certain IPs ?
You can limit the http request size.
Here is the middleware you can use.
https://github.com/senchalabs/connect/blob/master/lib/middleware/limit.js
http://www.senchalabs.org/connect/middleware-limit.html
P.S. Possibility duplication from maximum request lengths in node.js
For getting the IP address in node.js, you can try request.connection.remoteAddress.

Workaround for cross-site parameters reaching max URL limit

We are developing a bookmarklet, and we use JSONP to communicate with the server. We have reached a phase when we must send from the browser to server parameters that will exceed the well know 2000ish URL length.
We are looking for solutions to overcome this problem. Please note that the bookmarklet will be executed on 3rd party URLS, some of them are HTTP and some of them HTTPS, and JSONP is limited to GET requests only
The only thing I can think of would be to do multiple requests - throw an id in with the request and setup the state server side in a persistent way and then request the data.
Multiple requests is pretty ugly too - what if one message gets lost while another makes it, etc.
Unfortunately, JSONP doesn't have a lot of flexibility since its just simulating script loads - and theres really no way around this with current browser security standards.
With the know limitations, I see only three ways:
Send less data. Potentially you could compress it?
Use more than one request. This can be complicated for blobs, but should be possible.
Extend the URL length limit - there are configs for that in your server

How many XMLHttpRequests cause server strain?

I am making a simple PHP and JavaScript multiplayer game. Here is a quick rundown of how it will work: when a user does something, it is submitted with ajax to a php script, which saves a few charachters representing the action taken to a text file on the server. Every so often (to be exact, 430 milliseconds), the other player's computer submits something to another php script, which checks for new content on that text file. If there is new content, it is returned to the client-side and displayed on the other users screen. The only thing is, I am new to ajax, php, and anything server and do not want to crash the server. To make sure I do not crash the server, I need to know if submitting an XMLHttpRequest every 430 millisecoinds is a potential cause of major server strain. Not only that, but BOTH players will submit an XMLHttpRequest every 430 milliseconds. I can easily raise it to 450 milliseconds, but anything beyond that will be a problem.
Well, that depends entirely on your server. If you're running it on a ZX80, I'd be concerned :-)
However, that's only four to six requests a second and modern servers should have no difficulty handling that sort of load.
Of course, if what happens on the server in response to your requests takes more time that the cycle time, you'll run into problems, especially with sustained traffic (no chance to slow down).
This will be inefficient with separate requests, I would suggest taking some time to understand COMET, the term refers to a number of techniques to manage always open bi-direction connections over HTTP.
Here are some useful links I'd start with (I'm not too familiar with COMET for PHP, so I haven't vetted these resource recommendations myself).
http://en.wikipedia.org/wiki/Comet_%28programming%29
Using comet with PHP?
http://www.zeitoun.net/articles/comet_and_php/start

Guidance on the number of http calls; is too much AJAX a problem?

I develop a website based on the way that every front end thing is written in JavaScript. Communication with server is made trough JSON. So I am hesitating about it: - is the fact I'm asking for every single data with http request query OK, or is it completely unacceptable? (after all many web developers change multiple image request to css sprites).
Can you give me a hint please?
Thanks
It really depends upon the overall server load and bandwidth use.
If your site is very low traffic and is under no CPU or bandwidth burden, write your application in whatever manner is (a) most maintainable (b) lowest chance to introduce bugs.
Of course, if the latency involved in making thirty HTTP requests for data is too awful, your users will hate you :) even if you server is very lightly loaded. Thirty times even 30 milliseconds equals an unhappy experience. So it depends very much on how much data each client will need to render each page or action.
If your application starts to suffer from too many HTTP connections, then you should look at bundling together the data that is always used together -- it wouldn't make sense to send your entire database to every client on every connection :) -- so try to hit the 'lowest hanging fruit' first, and combine the data together that is always used together, to reduce extra connections.
If you can request multiple related things at once, do it.
But there's no real reason against sending multiple HTTP requests - that's how AJAX apps usually work. ;)
The reason for using sprites instead of single small images is to reduce loading times since only one file has to be loaded instead of tons of small files at once - or at a certain time when it'd be desirable to have the image already available to be displayed.
My personal philosophy is:
The initial page load should be AJAX-free.
The page should operate without JavaScript well enough for the user to do all basic tasks.
With JavaScript, use AJAX in response to user actions and to replace full page reloads with targeted AJAX calls. After that, use as many AJAX calls as seem reasonable.

How does Gmail's periodic mail fetching work?

When I am using gmail, some new mails that I just received appear on the inbox, even if I did not refresh the page. I believe that would done in Ajax.
Is there any good demo that works very similar to it? Periodically checking and getting JSON data (I am not sure if it is JSON data..) to fetch new data??
Thanks!
The problem with periodic refresh is that while it is good for some things that aren't too time critical, like e-mail fetching, it is not instantaneous. Therefore you wouldn't want to use it for something like chat where waiting even five seconds for a response is too long. You could decrease the polling interval and make a request once a second or even half second, but then you'd quickly overload your browser and waste resources.
One solution for this is to use a technique called ajax long polling (known by others as 'comet' or 'reverse ajax'). With this technique, the browser makes a long duration ajax request, one that does not return until there is new data available. That request sits on the server (you need to be running special server side software to handle this in a scalable manner, but you could hack something together with php as a demonstration) until new data is available at which point it returns to the client with the new data. When the client receives the data it makes another long polling request to sit at the server until there is more data. This is the method that gmail uses, I believe.
That is the gist of long polling, you have to make a couple of modifications because most browsers will expire an ajax request if it does not return after a long time, so if the ajax request times out the client has to make another request (but the timeout is usually on the interval of a minute or longer). But this is the main idea.
The server side implementation of this is far more complicated than the client side (client side requires just a few lines of js).
While I'm not sure of the exact gmail implemention, the AjaxPatterns site has a good overview of what they dub a Periodic Refresh: --> http://ajaxpatterns.org/Periodic_Refresh. I've always just referred to the style as a heartbeat.
The gist of their solution is:
The browser periodically issues an XMLHttpRequest Call to gain new information, e.g. one call every five seconds. The solution makes use of the browser's Event Scheduling capabilities to provide a means of keeping the user informed of latest changes.
They include some links to real-world examples and some sample code.

Categories