Hi all I have a site develop in codeigniter.
In this site I have to make requests to 10 server with this code for example in javascript:
var XML_req = new ActiveXObject("Microsoft.XMLHTTP");
XML_req.open("POST", link_server, false);
XML_req.send(unsescape(XMLdata));
I make a request and the result return me on var_XML_req.responseText
The problem is that:
The xml the xml response can have thousand of nodes, and I have to make the same request 10 times one for each server. I have 10 requests(the same for all server) and 10 response in single big xml.
I know that this requests are asynchronous but in this case I don't know if exist a method to handle this request and this big response, because multiple user can make request at the same time. I have a server with good features but I don't know if this requests and the big responses can be very slow.
Exist a method to handle this?
I have thinked to make a cronjob ebery hour for every server and stor into a database but in this case I don't have real time data I have to buy a very big database because I can have milion of data in many table.
I don't have many experience with this kind of work with a lot and a lot of data.
I hope that someone can explain me the right, the best and the more fast way to handle this request and response.
Honestly, if you are returning millions of rows in a single request something is wrong. Your queries should be small and should be returning only the bits of information people are looking for, not the full set:
Dealing with Large Data in Ajax
There is always the option to use local storage in the browser:
jQuery Ajax LocalStorage
Here is another answer that may provide some insight, too:
LocalStorage, several Ajax requests or huge Ajax request?
Good luck!
Related
I'm making an app which accesses a database and updates the data every view seconds over a PHP script, the problem is that it currently always updates all data, I would like to know how to program something that dynamically updates data and decides what data to update and what not, so it basically keeps track of change somehow. So how would I best go along doing something like this ?
I think that there should be some where that this question has already be asked but I couldn't find it so maybe someone can show me a website where to look.
In general, you will need to user either XHR requests, web sockets, or HTTP/2 to solve this problem. Since HTTP/2 is not universally supported on the browser side, it may not work for you. Here is the outline of the solution:
Every few seconds, javascript you provide in the browser will need to poll the server for updates using an XHR request. You can use the returned data to update the screen with Javascript. If you only want to do some simple updates, like updating some numbers, you might use raw Javascript or jQuery. If your polling will result in complex screen updates or you want to move a lot of functionality into the client, you probably want to redo your client using one of the JavaScript frameworks like React or Angular.
Use web sockets (or HTTP/2) to create a persistent connection to the server, and have the server send updates to the clients as the data changes. This probably will require some code in the application to broadcast or multicast the updates. The client code would be similar to case 1, except that the client would not poll for updates.
The polling solution is easier to implement, and would be a good choice as long as you don't have too many clients sending polls at too high a rate - you can overwhelm the servers this way.
I develop a website based on the way that every front end thing is written in JavaScript. Communication with server is made trough JSON. So I am hesitating about it: - is the fact I'm asking for every single data with http request query OK, or is it completely unacceptable? (after all many web developers change multiple image request to css sprites).
Can you give me a hint please?
Thanks
It really depends upon the overall server load and bandwidth use.
If your site is very low traffic and is under no CPU or bandwidth burden, write your application in whatever manner is (a) most maintainable (b) lowest chance to introduce bugs.
Of course, if the latency involved in making thirty HTTP requests for data is too awful, your users will hate you :) even if you server is very lightly loaded. Thirty times even 30 milliseconds equals an unhappy experience. So it depends very much on how much data each client will need to render each page or action.
If your application starts to suffer from too many HTTP connections, then you should look at bundling together the data that is always used together -- it wouldn't make sense to send your entire database to every client on every connection :) -- so try to hit the 'lowest hanging fruit' first, and combine the data together that is always used together, to reduce extra connections.
If you can request multiple related things at once, do it.
But there's no real reason against sending multiple HTTP requests - that's how AJAX apps usually work. ;)
The reason for using sprites instead of single small images is to reduce loading times since only one file has to be loaded instead of tons of small files at once - or at a certain time when it'd be desirable to have the image already available to be displayed.
My personal philosophy is:
The initial page load should be AJAX-free.
The page should operate without JavaScript well enough for the user to do all basic tasks.
With JavaScript, use AJAX in response to user actions and to replace full page reloads with targeted AJAX calls. After that, use as many AJAX calls as seem reasonable.
Alright I'm going to create a fairly complex form to post via AJAX a lot of different types of information PHP page which will then parse the data and CURL the various datatypes into the correct tables in another database.
Usually I just send a HUGE POST request and then parse the information in the PHP page, making multiple CURL requests along the way to post the various elements.
The downside is the form isn't super responsive, it generally takes 5-10 seconds for the PHP page to give the a-ok. I'd like this new form to be more snappy for better data entry, allowing the user to move onto the next entry without a hitch.
So just looking for some professional advice: Make 5 AJAX requests to 5 PHP pages or stick with the large load?
In terms of scale, there generally wont be many users it at the same time (it's internal for an organization), I'm trying to optimize for a single person submitting many entries over and over again with the same form.
Any and all advice is very welcome and appreciated.
PHP page which will then parse the data and CURL the various datatypes into the correct tables in another database.
Are you making an HTTP Request then? If yes, you should stick to one big AJAX call because otherwise you'd have to establish an HTTP Connection 5 times and establishing a connection takes time!
On a webpage, where there are various fields, I tend to find it best to just send the data over as the users enter it, and jquery makes it easy with the event handling, so that if there is a problem the users can be informed quickly.
But, if the data must be processed as a group, for example, you have a 100x100 matrix and only when it is filled in can you do the mathematics, then one large post works.
So, it depends, if you can't validate until you have all the related information, for example, you don't know if an address is valid until you have street, city, state, then wait until all three are entered then submit the information.
Without more information as what is being done it is hard to really answer the question, but, as a rule of thumb, submit the smallest amount of information that is useful to give the fastest response back to the user if there is an error.
One thing to keep in mind: If you're using file-based sessions and those "5 posts" are all handled by the same site/server, you won't be able to have those 5 POSTs running in parallel. PHP will slap exclusive locks on the session file for each request, so they'll effectively be processed in a serial manner rather than parallel.
Unless you do a session_write_close() in each one, you'd be better off doing one big POST instead and save the extra overhead of establishing/tearing down 5 connections.
I am developing a vertical search engine. When a users searches for an item, our site loads numerous feeds from various markets. Unfortunately, it takes a long time to load, parse, and order the contents of the feed quickly and the user experiences some delay. I cannot save these feeds in the db nor can I cache them because the contents of the feeds are constantly changing.
Is there a way that I can process mutliple feeds at the same time at the same time in PHP? Should I use popen or it there a better php parallel processing method?
Thanks!
Russ
If you are using curl to fetch the feeds, you could take a look at the function curl_multi_exec, which allows to do several HTTP requests in parallel.
(The given example is too long to be copied here.)
That would at least allow you to spend less time fetching the feeds...
Considering you server is doing almost nothing when it's waiting for the HTTP request to end, parallelizing those wouldn't harm, I guess.
Parallelizing parsing of those feeds, on the other hand, might do some damages, if it's a CPU-intensive operation (might be, if it's XML parsing and all that).
As a sidenote : is it really not possible to cache some of this data ? Event if it's only for a couple of minutes ?
Using a cron job to fetch the most often used data and store it in cache, for instance, might help a lot...
And I believe a website responding fast is more important to the users than really really upto date at the second results... If your site doesn't respond, they'll go somewhere else !
I agree, people will forgive the caching far sooner than they will forgive a sluggish response time. Just recache every couple of minutes.
You'll have to setup a results page that executes multiple simultaneous requests against the server via JavaScript. You can accomplish this with a simple AJAX request and then inject the returned data into the DOM once it's finished loading. PHP doesn't have any support for threading, currently. Parallelizing the requests is the only solution at the moment.
Here's some examples using jQuery to load remote data from a website and inject it into the DOM:
http://docs.jquery.com/Ajax/load#urldatacallback
I was wondering if anyone has done any testing on the speed differences of cURL and XHR (in regards to the time it takes to complete a request, or series of requests).
Specifically I'm wondering because I would like to use XHR to go to php script, and use cURL from there to grab a resource. The php page will ensure ensuring the data is in the right format, and alter it if it isn't. I would like to avoid doing this on the javascript end because it's my understanding that if the users computer is slow it could take noticeably longer.
If it makes a difference, all data will be retrieved locally.
There is no speed difference between the two. You're comparing an HTTP request to an... HTTP request. For our purposes they both do the exact same thing, only one does it in JavaScript and one in PHP. Having a chain will take twice as long (probably more) since you're making a request to your server, and then your server is making a request to another server.
I don't understand why you wouldn't want to just get the resource with JavaScript and scrap the PHP median. I don't see any problem with doing it that way. (Unless your data is on another domain, then it gets trickier, but it's still doable.)
If I understand the question correctly, the difference will be that XmlHttpRequest would be on the client side (javascript), and cURL would be on the server side (PHP)
This would impact performance one way or the other, depending on where the resource is (you say local), and how many concurrent requests you'll get.