I'm making 3 subsequent AJAX requests to a PHP file.
$.post("page/1");
$.post("page/2");
$.post("page/3");
They all seem to wait for one another to finish before firing. I can watch the activity in Network Console, and see that they really do wait for each other. I can't figure out how to get them to fire all at once.
I am using PHP sessions, which I saw on here that might be the issue. But I added session_close() as suggested and that doesn't help. I don't think PHP is the issue because at least the browser would have made the 3 ajax calls and then they would wrongfully queue up. But that's not the case, they all wait for each other. 1 after another.
I've check jQuery Ajax.settings and async is set to true. I'm using 1.7.2 from the CDN.
Any ideas on where to look on how to get these to fire all at once?
Thanks!
By default, you can have 4 concurrent HTTP requests.
As for XMLHttpRequests, its sometimes 6 (FF). But varies.
It would seem they are determined by the domain in question.
See this related question.
Related
My usecase is eg. to send three FB and two GA-firebase events from server-side code to the corresponding tracking endpoints. But that significantly decreases loading time for each page-view.
I tried to implement it as a loop as described here, tried the wp_remote_post blocking attribute in combination with timeout: 0.01 but all of them increase the page load still significantly. Adding the redundant method attribute to wp_remote_post as mentioned here helped a bit, which seems very confusing to me
Without the post_request the page loads in 70ms, with the post request it goes to 600ms. I couldn't see much difference in the blocking attribute in load time change.
I also tried to use wp_schedule_single_event( time(), 'trigger_async_schedule_hook' ); but this didnt work in my local docker setup. I guess due to missing cron abilities.
Since php relies on HTTP it seems that this seems not really possible at first glance?! I'm used to it from node where you can send requests and wait for them in another context aka after page load.
What is the way to go here for?
Which is the way to go here? I thought about an external queue outside of PHP execution, but that seems overkill. Also other wp-async ideas would be very helpful.
PS: I searched the whole internet but couldn't find exactly the solution.
Cheers
The page really needs to load fast, but the DB is slow, so we split it into two db calls, one faster and one slower, the first one that is faster runs and we can serve a part of the page that is quite usable by itself.
But then we want the second request to go off, and we know that it will ALWAYS be necessary to do whenever the first request goes off. So now the first part of the page contains a script which fires off http requests and then we make a db call and finally it loads.
But this is a serial opreation, which means the first part of page load needs to both finish its db, return http, render in the browser, run the script, request http then wait for db and finally return us the whole page.
How do you go about solving this in PHP? We dont have memcache and I looked into fifo but we dont have posix_mkfifo function either.
I want to make two db calls on the first request, serve the first request and part of page, let the second db call continue running, when its finished I want to keep it in /tmp/ or a buffer or wherever fast - in memory - and when the script asks for it - perhaps the scripts http req will need to wait for it some more, perhaps its lucky and will get it served from memory already.
But where in memory do you keep it, across requests and php instances? Not in global, not in session, not in memcached. Where? Sockets?? Should I fork and pipe?
EDIT: Thanks, everybody. I went with the two-async-http-requests route.
I think you could use AJAX.
First time send HTML page with 2 javascript AJAX call, one for each sql query, triggered by page load.
Then load page async with those results.
The problem is that your problem is to complex to solve it without extra solutions like memcache. Direkt in PHP you can save short data in SHM. But thats not the best solution.
The best solution is to build a better database structure so get a better result and a faster response from your database.
For better performance in your database you can look at MySQL memory tables. But be careful the tables will be cleared after restart. So you can fill the tables with data for caching.
And you can send more then one request at a time with Ajax.
I'd like to use ajax to start asynchronously one php script. I try to use another ajax request to get information about progress of it... but.
I can't. Second request will wait for the first one. Is it cause by the nature of PHP? I guess that it can be also caused by handling session in both of scripts - Am I right?
//edit
Due to asks - I use opera web browser. I will check also firefox with firebug as you recommended to me.
Once you have created an xhtml-request it can be that you need another one. For this usecase there are some libraries avalaible.
I'm working on a simple PHP application, using CouchDB and PHP-on-Couch to access some views, and it's working great. My next step is to introduce Ajax to update the frontend with data from the database.
I understand you can use the _changes notifications to detect any changes made on the database easily enough. So, its a matter of index.html monitoring for changes (through long polling), which calls loadView.php to update the page content.
Firstly, I hope the above is the correct method of going about it...
Secondly, when browsing to index.html, the page seems to never fully load (page load bar never completes). When a change is made, Firebug shows a the results as expected, but not any subsequent changes. At this time, the page seems to have stopped the infinite loading.
So far, i'm using jQuery to make the Ajax call...
$.getJSON('http://localhost:5984/db?callback=?', function(db) {
console.log(db.update_seq);
$.getJSON('http://localhost:5984/db/_changes?since='+db.update_seq+'&feed=continuous&callback=?', function(changes) {
console.log(changes);
});
});
Any ideas what could be happening here?
I believe the answer is simple enough.
A longpoll query is AJAX, guaranteed to respond only once, like fetching HTML or an image. It may take a little while to respond while it waits for a change; or it may reply immediately if changes have already happened.
A continuous query is COMET. It will never "finish" the HTTP reply, it will keep the connection open forever (except for errors, crashes, etc). Every time a change happens, zoom, Couch sends it to you.
So in other words, try changing feed=longpoll to feed=continuous and see if that solves it.
For background, I suggest the CouchDB Definitive Guide on change notifications and of course the excellent Couchbase Single Server changes API documentation.
morning
I have some doubts about the the way php works. I cant find the answer anywhere on books so I thought to hit the stack ;)
so here it goes:
lets assume we have one single server with php+apache installed. Here are my beliefs:
1 - php can handle one request at a time. Doesn't matter if apache can handle more than 1 thread at a time because eventually the invoked php interpreter is single threaded.
2 - from belief 1 follows that I believe if the server receives 4 calls at the same very time these calls are queued up and executed 1 at a time. Who makes the request last gets the response last.
3 - from 1 and 2 follows that if I cron-call a url corresponding to a script that does some heavy-lifting/time consuming stuff I slow down the server up to the moment the script returns.
Whats true? whats false?
cheers
My crystal ball suggests that you are using PHP sessions and you have having simultaneous requests (either iframes or AJAX) getting queued. The problem is that the default session handler uses files and session_start() locks the data file. You should read your session data quickily and then call session_write_close() to release the file.
I see no reason why would PHP be not able to handle multiple requests at the same time. That said, it may be semi-true for handling requests of single client, depending on the type of script.
Many scripts use sessions. When session_start() is called, session is being opened and locked. When execution of script ends, session is being closed and unlocked (this can be done manually). When there are multiple requests for the same session, first requests opens and locks session, and the second request has to wait until session is unlocked. This might make an impression that multiple PHP scripts cannot be executed at the same time, but that's true (partly) only for requests that use the same session (in other words - requests from the same browser). Requests from two clients (browsers) may be processed parallelly as long as they don't use resources (files, DB tables etc) that are being locked/unlocked in other requests.