How do I make scalable PHP for rapid Ajax requests? - php

I have a program in which an Ajax request goes out to a PHP script that does all the behind the scenes stuff of grabbing the data from the database and then returning it as responseText to be pasted into my HTML to update the page without refreshing. I do this request once every 250 milliseconds (so that means I get 4 hits to the PHP script per second for every user on the HTML page firing the Ajax requests.) I am already seeing the PHP crash with a few computers on at the same time, so I'm guessing the problem has something to do with the PHP getting a lot of requests. Is there a way to do these requests so that a lot of users can get on without this scalability issue coming into play?

First of all firing that much ajax requests ain't a good idea. when the number of users increases, the server load will increase exponentially serving requests. Second thing you need to consider the possibilities of scaling the application and database. I guess you might be returning json_encoded data from server, else make it so.

4 r/s is.. nothing for PHP. So in a short: in order to make PHP scalable; make your code scalable.

Related

DB Connection on Ajax

I have a chat on my website, and it runs on AJAX calls. Knowing that the PHP script is being ran 2-3 times per second, is it a bad idea to connect to a database and pull / insert data? I am wondering if it will slow down my PHP significantly, or not change it much at all.
Sorry I can't comment yet, so i don't know if that's your answer..
So basically of course this will cause all lot of traffic on your database. Depending on Webserver this might not be that big of a deal. But if a Clients physic computer is just from 2000, his side will just lag out the tab because his browser is sending all the time requests to your database and is trying to get the answer.
But i think this is the easiest method to get live data, which you need for your chat. But in my opinion I would suggest you to run the AJAX-Request like every 2 Seconds. I don't know what's exactly your purpose. But for a normal chat (not real time data exchange) this will last.
By the way: I am also not sure how you are initializing your AJAX-Request. But i would suggest doing this with jQuery:
$(document).ready(function(){
call_php_ajax()
});
function call_php_ajax(){
$("#div_toinsert").load("ajax.php");
setTimeout(call_php_ajax, 3000) // 3 Seconds
And in your MySQL-Query-File (ajax.php) you perform your queries

How to parallelize requests without mecache in PHP?

The page really needs to load fast, but the DB is slow, so we split it into two db calls, one faster and one slower, the first one that is faster runs and we can serve a part of the page that is quite usable by itself.
But then we want the second request to go off, and we know that it will ALWAYS be necessary to do whenever the first request goes off. So now the first part of the page contains a script which fires off http requests and then we make a db call and finally it loads.
But this is a serial opreation, which means the first part of page load needs to both finish its db, return http, render in the browser, run the script, request http then wait for db and finally return us the whole page.
How do you go about solving this in PHP? We dont have memcache and I looked into fifo but we dont have posix_mkfifo function either.
I want to make two db calls on the first request, serve the first request and part of page, let the second db call continue running, when its finished I want to keep it in /tmp/ or a buffer or wherever fast - in memory - and when the script asks for it - perhaps the scripts http req will need to wait for it some more, perhaps its lucky and will get it served from memory already.
But where in memory do you keep it, across requests and php instances? Not in global, not in session, not in memcached. Where? Sockets?? Should I fork and pipe?
EDIT: Thanks, everybody. I went with the two-async-http-requests route.
I think you could use AJAX.
First time send HTML page with 2 javascript AJAX call, one for each sql query, triggered by page load.
Then load page async with those results.
The problem is that your problem is to complex to solve it without extra solutions like memcache. Direkt in PHP you can save short data in SHM. But thats not the best solution.
The best solution is to build a better database structure so get a better result and a faster response from your database.
For better performance in your database you can look at MySQL memory tables. But be careful the tables will be cleared after restart. So you can fill the tables with data for caching.
And you can send more then one request at a time with Ajax.

Finish php script before executing

I'm making a multiplayer smartphone game where I have PHP as my backend. When a player makes a move info is send to my PHP script which executes the script, and in the end sends some info back to the smartphone. When alot of players a plying atr the same time (Now I have a workload of 2-4 requests each msecond) the response time is a bit long... I have about 3-4 different SELECT queries and 4-5 UPDATE/INSERT queries in my script.
I have been looking into Stored Procedures and are thinking of maybe using AJAX but not sure. What I want to accomplish is to get the response time down on the smartphone by sending the data back as soon as possible in the script and then execute the rest afterwards!
What is the best way to approach this?
Thanks for all your advice in advance ;-)
Ajax is a good option because it will not render all the page and do all queries necessary to it.
Then, have you check your queries : are there all indexes ? Have you avoid multiple join ?
so, 2 ways : ajax, reduce the response transfer (using json with very few information is good) and optimize your database
If you use PHP as Fast CGI (php-fpm) then you can try return response and continue script in "background".
See function fastcgi_finish_request

The best way to access data from a database every X seconds (asynchronously)

Ok, I didn't really now how to formulate this question, and especially not the title. But i'll give it a try and hope i'm being specific enough while trying to keep it relevant to others.
I you want to run a php script in the background (via ajax) every X seconds that returns data from a database, how do you do this the best way without using to much of the server resources?
My solution looks like this:
A user visits a webpage, ever x seconds that page runs a javascript. The javascript calls a PHP script/file that calls the database, retrieves the data and returns the data to the javascript. The javascript then prints the data to the page. My fear is that this way of solving it will put a lot of pressure on the server if there is a lot (10 000) simultaneous visitors on the page. Is there another way to do this?
That sounds like the best way, given the spec/requirement you set out.
Another way is to have an intermediary step. If you are going to have a huge amount of traffic (otherwise this does not introduce any benefit, but to the contrary may overcomplicat/slow the process), add another table that records the last time a dataset was pulled, and a hard file (say, XML) which if the 'last time' was deemed too long ago, is created from a new query, this XML then feeds the result returned to the user.
So:
1.Javascript calls PHP script (AJAX)
2.PHP pings DB table which contains last time data was fully output
3.If time is too great, 'main' query is rerun and XML file is regenerated from output
ELSE skip to 4
4.Fetch the XML file and output as appropriate for returned AJAX
You can do it the other way, contacting the client just when you need it and wasting less resources.
Comet it's the way to go for this option:
Comet is a programming technique that
enables web servers to send data to
the client without having any need for
the client to request it. This
technique will produce more responsive
applications than classic AJAX. In
classic AJAX applications, web browser
(client) cannot be notified in real
time that the server data model has
changed. The user must create a
request (for example by clicking on a
link) or a periodic AJAX request must
happen in order to get new data fro
the server.

PHP display progress messages on the fly

I am working in a tool in PHP that processes a lot of data and takes a while to finish. I would like to keep the user updated with what is going on and the current task processed.
What is in your opinion the best way to do it? I've got some ideas but can't decide for the most effective one:
The old way: execute a small part of the script and display a page to the user with a Meta Redirect or a JavaScript timer to send a request to continue the script (like /script.php?step=2).
Sending AJAX requests constantly to read a server file that PHP keeps updating through fwrite().
Same as above but PHP updates a field in the database instead of saving a file.
Does any of those sound good? Any ideas?
Thanks!
Rather than writing to a static file you fetch with AJAX or to an extra database field, why not have another PHP script that simply returns a completion percentage for the specified task. Your page can then update the progress via a very lightweight AJAX request to said PHP script.
As for implementing this "progress" script, I could offer more advice if I had more insight as to what you mean by "processes a lot of data". If you are writing to a file, your "progress" script could simply check the file size and return the percentage complete. For more complex tasks, you might assign benchmarks to particular processes and return an estimated percentage complete based on which process has completed last or is currently running.
UPDATE
This is one suggested method to "check the progress" of an active script which is simply waiting for a response from a request. I have a data mining application that I use a similar method for.
In your script that makes the request you're waiting for (the script you want to check the progress of), you can store (either in a file or a database, I use a database as I have hundreds of processes running at any time which all need to track their progress, and I have another script that allows me to monitor progress of these processes) a progress variable for the process. When the process begins, set this to 1. You can easily select an arbitrary number of 'checkpoints' the script will pass and calculate the percentage given the current checkpoint. For a large request, however, you might be more interested in knowing the approximate percent the request has completed. One possible solution would be to know the size of the returned content and set your status variable according to the percentage received at any moment. I.e. if you receive the request data in a loop, each iteration you could update the status. Or if you are downloading to a flat file you could poll the size of the file. This could be done less accurately with time (rather than file size) if you know the approximate time the request should take to complete and simply compare against the script's current execution time. Obviously neither of these are perfect solutions, but I hope they'll give you some insight into your options.
I suggest using the AJAX method, but not using a file or a database. You could probably use session values or something like that, that way you don't have to create a connection or open a file to do anything.
In the past, I've just written messages out to the page and used flush() to flush the output buffer. Very simple, but it may not work correctly on every web server or with every web browser (as they may do their own internal buffering).
Personally, I like your second option the best. Should be reliable and fairly simple to implement.
I like option 2 - using AJAX to read a status file that PHP writes to periodically. This opens up a lot of different presentation options. If you write a JSON object to the file, you can easily parse it and display things like a progress bar, status messages, etc...
A 'dirty' but quick-and-easy approach is to just echo out the status as the script runs along. So long as you don't have output buffering on, the browser will render the HTML as it receives it from the server (I know WordPress uses this technique for it's auto-upgrade).
But yes, a 'better' approach would be AJAX, though I wouldn't say there's anything wrong with 'breaking it up' use redirects.
Why not incorporate 1 & 2, where AJAX sends a request to script.php?step=1, checks response, writes to the browser, then goes back for more at script.php?step=2 and so on?
if you can do away with IE then use server sent events. its the ideal solution.

Categories