Fastest way to check for update using PHP,AJAX and MySQL - php

I need to check for updates on a (max) one second interval for updates.
I'm now looking for the fastest way to do that using AJAX for the requests and PHP and MySQL.
Solution 1
Every time new data, that needs to be retreived by other clients, is added to the MySQL database a file.txt is updated with 1. AJAX makes a request to a PHP file which will check if file.txt contains a 1 or 0. If it contains a 1 it will get the data from the MySQL database and return it to the client.
Solution 2
Every AJAX request calls a PHP file which will check directly into MySQL database for new data.
Solution ..?
If there is any faster solution i'd be happy to know! (considering I can only use PHP/MySQL and AJAX)

Avoiding the database will probably not make the process significantly faster, if at all.
You can use a comet-style ajax request to get near real-time polling. Basically, create an ajax request as usual to a php-script, but on the server side you poll the database and sleep for a short interval if there is nothing new. Repeat until there is something of interest for the client. If nothing appears within a timeframe of e.g. 60 seconds, close the connection down. On the client side, you only open a new connection once the first has terminated (either with a response or as a timeout).
See: https://en.wikipedia.org/wiki/Comet_(programming)

Related

Show real time data to users with same response time

I've doubt regarding speed and latency for show real time data.
Let's assume that I want to show read time data to users by fire ajax requests at every second that get data from MySql table by simple collection query.
For that currently these two options are bubbling in my mind
MySql / Amazon Aurora
File system
Among these options which would be better? Or any other solution?
As I checked practically, if we open one page in browser then ajax requests gives response in less than 500ms using PHP, MySql, Nginx stack.
But if we open more pages then same ajax requests gives response in more than 1 second that should be less than 500ms for every visitors.
So in this case if visitors increase then ajax requests gives very poor response.
I also checked with Node.js+MySql but same result.
Is it good to create json files for records and fetch data from file? Or any other solution?
Indeed, you have to use database to store actual data but you can easily add memory cache (it could be internal dictionary or separate component) to track actual updates.
Than your typical ajax request will look something like:
Memcache, do we have anything new for user 123?
Last update was 10 minutes ago
aha, so nothing new, let's return null;
When you write data:
Put data into database
Update lastupdated time for clients in memcache
Actual key might be different - e.g. chat room id. Idea is to read database only when updates actually happened.
Level 2:
You will burn you webserver and also client internet with high number of calls. You can do something:
DateTime start = DateTime.Now;
while(Now.Subtract(30 seconds) < start)
{
if (hasUpdates) return updates;
sleep(100);
}
Than client will call server 1 time per 30 seconds.
Client will get response immediately when server notices new data.

Insert JSON data to MYSQL instantly on JSON change

i have changable JSON file http://91.205.205.18:8000/info.xsl?mount=/128 (it includes icecast2 statistic), and i cant understand how to check if JSON data change for parsing and inserting parsed data to database instantly. Do i need use cron for some php script every 3-5 min?
P.S. Sorry for my bad english ;)
Write a CRON job, that get JSON data from http://91.205.205.18:8000/info.xsl?mount=/128 and then check that received data in your database if that data is already present in db discard it otherwise save it.
First time you may use timestamp in database with the imported json data (This time value will be provided in the json) and after that in each process time you can check only json timestamp and if there are any difference in timestamp then just insert/update else ignore it and for this insert/update process you can create a Cron job.
You will either have to setup a cron job to check at an interval. On a project I work on we have a section of the program checking constantly (every 2 seconds) for some xml data to change on a website. This is the only solution if one doesn't have control of the source as we do not.
If you truly need a form of push back notification and you are in control of the source data you can use a pubsubhubub server:
https://code.google.com/p/pubsubhubbub/
I haven't setup a pubsubhubbub server but it is the best solution if you need realtime data updates being sent from one output to a remote input.

async web calls php

I have a db with over 5 million rows and for each row i have to do a http post to a server with some parameters at maximum rate of 500 connections. each post request takes 12 secs to process. so as old connections are completed i have to do new ones and maintain ~500 connection. I have to then update DB with values returned from these webcalls.
How do I make the webcalls as above?
My App is in PHP. Can I use PHP or should I switch to something else for this.
Actually you can definitely do this with PHP using a technique called long-polling. Basically how it works is the client machine pings the server and says "Do you have anything for me" and the server sees that it does not. Instead of responding it holds onto the request and responds when it has something to send.
Long polling is a method that is used by both DrupalChat and the APE project (AJAX Push Engine).
http://drupal.org/project/drupalchat
http://www.ape-project.org/
Here is some more info on push tech: http://en.wikipedia.org/wiki/Push_technology and http://en.wikipedia.org/wiki/Comet_%28programming%29
And here is a stackoverflow post about it: How do I implement basic "Long Polling"?
Now I have to say that 12 seconds is really dang long for a DB query to run. It sounds like either the query needs to be optimized or the DB does (or both). Have you normalized the database and setup good table and inter-table indexing?
Now as for preventing DB update collisions you need to use transactions (which both PostGres and newer versions of MySQL offer along with most enterprise DB systems). Transactions will allow you to rollback db changes and reserve table IDs and things like that.
http://en.wikipedia.org/wiki/Database_transaction
PHP isn't the right tool to make long-running scripts, since it by default has a maximum execution time which is pretty short. You might look into using python for this task. Also note that you can call external scripts from PHP (such as python scripts) using the system() function, if the only reason you're using PHP is to make it easy to integrate a web front-end.
However, you [b]can[/b] do this in php with a cron-job by simply having your php script only handle a single row at a time, and have the cron-job call the php script every second. Just maintain the index into the table elsewhere (either elsewhere in the DB or just write the number to a file)
If you wanted to saturate your 500 connection limit, have your script do 40 rows at a time. 40 rows / second is roughly 500 rows / 12 seconds

PHP+AJAX with MySQL - Query every 2 seconds, too many in TIME_WAIT

I have a basic HTML file, using jQuery's ajax, that is connecting to my polling.php script every 2 seconds.
The polling.php simply connections to mysql, checks for ID's newer than my hidden, stored current ID, and then echo's if there is anything new. Since the javascript is connecting every 2 seconds, I am getting thousands of connections in TIME_WAIT, just for my client. This is because my script is re-connecting to MySQL over and over again. I have tried mysql_pconnect but it didn't help any.
Is there any way I can get PHP to open 1 connection, and continue to query using it? Instead of reconnecting every single time and making all these TIME_WAIT connections. Unsure what to do here to make this work properly.
I actually ended up doing basic Long Polling. I made a simple PHP script to to an infinite while loop, and it queries every 2 seconds. If it finds something new, it echoes it out, and breaks the loop. My jquery simply ajax connects to it, and waits for a reponse; on reponse, it updates my page, and restarts the polling. Very simple!
PS, the Long Polling method also reduces browser memory issues, as well as drastically reduces the TIME_WAIT connections on the server.
There's no trivial way of doing this, as pconnect doesn't work across multiple web page calls. However, some approaches to minimise the database throughput would be:
Lower the polling time. (2 seconds is perhaps a bit excessive?)
Have a "master" PHP script that runs every 'n' seconds, extracts the data from the database and saves it in the appropriate format (serialised PHP array, XML, HTML data, etc.) in the filesystem. (I'd recommend writing to a temp file and then renaming over the existing one to minimise any partial file collection issues.) The Ajax requested PHP page would then simply use the information in this data file.
In terms of executing the master PHP script, you could either use cron or simply let the user who first requests the page when the contents of file is deemed too stale. (You could use the data file's timestamp for this purpose via the filemtime function.) I'd personally use the latter approach, as cron is overkill for this purpose.
You could take this even further and use memcached instead of a flat file, etc. if so required. (That said, it would perhaps be an over-complex solution at this stage of affairs.)

Is the PHP file instantiated on every AJAX call?

I was just wondering how the PHP is behaving in the background.
Say I have a PHP which creates an array and populates it with names.
$names = Array("Anna", "Jackson" .... "Owen");
Then I have a input field which will send the value on every keypress to the PHP, to check for names containing the value.
Will the array be created on every call? I also sort the array before looping through it, so the output will be alphabetical. Will this take up time in the AJAX call?
If the answer is yes, is there some way to go around that, so the array is ready to be looped through on every call?
There's no difference between an AJAX request and a "normal" http request. So yes, a new php instance will be created for each request. If the time it takes to parse the script(s) is a problem you can use something like APC.
If those arrays are created at runtime and the time this takes is a problem you might store and share the values between requests in something like memcache
No matter what method you use to create the array, if it's in the code, if you pull it out of a database, a text file or any other source, when the web server gets an http request, ( whether it be Ajax or not ) it will start the execution of the PHP script, create its space in memory, and the array will be created.
There's only one entry point for a PHP script and it's the first line of it, when an http rquest points to it. (or when another script is included, which is the same)
As far as I know then it will have to create the array each time as the AJAX will make a new server request on each key press on the input input field. Each server request will create the array if you create the script to do so.
A better method would be to use a database to store the names.
Yes it will be created and destroyed every time you run the PHP script.
If this is a problem you could look at persisting this data somewhere (e.g. in a Session or in a Database), but I would ask whether it is really causing you so much of a performance problem that you need to do this?
(it's not a direct answer to your question, but it can help, if you are concerned about performances)
You say this :
Then I have a input field which will
send the value on every keypress to
the PHP
In this case, it is common pratice to not send the request to the server as soon as a key is pressed : you generally wait a couple of milliseconds (between 100 ms and 150 ms, I'd say), to see if there is not another keypress in that interval :
if the user types several keys, he usually types faster than the time you are waiting, so, you only send a request for the last keypress, and not every keypress
if the user types 4 letters, you only do 1 request, instead of 4 ; which is great for the health of your server :-)
speaking of time for the user : waiting 100 ms plus the time to go to the server, have the script running, and get back from the server is almost the same as without waiting 100 ms first ; so, not bad for the user
As a sidenote : if your liste of data is not too big (20 names is definitly OK ; 100 names would probably be OK ; 1000 might be too much), you could store it directly as a Javascript array, and not do an Ajax request : it's the fastest way (no client-server call), and it won't load your server at all.

Categories