Current situation: I have a web page that uses AJAX/JQUERY to refresh all the content on the page every 17 seconds. Every time this happens the server queries the database for data from two tables, one of which is large (450MiB in size, 11 columns) and processes all the data.
This is too resource intensive. I want:
The server queries the database only when one of the two tables have changed.
The page then reloads the page through AJAX only when the tables have been updated and the server has re-processed the data.
I think this falls under the category of comet programming. I'm not sure.
2 is easy. The webpage calls 'update.php' every 17 (or maybe less) seconds. The PHP script returns no data if no changes have been made. Only if data is returned then the current page is replaced with the new data. Please advise me if there is a better way.
As for 1 my googling tells that every time one of my two tables is updated I can put a notification in a table (or maybe just a single byte in a file) to indicate that I must query the database again and then the next time that the webpage sends an AJAX request I return the data.
The problem is that I'm working with a rather large code base I didn't write and I don't know of all the places that either of the two tables may be updated. Is there an easier way to check when the database is modified.
I'm working with PHP, Apache, Drupal and MYSQL.
You can chekout Server Sent Events
A server-sent event is when a web page automatically gets updates from a server.
there is an excellent article on HTML5rocks.com - Server Sent Events
All You have to do is create an object
var source = new EventSource('xyz.php'); //Your php files which will return the updates.
Once you create an object,you can listen to the events
source.addEventListener('message', function(e) {
console.log(e.data);
}, false);
Related
I've doubt regarding speed and latency for show real time data.
Let's assume that I want to show read time data to users by fire ajax requests at every second that get data from MySql table by simple collection query.
For that currently these two options are bubbling in my mind
MySql / Amazon Aurora
File system
Among these options which would be better? Or any other solution?
As I checked practically, if we open one page in browser then ajax requests gives response in less than 500ms using PHP, MySql, Nginx stack.
But if we open more pages then same ajax requests gives response in more than 1 second that should be less than 500ms for every visitors.
So in this case if visitors increase then ajax requests gives very poor response.
I also checked with Node.js+MySql but same result.
Is it good to create json files for records and fetch data from file? Or any other solution?
Indeed, you have to use database to store actual data but you can easily add memory cache (it could be internal dictionary or separate component) to track actual updates.
Than your typical ajax request will look something like:
Memcache, do we have anything new for user 123?
Last update was 10 minutes ago
aha, so nothing new, let's return null;
When you write data:
Put data into database
Update lastupdated time for clients in memcache
Actual key might be different - e.g. chat room id. Idea is to read database only when updates actually happened.
Level 2:
You will burn you webserver and also client internet with high number of calls. You can do something:
DateTime start = DateTime.Now;
while(Now.Subtract(30 seconds) < start)
{
if (hasUpdates) return updates;
sleep(100);
}
Than client will call server 1 time per 30 seconds.
Client will get response immediately when server notices new data.
I need some advice on performance...
We have a long running PHP script (could potentially take over 10 mins) with progress updates firing back to the UI via Server Sent Events. http://www.html5rocks.com/en/tutorials/eventsource/basics/
Its all running fine, but I'm worried about it hammering the DB.
The rough flow is this;
1) A user goes to the 'publish' page of the app.
2) The UI opens an EventSource stream to a php script that monitors if
a publish is in progress, and reports progress events back if so - checks every
second.
3) If the user initiates a publish, it fires a Ajax call to the long
running php script.
4) The EventSource will then report events back for this publish.
The monitoring is done by storing progress in a mySQL table.
The long running script writes progress to the DB, and the Event script checks this every second.
As I said, it all works, except that it's hitting the database every second for a look up - for every page left open on the publish page.
It is a low user (sub 100 at the moment - but this could increase), low frequency application, so unlikely there will be more than a hand full on that page at the same time, but still - it doesn't take much.
It's all hosted on AWS - micro DB at the moment.
So I suppose my questions are.
1) Hitting the DB every second for each publish page session - is this
bad? should I be worried?
2) What are the alternatives to hitting the DB - write to a file or memory instead ?
(bad if we ever load balance)
3) There is no way to get the PHP event script notified when mySQL
table updates is there?
If I extend it to more then a second, the UI progress is pretty lame (skips too much info).
I could dynamically change the update time. Every second when a publish is in progress, drop down to every 5 or 10 when its not?
Re writing in Node or using a seperate notification server is not really an option - I just want simple progress events for a long running script!!
Any advice much appreciated.
m
1) Hitting the DB every second for each publish page session - is this bad? should I be worried?
This will probably kill your DB quick if your user base grows.
2) What are the alternatives to hitting the DB - write to a file or memory instead ? (bad if we ever load balance)
File would be easiest. Although you could potentially use SQLite and take your primary database out of the equation for this particular issue.
3) There is no way to get the PHP event script notified when mySQL table updates is there?
Sure there is, in a roundabout way. Have the event script listen for changes to a cache file that gets written to whenever your MySQL database updates.
Many desktop applications (e.g, those built with Delphi) use "Database aware components", such as a grid, which display the contents of a database - usually the result of a query - and auto-update their display when the database contents change.
Does such a thing exist for PHP (maybe displaying the result of a query in an HTML table, and updating)?
If not, how would we go about creating it?
(Note: This seemingly similar question didn't help me)
It's not technically possible to update a HTML page once rendered with pure PHP because of the static nature of the HTTP protocol so any solution would have to include JavaScript and AJAX calls.
Emulating using AJAX to re-render the table every 5 minutes
It wouldn't be hard to emulate though, just make a PHP page which gets the results of a database and puts them in a table, then use jQuery's .load() function to get that table and render it in a DIV of your choice on an interval of 5 seconds or whatever.
Something like:
function updateTable(){
$('#tableDiv').load(url);
}
var url = 'renderTable.php';
setInterval(updateTable,5000);
You can put this in any PHP (or HTML) page with a DIV with id tableDiv and it will render the output of renderTable.php in that div every 5 seconds without refreshing.
Actually monitoring the database
This is possible, you'd have to set up a PHP file on a cron for every 5 seconds (or an AJAX call every 5 seconds) to run something like SELECT MD5(CONCAT(rows,'_',modified)) AS checksum FROM information_schema.innodb_table_stats WHERE table_schema='db' AND table_name='some_table'; (assuming innoDB), you'd then compare this to the previous checksum.
If the checksums are identical, you could pass 'false' to your modified AJAX call, which would tell it not to render anything over the table. If they aren't, you could pass it the HTML of the new table to render in place.
it could be done but with mix of different technologies
if I would like to monitor in real time changes made in database I would think about triggers and sockets - trigger in database should call (on insert or update) function that will add event to queue - here's example function for postgresql (plsh is custom handler)
CREATE FUNCTION propagate_event(tablename) RETURNS text AS '
#!/bin/sh
# execute script that will add event to message queue
/var/scripts/change_in_table.sh "$1"
' LANGUAGE plsh;
client connects to socket and receives that events in real time
Currently I have a data file in dropbox that is uploaded every 15 seconds, I want to take this data, which has several different data types, and graph the real time data that the user selects on a website. I have a data server but my data is not on there. Is there any way for me to take this data from the file and graph it while also having a control panel that selects which data I want to graph.
You can refresh your web page using Ajax. Note that if your refresh is set to every 15 seconds and your data comes in every 15 seconds, worst-case is that you will show data that's almost 30 seconds old if the timing of the data update and the Ajax refresh is unfortunate.
You probably want to check for new data using Ajax more frequently, depending on your specific needs. On the server side, cache the result of the Ajax update to avoid too much duplicate processing.
To create the data that you return from the Ajax query, open and process the data file. No need for MySQL. You can use the timestamp of the file to invalidate the result cache I suggest in the previous paragraph.
There are many JavaScript based charting libraries that can update via Ajax. Here's a good starting point:
Graphing JavaScript Library
Is there a way that I can query an Oracle 10g database, and display the results in a dynamically refreshed html file every 3 minutes, for example?
Here is my predicament: I have several queries that I would LOVE to display the results of to a whole organization on a basic HTML web page with some CSS. The problem is that I do NOT want a user to be able to constantly refresh a page in his/her browser, and thus severely bog down the database. I have no problem writing the queries, or writing the HTML and CSS needed to display the tables. It's almost as if I would like to query, export results to XML every 3 minutes, and constantly have an HTML or PHP file that is pointing to the dynamically updated XML file. I am open to other options as well...
I have basic user access with the Oracle DB...nothing Admin like. I do have access to a server, though, and have experience with PHP, PL/SQL, and HTML. Perhaps I would have to get into a lower level programming language like Python? I am kind of stuck here. Any kind of help would be appreciated!
you can also execute an Ajax Request every 3 minutes using the setTimeout() function.
Using jQuery framework
$(document).ready(function() {
setTimeout("getFeed()", 180000); //180000 = 3 minutes in milliseconds
});
function getFeed() {
//ajaxRequest here
}
For more info on ajax you can go here: http://api.jquery.com/jQuery.ajax/
Setup a materialized view(mv), point your app to this mv, and then setup a scheduler job to refresh it on whatever frequency you like.
See dbms_scheduler for setting up scheduler jobs in Oracle.
One note: you may want to do an atomic_refresh=>true to do deletes/inserts into the mv instead of truncate/insert (if atomic_refresh=>false, there will be 0 rows in mv until refresh is completed).
An simple example mv creation:
create materialized view MY_MV
tablespace MY_TS
build immediate
refresh complete on demand
with primary key
as
SELECT a.foo, b.bar
from table_a a, table_b b
where a.col1 = b.col2
and a.baz='BLAH'
;
An example refresh call:
dbms_mview.refresh('MY_MV', 'C', atomic_refresh=>true);