I have an ajax application where the client might lookup bunch of data frequently(say by key stroke), the data gets updated on the server side once or twice a day at fixed times by a demon process. To avoid visiting the server frequently, I store the data in a xml file, so the client downloads it once when the page first loads, then look the data up from local data file via javascript.
But the user might load the page shortly before the changes, then start using it without ever refreshing the page, so the data file never gets updated, hence keep telling user the new data is not available.
How do I solve this issue?
You should set the appropriate HTTP cache headers for that generated XML file so any client request past that time will get the new version and cache it locally, like any other static content.
If the data load is not very large... Include the data in the main document as an XML island. Either form it in document generation (aspx, php, whatever) or fill in (via ajax calls) a reserved document node upon loading. This way, your user always has the latest data, you do not have to worry about caching, and life is much simpler.
If it is large, fill in that node as needed via ajax calls.
One obvious option is to add some AJAX that polls the server every x minutes. If the data needs refreshing just show a non-blocking message somewhere obvious on the page notifying the user that fresh data is available and provide a link to refresh the page. As an extra you might want to provide a button for the user to click if they want to check for fresh data (rather than waiting for x minutes to elapse) themselves.
If you use a HEAD request you can just check the last-modified header.
You said that the update time is FIXED? So when the user visit your page SHORTLY before the update time to come, you can set a javascript variable to you page that indicate how many minutes, for example, until the next update, and run a client-side timer such as:
timer = {
run: function() {
if( now + minuteToUpdate > updateTime - startVisitTime ) {
// make ajax request here to update XML file
}
},
interval: //you can determine this since this will run in client-side
}
Do not you POLLING in this stiuation because it's waste and stressed to call server every time.
You can set some SESSION variable to help this run better and more exactly
Justin
What runtime said...
Or, since update times are fixed and infrequent, then when you serve your XML, also include a cache expiration time as an element or custom header. This way, if your user visits the site 1 minute before the XML update, you can code your client to expire its cache and update itself on the next request made after that 1 minute mark.
Related
I am currently creating a stock market simulation and am working on the moment that the user logs into the simulation. I have a PHP script that will generate a certain price for a company four times and update it into my MySQL database while running. I currently have the following code:
PHP:
if (isset($_SESSION['userId']))
{
$isPlaying = 0;
while ($isPlaying <= 3)
{
$priceTemp = (rand(3300, 3700) / 100);
$sql = "UPDATE pricestemp SET price = $priceTemp WHERE companyName = 'Bawden';";
mysqli_query($conn, $sql);
sleep(1);
$isPlaying++;
}
echo '<h1>Welcome to the simulation</h1>';
}
I am aiming for these updates to happen in the background once the user has logged into the simulation. When refreshing my database every second, the updated prices are shown which is one of my objectives. However, what I would like it to do is still load the HTML onto the page (to say "Welcome to the simulation") while updating the database with every second with an updated price.
So far, when I log in, I have to wait 4 seconds before the HTML will load. In the future, I hope to have it consisently updating until a certain condition is met but when I have set an infinite loop earlier the HTML never loaded.
What do I have to do to allow the HTML to load once logged in and have the prices being generated and updated in the MySQL database in the background with no delay in either of these tasks happening?
You have a fundamental misunderstanding of how web-based requests work.
What you need to understand is that PHP is a server-side language. PHP generates any combination of HTML, CSS, JavaScript, JSON, or any other forms of data you want and sends it to your web browser when it's finished. While it's doing that, it can also manage data within a database or perform any other number of actions, but it will never send what the web browser can make use of until it finishes setting everything up. So if you're within an infinite loop, it will never finish and therefore nothing will be sent back to the web browser.
To remedy this, you need to use something called "asynchronous JavaScript", more commonly referred to as "ajax". Specifically, you first send some initial HTML to the web browser in one request and let the request end immediately. This allows the user to see something without waiting around for an indefinite period of time. Then, on the web browser end, you can use JavaScript to automatically send a second request to the server. During this second request to the server, you can perform your data processing and send back some data when you're finished to display to the user.
If you want to periodically update what you show the user, then you would repeat that second request to refresh what is shown on the user's webpage.
Any time you see some kind of "real-time" updating on a website, it's not coming from a single, persistently open connection to the web server--it's actually a series of repeated, broken up requests that periodically refresh what you see.
Broken down, standard web request workflows look something like this:
Web browser asks the web server for the webpage. Web browser waits for a reply.
Web server generates the webpage and sends the webpage to the web browser. Web server is done.
Web browser receives the webpage and shows it to the user. Web browser stops waiting for a reply.
Web browser runs any JavaScript it needs to run and requests data from the web server. Web browsers waits for a reply.
Web server processes the request and sends the requested data back to the web browser. Web server is done.
Web browser receives the requested data and updates the HTML on the webpage so the user can see it. Web browser stops waiting for a reply.
As you can see, each series of requests is 1) initiated by the web browser, 2) processed by the web server, and 3) any replies from the web server are then handled by the web browser after the web server is finished up. So each and every request goes browser -> server -> browser. If we add steps 7., 8., and 9. to the above, we will see them repeat the exact same pattern.
If you want to avoid adding JavaScript into the mix, preferring to refresh the entire page every time, then keep your data processing short. Optimize your database calls, fix your infrastructure (make sure your server and database have a LAN connection, that your hardware is good enough, etc.), make your code more efficient... do whatever you need to do to keep the processing time to a minimum.
This is all incredibly simplified and not 100% accurate, but should hopefully help you with your specific problem. The short version of all of this is: you can't show your HTML and process your data at the same time the way you're doing things now. You need to fundamentally change your workflow.
You have to do this in 2 network calls. The first network call should fetch the html. Then you have to use Javascript to fire another call to update your data. Once that api call returns it will update the html.
The scheduling model to manage the frequency of a background operation based on the frequency of requests at the front end is a very difficult problem. It's also a problem you don't need to solve. The data doesn't need to be changed when nobody is looking at it. You just need to store when the data was last looked at and apply greater deltas to older data.
I have some kind of chat/forum application that checks for new messages using periodic polling (every 15 seconds) using jquery ajax. I was wondering if i can get around the issue of users who try to be 'funny' by loading several same browser instances, with lots of tabs, all pointing to the same application. Each tab is sending an ajax request, which potentially can overflow a server if several users start to do the same thing.
I do store sessions in a table, along with the last access time and IP address, which works fine as long as users don't use the same browser. I could store a unique identifyer that is sent using the ajax POST or GET request, but that would give problems if a regular (non abusing) user refreshes his page, which would then create a new identifyer.
This is not a real problem yet, but better catch it before someone thinks of abusing the system like this :) Any idea how to do this?
One option could be to fetch data like so:
Your script is preparing to poll data. Before executing the request, write (with LocalStorage), a value saying that you're going to fetch data. localStorage.setItem("last-request-timestamp", new Date().getTime());
Poll for data. You get a result. Write that result to the localStorage: localStorage.setItem("latest-messages", ajax_result);
Check if a page is preparing to poll data by checking if localStorage.getItem("last-request-timestamp") is longer than 15 seconds ago. If so, go to step 1. If not, wait 15 seconds and check again.
Regardless if the current page polled for data or not, check the latest-messages variable and update the page.
Other pages will of course share the localStorage data. They won't get data if another page is fetching at the moment. If page #1 is closed, one of the other pages will continue to fetch data.
I haven't used LocalStorage before, but browser support seems decent enough. You should also be able to just use it as a key-value array: localStorage["last-request-timestamp"].
You can only store strings in localStorage, but you can of course serialize it into JSON.
Not sure if it is do-able in javascript. You can check if the tab is active. And only do the ajax on the active tab?
I have the similar problem. Now I force all users to log in (it means i have their e-mails). Also i setup connections limit per account and request limit per connection, after 5 overflows i ask user to enter captcha, then i block account for a 30 min and send e-mail with password recovery link. It's not a clear solution but for now it works for me.
UPD:
The simplest way to do this is to use cookie or session storage. I use cookies. The algorithm is simple:
User login on web.
Check is there any opened session for this user,
is opened, then delete the other session or trigger exception or
switch to that session, you have decide your own the desired
behavior.
Create session id for user and store it in database.
Increase sessions counter field for specific user to detect opened
sessions, so now it doesn't matter is there one browser in use
or many.
Update last access mark (i use microtime(true) + $delay and mysql
decimal(14,4)). Send it to user
Send id to client
On each request:
Search for session by passed id in $_COOKIE.
Check last access mark. If it less then microtime(true) it means that client send requests to frequent, so decide yourself what to do, increase the mark, for example microtime(true) + $delay + $penalty or drop whole session or trigger error. The behavior depends of your application.
Why not throw something like Memcached/Redis at the problem? Cache a response with a 10-15s lifetime and avoid as much processing as possible.
Here is the scenario:
I have a page that is logging data to MYSQL. I have another page that reads that data and allows it to be viewed. When a new piece of data is logged I would like to have the first script check and see if the viewing page is open in the browser, and if so append the newest data to the end of the view. Also - could anyone point to some info giving an overview of how PHP and the browser interact? I think I have the concept of the DOM down for javascript...but as far as PHP it just appears that once the page is sent, that's it...
You're correct in that once the PHP is sent, that's it.
There is no way to send data to a PHP page once the page is loaded. There is another slightly nastier method, but the easiest way of doing this is going to be polling the page via Ajax.
So, have a script that every 20 seconds, sends a message to another PHP script that contains the timestamp of the last MySQL log you received, then get the script to return all the data that has been set by that time.
I'm unsure how new you are to JavaScript, but the easiest way of doing that is probably using JQuery's $.ajax and encoding the new MySQL records as JSON.
No this isn't possible as you describe. The viewing page will have to poll the server for changes, either by periodically reloading itself, or by javascript / AJAX.
You are right that once the page is sent by PHP it can have no further influence. In fact the PHP execution thread on the server is killed as soon as output is complete, so the thing that generated the page no longer even exists.
To expand on Dolondro's suggestion, rather than periodically polling the server for updates, you could use Server-Sent-Events (newly supported in modern browsers).
With these, you basically just send 1 ajax request to the server, and the connection is held open. Then, the server can send updates whenever it wants. When the browser receives an event, it can add the data to the screen. Even still, the connection is held open, and the server can send additional events/updates as they occur.
W3C page:
http://dev.w3.org/html5/eventsource/
Wikipedia:
http://en.wikipedia.org/wiki/Server-sent_events
More Info:
https://www.google.com/search?ix=hcb&sourceid=chrome&ie=UTF-8&q=server+sent+events
I'm at the conceptualizing stage of developing something but not quite sure of a certain feature.
I have a DIV in a form, lets call it id='divComments'. This div contains all of the comments on a particular title. It retrieves all of the data from the database which is easy to do.
Now when the page is refreshed, this div is populated with all of the comments. If another user adds a comment, all of the other users will see this comment when they log on (after that point in time) or if they refresh the page.
What if I want this div feeding from the database and refreshes automatically when something is inserted into the relation/table in the Database? so I have my page opened (im not refreshing it, just staring at it) showing lets say x and someone else adds a tuple in the particular database table lets call that y, and my div now shows x and y. In other words its updating real time from the database without refreshing.
Anyone has any idea how to go about doing something like this?
HTTP is stateless. Once an asset has finished downloading through a HTTP connection, the connection is destroyed and the server no longer has any knowledge of what the client is doing next.
There are ways of fudging stateful behaviour, using cookies and sessions and the like, but these still require a new connection to the server to fetch fresh data.
There are technologies in development that can allow a web server to "push" new data to the client the instant it becomes available (websockets, server-sent events, etc), but these are still at the draft stage for the most part and browser support is spotty at best.
The only real choice you have is polling the server with a refresh meta tag (EXTREMELY inefficient!), polling the server with AJAX (Better, in that you can design it to only fetch the data that's changed, but still not ideal), or using a long-lasting AJAX connection that remains idle until new data becomes available, at which point the data is downloaded, the connection is closed, and a new connection is opened to sit idle for more data (will allow immediate response, but difficult to set up properly).
im making some statistic codes for my website (im a php developper). I want to calculate how many seconds/minutes the web user stay on any page (like google analytics do) but i have no idea of how to make this. Thanks for any help or scripts!
How are you gathering the data? The common options would be instrumenting the page using javascript, looking at webserver log files, in the server-side request handler or sniffing the TCP/IP traffic.
Doing it "like Google Analytics" implies the former. In which case the way to do it would be to grab a timestamp as soon as possible when the page loads (rather than waiting for page ready / onload event) and compare that value with the previous tiestamp (so you'd probably store that in a cookie). Then you need some way to send this back serverside, and a way of recording and reporting on the data.
Note that trying to fire an ajax call as the user leaves the page, e.g. via onunload, will not work reliably (the page launching the request is at the end of its lifecycle). The important thing here is the ASYNCHRONOUS part. And making a synchronous call will just have the effect of slowing down the website.
You might want to have a look at Yahoo Boomerang - although it doesn't support dwell time measurements out of the box, it's easy to extend. For a backend, you could do a lot worse than Graphite
You can fire an unload event in javascript when the user leaves the page, which sends an Ajax request to your server. Since this may not work in all browsers, especially if the network latency is high, also have a ping script (also with Ajax) which calls your statistics system once in a while as long as the user stays on the page (for example, every 10-60 seconds depending on the resolution you want).
If you want to do it in serverside i.e in php then probably you would need a table allocated for this. say "analytic"
First you need to add this script in every pages. that inserts these data into the table analytic which is $_SERVER['http_referer'] , current timestamp, remote address and current page URL.
Now the calculation part.
basically when a user first lands in your page $_SERVER['http_referer'] wouldnt be from your domain. Then keep the timestamp as the start time.
Now check the next time stamp. If the http_referer is same as previous records page URL then find the difference in the time stamp to know how much the user has stayed in a page.
More or less what am trying to say is find the time between each request from the user.
Disadvantage of this method: When user lands in a page closes it. its impossible to find the time on your site.
A quick and easy method I came up with is pretty useful.
On every page of a site where I want to track time on page, I include a tracker script.
I grab as much info as I can, and make a database entry, including the referrer, the requested/loaded page, user-agent, ip, timestamp, etc.
These timestamps, in conjunction with the user's ip, can be used to determine the time the user was on the previous page (including load time of current page).
The only drawback is that I can't determine time on the last page they visit (which isn't always a bad thing, I can reduce tracking idle time).
Bounces are identified by single entries by a given ip within a specified time period (an hour would probably be sufficient).
At page load create a date object, then when the page unloads create another and substract them. After that you can do an AJAX request to your tracking server, sending the elapsed time.
var startTime = new Date();
var endTime;
window.onunload = function()
{
endTime = new Date();
var elapsedSeconds = endTime.getTime() - startTime.getTime();
//Do the ajax request, sending elapsedSeconds
}