I'm wondering if it is possible to push an xml file update from server to all client browsers?
Basically my proposed situation is that my server holds an xml file, when a user loads a page that uses said xml file they can request to change it, if the change is allowed (determined by the page client side) then the xml file is updated on the server side. I'm fine up to this point (well, I have plenty of reading to get me to this point). Then I want all pages who are connected to refresh all elements of the page reliant on the xml file with out refreshing the whole page.
Another words all those pages using the file to update their copy->data if the copy on the server is newer than theirs. Is this possible via a server push, or do I have to constantly poll the server to compare files? (That seems sloppy to me..) And if it is possible, what's the best way to go about it?
Thanks for any points in the right direction.
Because a webpage is stateless you cannot push data to it. You need to poll the server for updates. Think about a small ajax script that polls about every 5 minutes, when content is update that script calls something to update the page. You will need a lot of ajax to do this; take jQuery or alike to accomplish this.
You may try an AJAX call to a some php script like:
set_time_limit(3600); // one hour or set it as long as your session timeout is
// Keep on repeating this to prevent PHP from stopping the script
while (true)
{
sleep(5); //5 seconds between polling the server
//do the updates xml updates
flush();
ob_flush();
}
The connection will stay open and every 5 seconds xml will be pulled and client updated.
If you don't wont to spend a lot of time and resources by pulling the data if it's not changed, you may use APC, memcache or any other server stored variable which notifies you the XML was changed.
if(apc_fetch('xml_updated') == 1)
{
//do the xml pull
}
You may test what happens if you are trying to pull data every second in terms of resources. In my opinion it's best to have a greater delay.
Hope it helps!
Related
I've got the following problem at hand:
I'm having users on two seperate pages, but saving page input to the same text file. While one user is editing, the other can't. I'm keeping track of this with sessions and writing changes and who's turn to edit it is in a file.
Works fine so far, the output in the end is quite similar to a chat. However, right now I'm having users manually actualize their page and reload the file. What I'd like to do is have the page execute a redirect when the file-timestamp changes (to indicate that the last user has saved their edits and its another users turn). I've looked into javascript shortpolling a little, but then found the php filmtime function and it looks much easier to use. Well - here's what I got:
while(true){
$oldtimestamp=filemtime(msks/$session['user']['kampfnr'].txt);
$waittimer=2;
$waittimer++;
sleep($waittimer);
$newtimestamp=filemtime(msks/$session['user']['kampfnr'].txt);
if ($eintragszeit2 > $eintragszeit1){
addnav("","kampf_ms.php?op=akt");
redirect("kampf_ms.php?op=akt");
}}
In theory, while the user sees the output "it's ... turn to edit the file." this should loop in the background, checking if the file has already been updated, and if yes, redirect the user.
Practically this heavily affects server perfomance (I'm on shared hosting) until it breaks with a memory exceeded error message.
Is something wrong with the code? Or is it generally a bad idea to use a while loop in this case?
Thanks in advance!
PHP language should be only used to generate web content (client do a request to the server => server calls the required script, and returns the response to the client).
Once page is loaded and displayed to the client, the connection is closed, so Internet can die, the client isn't informed...
So with an infinite loop, not only the client can wait for response... an infinite time, but also the server may be heavy impacted because of load... Effectively It is a really bad idea :)
PHP can't be used to make a bidirectional communication: it is just called to build web pages that client demands, and so he can't do anything "in the background" (not directly, effectively you can call an external script, but not for notify a client...)
Also, to do a bidirectional communication, php and "regular" http is not good, because of client / server architecture (the server only answers client request, it is passive)
I can suggest to use WebSocket protocol, to do a chat application:
http://socket.io/
https://en.wikipedia.org/wiki/WebSocket
But for that, you need to use an "active" server solution, such as node.js or ruby (depends of your server capabilities...)
The other way if you want to stay in php is that client makes Ajax request every 10 seconds, for example, to call a php script which check the file, and send back a message to the client if file is updated, but it is really deprecated, because of heavy performance loss, so forget it immediately.
I have recently updated my site with the use of ajax calls to improve the end-user experience. Some calls are set to poll the db repeadedly, others are called at to alter the database upon user interaction ie. completing a task or cancelling a cart item.
Now I am getting server errors resulting from reaching my servers open file limit.
Here is an example of the sort of code I am using: (credit goes to every tutorial found on google...)
function checkForNewData() {
$.get('checkForNewData.php',false,function(data){
if(data.length){
$('#newData').html(data);
}
});
}
$(function(){
checkForNewData();
setInterval('checkForNewData()',10000);
});
I realize that by using "setInterval('checkForNewData()',10000);" that this means that file is loaded every 10000ms for every user that has this page open.
Here are my questions regarding my ignorance of ajax:
Does a unix server record each ajax call (of this manor) as a page load or open file?
If the page loads behind the scenes, do I have to close it?
Is there a better way to keep a site up-to-date than the repetitiously polling of my db.
Thanks for your time and assistants.
Does a unix server record each ajax call (of this manor) as a page load or open file?
Every-time a php file is run, it is logged. Executed PHP of any manner is recorded. That's why you can see errors in your error log if anything goes wrong during AJAX calls.
If the page loads behind the scenes, do I have to close it?
Which page? "checkForNewData.php"? No you don't. The AJAX call waits for the script to execute & finish and than gets the response.
Is there a better way to keep a site up-to-date than the repetitiously polling of my db?
Yes, there is. I would:
On the server
Use cache (maybe APC cache)
Run a DB check once every minute/ two minutes/ five minutes only
Store/ Update the results in an XML file
On the client
Get the timestamp of the most recent update on client-side page load
Get AJAX to check the timestamp (stored in the XML) of the last update
If timestamp of the AJAX response differs from the first-load timestamp, get new HTML from the XML file
Use AJAX headers or AJAX post-data to request a specific function (like asking for timestamp update vs getting HTML data).
Remember to use the correct flags for json_encode.
print_r(json_encode($html,JSON_HEX_QUOT|JSON_HEX_TAG|JSON_HEX_AMP|JSON_HEX_APOS));
Also remember to zip the data.
ob_start('ob_gzhandler');
It is a best practice to have as fewer DB calls as possible.
I am making a Warehouse management system.
The orders come in a CSV in the morning that my script then executes.
It places a php-made barcode on the top of each order. the sample CSV i am using has around 100 unique orders on, so when i load the page that will then print orders off the server is getting 100+ requests and (im guessing) some of the images time out.
When i view source and open the link to the ones that don't work it loads the image, leading me to think i need to somehow disable the timout method on the browser.
My only other idea is to load the barcodes through javascript.
Any suggestions?
I think what enygma may be getting at is the limited processing time php scripts have. Sometimes they get cut off after 30 seconds. Generating all of those images at one time might run over, causing your script to be killed on the server and stop sending data. Your idea of loading them in javascript is probably your best bet, as long as you only do a few at a time or do them serially.
If you start a session in php, the session is locked and cannot be accessed by another php script until released.
Based on you generating images with php - that's quite likely the cause of what you see.
There are other questions which go into a bit more detail of how php and sessions work; but most likely that's the direct cause for some of your images not being received - the requests are in a single, serial queue being processed in turn because each script reads the session and doesn't release it until it's finished. The requests at the end of the queue hit a time limit one way or another and return nothing.
Therefore, ensure that you call:
session_write_close();
as soon as you can in all scripts that need access to the session to prevent them from blocking all other php requests, or better still don't use the session at all (e.g. if you're using the session for authorization just include a hash in the url and compare to that for image requests).
Ok, I didn't really now how to formulate this question, and especially not the title. But i'll give it a try and hope i'm being specific enough while trying to keep it relevant to others.
I you want to run a php script in the background (via ajax) every X seconds that returns data from a database, how do you do this the best way without using to much of the server resources?
My solution looks like this:
A user visits a webpage, ever x seconds that page runs a javascript. The javascript calls a PHP script/file that calls the database, retrieves the data and returns the data to the javascript. The javascript then prints the data to the page. My fear is that this way of solving it will put a lot of pressure on the server if there is a lot (10 000) simultaneous visitors on the page. Is there another way to do this?
That sounds like the best way, given the spec/requirement you set out.
Another way is to have an intermediary step. If you are going to have a huge amount of traffic (otherwise this does not introduce any benefit, but to the contrary may overcomplicat/slow the process), add another table that records the last time a dataset was pulled, and a hard file (say, XML) which if the 'last time' was deemed too long ago, is created from a new query, this XML then feeds the result returned to the user.
So:
1.Javascript calls PHP script (AJAX)
2.PHP pings DB table which contains last time data was fully output
3.If time is too great, 'main' query is rerun and XML file is regenerated from output
ELSE skip to 4
4.Fetch the XML file and output as appropriate for returned AJAX
You can do it the other way, contacting the client just when you need it and wasting less resources.
Comet it's the way to go for this option:
Comet is a programming technique that
enables web servers to send data to
the client without having any need for
the client to request it. This
technique will produce more responsive
applications than classic AJAX. In
classic AJAX applications, web browser
(client) cannot be notified in real
time that the server data model has
changed. The user must create a
request (for example by clicking on a
link) or a periodic AJAX request must
happen in order to get new data fro
the server.
I am working in a tool in PHP that processes a lot of data and takes a while to finish. I would like to keep the user updated with what is going on and the current task processed.
What is in your opinion the best way to do it? I've got some ideas but can't decide for the most effective one:
The old way: execute a small part of the script and display a page to the user with a Meta Redirect or a JavaScript timer to send a request to continue the script (like /script.php?step=2).
Sending AJAX requests constantly to read a server file that PHP keeps updating through fwrite().
Same as above but PHP updates a field in the database instead of saving a file.
Does any of those sound good? Any ideas?
Thanks!
Rather than writing to a static file you fetch with AJAX or to an extra database field, why not have another PHP script that simply returns a completion percentage for the specified task. Your page can then update the progress via a very lightweight AJAX request to said PHP script.
As for implementing this "progress" script, I could offer more advice if I had more insight as to what you mean by "processes a lot of data". If you are writing to a file, your "progress" script could simply check the file size and return the percentage complete. For more complex tasks, you might assign benchmarks to particular processes and return an estimated percentage complete based on which process has completed last or is currently running.
UPDATE
This is one suggested method to "check the progress" of an active script which is simply waiting for a response from a request. I have a data mining application that I use a similar method for.
In your script that makes the request you're waiting for (the script you want to check the progress of), you can store (either in a file or a database, I use a database as I have hundreds of processes running at any time which all need to track their progress, and I have another script that allows me to monitor progress of these processes) a progress variable for the process. When the process begins, set this to 1. You can easily select an arbitrary number of 'checkpoints' the script will pass and calculate the percentage given the current checkpoint. For a large request, however, you might be more interested in knowing the approximate percent the request has completed. One possible solution would be to know the size of the returned content and set your status variable according to the percentage received at any moment. I.e. if you receive the request data in a loop, each iteration you could update the status. Or if you are downloading to a flat file you could poll the size of the file. This could be done less accurately with time (rather than file size) if you know the approximate time the request should take to complete and simply compare against the script's current execution time. Obviously neither of these are perfect solutions, but I hope they'll give you some insight into your options.
I suggest using the AJAX method, but not using a file or a database. You could probably use session values or something like that, that way you don't have to create a connection or open a file to do anything.
In the past, I've just written messages out to the page and used flush() to flush the output buffer. Very simple, but it may not work correctly on every web server or with every web browser (as they may do their own internal buffering).
Personally, I like your second option the best. Should be reliable and fairly simple to implement.
I like option 2 - using AJAX to read a status file that PHP writes to periodically. This opens up a lot of different presentation options. If you write a JSON object to the file, you can easily parse it and display things like a progress bar, status messages, etc...
A 'dirty' but quick-and-easy approach is to just echo out the status as the script runs along. So long as you don't have output buffering on, the browser will render the HTML as it receives it from the server (I know WordPress uses this technique for it's auto-upgrade).
But yes, a 'better' approach would be AJAX, though I wouldn't say there's anything wrong with 'breaking it up' use redirects.
Why not incorporate 1 & 2, where AJAX sends a request to script.php?step=1, checks response, writes to the browser, then goes back for more at script.php?step=2 and so on?
if you can do away with IE then use server sent events. its the ideal solution.