So I have a php file that runs sets of sql queries for a list of ids. This file may take some time to run.
While that is running, I want to have a progress bar.
I have it so that the running file updates a session variable indicating the percentage of completion after each set of queries. Then in my front end, I'm doing an ajax call to another file that reads the session variable value and returns it. I then update the width of the progressbar fill with the returned percentage.
The problem:
The progress bar isn't updated until the all the queries are completely done. So I see my progress bar at 0% for some seconds then it suddenly jumps to 100%. It seems that the session is locked until the script completes. I tried using session_write_close() but cannot get it to work.
Thanks.
I had a similar problem, and the session solution did not work. My solution was to wrote the current status into a temporary .txt file, and then read that file with ajax call.
The problem with doing it this way is that session variables are not stored at runtime. They will wait until the script completed before storing the variable in the session. One solution would be to store the progress temporarily in a database or a file and accessing the data periodically with AJAX.
Another solution would be to use server side javascript like Googles V8 Engine (Node.js) and fire events based on your progress. You're page would then update based on the javascript events being fired.
Related
i have a php script that scrapes data from a bunch of websites and stores them in a db. What i want to happen is instead of having the php load at every connection, i want to set it on a 10 minute interval that then stores the data it gets into a DB so i can instantly retrieve info instead of having to have the php run everytime which takes up time. I don't know ajax well and would like to keep it as php/mysql as possible. Any help is apreciated.
TL;DR: Want php to save data to a db every 10 minutes then output that db the same way until it gets over written, instead of loading new data on a refresh.
Basic options as follows. No need (or use) for AJAX here.
Make a cron job / scheduled task (Linux / Windows) that calls your script at intervals.
Add a timed javascript browser refresh to your PHP script. See here for how.
Use a browser plugin like "Easy Auto Refresh" (Chrome) or "ReloadEvery" (Firefox).
The first one is the cleanest way around, spares you from keeping a browser tab open.
I have recently updated my site with the use of ajax calls to improve the end-user experience. Some calls are set to poll the db repeadedly, others are called at to alter the database upon user interaction ie. completing a task or cancelling a cart item.
Now I am getting server errors resulting from reaching my servers open file limit.
Here is an example of the sort of code I am using: (credit goes to every tutorial found on google...)
function checkForNewData() {
$.get('checkForNewData.php',false,function(data){
if(data.length){
$('#newData').html(data);
}
});
}
$(function(){
checkForNewData();
setInterval('checkForNewData()',10000);
});
I realize that by using "setInterval('checkForNewData()',10000);" that this means that file is loaded every 10000ms for every user that has this page open.
Here are my questions regarding my ignorance of ajax:
Does a unix server record each ajax call (of this manor) as a page load or open file?
If the page loads behind the scenes, do I have to close it?
Is there a better way to keep a site up-to-date than the repetitiously polling of my db.
Thanks for your time and assistants.
Does a unix server record each ajax call (of this manor) as a page load or open file?
Every-time a php file is run, it is logged. Executed PHP of any manner is recorded. That's why you can see errors in your error log if anything goes wrong during AJAX calls.
If the page loads behind the scenes, do I have to close it?
Which page? "checkForNewData.php"? No you don't. The AJAX call waits for the script to execute & finish and than gets the response.
Is there a better way to keep a site up-to-date than the repetitiously polling of my db?
Yes, there is. I would:
On the server
Use cache (maybe APC cache)
Run a DB check once every minute/ two minutes/ five minutes only
Store/ Update the results in an XML file
On the client
Get the timestamp of the most recent update on client-side page load
Get AJAX to check the timestamp (stored in the XML) of the last update
If timestamp of the AJAX response differs from the first-load timestamp, get new HTML from the XML file
Use AJAX headers or AJAX post-data to request a specific function (like asking for timestamp update vs getting HTML data).
Remember to use the correct flags for json_encode.
print_r(json_encode($html,JSON_HEX_QUOT|JSON_HEX_TAG|JSON_HEX_AMP|JSON_HEX_APOS));
Also remember to zip the data.
ob_start('ob_gzhandler');
It is a best practice to have as fewer DB calls as possible.
Part of the PHP web app I'm developing needs to do the following:
Make an AJAX request to a PHP script, which could potentially take from one second to one hour, and display the output on the page when finished.
Periodically update a loading bar on the web page, defined by a status variable in the long running PHP script.
Allow the long running PHP script to detect if the AJAX request is cancelled, so it can shut down properly and in a timely fashion.
My current solution:
client.php: Creates an AJAX request to request.php, followed by one request per second to status.php until the initial request is complete. Generates and passes along a unique identifier (uid) in case multiple instances of the app are running.
request.php: Each time progress is made, saves the current progress percentage to $_SESSION["progressBar"][uid]. (It must run session_start() and session_write_close() each time.) When finished, returns the data that client.php needs.
status.php: Runs session_start(), returns $_SESSION["progressBar"][uid], and runs session_write_close().
Where it falls short:
My solution fulfills my first two requirements. For the third, I would like to use connection_aborted() in request.php to know if the request is cancelled. BUT, the docs say:
PHP will not detect that the user has aborted the connection until an attempt is made to send information to the client. Simply using an echo statement does not guarantee that information is sent, see flush().
I could simply give meaningless output, but PHP must send a cookie every time I call session_start(). I want to use the same session, BUT the docs say:
When using session cookies, specifying an id for session_id() will always send a new cookie when session_start() is called, regardless of if the current session id is identical to the one being set.
My ideas for solutions, none of which I'm happy with:
A status database, or writing to temp files, or a task management system. This just seems more complicated than what I need!
A custom session handler. This is basically the same as the above solution.
Stream both progress data and result data in one request. This solves everything, but I would essentially be re-implementing AJAX. That can't be right.
Please tell me I'm missing something! Why doesn't PHP know immediately when a connection terminates? Why must PHP resend the cookie, even when it is exactly the same? An answer to any of these questions will be a big help!
My sincere thanks.
Why not set a second session variable, consisting of the unique request identifier and an access timestamp, from status.php.
If the client is closed it stops getting updates from status.php and the session variable stops being updated, which triggers a clean close in request.php if the variable isn't updated in a certain amount of time.
I am working in a tool in PHP that processes a lot of data and takes a while to finish. I would like to keep the user updated with what is going on and the current task processed.
What is in your opinion the best way to do it? I've got some ideas but can't decide for the most effective one:
The old way: execute a small part of the script and display a page to the user with a Meta Redirect or a JavaScript timer to send a request to continue the script (like /script.php?step=2).
Sending AJAX requests constantly to read a server file that PHP keeps updating through fwrite().
Same as above but PHP updates a field in the database instead of saving a file.
Does any of those sound good? Any ideas?
Thanks!
Rather than writing to a static file you fetch with AJAX or to an extra database field, why not have another PHP script that simply returns a completion percentage for the specified task. Your page can then update the progress via a very lightweight AJAX request to said PHP script.
As for implementing this "progress" script, I could offer more advice if I had more insight as to what you mean by "processes a lot of data". If you are writing to a file, your "progress" script could simply check the file size and return the percentage complete. For more complex tasks, you might assign benchmarks to particular processes and return an estimated percentage complete based on which process has completed last or is currently running.
UPDATE
This is one suggested method to "check the progress" of an active script which is simply waiting for a response from a request. I have a data mining application that I use a similar method for.
In your script that makes the request you're waiting for (the script you want to check the progress of), you can store (either in a file or a database, I use a database as I have hundreds of processes running at any time which all need to track their progress, and I have another script that allows me to monitor progress of these processes) a progress variable for the process. When the process begins, set this to 1. You can easily select an arbitrary number of 'checkpoints' the script will pass and calculate the percentage given the current checkpoint. For a large request, however, you might be more interested in knowing the approximate percent the request has completed. One possible solution would be to know the size of the returned content and set your status variable according to the percentage received at any moment. I.e. if you receive the request data in a loop, each iteration you could update the status. Or if you are downloading to a flat file you could poll the size of the file. This could be done less accurately with time (rather than file size) if you know the approximate time the request should take to complete and simply compare against the script's current execution time. Obviously neither of these are perfect solutions, but I hope they'll give you some insight into your options.
I suggest using the AJAX method, but not using a file or a database. You could probably use session values or something like that, that way you don't have to create a connection or open a file to do anything.
In the past, I've just written messages out to the page and used flush() to flush the output buffer. Very simple, but it may not work correctly on every web server or with every web browser (as they may do their own internal buffering).
Personally, I like your second option the best. Should be reliable and fairly simple to implement.
I like option 2 - using AJAX to read a status file that PHP writes to periodically. This opens up a lot of different presentation options. If you write a JSON object to the file, you can easily parse it and display things like a progress bar, status messages, etc...
A 'dirty' but quick-and-easy approach is to just echo out the status as the script runs along. So long as you don't have output buffering on, the browser will render the HTML as it receives it from the server (I know WordPress uses this technique for it's auto-upgrade).
But yes, a 'better' approach would be AJAX, though I wouldn't say there's anything wrong with 'breaking it up' use redirects.
Why not incorporate 1 & 2, where AJAX sends a request to script.php?step=1, checks response, writes to the browser, then goes back for more at script.php?step=2 and so on?
if you can do away with IE then use server sent events. its the ideal solution.
I currently have a class Status, and I call it throughout my code as I preform various tasks, for example - when I upload an image, I call $statusHandler = new Status(121), when I resize it, I call $statusHandler = new Status(122).
Every statusID corresponds to a certain text stored in the database.
Class status retrieves the text, and stores in $_SESSION.
I have another file, getstatus.php, that returns the current status.
My idea was to call getstatus.php every 500 miliseconds with ajax (via jquery), and append the text to the webpage.
This way the user gets (almost) real-time data about what calculations are going on in the background.
The problem is that I only seem to be getting the last status.
I thought that it just was a result of things happening too quickly, so I ran sleep after calling new Status. This delayed the entire output of the page, meaning PHP didn't output any text until it completed running through the code.
Does PHP echo data only after it finishes running through all the code, or does it do it real-time, line-by-line?
If so, can anyone offer any workarounds so that I can achieve what I want?
Thanks.
The default session implementation (flat file) sets a file lock on the session-file when session_start() is called. This lock is not released until the session mechanism shuts down, i.e. when the script has ended or session_write_close() is executed. Another request/thread that wants to access the same session has to wait until this lock has been released.