I need to provide interaction between js on html-page and php-script.
- Use AJAX.
Ok. But the problem is that php-script executing for a long time, and i need to know the state of this script processing (e.g. 60% complete)
What should i do? Create 2 php-scripts (client&server) and do ajax-request to client.php which will do requests to server.php via sockets or smth?
Are there more elegant solutions?
What if you had the script doing the processing write its status to a file once in awhile. Make a second script that will read the file and return the status of the original one.
You should never have a long-running process being executed entirely within an HTTP session.
A simple and common approach to this problem is message queuing. Basically, you have your UI queue up the request into a database table and then have external daemon(s) process the queue.
To provide feedback, have the daemon periodically update the table with the status for the row it's currently working on. Then, your javascript code can make AJAX requests to a script that retrieves the status for that work item from the database and displays it to the user.
See: Dealing with long server-side operations using ajax?
Ajax call php script and return information that script is runing.
Main script create lock.file.
Script called from cron is checking if lock.file exists and run the correct script.
The correct script saves the current progress into progress.txt.
Ajax is reading progress.txt and when progress is 100% then return information that script processing is finished.
edited: Thanks to Justin for poiting the timeout problem ;)
If you want to be really fancy, write output from the php script to stdout, and capture it via a pipe. This would require running the php script using exec() or proc_open() (http://php.net/manual/en/function.proc-open.php) and pipe the output to a file (or if you want to be extra-extra fancy, use node.JS to listen for that data)
There are quite a few ways to accomplish this:
Node.JS
An Ajax query every x seconds
A META/Javascript page reload
An iframe that is routinely reloading with the status in it.
Good luck!
You could use PHP's output buffering (see ob_flush) to flush the contents at certain points in your script and tailor your JavaScript so that it uses the flushed contents. I believe readyState in your AJAX call won't be set to 4 on flushes so that's where you'll have to handle that yourself (see this article). I think its a much nicer way than writing to a file and checking the contents of that.
on your process.php:
// 1st task
$_SESSION['progress'] = 0;
// your code for the first task here ...
// 2nd task
$_SESSION['progress'] = 10;
// you code for 2nd task ...
// 3rd task
$_SESSION['progress'] = 17;
// continue ...
// everything finished?
$_SESSION['progress'] = 100;
on your progress.php:
// simply output
echo $_SESSION['progress'];
now from your client-side, just make a request to your progress.php, receive the number and give it to your progress bar ...
didn't check that by myself, but hope that it works! :)
Related
I am developing an web application using php and mysql along with AJAX. In one of my script there is a provision to fetch data from mysql table. But what if I want to cancel the execution of the php script which I am calling to get the data, in the middle of the execution? Let me clear it more. Like if it takes say 30 minutes to complete an AJAX call due to the heavy loop and I want to exit from that call before completion by clicking some button. How can I achieve that goal. Otherwise, my script is running well except that it hangs if I don't want to wait for the final AJAX response text and try to switch to other page of the web application.
You can create script like this:
$someStorage->set($sessionId . '_someFuncStop', true);
Call it through AJAX, when button STOP pressed.
In your script with loop check that var from storage
while(1) {
if ($someStorage->get($sessionId . '_someFuncStop') === true ) break;
}
To my best knowledge, PHP doesn't support some sort of event listeners that can interrupt running script by an external cause.
There are 2 paths you might want to consider (if you don't want to write shell scripts on the server that would terminate system processes that execute the script):
ignore_user_abort function, thoug it is not 100% reliable
http://php.net/manual/en/function.ignore-user-abort.php
Inside the loop you wish to terminate, create a database call (or read from a file), where you can set some kind of flag and if that flag is set, run a break/return/die command inside the script. The button you mentioned can then write to database/file and set the interrupt flag.
(In general this is not really useful for script interruptions, since most scripts run in tens of milliseconds and the flag would not be set fast enough to terminate the script, in tens of minutes however, this is a viable solution.)
Requirement:
I need to run a background process (per a user request) that takes about 30 to 60 seconds to complete. I'd like to give the user some status feedback. Note: Toly is right, 'Background' is not required.
What's working:
The process prints about 20 status messages during this time and I retrieve them with a proc_open and listening on a read pipe using fgets. I can save those messages into a session var and using timestamps (to help debug) I can see that the session array is getting written to with these messages as the process progresses.
The Trouble:
My plan was to poll the server with ajax calls (every sec) to retrieve these session vars for display in the DOM. The bottleneck seems to be that the server cannot service the ajax request while it's still running the background process. Everything dumps out at once when the background process completes. From what I can tell, the issue is not with output buffering because using (debugging) timestamps saved with each process message shows the server is writing to the session var sequentially, so that's how I know the proc_open and pipe reads are working as I expect. The issue appears to be the server not being able to give the AJAX request it's JSON object until it is done with the process; or, probably more accurately, done with the loop that is reading the pipe.
Obvious Misconception:
I thought sending a process to the background (using &) might give me a solution here. Apparently I do not know the difference between a background process and a forked process. What benefit is gained - if any - by running a process in the background when doing so appears to make no difference to me in this scenario?
Possible Solutions:
I do not expect the user initiated process that runs this
process/scenario to be that heavy, but if there's something I can
build into this solution that would help a heavy load then I would
like to do that now.
Is this a multi-threading (pthreads) or a
multi-process (fork) solution?
Or, should I save a process id,
let go polling it with a while( .. fgets ..) statement and then
come back to the process after the server has serviced the ajax
request?
I suppose I could run fake status messages and then
response accurately when the results come back after completion.
The time to process the request is not dependent upon the user, so
my fake timing could be pretty accurate. However, I would like to
know what the solution would be to provide real-time feedback.
After google-ing one day for a technique to get the same behavior you are describing here I come up with an easy solution for my project.
A bit of important theory:
- session_start () and a set like $_SESSION["x"] = "y" will always lock the session file.
Case scenario:
- A - process.php - running through an ajax call
- B - get_session.php - a second ajax call;
The main problem is/was, that even if you set a $_SESSION inside a process that is being run through an AJAX it will always have to wait the for the session file to get unlocked and it will result into a sync between the two processes (A. + B.) - both finishing at the same time!
So, the easiest way to fix this matter and get a good result is by using session_write_close() after each set. E.g.:
%_SESSION["A"] = "B";
$_SESSION["x"] = "y";
session_write_close();
PS: Best approach is to have a customed set of functions to handle the sessions.
Sorry for the mark-up. I just created an stack account.
Why would you think that you need a background process? Also, where did you get the idea that you needed one?
A normal php script, with sufficient time out set, with flush() function used every step of the way will give you the output you need for your AJAX.
What's even easier, since you use sessions - AJAX request to a separate handler, that will just check what's in session, and if there is smth new - will return you the new part.
$_SESSION['progress'] = array();
inside process.php
$_SESSION['progress'][] = 'Done 5%';
// complete some commands
$_SESSION['progress'][] = 'Done 10%';
inside ajax.php
if(count($_SESSION['progress']) > $_GET['laststep']) {
// echo the new messages
}
inside your normal page
$.ajax('ajax.php', 'GET', 'laststep=1', success: function(data){ show(data);})
Something like that should work.
Situation:
My php/html page retrieves the contents of another page on a different domain every 5-10 minutes or so. I use a JavaScript setInterval() and a jquery .load() to request content from the other domain into an element on my page. Each time it retrieves content, javascript compares new content with the previous content and then I make an Ajax call to a php script that sends me an email of what the changes are.
Problem:
It's all working fine and dandy except for the fact that I need a browser constantly open, requesting the updates.
Question:
Is there a way to accomplish this with some sort of 'self executing' script on the server? Something that I would only have to start once, and it continues to run on it's own without needing a browser to be open as long as I want the script to run?
Thanks in advance!
P.S. I'm not a php/javascript expert by any means, but I can get my way around.
I believe the thing you are looking for is a cron job.
If your script relies on Javascript for proper execution, you will need to use a browser to accomplish your goals.
However, if you can alter your script to perform all of the functionality via PHP, perhaps using cURL to request the necessary data, you can use a cron job to execute the script at regular intervals.
If you're running a script at an interval, I would recommend using a bash script instead that runs in the background.
#!/bin/bash
while [ 1 ]
do
php "script.php"
sleep 300
done
Then you can run the script like nohup bash.sh. 300 seconds = 5 minutes.
I'm calling a Java program with a PHP system call. The Java program takes a while to run but will eventually produce a PDF file with a known filename.
I need to keep checking for this file until it exists and then serve up a link to it. I assume a while loop will be involved but I don't want it to be too resource intensive. What's a good way of doing this?
Basically you got it right
while (!file_exists($filename)) sleep(1);
print 'download PDF';
the sleep gives 1 second between checks so it won't stress your CPU for nothing
this will do the work but you may specify an additional timeout.
while( !file_exists($pathToFile) )
{
sleep(1);
}
If you need to send it back to the browser, you should probably investigate using an AJAX call on a setInterval timer and a PHP script that checks for the files existence. You can do this in two ways:
flush() html back to the browser that includes Javascipt that starts a polling process using AJAX for the browser poll-side and your PHP script with an AJAX function to process the poll.
If flush() doesn't work, then you should return the HTML of your PHP script BEFORE setting off your Java process. In that code put two AJAX calls. One that starts the actual Java process and one that starts a polling service looking for the file.
Long running scripts may timeout the browser before you can get a response from your Java application, which is why you'll likely need the browser to work asynchronously from your Java process.
On the other hand, if this is a pure PHP script running or the Java process is less than a typical browser timeout, you can just use something like:
$nofileexists = true;
while($nofilexists) { // loop until your file is there
$nofileexists = checkFileExists(); //check to see if your file is there
sleep(5); //sleeps for X seconds, in this case 5 before running the loop again
}
You didn't mention if this would be a high traffic call (for lots of public users) or a reporting type application. If high traffic, I would recommend the AJAX route, but if low traffic, then the code above.
I am working in a tool in PHP that processes a lot of data and takes a while to finish. I would like to keep the user updated with what is going on and the current task processed.
What is in your opinion the best way to do it? I've got some ideas but can't decide for the most effective one:
The old way: execute a small part of the script and display a page to the user with a Meta Redirect or a JavaScript timer to send a request to continue the script (like /script.php?step=2).
Sending AJAX requests constantly to read a server file that PHP keeps updating through fwrite().
Same as above but PHP updates a field in the database instead of saving a file.
Does any of those sound good? Any ideas?
Thanks!
Rather than writing to a static file you fetch with AJAX or to an extra database field, why not have another PHP script that simply returns a completion percentage for the specified task. Your page can then update the progress via a very lightweight AJAX request to said PHP script.
As for implementing this "progress" script, I could offer more advice if I had more insight as to what you mean by "processes a lot of data". If you are writing to a file, your "progress" script could simply check the file size and return the percentage complete. For more complex tasks, you might assign benchmarks to particular processes and return an estimated percentage complete based on which process has completed last or is currently running.
UPDATE
This is one suggested method to "check the progress" of an active script which is simply waiting for a response from a request. I have a data mining application that I use a similar method for.
In your script that makes the request you're waiting for (the script you want to check the progress of), you can store (either in a file or a database, I use a database as I have hundreds of processes running at any time which all need to track their progress, and I have another script that allows me to monitor progress of these processes) a progress variable for the process. When the process begins, set this to 1. You can easily select an arbitrary number of 'checkpoints' the script will pass and calculate the percentage given the current checkpoint. For a large request, however, you might be more interested in knowing the approximate percent the request has completed. One possible solution would be to know the size of the returned content and set your status variable according to the percentage received at any moment. I.e. if you receive the request data in a loop, each iteration you could update the status. Or if you are downloading to a flat file you could poll the size of the file. This could be done less accurately with time (rather than file size) if you know the approximate time the request should take to complete and simply compare against the script's current execution time. Obviously neither of these are perfect solutions, but I hope they'll give you some insight into your options.
I suggest using the AJAX method, but not using a file or a database. You could probably use session values or something like that, that way you don't have to create a connection or open a file to do anything.
In the past, I've just written messages out to the page and used flush() to flush the output buffer. Very simple, but it may not work correctly on every web server or with every web browser (as they may do their own internal buffering).
Personally, I like your second option the best. Should be reliable and fairly simple to implement.
I like option 2 - using AJAX to read a status file that PHP writes to periodically. This opens up a lot of different presentation options. If you write a JSON object to the file, you can easily parse it and display things like a progress bar, status messages, etc...
A 'dirty' but quick-and-easy approach is to just echo out the status as the script runs along. So long as you don't have output buffering on, the browser will render the HTML as it receives it from the server (I know WordPress uses this technique for it's auto-upgrade).
But yes, a 'better' approach would be AJAX, though I wouldn't say there's anything wrong with 'breaking it up' use redirects.
Why not incorporate 1 & 2, where AJAX sends a request to script.php?step=1, checks response, writes to the browser, then goes back for more at script.php?step=2 and so on?
if you can do away with IE then use server sent events. its the ideal solution.