I have a form in page1.php which both redirects to page3.php and also triggers an ajax post in page2.php (with no success function), page2.php might need to run for an hour, but the user doesn't need the results. I do need the user to see page2.php, but he might navigate away.
Do I need to use in page2.php these 2 functions? Or just one of them? Or none? I want to make sure the script in page2.php runs until the end.
Page1.php
<form id="form" action="page2.php" method="post">
<!--form input elements with names-->
</form>
<script>
$('#form').submit(function() {
$.post('page3.php', {
var1: $('input[name="name1"]').val(),
var2: $('input[name="name2"]').val(),
});
});
</script>
Page2.php
<?php
ignore_user_abort(true); // Allow user to navigate away
set_time_limit(0); // Allow script to run indefinitely
// a lot of code which will run for a while - between 3 minutes and an hour
?>
Page3.php
<html>
<!--some code here including links to go to page4.php-->
</html>
I am asking partly because I thought there is no need for any of these functions, but was told to use them, but when I try using them, eventhough there is die(); and the script stops, it still seems to be processsing something and I'm afraid because of this "indefinitely" it will be too much on the server.
As I don't want to add unnecessary loads.
Yes you would need both of those functions in order to accomplish your current criteria, my suggestion would be to move this out of the http protocal. Depending on what your script is actually accomplishing if it requires no further interaction from the client it would be best used in the command line.
A theory of use would be to create a cron script that is called at the needed intervals, it would then access a queue which page2.php would populate.
If there is a queue available the cron script would process the information as it is currently done on page2.php. Since your script runs for a long period of time I would suggest using a locking mechanism for the cron, see php.net/flock for a simple file system lock. You check the file if its locked its already running.
Here is a simple example that you put into a standalone script for processing via cron:
$fp = fopen(DATA_PATH . '/locks/isLocked', 'w+');
if (!flock($fp, LOCK_EX | LOCK_NB)) { //locks the file
$logger->info('Already Running');
exit(0);
}
fwrite($fp, time()); //write our new time so we can inspect when it ran last if needed
try {
if (hasQueue()) { //checks to see if any jobs are waiting in mysql
run(); //process normally completed by page2.php
}
} catch (Exception $e) {
//something went wrong here could setup a log / email yourself etc..
}
flock($fp, LOCK_UN); //unlock the file
Related
I have a PHP Code that does some tasks.
Lets say someone executes the code by doing so https://localhost/code.php.
I have an employee that executes the script over curl from a separate server, what is the best way to prevent him from launching the script twice, before the (already running) script is actually completed/finished goes to the end.
TLDR: I would need a function, to wait until the task/code (that's running now) completes and the secondary task that is trying to be launched has given (sleep for few seconds or until the first tasks completes).
TLDR2: Looking for function [The title says it]
Any ideas? thanks.
While a session won't work with cURL, the idea is valid -- you need to set something persistent outside of your script. So, how about writing to a local file, or writing to a database?
if ( file_exists('lock.txt') ) die;
file_put_contents ('lock.txt', 'This file prevents script execution', LOCK_EX);
(... your script code here...)
unlink ('lock.txt');
If you know that there is only one user who will hit your server you can simply use session data.
<?php
session_start();
if (true === $_SESSION["NOT_FINISHED"] ?? false) {
die("Previous job is not finished yet!");
} else {
$_SESSION["NOT_FINISHED"] = true;
// start whatever job need to be done here
...
// when job is done and finished lets release out busy flag
unset( $_SESSION["NOT_FINISHED"]);
}
I've searched a lot and didn't found a simply answer, all i need is dont allow user to run the same php script too fast, like if he sent the form, i need to block the same form to be runned again for 10 seconds.
I tried using:
if(isset($_SESSION['var']){
exit;
}
and setting the var on the start of the script and then unsetting it on script finish, but it will only disallow him to run the script while it's currently running, is there any way to do this?
You can use Timeout to prevent user run script to fast ,
But php use Session to store Timeout like this :
session_start(); //At the beginning of the PHP file
define("TIMEOUT", 10); //10 sec
//Check timeout
if (isset($_SESSION['expire'])) {
if ($_SESSION['expire']-time()>TIMEOUT) {
unset($_SESSION['expire']);
}
}
//Allow to do submit after 10 sec
if (!isset($_SESSION['expire'])) {
$_SESSION['expire']=time();
//Your submit code
}
I am building a WebService, using PHP:
Basically,
User sends a request to the server, via HTTP Request. 'request.php', ie.
Server starts php code asynchronously. 'update.php', ie.
The connection with the user is finished.
The code 'update.php' is still running, and will finish after some time.
The code 'update.php' is finished.
The problem is with php running asynchronously some external code.
Is that possible? Is there another way to do it? With shell_exec?
Please, I need insights! An elegant way is preferable.
Thank you!
The best approach is using message queue like RabbitMQ or even simple MySQL table.
Each time you add new task in front controller it goes to queue. Then update.php run by cron job fetch it from queue, process, save results and mark task as finished.
Also it will help you distribute load over time preventing from DoS caused by your own script.
You could have the user connect to update.php, generate some sort of unique ID to keep track of the process, and then call fsockopen() on itself with a special GET variable to signify that it's doing the heavy lifting rather than user interaction. Close that connection immediately, and then print out the appropriate response to the user.
Meanwhile, look for the special GET variable you specified, and when present call ignore_user_abort() and proceed with whatever operations you need in that branch of the if clause. So here's a rough skeleton of what your update.php file would look like:
<?php
if ( isset($_GET['asynch']) ) {
ignore_user_abort();
// check for $_GET['id'] and validate,
// then execute long-running code here
} else {
// generate $id here
$host = $_SERVER['SERVER_NAME'];
$url = "/update.php?asynch&id={$id}";
if ( $handle = fsockopen($host, 80, $n, $s, 5) ) {
$data = "GET {$url} HTTP/1.0\r\nHost: {$host}\r\n\r\n";
fwrite($handle, $data);
fclose($handle);
}
// return a response to the user
echo 'Response goes here';
}
?>
You could build a service with PHP.
Or launch a PHP script using bash : system("php myScript.php param param2 &")
Look into worker processes with Redis resque or gearman
I have something like this :
<?php
$perform_check = 1; #Checks data with ID : 1
while(true)
{
#code
}
?>
But some data must be updated in this process, is it possible to update this data from another document?
I tried something like this :
index.php
<?php
setcookie("data", 19, time()+3600);
?>
and
loop.php
<?php
while(true)
{
if($perform_check!=$_COOKIE[data]) $perform_check = $_COOKIE[data];
#rest of code
flush();
sleep(0.3);
}
?>
But it doesn't work. I also tried $_SESSION but the page crashes on session_start().
Is it somehow possible?
Cookies are sent as a HTTP header when PHP is sending a response through the web server (for example Apache2).
All HTTP headers are sent before any output. If you output anything, headers are sent (including the set-cookie header) before the output.
After you flush() the first time you can no longer set cookies or other headers.
If you want a progress indicator or updates, you need to initiate whatever operation you are doing using javascript and do polling at an interval. In the process with the loop you need to save the progress in a shared memory, in a file or in a database (in this order of preference), then read that data using the process started by javascript progress check/updates check.
You could use the existence of a file to flag the ending of the process. For example, create a lock file,
$lock_file = <some unique name>
fopen($lock_file, 'w') or die("can't open file");
while ( file_exists($lock_file)) {
.
.
doStuff();
.
.
}
If the file is removed by some other process, it should terminate.
I think a while loop isn't doing any good here. you should look into php websocket implementation. It's an implementation in PHP to have websockets and to have a open connection with your user. If you have that you can manage things with listeners on both sides.
If you want the value to change while your code is looping, you need to check if the value has changed within the loop, not before.
A cookie or session will only work if the same user/browser is running both scripts. Writing to a file or database is the more usual approach.
I'm not quite sure about what you really want to do here.. but i guess you could do this..
<?php
$perform_check = 1; #Checks data with ID : 1
while(true) : ?>
$.ajax({
type:'post',
data: //your data to be passed to ajax script..
url: //the script wherein you want to run the query..
onSuccess: function(data) {
if(data==test) {
//if you have data that you want then you could stop the loop
<?php $perform_check = false; ?>
}
}
});
<?php endwhile: ?>
Hope this helps.
If you need to check very often you should use a CRON that calls the function you use to check.
I have a PHP script (let's call it execute.php) that draws the whole page (HTML tags and body tags etc.) at the beginning and, afer that, executes some commands (C++ programs) in the background. It then waits for these programs to terminate (some depend on the results of others, so they may be executed sequentially) and then has a JavaScript that auto-submits a form to another PHP script (which we will call results.php) because results.php needs the POST-information from the previous script.
execute.php:
<?php
print"
<html>
<body>
Some HTML code here
</body>
</html>
";
// Here come some C++-program calls
$pid_program1 = run_in_background($program1)
$pid_program2 = run_in_background($program2)
while (is_running($pid_program1) or is_running($pid_program2) )
{
//echo(".");
sleep(1);
}
// Here come some later C++-program calls that execute quickly
$pid_program3 = run_in_background($program3)
$pid_program4 = run_in_background($program4)
while (is_running($pid_program3) or is_running($pid_program4) )
{
sleep(1);
}
...
// We are now finished
print "
<form action=\"results.php\" id=\"go_to_results\" method=\"POST\">
<input type='hidden' name=\"session_id\" value=\"XYZ\">
</form>
<script type=\"text/javascript\">
AutoSubmitForm( 'go_to_results' );
</script>
";
This works nicely if the C++ programs 1 and 2 execute quickly. However, when they take their time (around 25 minutes in total), the PHP script seems to fail to continue. Interestingly the C++ programs 3 and 4 are nevertheless executed and produce the expected outputs etc.
However, when I put a echo("."); in the first while-loop before the sleep(), it works and continues until the JavaScript autosubmit.
So it seems to me that the remaining PHP code (including the autosubmit) is, for whatever reason, not send when there is no output in the first while loop.
I have also tried using set_time_limit(0) and ignore_user_abort(true) and different other things like writing to an outputbuffer (don't want to clutter the already finally displayed webpage) instead of the echo, but none of these work.
When I run the same scripts on a machine with multiple cores, so that program1 and 2 can be executed in parallel, it also works, without the echo(".").
So I am currently very confused and can't find any error messages in the apache log or PHP log and thus would really appreciate your thoughts on this one.
EDIT
Thanks again for your suggestions so far.
I have now adopted a solution involving (really simple) AJAX and it's definitely nicer this way.
However, if the C++-programs executions take "longer" it is not autosubmitting to the results-page, which is actually created this time (failed to do so before).
Basically what I have done is:
process.php:
<?php
$params = "someparam=1";
?>
<html>
<body>
<script type="text/javascript">
function run_analyses(params){
// Use AJAX to execute the programs independenantly in the background
// Allows for the user to close the process-page and come back at a later point to the results-link, w/o need to wait.
if (window.XMLHttpRequest)
{
http_request = new XMLHttpRequest();
}
else
{
//Fallback for IE5 and IE6, as these don't support the above writing/code
http_request = new ActiveXObject("Microsoft.XMLHTTP");
}
//Is http_request still false
if (!http_request)
{
alert('Ende :( Kann keine XMLHTTP-Instanz erzeugen');
}
http_request.onreadystatechange=function(){
if (http_request.readyState==4 && http_request.status==200){
// Maybe used to display the progress of the execution
//document.getElementById("output").innerHTML=http_request.responseText;
// Call of programs is finished -> Go to the results-page
document.getElementById( "go_to_results" ).submit();
}
};
http_request.open("POST","execute.php",true);
//Send the proper header information along with the request
http_request.setRequestHeader("Content-type", "application/x-www-form-urlencoded");
http_request.setRequestHeader("Content-length", params.length);
http_request.setRequestHeader("Connection", "close");
http_request.send(params);
};
</script>
<?php
// Do some HTML-markup
...
// Start the programs!
print "
<script type=\"text/javascript\">
run_analyses('".$params."');
</script>
<form action=\"results.html" id=\"go_to_results\" method=\"POST\">
<input type='hidden' name=\"session_id\" value=\"XYZ\">
</form>
?>
</html>
</body>
and execute.php contains the C++-program calls, waiting-routines and finally, via "include("results.php")" the creation of the results-page.
Again, for "not so long" program executions, the autosubmission works as expected, but not if it takes "longer". By "longer" I mean around 25 minutes.
I have absolutely no idea what could cause this as again, there are no error-messages to be found.
Am I missing a crucial configuration option there (apache, php, etc.)?
EDIT
As it turned out, letting the requested PHP-script "echo" something repeatedly prevents the timeout. So it is basically the same as for the PHP-solution without AJAX, but this time, since the responseText of the AJAX-request is not necessarily needed, the progress-page is not cluttered and it may be used as a workaround. Specifically, I would not necessarily recommend it a as a general solution or good-practice.
It occurs to me that a better approach would be to:
Output the complete HTML page
Show a loading message to the user
Send an AJAX request to start the external program
Wait for callback (waiting for external program to finish)
Repeat steps 3 and 4 until all program have been executed
Update the page to tell the user what is going on
Submit the form
This way, you get the HTML to the user as quickly as possible, then you execute the programs sequentially in an orderly and controlled fashion without worrying about hitting the max_execution_time threshold. This also enables you to keep your user informed - after each AJAX callback, you can tell the user that "program ABC has completed, starting DEF..." and so on.
EDIT
Per request, I'll add an outline of how this could be implemented. A caveat, too: If you are going to be adding more javascript-derived functionality to your page, you'll want to consider using a library like jQuery or mootools (my personal favorite). This is a decision you should make right away - if you aren't going to be doing a lot of javascript except this, then a library will only bloat your project, but if you are going to be adding a lot of javascript, you don't want to have to come back later and re-write your code because you add a library 3/4 of the way through the project.
I've used mootools to create this demonstration, but it isn't necessary or even advisable to add mootools if this is the only thing you're going to use it for. It is simply easier for me to write an example really quick without having to stop and think :)
First, the main page. We'll call this page view.php. This should contain your initial HTML as well as the javascript that will fire off the AJAX requests. Basically, this entire jsFiddle would be view.php: http://jsfiddle.net/WPnEy/1/
Now, execute.php looks like this:
$program_name = isset($_POST['program_name']) ? $_POST['program_name'] : false;
switch ($program_name) {
case 'program1':
$program_path = '/path/to/executable/';
$friendly_name = 'Some program 1';
break;
case 'program2':
$program_path = '/path/to/executable/';
$friendly_name = 'Some program 2';
break;
case 'program3':
$program_path = '/path/to/executable/';
$friendly_name = 'Some program 3';
break;
case 'program4':
$program_path = '/path/to/executable/';
$friendly_name = 'Some program 4';
break;
default:
die(json_encode(array(
'program_name'=>'Invalid',
'status'=>'FAILED',
'error'->true,
'error_msg'=>'Invalid program'
)));
break;
}
$pid = run_in_background($program_path)
while (is_running(pid)) {
sleep(1);
}
// check here for errors, get any error messages you might have
$error = false;
$error_msg = '';
// use this for failures that are not necessarily errors...
$status = 'OK';
die(json_encode(array(
'program_name'=>$friendly_name,
'status'=>$status,
'error'->$error,
'error_msg'=>$error_msg
)));
execute.php would then be called once for each program. The $friendly_program variable gives you a way to send back something for the user to see. The switch statement there makes sure that the script isn't being asked to execute anything you aren't expecting. The program is executed, and when it is done you send along a little package of information with the status, the friendly name, any errors, etc. This comes into the javascript on view.php, which then decides if there are more programs to run. If so, it will call execute.php again... if not, it will submit the form.
This seems rather convoluted... And very risky. Any network glitches, the user's browser closing for whatever reason, and even a firewall timing out, and this script is aborted.
Why not run the whole thing in the background?
<?php
session_start();
$_SESSION['background_run_is_done'] = false;
session_write_close(); // release session file lock
set_time_limit(0);
ignore_user_abort(true); // allow job to keep running even if client disconnects.
.... your external stuff here ...
if ($successfully_completed) {
session_start(); // re-open session file to update value
$_SESSION['background_run_is_done'] = TRUE;
}
... use curl to submit job completion post here ...
?>
This disconnects the state of the user's browser from the processing of the jobs. You then just have your client-side code ping the server occasionally to monitor the job's progress.
Launching and managing multiple and long-running processes from a webserver PHP process is fraught with complications and complexity. It's also very different on different platforms (you didn't say which you are using).
Handling the invocation of these processes synchronously from the execution of your PHP is not the way to address this. You really need to run the programs in a seperate session group - and use (e.g.) Ajax or Comet to poll the status of them.