I have a JavaScript functions which calls a PHP function through AJAX.
The PHP function has a set_time_limit(0) for its purposes.
Is there any way to stop that function when I want, for example with an HTML button event?
I want to explain better the situation:
I have a php file which uses a stream_copy_to_stream($src, $dest) php function to retrieve a stream in my local network. The function has to work until I want: I can stop it at the end of the stream or when I want. So I can use a button to start and a button to stop. The problem is the new instance created by the ajax call, in fact I can't work on it because it is not the function that is recording but it is another instance. I tried MireSVK's suggest but it doesn't worked!
Depending on the function. If it is a while loop checking for certain condition every time, then you could add a condition that is modifiable from outside the script (e.g. make it check for a file, and create / delete that file as required)
It looks like a bad idea, however. Why you want to do it?
var running = true;
function doSomething(){
//do something........
}
setInterval(function(){if(running){doSomething()}},2000); ///this runs do something every 2 seconds
on button click simply set running = false;
Your code looks like:
set_time_limit(0);
while(true==true){//infinite loop
doSomething(); //your code
}
Let's upgrade it
set_time_limit(0);
session_start();
$_SESSION['do_a_loop'] = true;
function should_i_stop_loop(){
#session_start();
if( $_SESSION['do_a_loop'] == false ) {
//let's stop a loop
exit();
}
session_write_close();
}
while(true==true){
doSomething();
should_i_stop_loop(); //your new function
}
Create new file stopit.php
session_start();
$_SESSION['do_a_loop'] = false;
All you have to do now is create a request on stopit.php file (with ajax or something)
Edit code according to your needs, this is point. One of many solutions.
Sorry for my English
Sadly this isn't possible (sort of).
Each time you make an AJAX call to a PHP script the script spawns a new instance of itself. Thus anything you send to it will be sent to a new operation, not the operation you had previously started.
There are a number of workarounds.
Use readystate 3 in AJAX to create a non closing connection to the PHP script, however that isn't supported cross browser and probably won't work in IE (not sure about IE 10).
Look into socket programming in PHP, which allows you to create a script with one instance that you can connect to multiple times.
Have PHP check a third party. I.E have one script running in a loop checking a file or a database, then connect to another script to modify that file or database. The original script can be remotely controlled by what you write to the file/database.
Try another programming language (this is a silly option, but I'm a fan of node). Node.js does this sort of thing very very easily.
Related
I have a PHP Code that does some tasks.
Lets say someone executes the code by doing so https://localhost/code.php.
I have an employee that executes the script over curl from a separate server, what is the best way to prevent him from launching the script twice, before the (already running) script is actually completed/finished goes to the end.
TLDR: I would need a function, to wait until the task/code (that's running now) completes and the secondary task that is trying to be launched has given (sleep for few seconds or until the first tasks completes).
TLDR2: Looking for function [The title says it]
Any ideas? thanks.
While a session won't work with cURL, the idea is valid -- you need to set something persistent outside of your script. So, how about writing to a local file, or writing to a database?
if ( file_exists('lock.txt') ) die;
file_put_contents ('lock.txt', 'This file prevents script execution', LOCK_EX);
(... your script code here...)
unlink ('lock.txt');
If you know that there is only one user who will hit your server you can simply use session data.
<?php
session_start();
if (true === $_SESSION["NOT_FINISHED"] ?? false) {
die("Previous job is not finished yet!");
} else {
$_SESSION["NOT_FINISHED"] = true;
// start whatever job need to be done here
...
// when job is done and finished lets release out busy flag
unset( $_SESSION["NOT_FINISHED"]);
}
I'm using DHTMLX Scheduler on the front end and DHTMLX Connector on the backend as part of my radio automation app. Every time a user edits the calendar, an AJAX call is made to a file that looks like this:
require_once("dhtmlxScheduler_v4/connector/scheduler_connector.php");
require_once('QDRAconf.php');
$res = mysql_connect($QDRAconf['mysqlHost'], $QDRAconf['mysqlUser'], $QDRAconf['mysqlPass']);
mysql_select_db($QDRAconf['mysqlDb']);
// init the schedulerconnector
$conn = new SchedulerConnector($res);
// render the table
$conn->render_table("events","id","start_date,end_date,text");
This file is my "shim" that hooks up the fronted to the back end. I want to run another PHP script that writes the changes to my crontab, but it needs to happen after the DHTMLX library has updated the database. Trouble is, the DHTMLX library will automatically exit whenever it thinks it's done: sometimes it might not get past the first require_once('...') line so I can't just put require_once('cronwriter.php'); at the last line of the script.
My solution to this was to create a class with a destructor that updates the crontab with the latest changes. Since the php manual states that destructors will still be run if the exit() or die() function is called, I added a dummy class with a destructor that runs cronwriter.php script: (I added this to the beginning of the file.)
class ExitCatcher
{
function __destruct()
{
require_once('cronwriter.php');
}
}
//init the class
$ExitCatcher = new ExitCatcher;
For some reason, it doesn't work.
register_shutdown_function may offer a quick solution; but, you might save yourself some future trouble by inspecting the cause of that library's sporadic process haltings.
A good place to start might be...
your browser's JS console for JS errors
your JS console's network tab for AJAX errors
your server's error logs for PHP errors
I have a problem where I call a PHP function on page load - the function checks to see if a file exists it returns the filename, if it doesn't exist it runs a script which is fairly resourceful and takes time - converting a waveform image from an audio file. The problem is the audio files are large so creating the file can take some time, so if the audio file doesn't have this image file associated with it the page load takes as long as the process does.
What I'm after is for this function to return a placeholder image if one doesn't exist, but carry on with the process after the page is loaded - or in the background. So in theory when the page is reloaded at a later date the correct image will be there.
I can get the return of the placeholder image currently but then the process stops and the image doesn't get generated. Here's what I have so far:
function example($file_path, $file_name) {if ($file_path) {
if (file_exists("/path/to/folder/{$file_name}.png")) {
return "/path/to/folder/{$file_name}.png";
}
if (!file_exists("/path/to/folder/{$audio_file_name}.png")) {
return "/path/to/folder/processing.png";
Some stuff in here
return $new image
} return FALSE
As you can see this just stops when the file doesn't exist but I want the stuff in here to continue in background. Is it possible or do I need a different approach? Like a cron job or something? Any help appreciated.
You might try a queuing system like resque https://github.com/chrisboulton/php-resque
You then can generate a job, that processes the information and quite fast return with the "processing" image.
With this approach you won't know when it is finished though.
In my experience this is still easier than arguing with the operations guys to compile php with multi threading support.
I'd do it with AJAX. If the image is found, just put it there.
Otherwise, put the placeholder, and add a JS flag with data to load the waveform image.
In the PHP code that generates HTML Document, no conversion happens. And you have another request handler to handle requests coming from JS, that makes the conversion with suppied data.
The data created originally on HTML Document generation code will be passed to JS, which will use it to send a request for the conversion. While JS waits for response, you handle to loading time, and when response comes you put it on the placeholder.
If you're running on FastCGI / FPM you could consider doing the following:
You put a regular <img> tag with the src attribute pointing to your script.
If your script needs to regenerate, you make the browser redirect to a processing image.
If the image is ready, you redirect to the created image (you could do an AJAX poll on the page as well)
How to do step 2?
Normally, the browser has to wait for your script to end before performing a render or redirect; but FastCGI (PHP-FPM) has a special function for this: fastcgi_finish_request. It's largely undocumented, but its use is simple:
if ($need_to_process) {
header('Location: /path/to/processing.png');
fastcgi_finish_request();
// do processing here
} else {
header('Location: /path/to/final_image.png');
}
Alternative
You can apply it to your existing process as well if you have a template that you can immediately render just before doing fastcgi_finish_request().
Yet another alternative
Use a task scheduler like Gearman.
you can use "try" and "finally"
try {
return "hello world";
} finally {
//do something
}
I am not able to comment because my reputation is below 50, but I wanted to note something on mohammadhasan's answer. It seems to work but avoid 'return' statement in both try and finally block
try {
return "hello world";
} finally {
//do not put return here
}
Example:
function runner() {
try {
return "I am the trial runner";
} finally {
return "I am the default runner";
}
}
echo runner();
Will only show I am the default runner.
I have a PHP script (let's call it execute.php) that draws the whole page (HTML tags and body tags etc.) at the beginning and, afer that, executes some commands (C++ programs) in the background. It then waits for these programs to terminate (some depend on the results of others, so they may be executed sequentially) and then has a JavaScript that auto-submits a form to another PHP script (which we will call results.php) because results.php needs the POST-information from the previous script.
execute.php:
<?php
print"
<html>
<body>
Some HTML code here
</body>
</html>
";
// Here come some C++-program calls
$pid_program1 = run_in_background($program1)
$pid_program2 = run_in_background($program2)
while (is_running($pid_program1) or is_running($pid_program2) )
{
//echo(".");
sleep(1);
}
// Here come some later C++-program calls that execute quickly
$pid_program3 = run_in_background($program3)
$pid_program4 = run_in_background($program4)
while (is_running($pid_program3) or is_running($pid_program4) )
{
sleep(1);
}
...
// We are now finished
print "
<form action=\"results.php\" id=\"go_to_results\" method=\"POST\">
<input type='hidden' name=\"session_id\" value=\"XYZ\">
</form>
<script type=\"text/javascript\">
AutoSubmitForm( 'go_to_results' );
</script>
";
This works nicely if the C++ programs 1 and 2 execute quickly. However, when they take their time (around 25 minutes in total), the PHP script seems to fail to continue. Interestingly the C++ programs 3 and 4 are nevertheless executed and produce the expected outputs etc.
However, when I put a echo("."); in the first while-loop before the sleep(), it works and continues until the JavaScript autosubmit.
So it seems to me that the remaining PHP code (including the autosubmit) is, for whatever reason, not send when there is no output in the first while loop.
I have also tried using set_time_limit(0) and ignore_user_abort(true) and different other things like writing to an outputbuffer (don't want to clutter the already finally displayed webpage) instead of the echo, but none of these work.
When I run the same scripts on a machine with multiple cores, so that program1 and 2 can be executed in parallel, it also works, without the echo(".").
So I am currently very confused and can't find any error messages in the apache log or PHP log and thus would really appreciate your thoughts on this one.
EDIT
Thanks again for your suggestions so far.
I have now adopted a solution involving (really simple) AJAX and it's definitely nicer this way.
However, if the C++-programs executions take "longer" it is not autosubmitting to the results-page, which is actually created this time (failed to do so before).
Basically what I have done is:
process.php:
<?php
$params = "someparam=1";
?>
<html>
<body>
<script type="text/javascript">
function run_analyses(params){
// Use AJAX to execute the programs independenantly in the background
// Allows for the user to close the process-page and come back at a later point to the results-link, w/o need to wait.
if (window.XMLHttpRequest)
{
http_request = new XMLHttpRequest();
}
else
{
//Fallback for IE5 and IE6, as these don't support the above writing/code
http_request = new ActiveXObject("Microsoft.XMLHTTP");
}
//Is http_request still false
if (!http_request)
{
alert('Ende :( Kann keine XMLHTTP-Instanz erzeugen');
}
http_request.onreadystatechange=function(){
if (http_request.readyState==4 && http_request.status==200){
// Maybe used to display the progress of the execution
//document.getElementById("output").innerHTML=http_request.responseText;
// Call of programs is finished -> Go to the results-page
document.getElementById( "go_to_results" ).submit();
}
};
http_request.open("POST","execute.php",true);
//Send the proper header information along with the request
http_request.setRequestHeader("Content-type", "application/x-www-form-urlencoded");
http_request.setRequestHeader("Content-length", params.length);
http_request.setRequestHeader("Connection", "close");
http_request.send(params);
};
</script>
<?php
// Do some HTML-markup
...
// Start the programs!
print "
<script type=\"text/javascript\">
run_analyses('".$params."');
</script>
<form action=\"results.html" id=\"go_to_results\" method=\"POST\">
<input type='hidden' name=\"session_id\" value=\"XYZ\">
</form>
?>
</html>
</body>
and execute.php contains the C++-program calls, waiting-routines and finally, via "include("results.php")" the creation of the results-page.
Again, for "not so long" program executions, the autosubmission works as expected, but not if it takes "longer". By "longer" I mean around 25 minutes.
I have absolutely no idea what could cause this as again, there are no error-messages to be found.
Am I missing a crucial configuration option there (apache, php, etc.)?
EDIT
As it turned out, letting the requested PHP-script "echo" something repeatedly prevents the timeout. So it is basically the same as for the PHP-solution without AJAX, but this time, since the responseText of the AJAX-request is not necessarily needed, the progress-page is not cluttered and it may be used as a workaround. Specifically, I would not necessarily recommend it a as a general solution or good-practice.
It occurs to me that a better approach would be to:
Output the complete HTML page
Show a loading message to the user
Send an AJAX request to start the external program
Wait for callback (waiting for external program to finish)
Repeat steps 3 and 4 until all program have been executed
Update the page to tell the user what is going on
Submit the form
This way, you get the HTML to the user as quickly as possible, then you execute the programs sequentially in an orderly and controlled fashion without worrying about hitting the max_execution_time threshold. This also enables you to keep your user informed - after each AJAX callback, you can tell the user that "program ABC has completed, starting DEF..." and so on.
EDIT
Per request, I'll add an outline of how this could be implemented. A caveat, too: If you are going to be adding more javascript-derived functionality to your page, you'll want to consider using a library like jQuery or mootools (my personal favorite). This is a decision you should make right away - if you aren't going to be doing a lot of javascript except this, then a library will only bloat your project, but if you are going to be adding a lot of javascript, you don't want to have to come back later and re-write your code because you add a library 3/4 of the way through the project.
I've used mootools to create this demonstration, but it isn't necessary or even advisable to add mootools if this is the only thing you're going to use it for. It is simply easier for me to write an example really quick without having to stop and think :)
First, the main page. We'll call this page view.php. This should contain your initial HTML as well as the javascript that will fire off the AJAX requests. Basically, this entire jsFiddle would be view.php: http://jsfiddle.net/WPnEy/1/
Now, execute.php looks like this:
$program_name = isset($_POST['program_name']) ? $_POST['program_name'] : false;
switch ($program_name) {
case 'program1':
$program_path = '/path/to/executable/';
$friendly_name = 'Some program 1';
break;
case 'program2':
$program_path = '/path/to/executable/';
$friendly_name = 'Some program 2';
break;
case 'program3':
$program_path = '/path/to/executable/';
$friendly_name = 'Some program 3';
break;
case 'program4':
$program_path = '/path/to/executable/';
$friendly_name = 'Some program 4';
break;
default:
die(json_encode(array(
'program_name'=>'Invalid',
'status'=>'FAILED',
'error'->true,
'error_msg'=>'Invalid program'
)));
break;
}
$pid = run_in_background($program_path)
while (is_running(pid)) {
sleep(1);
}
// check here for errors, get any error messages you might have
$error = false;
$error_msg = '';
// use this for failures that are not necessarily errors...
$status = 'OK';
die(json_encode(array(
'program_name'=>$friendly_name,
'status'=>$status,
'error'->$error,
'error_msg'=>$error_msg
)));
execute.php would then be called once for each program. The $friendly_program variable gives you a way to send back something for the user to see. The switch statement there makes sure that the script isn't being asked to execute anything you aren't expecting. The program is executed, and when it is done you send along a little package of information with the status, the friendly name, any errors, etc. This comes into the javascript on view.php, which then decides if there are more programs to run. If so, it will call execute.php again... if not, it will submit the form.
This seems rather convoluted... And very risky. Any network glitches, the user's browser closing for whatever reason, and even a firewall timing out, and this script is aborted.
Why not run the whole thing in the background?
<?php
session_start();
$_SESSION['background_run_is_done'] = false;
session_write_close(); // release session file lock
set_time_limit(0);
ignore_user_abort(true); // allow job to keep running even if client disconnects.
.... your external stuff here ...
if ($successfully_completed) {
session_start(); // re-open session file to update value
$_SESSION['background_run_is_done'] = TRUE;
}
... use curl to submit job completion post here ...
?>
This disconnects the state of the user's browser from the processing of the jobs. You then just have your client-side code ping the server occasionally to monitor the job's progress.
Launching and managing multiple and long-running processes from a webserver PHP process is fraught with complications and complexity. It's also very different on different platforms (you didn't say which you are using).
Handling the invocation of these processes synchronously from the execution of your PHP is not the way to address this. You really need to run the programs in a seperate session group - and use (e.g.) Ajax or Comet to poll the status of them.
I have a counter timer and I tried to block a simple bypass to download without waiting.
so in main class I declared the boolean
$allow_download = false;
and in Javascript when the time is elapsed
else
{
textDLShow.style.display = 'none';
divDLShow.style.display = '';
"<?php $allow_download = true;?>";
}
and in the second class
if($allow_download == false)
echo "Test";
well, when time is elapsed the boolean is not set with positive value. Any suggestions ??
Thanks for your time !!
JavaScript cannot set the value of a PHP variable since the JavaScript interpreter cannot parse PHP (nor is the variable in the same interpreter anyway, since the JavaScript is run on the client's browser rather than on the server.)
In order to do this you will need to make a new request with JavaScript that your PHP code can read in order to set $allow_download to true and then serve up the download.
You'll want to read up on ajax, document.createElement (because one way to do this might be to create an iframe pointing at the download location after the time has elapsed) and setTimeout.
PHP is run on your server, the variables you create are no longer in scope when your client side javascript runs. There is no way to fix this unless you want to cook up some heavier stuff like storing in a database which IP addresses can download which files at what time.
php can run javascript, but javascript can never run php, take it for granted
the best thing you can do is run an ajax script to kind of call the php on the background
but really why bother going with all that, you can include the javascript in a php function and let it do the job