Seems php scripts continue even after closing or stopping a page.
for example, if my script is run by
/localhost/test.php
after I close the page, sometimes it continues to run. I'm pretty sure restarting apache clears it out but is there a better way to terminate the php script after it's started.
You could use ignore_user_abort but I think it only applies if you are running PHP as a command line script .
When running scripts from browser , in my experience I have realized scripts running even after the browser was closed . I did not go further to check how long they used to run .
The scripts that does huge processing that could run many minutes , I used to have control as below :
A flag variable is considered which is stored in a database table .
A way is provided to set the flag On or Off ( or 1 or 0 ) .
The flag is checked in the script and the script is stopped running if the flag is found to be Off ( or 0 ) .
One way to consider the flag is in a loop as below :
while(true)
{
/* Your code here which probably does lot of processing */
if($flag === false) break; // Or exit if you prefer
}
If you have code that need to run following the loop , you would use break .If you want to abruptly stop the script you could use exit instead in its place .
check this out: http://php.net/manual/en/function.ignore-user-abort.php
Sets whether a client disconnect should cause a script to be aborted.
Related
I have a PHP Code that does some tasks.
Lets say someone executes the code by doing so https://localhost/code.php.
I have an employee that executes the script over curl from a separate server, what is the best way to prevent him from launching the script twice, before the (already running) script is actually completed/finished goes to the end.
TLDR: I would need a function, to wait until the task/code (that's running now) completes and the secondary task that is trying to be launched has given (sleep for few seconds or until the first tasks completes).
TLDR2: Looking for function [The title says it]
Any ideas? thanks.
While a session won't work with cURL, the idea is valid -- you need to set something persistent outside of your script. So, how about writing to a local file, or writing to a database?
if ( file_exists('lock.txt') ) die;
file_put_contents ('lock.txt', 'This file prevents script execution', LOCK_EX);
(... your script code here...)
unlink ('lock.txt');
If you know that there is only one user who will hit your server you can simply use session data.
<?php
session_start();
if (true === $_SESSION["NOT_FINISHED"] ?? false) {
die("Previous job is not finished yet!");
} else {
$_SESSION["NOT_FINISHED"] = true;
// start whatever job need to be done here
...
// when job is done and finished lets release out busy flag
unset( $_SESSION["NOT_FINISHED"]);
}
I need a function that executes by itself in php without the help of crone. I have come up with the following code that works for me well but as it is a never-ending loop will it cause any problem to my server or script, if so could you give me some suggestion or alternatives, please. Thanks.
$interval=60; //minutes
set_time_limit(0);
while (1){
$now=time();
#do the routine job, trigger a php function and what not.
sleep($interval*60-(time()-$now));
}
We have used the infinite loop in a live system environment to basically wait for incoming SMS and then process it. We found out that doing it this way makes the server resource intensive over time and had to restart the server in order to free up memory.
Another issue we encountered is when you execute a script with an infinite loop in your browser, even if you hit the stop button it will continue to run unless you restart Apache.
while (1){ //infinite loop
// write code to insert text to a file
// The file size will still continue to grow
//even when you click 'stop' in your browser.
}
The solution is to run the PHP script as a deamon on the command line. Here's how:
nohup php myscript.php &
the & puts your process in the background.
Not only we found this method to be less memory intensive but you can also kill it without restarting apache by running the following command :
kill processid
Edit: As Dagon pointed out, this is not really the true way of running PHP as a 'Daemon' but using the nohup command can be considered as the poor man's way of running a process as a daemon.
You can use time_sleep_until() function. It will return TRUE OR FALSE
$interval=60; //minutes
set_time_limit( 0 );
$sleep = $interval*60-(time());
while ( 1 ){
if(time() != $sleep) {
// the looping will pause on the specific time it was set to sleep
// it will loop again once it finish sleeping.
time_sleep_until($sleep);
}
#do the routine job, trigger a php function and what not.
}
There are many ways to create a daemon in php, and have been for a very long time.
Just running something in background isn't good. If it tries to print something and the console is closed, for example, the program dies.
One method I have used on linux is pcntl_fork() in a php-cli script, which basically splits your script into two PIDs. Have the parent process kill itself, and have the child process fork itself again. Again have the parent process kill itself. The child process will now be completely divorced and can happily hang out in background doing whatever you want it to do.
$i = 0;
do{
$pid = pcntl_fork();
if( $pid == -1 ){
die( "Could not fork, exiting.\n" );
}else if ( $pid != 0 ){
// We are the parent
die( "Level $i forking worked, exiting.\n" );
}else{
// We are the child.
++$i;
}
}while( $i < 2 );
// This is the daemon child, do your thing here.
Unfortunately, this model has no way to restart itself if it crashes, or if the server is rebooted. (This can be resolved through creativity, but...)
To get the robustness of respawning, try an Upstart script (if you are on Ubuntu.) Here is a tutorial - but I have not yet tried this method.
while(1) means it is infinite loop. If you want to break it you should use break by condition.
eg,.
while (1){ //infinite loop
$now=time();
#do the routine job, trigger a php function and what no.
sleep($interval*60-(time()-$now));
if(condition) break; //it will break when condition is true
}
I have a generic PHP maintenance script that sets a database value to prevent running at the same time as another instance. Usually run via cronjob.
I cleared the database value and tried running this script manually with the cronjob turned off. Every time I run it from the browser, it terminates immediately stating it is already running.
The script will run for about 30 seconds as a background process then terminate automatically as if PHP detected the browser was closed (should take about 15 min to complete).
So I added code to echo when the database value is set or read. It never echos when it's set, only when it was read, but I can see the database value is stored each time.
Script always finishes as expected if run from cron.
What could be going on? Could the server be executing scripts twice on each browser based request?
Server runs Hive so different dirs can have different PHP versions. Don't know if this could have something to do with it.
PHP 5.2.17 (default)
PHP 5.3.27 (dir this script is in)
Apache 2.2.25
Code that dictates if it runs is simply this:
$DB = new DbConnector($db_name, $db_user, $db_pass);
if ($DB->queryOne("SELECT COUNT(*) FROM data_vars WHERE name = 'maintenance_running'")) {
exit('Already running!');
} else {
$DB->query("INSERT INTO data_vars (name, value) VALUES ('maintenance_running', 1)");
}
At the end of the script, the value is cleared. Again, this problem only happens when run from the browser.
You should set a longer value for execution limit with:
set_time_limit(30*60) // 30 minutes
Also the echo function probably is working, but as the response is sent to the browser until the whole script ends executing that is the cause that you can't see the print, because it´s killed by the execution limit before it ends and prints the response.
You can try echoing partially the response with the help of the
ob_start
and
ob_flush
You can read more about ob_functions here
I've been completely unsuccessful finding an answer to this question. Hopefully someone here can help.
I have a PHP script (a WordPress template, to be specific) that automatically imports and processes images when a user hits it. The problem is that the image processing takes up a lot of memory, particularly if multiple users are accessing the template at the same time and initiating the image processing. My server crashed multiple times because of this.
My solution to this was to not execute the image-processing function if it was already running. Before the function started running, I would check a database entry named image_import_running to see if it was set to false. If it was, the function then ran. The very first thing the function did was set image_import_running to true. Then, after it was all finished, I set it back to false.
It worked great -- in theory. The site hasn't crashed since, I can tell you that. But there are two major problems with it:
If the user closes the page while it's loading, the script never finishes processing the images and therefore never sets image_import_running back to false. The template will never process images again until it's manually set to false.
If the script times out while it's processing images -- and that's a strong possibility if there are many images in the queue -- you have essentially the same problem as No. 1: the script never gets to the point where it sets image_import_running back to false.
To handle No. 1 (the first one of the two problems I realized), I added ignore_user_abort(true) to the script. Did it work? I don't know, because No. 2 is still an issue. That's where I'm stumped.
If I could ask the server whether the script was running or not, I could do something like this:
if($import_running && $script_not_running) {
$import_running = false;
}
But how do I set that $script_not_running variable? Beats me.
I've shared this entire story with you just in case you have some other brilliant solution.
Try using
ignore_user_abort(true); it will continue to run even if the person leaves and closes the browser.
you might also want to put a number instead of true false in the db record and set a maximum number of processes that can run together
As others have suggested, it would be best to move the image processing out of the request itself.
As an interim "fix", store a timestamp alongside image_import_running when a processing job begins (e.g., image_import_commenced). This is a very crude mechanism, but if you know the maximum time that a job can run before timing out, the script can check whether that period of time has elapsed.
e.g., if image_import_running is still true but the current time is more than 10 minutes since image_import_commenced, run the processing anyway.
What about setting a transient with an expiry time that would throttle the operation?
if(!get_transient( 'import_running' )) {
set_transient( 'import_running', true, 30 ); // set a 30 second transient on the import.
run_the_import_function();
}
I would rather store the job into database flagging it pending and set a cron job to execute the processing one job at a time.
For Me i use just this simple idea with a text document. for example run.txt file
in the top script use :
if((file_get_contents('run.txt') != 'run'){ // here the script will work
$file = fopen('run.txt', 'w+');
fwrite($file, 'run');
fclose('run.txt');
}else{
exit(); // if it find 'run' in run.txt the script will stop
}
And add this in the end of your script file
$file = fopen('run.txt', 'w+');
fwrite($file, ''); //will delete run word for the next try ;)
fclose('run.txt');
That will check if script already work by checking runt.txt contents
if run word exist in run.txt it will not run
Running a cron would definitively be a better solution. Idea to store url in a table is a good one.
To answer to the original question, you may run a ps auxwww command with exec (Check this page: How to get list of running php scripts using PHP exec()? ) and move your function in a separated php file.
exec("ps auxwww|grep myfunction.php|grep -v grep", $output);
Just add following on the top of your script.
<?php
// Ensures single instance of script run at a time.
$fileName = basename(__FILE__);
$output = shell_exec("ps -ef | grep -v grep | grep $fileName | wc -l");
//echo $output;
if ($output > 2)
{
echo "Already running - $fileName\n";
exit;
}
// Your php script code.
?>
The problem is that for a long process the PHP script keeps on executing whether or not the client browser is currently connected or not. Is there any possibility that if the client has terminated the Ajax call to a script then the script also terminates on server?
As pointed out by #metadings php does have a function to check for connection abort named connection_aborted(). It will return 1 if connection is terminated otherwise 0.
In a long server side process the user may need to know if the client
is disconnected from the server or he has closed the browser then the
server can safely shutdown the process.
Especially in a case where the application uses php sessions then if we left the long process running even after the client is disconnected then the server will get unresponsive for this session. And any other request from the same client will wait until the earlier process executes completely. The reason for this situation is that the session file is locked when the process is running. You can however intentioanlly call session_write_close() method to unlock it. But this is not feasible in all scenarios, may be one need to write something to session at the end of the process.
Now if we only call connection_aborted() in a loop then it will always
return 0 whether the connection is closed or not.
0 means that the connection is not aborted. It is misleading. However, after re-search and experiments if have discovered that the output buffer in php is the reason.
First of all in order to check for aborts the developer in a loop must send some output to the client by echoing some text. For example:
print " ";
As the process is still running, the output will not be sent to the client. Now to send output we then need to flush the output buffer.
flush ();
ob_flush ();
And then if we check for aborts then it will give correct results.
if (connection_aborted () != 0) {
die();
}
Following is the working example, this will work even if you are using PHP session:
session_start ();
ignore_user_abort ( TRUE );
file_put_contents ( "con-status.txt", "Process started..\n\n" );
for($i = 1; $i <= 15; $i ++) {
print " ";
file_put_contents ( "con-status.txt", "Running process unit $i \n", FILE_APPEND );
sleep ( 1 );
// Send output to client
flush ();
ob_flush ();
// Check for connection abort
if (connection_aborted () != 0) {
file_put_contents ( "con-status.txt", "\nUser terminated the process", FILE_APPEND );
die ();
}
}
file_put_contents ( "con-status.txt", "\nAll units completed.", FILE_APPEND );
EDIT 07-APR-2017
If someone is using Fast-Cgi on Windows then he can actually terminate
the CGI thread from memory when the connection is aborted using following code:
if (connection_aborted () != 0) {
apache_child_terminate();
exit;
}
Check out PHP connection_aborted() function. While doing your processing, you can sometimes check for the aborted connection to gracefully cancel the progress, as one would do in an interactive threading model.
One way I've found which is the easiest way to handle this time-out issue is as such:
1: set a value on the server as 'processing'. Start an independent thread to do the processing.
2: The initial ajax call returns a success
3: The javascript on the page goes into 'waiting mode' which sends a new ajax request every 10 or 30 or 60 seconds or five or ten minutes or whatever (depending on your situation) to find out whether the value on the server is still set to 'processing'.
4: The independent thread completes. It sets the value on the server to 'done'.
5: The javascript on the page makes its next waiting-mode query, and returns 'done' and the appropriate data.
4b: If an obscene amount of time goes by without a 'done', it registers as a failure. How much time is obscene depends upon your situation. Send an ajax call updating the value from 'processing' to 'cancel'.
5b: The independent thread periodically checks the status to make sure it's still set to 'processing'. If it sees a mode-shift to 'cancel' it cancels itself.
This is what you're looking for:
http://php.net/manual/en/function.ignore-user-abort.php
Stopping a script in the middle of execution can lead to unexpected results, so caveat emptor