How long does ignore_user_abort(true); persist? - php

I made a script that shouldn't return anything to the browser (not any echo, print or interruptions of the code with blank space, like ?> <?, and that uses ignore_user_abort(true); to avoid that, once the browser window is closed, the process stops.
Thus once the script is launched, it should go till the end.
The script is designed for newsletter, and it sends one email each 5 seconds, to respect spam policies of my provider, through mail();
Said that, what's happening is that after about 20 minutes working (the total emails are 1002 ), the script "collapses", with no error returned.
Hence my question: is there a life time limit for scripts are running with ignore_user_abort(true); ?
EDIT
Following the suggestion of Hanky (here below) I put the line:
set_time_limit(0);
But the issue persists

So whilst ignore_user_abort(true); will prevent the script stopping after a visitor browses away from a page, it is set_time_limit(0); that will remove the time limit. You can also change the PHP memory_limit in your php.ini or by setting something like php_value memory_limit 2048M in your .htaccess file.
In order to list the default max_execution time you can run echo ini_get('max_execution_time'); (seconds) or echo ini_get('memory_limits'); (megabytes).
This being said, it sounds like your PHP scripts are better suited to being run from the CLI. Using the command line you can run PHP scripts, this sounds better suited to your usage as it seems, from what you have described, the script doesn't really need to serve anything to the web browser. This method is better for PHP scripts that are run in order to operate a background process rather than to return a front-end to the user.
You can run a file from the command line simply by running php script.php or php -f script.php.

Initially there was not way to solve the issue. Also the provider still investigating.
Meanwhile following your suggestions, I was able to make it running. I created a TEST file and I fired it to verify:
exec("/php5.5/bin/php -f /web/htdocs/www.mydomain.tld/home/test.php > /dev/null 2>&1 &");
In worked. I setup a sleep(600); and I sent 6 emails + one that inform me when the process is really finished.
It runs in a transparent way till the end.
Thank you so much for your support

Related

Detect when a script delays and allow it to skip or continue in PHP

I have a file, lets say file1.php, that within the script executes a file using: exec("php-cli -f _DAEMON.php") after executing the exec() command, it needs to run more code, the problem is that _DAEMON.php as its name says, is a Daemon and it will never stop running, so it freezes file1.php without allowing the rest of the code to run.
Is there a way to allow the code to continue executing even if exec("php-cli -f _DAEMON.php") has not finished. Or to detect if the code delays for more than x seconds/milliseconds, to continue?
Thanks.
Maybe try using a socket (curl might work with a low timeout, not sure if it'll kill the script though offhand). Not ideal, will add some overhead.
http://phplens.com/phpeverywhere/?q=node/view/254
Also, doriana_gd was probably referring to something like node.js, server side javascript

php script execution for heavy script file

i have a project which i develop in php framework named codeigniter. I have a script file in it which creates 30,0000 pdfs of db records for my client. the problem is that i have to run code in chunks. Because it gives errors, some time server time out, some time memory out of sync, I fixed all errors by changing in php.ini but in vain. Because I still have to make pdfs in chunks by using limit with 7000 ofset. IF I increase Limit offset it fails to run and gives me error. It's very hard to sit upto 30,0000 record and generate pdf on them. I just want that I run within one single chunk. Please give me solution. I really need it. Thanks in advance.
I would recommend using separate shell script for this, or if you insist on using a PHP script, use that as you would use a shell script. You can just "fire and forget" the script, so it would run on the background, without it affecting anything else and without it eating resources off your web application. That would be something like:
shell_exec('yourscript.sh > /dev/null 2>/dev/null &');
or with php script
shell_exec('php yourscript.php > /dev/null 2>/dev/null &');
Note that in the above, I'm redirecting both stdout and stderr to null. If you wan't them, you'd rather redirect them to somewhere else.

Not Waiting for Response from an AJAX Request

Suppose I make an AJAX HTTP Request from jQuery to a backend PHP script. The request is made, the PHP script starts running and doing its magic. Suppose I then change to another website, away from the site where the original AJAX Request was made. As well, I do this before the PHP script finishes and has time to do a HTTP Response back. Does the PHP script finish running and doing its thing even though I've switched to another website before I got the HTTP Response?
So the order is this.
I'm on website www.xyz.com
I have a jQuery handler that kicks off an AJAX request to blah.php
blah.php starts running
I go to website www.abc.com soon after without waiting for a response from blah.php
What's going on with blah.php? Is execution still going on? Did it stop? I mean it didn't get a chance to respond so...
This may depend on your server configuration, but in general the script will continue to execute despite a closed HTTP connection.
I have tested this with Apache 2 + PHP 5 as mod_php. I would expect similar behaviour with PHP as CGI and with other webservers but do not know for certain.
The best way to determine for certain on your configuration is, as #tdammers suggests: set up a test script something like the following and monitor the log.
<?php
error_log('Test script started.');
for ($i = 1; $i < 13; $i++) {
sleep(10);
error_log('Test script got to ' . (10 * $i) . ' seconds.');
}
error_log('Test script got to the end.');
?>
Access this script (at /test.php or whatever) then before you get any results, hit stop on your browser. This is equivalent to navigating away before your XHR returns. You could even have it as the target of an XHR and navigate away.
Then check your error log: you should have a start and then messages every 10 seconds for two minutes and an end. You can modify how high $i gets to ensure your script will reach its anticipated maximum execution time if you'd like to test that too.
You don't have to use error_log() - you could write to a file, or make some other persistent change on the server that can be checked without needing to keep the client connection open.
The script execution time may stop before then because of the max_execution_time php.ini directive - but in any case this should be distinct from when the webserver times out.
Try ignore_user_abort(true);
ignore_user_abort(true);
it should not abort proccessing of your code
You might want to check out the answers to This Question.
Basically when you make your ajax call to a php function which calls the exec() function as shown in the answers to that question, you'll get an ajax response almost immediately, since your php function doesn't actually need to process anything. This way, it shouldn't matter if the user leaves the page.
Here's a small example:
ajax call in html file: $.ajax({url: 'blah.php'});
blah.php file: exec('bash -c "exec nohup setsid php really_slow_script.php > /dev/null 2>&1 &"');
And then finally in really_slow_script.php, just include the actual code you want to run.
I successfully used this kind of logic to allow users to post an already uploaded video from their account on my website to youtube. (The video had to be sent to youtube, and since videos are generally large files, I didn't want the user to have to wait while the video was being uploaded to youtube)
Navigating away will trigger a disconnect message on the server. The implications of that entirely depends on what what your server has been configured to do.
By default, the server will be set up so that a disconnect will not interrupt the way that the program functions. It is possible, however, to make it so that a user disconnect will trigger the function which has been registered with register_shutdown_function, garbage collection will occur, and the script will terminate.
Because it is something which can be configured several different places, it might be easiest to just run a test, but this is a php.ini directive. If you want to configure this on a global level, you can set ignore_user_abort = Off in php.ini. If you want this on a site-specific level, you can use php_value ignore_user_abort off in the htaccess in the parent directory of the current site. Otherwise you can use ignore_user_abort(false);.
Of course, there is no guarantee on a shared server that you have control of htaccess or php.ini, so you might just need to use ignore_user_abort(false);.

PHP CLI script not timing out

We have a node js script that runs a command to execute the following command:
/usr/local/bin/php -q /home/www/441.php {"id":"325241"}
This script does a lot of things, however it does not seem to respect the time limit. The first line of this file is:
set_time_limit(1800);
Yet if we check what processes are running on the server (ps -aux | grep php) we will see a lot of these commands that have been open since last week.
Any ideas on how we can clean this up?
I found the following comment on the PHP user guide for max_execution_time
Keep in mind that for CLI SAPI
max_execution_time is hardcoded to 0.
So it seems to be changed by ini_set
or set_time_limit but it isn't,
actually. The only references I've
found to this strange decision are
deep in bugtracker
(http://bugs.php.net/37306) and in
php.ini (comments for
'max_execution_time' directive).
So it would seem that there's a bug in the CLI module that means max_execution_time is effectively ignored.
The commenter mentioned a page in the bug tracker about this at http://bugs.php.net/37306 but the tracker seems to be down.
set_time_limit only has meaning to the php part of the program. If you had a query on a database that takes 5h to finish, those 5h are not counted by php, so they fall out of scope of the set_time_limit limitation. Having said that, it seems weird that a php process is still running after a week, if it is not calling another program that runs forever (which, in this case, the set_time_limit neither affects that calling).
Also, what does the -q flag? I can't find it on man php nor php --help nor in php's command line options.
If you start the script in nodejs, why not kill it there too, after 1800s?
var pid = startPHPProcess();
setTimeout(function() {
killPHPProcess(pid);
}, 1800);

Run a PHP-script from a PHP-script without blocking

I'm building a spider which will traverse various sites and data mining them.
Since I need to get each page separately this could take a VERY long time (maybe 100 pages).
I've already set the set_time_limit to be 2 minutes per page but it seems like apache will kill the script after 5 minutes no matter.
This isn't usually a problem since this will run from cron or something similar which does not have this time limit. However I would also like the admins to be able to start a fetch manually via a HTTP-interface.
It is not important that apache is kept alive for the full duration, I'm, going to use AJAX to trigger a fetch and check back once in a while with AJAX.
My problem is how to start the fetch from within a PHP-script without the fetch being terminated when the script calling it dies.
Maybe I could use system('script.php &') but I'm not sure it will do the trick.
Any other ideas?
$cmd = "php myscript.php $params > /dev/null 2>/dev/null &";
# when we call this particular command, the rest of the script
# will keep executing, not waiting for a response
shell_exec($cmd);
What this does is sends all the STDOUT and STDERR to /dev/null, and your script keeps executing. Even if the 'parent' script finishes before myscript.php, myscript.php will finish executing.
if you don't want to use exec you can use a php built in function !
ignore_user_abort(true);
this will tell the script to resume even if the connection between the browser and the server is dropped ;)

Categories