I know people complain usually about scripts not working, but here is a case where it keeps working even if I want it to stop.
I have a CSV parser that analyzes lines and inserts entries in a DB table. I am using PDO and Zend Framwork for the project. The code works fine.. too fine in fact.
public function save()
{
$memory_limit = ini_get('memory_limit');
ini_set('memory_limit', '512M');
$sql = "
INSERT INTO my_table (
date_start,
timeframe,
type,
country_to,
country_from,
code,
weight,
value
) VALUES (?,?,?,?,?,?,?,?)
ON DUPLICATE KEY UPDATE
weight = VALUES(weight),
value = VALUES(value)
";
if ($this->test_mode) {
echo $sql;
return;
}
$stmt = new Zend_Db_Statement_Pdo($this->_db, $sql);
foreach($this->parsed_data as $entry){
$stmt->execute(array_values($entry));
$affected_rows = $stmt->rowCount();
if ($affected_rows){
$this->_success = true;
}
}
unset($this->parsed_data, $stmt, $sql);
ini_set('memory_limit', $memory_limit);
}
The script takes various seconds to complete as I am parsing a big file. The problem appears when I am trying to stop the script, with ESC or even by closing the page. The script does not stop until it finishes to insert all entries. Not even an Apache reload is not fixing this, probably a restart will do it.
I am thinking that this is not normal behaviour and maybe I am doing something wrong so I am asking for suggestions.
Thanks.
UPDATE
ignore_user_abort is off (default behaviour) so user abort should be considered..
I'm pretty sure that's standard PHP behaviour - just because the browser goes away doesn't mean it won't stop processing the script. (Although restarting Apache, etc. will achieve this goal.)
To change this behaviour, you can use ignore_user_abort.
That said, "PHP will not detect that the user has aborted the connection until an attempt is made to send information to the client", which I suspect may be the issue you're experiencing.
See the above link and the PHP runtime configuration information for more info.
It is not wrong. Your tries won't work because:
ESCape - because it is totally unrelated to the working of a page - most browsers don't actually react to it
closing (or refreshing) the page - again, not related - the SERVER is doing something, and PHP will NOT stop when the client-side stops - server can't actually know if the client closed or refreshed a page
Apache reload - won't kill the PHP forked process
Restart WOULD do it - this will kill PHP processes and stuff. Although it is kinda troublesome.
Way to do this (if the long execution is undesirable), is to actually set an execution time limit, using PHP function set_time_limit(), or to make the parsing more optimal (if it is not).
Related
Well this isn't true, I'm sure there's a reason, but I can't find it!!
I have a script that can take around 10 minutes to execute. It does a lot of communicating with an api on a service that we have that use. It pulls a bit of a fingerprint of everything every 24 hours. So what it's doing is pretty aside from the point. the probm I'm finding is the script stops executing somewhat randomly!!
I can't find any errors that would cause my script to stop executing, even with
//for debugging
error_reporting(E_ALL);
ini_set('display_errors', '1');
on for debugging, it's all clean. I've also used
set_time_limit(0);
so that it shouldn't ever time out.
With that said, I'm not sure how to get any more debug info to figure out what it's stopping. I can say that the script should NOT be hitting any memory limits or anything. I mean that should throw an error, and I've gone through and cleaned this script up as much as I can see to clean it up.
So my Question is: What are common causes for a cron ending when it shouldn't? How can I debug this more effectively?
You could try using a register_shutdown_function() to define a codeblock that will execute when the script shuts down. Then create a variable across the main code execution points in the cron with details of what is going on. In the shutdown function write this into a log and check your log to see what state the program was in when it stopped. Of course, this is based on the assumption that your code is not totally erroring out.
You could also redirect the standard echo statements and logs into a log file by using
/path/to/cron.php > /path/to/log.txt 2>&1
2>&1 indicates that the standard error (2>) is redirected to the same file descriptor that is pointed by standard output (&1).So, both standard output and error will be redirected to /path/to/log.txt
UPDATE:
Below is a function/flow that I usually use in my crons:
function addLog($msg)
{
if(empty($msg)) return;
$handle = fopen('log.txt', 'a');
$msg = $msg."\r\n";
fwrite($handle,$msg);
fclose($handle);
}
Then I use it like so:
addLog("Initializing...");
init();
addLog("Finished initializing...");
addLog("Calling blah-blah API...");
$result = callBlahBlah();
addLog("blah-blah API returned value". $result);
It is more tedious to have all these logs, but when cron messes up, it really helps!
For eg. when you look at your log.txt and if you see something like:
Initializing...
Finished initializing...
Calling blah-blah API...
And there is no entry which says blah-blah API returned value, then you know that the function call to blah-blah messed up.
What are common causes for a cron ending when it shouldn't?
The most common in my experience is that the cron user has different permissions or different environment variables than the way that you're executing it from the command line.
Make your cronned program dump its environment to a temporary file and see if it's what you expect.
I am unable to understand and run a simple PHP script in FCGI mode. I am learning both Perl and PHP and I got the Perl version of FastCGI example below to work as expected.
Perl FastCGI counter:
#!/usr/bin/perl
use FCGI;
$count = 0;
while (FCGI::accept() >= 0) {
print("Content-type: text/html\r\n\r\n",
"<title>FastCGI Hello! (Perl)</title>\n",
"<h1>FastCGI Hello! (Perl)</h1>\n",
"Request number ", $++count,
" running on host <i>$ENV('SERVER_NAME')</i>");
}
Searching for similar in PHP found talk about "fastcgi_finish_request" but have no clue how
to accomplish the counter example in PHP, here is what I tried:
<?php
header("content-type: text/html");
$counter++;
echo "Counter: $counter ";
//http://www.php.net/manual/en/intro.fpm.php
fastcgi_finish_request(); //If you remove this line, then you will see that the browser has to wait 5 seconds
sleep(5);
?>
Perl is not PHP. This must not mean that you can not most often interchange things and port code between the two, however when it comes to runtime environments there are bigger differences you can not just interchange.
FCGI is on the request / protocol level already which is fully abstracted in the PHP runtime and you therefore have not as much control in PHP as you would have with Perl and use FCGI;
Therefore you can not just port that code.
Next to that fastcgi_finish_request is totally unrelated to the Perl code. You must have confused it or thrown it in by sheer luck to give it a try. However it's not really useful in this counter example context.
PHP and HTTP are stateless.
All data is only relevant for the current, ongoing request.
If you need to save state, you might consider storing the data into cookie, session, cache or db.
So the implementation of this "counter" example will be different for PERL and PHP.
Your usage of fastcgi_finish_request won't bring the functionality you expect from PERL.
Think about a long running calculation, where you output data in the middle.
You can do that with fastcgi_finish_request, the data is then pushed to the browsers, while the long running tasks keeps running.
Opening happens together FASTCGI+PHP.
Normally the connection would be open till PHP finishes, then FASTCGI would be closed.
Except you reach the exec timeout of PHP (exec timeout) or fastcgi timeout (connection timeout). fastcgi_finish_request handles the case, where the fascgi connection to the browser is closed BEFORE PHP finishes execution.
Simple Hit Counter Example for PHP
<?php
$hit_count = #file_get_contents('count.txt'); // read count from file
$hit_count++; // increment hit count by 1
echo $hit_count; // display
#file_put_contents('count.txt', $hit_count); // store the new hit count
?>
Honestly, that's not even how you should do it using Perl either.
Instead, I'd recommend using CGI::Session to track session information:
#!/usr/bin/perl
use strict;
use warnings;
use CGI;
use CGI::Carp qw(fatalsToBrowser);
use CGI::Session;
my $q = CGI->new;
my $session = CGI::Session->new($q) or die CGI->Session->errstr;
print $session->header();
# Page View Count
my $count = 1 + ($session->param('count') // 0);
$session->param('count' => $count);
# HTML
print qq{<html>
<head><title>Hello! (Perl)</title></head>
<body>
<h1>Hello! (Perl)</h1>
<p>Request number $count running on host <i>$ENV{SERVER_NAME}</i></p>
</body>
</html>};
Alternatively, if you really want to go barebones, you could keep a local file as demonstrated in: I still don't get locking. I just want to increment the number in the file. How can I do this?
I have a generic PHP maintenance script that sets a database value to prevent running at the same time as another instance. Usually run via cronjob.
I cleared the database value and tried running this script manually with the cronjob turned off. Every time I run it from the browser, it terminates immediately stating it is already running.
The script will run for about 30 seconds as a background process then terminate automatically as if PHP detected the browser was closed (should take about 15 min to complete).
So I added code to echo when the database value is set or read. It never echos when it's set, only when it was read, but I can see the database value is stored each time.
Script always finishes as expected if run from cron.
What could be going on? Could the server be executing scripts twice on each browser based request?
Server runs Hive so different dirs can have different PHP versions. Don't know if this could have something to do with it.
PHP 5.2.17 (default)
PHP 5.3.27 (dir this script is in)
Apache 2.2.25
Code that dictates if it runs is simply this:
$DB = new DbConnector($db_name, $db_user, $db_pass);
if ($DB->queryOne("SELECT COUNT(*) FROM data_vars WHERE name = 'maintenance_running'")) {
exit('Already running!');
} else {
$DB->query("INSERT INTO data_vars (name, value) VALUES ('maintenance_running', 1)");
}
At the end of the script, the value is cleared. Again, this problem only happens when run from the browser.
You should set a longer value for execution limit with:
set_time_limit(30*60) // 30 minutes
Also the echo function probably is working, but as the response is sent to the browser until the whole script ends executing that is the cause that you can't see the print, because it´s killed by the execution limit before it ends and prints the response.
You can try echoing partially the response with the help of the
ob_start
and
ob_flush
You can read more about ob_functions here
I'm using a recursive function to do some calculation for every user and reward them for certain condition. Everything was working fine, but now that my user number has increased to something like 20000+ my script cannot be completed... my code is like
function give_award($mid) {
$qry = mysql_query("SELECT * FROM users WHERE uid='$mid'");
while($row=mysql_fetch_array_result)
{
$refer = $row["refer"];
}
//some conditions applied
//query for user award
//if succeed
$mid = $mid+1;
give_award($mid)
}
give_award($mid);
I'm Sure give_award(); isn't doing the time out in a single time calling. Is there a way that I can reset the time limit every time before the function is recursively (re)called?
Ways I've tried:
set_time_limit(0);
changing the timeout limit in .htaccess
changing apache timeout limit in php.ini
NB: No fatal error was show... But I've to restart apache every time I try to run this on local server... Using Zend Server Community edition on Win7 32bit.
Please help
I don't know where to find the recursion limit, but this is not a place to use it as far as I can tell anyway. A quick test here:
<?php
function KillingRecurse($r){
echo $r.PHP_EOL;
KillingRecurse(++$r);
}
KillingRecurse(0);
Result:
...
20916
Segmentation fault
I'm not surprised a segfault causes the server to need to be restarted, and the numbers kind of match up.
Tell us more about what you are trying to do here, because recursing through all users seems maddness here...
I have set up a cronjob to run a script daily. This script pulls out a list of Ids from a database, loops through each to get more data from the database and geneates an XML file based on the data retrieved.
This seems to have run fine for the first few days, however, the list of Ids is getting bigger and today I have noticed that not all of the XML files have been generated. It seems to be random IDs that have not run. I have manually run the script to generate the XML for some of the missing IDs individually and they ran without any issues.
I am not sure how to locate the problem as the cron job is definately running, but not always generating all of the XML files. Any ideas on how I can pin point this problem and quickly find out which files have not been run.
I thought perhaps add timestart and timeend fields to the database and enter these values at the start and end of each XML generator being run, this way I could see what had run and what hadn't, but wondered if there was a better way.
set_time_limit(0);
//connect to database
$db = new msSqlConnect('dbconnect');
$select = "SELECT id FROM ProductFeeds WHERE enabled = 'True' ";
$run = mssql_query($select);
while($row = mssql_fetch_array($run)){
$arg = $row['id'];
//echo $arg . '<br />';
exec("php index.php \"$arg\"", $output);
//print_r($output);
}
My suggestion would be to add some logging to the script. A simple
error_log("Passing ID:".$arg."\n",3,"log.txt");
Can give you some info on whether the ID is being passed. If you find that that is the case, you can introduce logging to index.php to further evaluate the problem.
Btw, can you explain why you are using exec() to run a php script? Why not excute a function in the loop. This could well be the source of the problem.
Because with exec I think the process will run in the background and the loop will continue, so you could really choke you server that way, maybe that's worth trying out as well. (I think this also depends on the way of outputting:
Note: If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
Maybe some other users can comment on this.
Turned out the apache was timing out. Therefore nothing to do with using a function or the exec() function.