I have a quite long loop running in a function , but does not finish all the eventual iterations and stops without giving any errors :
function my_function(){
foreach (range(100,999) as $art_id){
$current++;//see bottom flush functions...
outputProgress($current, $art_id );//see bottom flush functions...
// do a lot of stuff on remote URL...
// including download images ,
// scraping HTMl etc ..
}
}
I am using some output progress with flush to track the progress
function outputProgress($current, $total) {
// echo "<span style='background:red;font-size:1.5em;color:#fff;'>" . round($current / $total * 100) . "% </span>";
echo "<span style='background:red;font-size:1.5em;color:#fff;'>" . $current .'/'. $total . "% </span>";
myFlush();
sleep(1);
}
and
function myFlush() {
echo(str_repeat(' ', 256));
if (#ob_get_contents()) {
#ob_end_flush();
}
flush();
}
(never mind the percentage calculation , it is now disabled and just shows the ID of iteration)
I have noticed that most of the time I am executing the loop,
It will just STOP after 20-25 iterations . sometimes as little as 10.
My first suspects were the time limit , and max_execution time, so I added :
set_time_limit(99999);
ini_set('max_execution_time', 99999);
function my_function(){
foreach (range(410,499) as $art_id){ // done 500-600
set_time_limit(99999);
ini_set('max_execution_time', 99999);
// do a lot of stuff on remote URL...
// including download images ,
// scraping HTMl etc ..
}
}
As you can see, I have added those both INSIDE and OUTSIDE the function itself, just in case .
But it does not help much , and the loop still stops .
My next suspect was the Memory limit, so I added :
ini_set('memory_limit','128M');
and since I am working on wp, I also tried
define('WP_MEMORY_LIMIT', '128M');
but to no avail. The scipt still stops after little iterations .
What are the other possible causes for this behavior , and the
possible remedies ?
Mind you - the script does not give any errors, it just stops at a certain loop.
EDIT I
I have pasted the script HERE
. it is actually a slightly modified scrap_slashdot() function from the simplehtmldom lib included examples.
It is modified to insert wordpress posts while also downloading images and attaching them.
EDIT II
Using #Allendar comment echo ini_get('memory_limit'); seems like it works and it is set to 128M..
I've speeded up usleep(50000); to test. This code take a fragment of a second to complete on PHP 5.4 and causes no memory leakage:
ini_set('memory_limit', '32M'); // Force back to default X/W/L/M/AMP
function my_function(){
$current = 0;
foreach (range(100,999) as $art_id){
$current++;
outputProgress($current, $art_id );
}
}
function outputProgress($current, $total) {
echo "<span style='background:red;font-size:1.5em;color:#fff;'>" . $current .'/'. $total . "% </span>";
myFlush();
usleep(50000);
}
function myFlush() {
echo(str_repeat(' ', 256));
if (#ob_get_contents()) {
#ob_end_flush();
}
flush();
}
my_function();
echo memory_get_usage();
I've added $current = 0; to cancel out a warning given by Xdebug.
The memory usage outputs only 282304 bytes (about 275,69 kiloBytes).
It might be the case that the 1-second wait each cycle might cause the script to abort on execution time.
ini_set('max_execution_time', 0);
.. will fix this, but is not recommended ;)
If you still find the script to stop suddenly, it really must be in the part where you have your comments, that you write there is code. That code might be heavy enough on the payload of the PHP deamon to abort. Besides that there are also hosts (if the script is online) that prevent you from setting ini-values and maybe even kill the PHP process if it "zombies" around to long.
First of all this is not "long" script, i have been working with arrays - actually 16 array with all having more than 650 indexes (= 14 X 650 = 9100 indexes, nvm if i am wrong in calculation). And it loads in fraction of seconds so it seems having no problem. I am sure you are doing something seriously wrong. It works fine (if i know correctly) [(tested here online, on php 5)]], even without ini_set(); (disabled by website) and the memoery usage was 63072 (in bytes ~ 63kbs ~ 0.063mb > 128mb)
And wanted to tell you that from where do you set $current? Your my_function() has not parameters and i would also recommend you to turn on error reporting by
error_reporting(E_ALL); ini_set('display_errors', '1');
There should be problem with online compiler you are using, try one which i used or download apache server, you can also try some free hosts.
Related
I'd like to limit a specific section of PHP to X seconds - if it takes longer, kill the currently executing code (just the section, not the entire script) and run an alternate code.
Pseudo code example (Example use case here is an unstable API which is sometimes fast and other times its a black hole):
$completed = 1;
$seconds = 60;
while ($completed != -1 && $completed < 5) {
limit ($seconds) {
$api = new SomeAPI('user','secret','key');
$data = $api->getStuff('user="bob"');
$completed = -1;
} catch () {
$completed++;
sleep(10);
}
}
if ($completed === 5) echo "Error: API black-hole'd 5 times.\n";
else {
//Notice: data processing is OUTSIDE of the time limit
foreach ($data as $row) {
echo $row['name'].': '.$row['message']."\n";
}
}
HOWEVER, this should work for anything. Not just API/HTTP requests. E.g. an intensive database procedure.
In case you're reading too fast: set_time_limit and max_execution_time are not the answer as they affect the time limit for the entire script rather than just a section (unless I'm wrong on how those work, of course).
In the case of an API call, I would suggest using cURL, for which you can set a specific timeout for the API call.
For generic use, you can look at forking processes, which would give you the ability to time each process and kill it if it exceeds the expected time.
Of course if the section of code might be subject to long execution times due to a highly repetitive loop structure, you can provide your own timers to break out of the loop after a specified time interval.
I might not have directly answered your question, but really the point I wanted to get to is that you might have to use a different approach depending on what the code block actually does.
I am currently implementing a long polling function in Codeigniter and have come up a problem.
Lets say I have a normal PHP Controller:
function longpolling()
{
//PHP Timelimit infinite
set_time_limit(0);
while(true){
echo "test";
//Sleep 3 Seconds
sleep(3);
}
}
The page is just saying loading when called and does not return "test" instead you get 404 Error after a while.
What am I doing wrong?
Thank you!
You aren't doing anything 'wrong' it's just that php doesn't work the way you're expecting it to.
If you did it like this:
$i = 0;
while ($i < 10)
{
echo "Hi There!";
sleep(2);
$i++;
}
It will eventually output lots of Hi There, but not one at a time, rather it will all display at the end of the while loop.
You could even throw a flush() in there
$i = 0;
while ($i < 10)
{
echo "Hi There!";
flush();
sleep(2);
$i++;
}
And you still wont get anything until the very end.
Because your while(true) never ends you will never see any output, and I assume the browser timeout kicks in? Or the max_execution_time setting is reached?
Just popped into my head now: It might work if you wrote some data to a file in an infinite loop I have never tried it myself.
I've ran into issues like this myself. You'll have to look into flushing the output out as php and/or the webserver might be buffering the data until a certain threshold is met. I had a horrible time struggling with IIS over this, I think Apache is a lot easier to manage. Plus there's telling the webserver what to do as well. For apache, here's a snippet found on php.net:
I just had some problems with flush() and ob_flush(). What I did to
resolve this problem took me some time to figure out so I'd like to
share what I came up with.
The main problem is the php setting "output_buffering" which can be
set too large and will prevent your text from outputting. To change
this value you can either set it in php.ini or you can add the line
php_value output_buffering "0"
to your .htaccess file. It will not work with ini_set() since it is
PHP_INI_PERDIR.
This is combined with the flush() function used before sleep(). I also had to output over a number of characters before it started flushing properly:
public function longpolling()
{
echo str_repeat(" ", 1024); flush();
for( $i = 0; $i < 10; $i++) {
echo $i."<br/>";
flush();
sleep(1);
}
}
Also. I just tried this on my server and it wouldn't work until I added the php_value line to my htaccess file. Once I did, it worked as expected.
The page will keep loading until the PHP file execution has reached the end of the file. PHP doesn't work like C or C++. You make a request and when everything is done you get the output. Once the page is loaded no PHP is executing anymore.
And sleep() is just used to slow PHP down in some cases. In this case:
echo "Something";
sleep(30);
echo " else";
"Something" and " else" will be printed at the same moment while the total execution will take 30 seconds more.
I've been playing around with a system I'm developing and managed to get it to cause this:
Fatal error: Maximum execution time of 30 seconds exceeded
It happened when I was doing something unrealistic, but nevertheless it could happen with a user.
Does anyone know if there is a way to catch this exception? I've read around but everyone seems to suggest upping the time allowed.
How about trying as PHP documentation (well... at least one of its readers) say:
<?php
function shutdown()
{
$a = error_get_last();
if ($a == null) {echo "No errors";}
else {print_r($a);}
}
register_shutdown_function('shutdown');
ini_set('max_execution_time', 1);
sleep(3);
?>
Have a look at the following links:
http://www.php.net/manual/en/function.set-error-handler.php#106061
http://www.php.net/manual/en/function.register-shutdown-function.php
Your only options are to increase the allowed execution time (setting it to 0 makes it infinite, but that is not recommended) of the script or spawn a new thread and hope for the best.
The reason that this isn't catchable is that it isn't really thrown. No one line of the code actually triggered the error, rather PHP said, "Nope, sorry, this is too long. Time to shut down now." And that makes sense. Imagine having a script with a max execution time of 30 seconds catching that error and taking another 30 seconds... in a poorly designed program, that opens up some rather nasty opportunities to exploit. At a minimum, it will create opportunities for DOS attacks.
This isn't an exception, it's an error. There are important differences between exceptions and errors, first and foremost errors can't be caught with try/catch semantics.
PHP scripts are built around a paradigm of short execution times, so PHP is configured by default to assume that if a script has been running for longer than 30 seconds it must be caught in an infinite loop and therefore should be terminated. This is to prevent an errant PHP script causing a denial of service, either by accident or by malicious intent.
However, scripts do sometimes need more running time than they are allocated by default.
You can try changing the maximum execution time, either by using set_time_limit() or by altering the value of max_execution_time in the php.ini file to raise the limit. you can also remove the limit entirely by setting the execution time to 0, though this isn't recommended.
set_time_limit() may be disabled by mechanisms such as disable_functions so it might not be available to you, likewise you might not have access to php.ini. If both of these are the case then you should contact your host for help.
One exception is PHP scripts run from the command line. Under these running conditions, PHP scripts may be interactive and need to spend a long time processing data or waiting for input. For this reason there isn't a max_execution_time limit on scripts run from the command line by default.
EDIT TO ADD: PHP 7's error handling had a major overhaul. I believe that errors and exceptions are now both subclasses of Throwable. This may make the above no longer relevant for PHP7+, though I'll have to look more closely into the specifics of how error handling works now to be sure.
There is nothing you can do about it. but you can have graceful shutdown using register_shutdown_function
<?php
ini_set('display_errors', '0');
ini_set("max_execution_time",15 ); //you can use this if you know your script should not take longer than 15 seconds to finish
register_shutdown_function('shutdown');
function shutdown()
{
$error = error_get_last();
if ($error['type'] === E_ERROR) {
//do your shutdown stuff here
//be care full do not call any other function from within shutdown function
//as php may not wait until that function finishes
//its a strange behavior. During testing I realized that if function is called
//from here that function may or may not finish and code below that function
//call may or may not get executed. every time I had a different result.
// e.g.
other_function();
//code below this function may not get executed
}
}
while(true)
{
}
function other_function()
{
//code in this function may not get executed if this function
//called from shutdown function
}
?>
Yeah I tested the solution by TheJanOnline. sleep() does not count into php execution time so here is WORKING version with indefinite loop:
<?php
function shutdown()
{
$a=error_get_last();
if($a==null)
echo "No errors";
else
print_r($a);
}
register_shutdown_function('shutdown');
ini_set('max_execution_time',1 );
while(1) {/*nothing*/}
// will die after 1 sec and print error
?>
There is a little tricky way to handle "Fatal error: Maximum execution time of 30 seconds exceeded" as exception in certain cases:
function time_sublimit($k = 0.8) {
$limit = ini_get('max_execution_time'); // changes even when you set_time_limit()
$sub_limit = round($limit * $k);
if($sub_limit === 0) {
$sub_limit = INF;
}
return $sub_limit;
}
In your code you must to measure execution time and throw exception earlier than the timeout fatal error may be triggered. $k = 0.8 is a 80% of allowed execution time, so you have 20% of time to handle exception.
try{
$t1 = time(); // start to mesure time.
while (true) { // put your long-time loop condition here
time_spent = time() - $t1;
if(time_spent >= time_sublimit()) {
throw new Exception('Time sublimit reached');
}
// do work here
}
} catch(Exception $e) {
// catch exception here
}
I came up with this based on the answer #pinkal-vansia gave. So I'm not claiming an original answer, but an answer with a practical application. I needed a way for the page to refresh itself in the event of a timeout. Since I have been observing enough timeouts of my cURL script to know the code is working, but that sometimes for whatever reason it fails to connect to the remote server, or read the served html fully, and that upon refresh the problem goes away, I am ok with script refreshing itself to "cure" a Maximum execution timeout error.
<?php //script name: scrape_script.php
ini_set('max_execution_time', 300);
register_shutdown_function('shutdown');
function shutdown()
{
?><meta http-equiv="refresh" content="0; url=scrape_script.php"><?php
// just do a meta refresh. Haven't tested with header location, but
// this works fine.
}
FYI, 300 seconds is not too long for the scraping script I'm running, which takes just a little less than that to extract the data from the kinds of pages I'm scraping. Sometimes it goes over by just a few seconds only due to connection irregularities. Knowing that it's connection times that sometimes fail, rather than script processing, it's better to not increase the timeout, but rather just automatically refresh the page and try again.
I faced a similar problem and here was how I solved it:
<?php
function shutdown() {
if (!is_null($error = error_get_last())) {
if (strpos($error['message'], 'Maximum execution time') === false) {
echo 'Other error: ' . print_r($error, true);
} else {
echo "Timeout!\n";
}
}
}
ini_set('display_errors', 0);
register_shutdown_function('shutdown');
set_time_limit(1);
echo "Starting...\n";
$i = 0;
while (++$i < 100000001) {
if ($i % 100000 == 0) {
echo ($i / 100000), "\n";
}
}
echo "done.\n";
?>
This script, as is, is going to print Timeout! at the end.
You can modify the line $i = 0; to $i = 1 / 0; and it is going to print:
Other error: Array
(
[type] => 2
[message] => Division by zero
[file] => /home/user/test.php
[line] => 17
)
References:
PHP: register_shutdown_function - Manual
PHP: set_time_limit - Manual
PHP: error_get_last - Manual
I've got sleep(n) in a loop that is intended to do output on a periodic cycle.
But when I run the loop, nothing happens until all of the seconds in the intended loop duration accrue collectively, after which all of the output comes spilling out at once.
Help. Thanks.
try that:
ob_end_flush (); // just in case
while (1) {
echo 'wait for it<br/>'.PHP_EOL;
flush ();
sleep (2);
}
Maybe you need to flush() the output buffer after each piece of output?
Argh, the site's not letting me add comment to mathroc's latest. So I'll put it here:
It didn't work for me. But the following is really weird: I accidentally stumbled upon some other sleep code on the web that I stuck in front of what I've got:
<HTML>
<BODY>
$c=0;
while($c <$chunks){
$rand = rand(2000000, 6000000);
echo '<br> . . . sleeping for ' . round(($rand / 1000000),2) . ' seconds . . . zzzzzzzzzzzzzz<br>';
flush();
usleep($rand);
$c++;
}
WHAT I'VE GOT BEGINS HERE:
<br />
<br />
This page is loading.<br />
<?php
for($i=0;$i<5;$i++){
flush(); sleep(2);
?>
Almost there...<br />
<?php
}
?>
<?php flush(); sleep(2); ?>
Done.<br />
</BODY>
</HTML>
...and now the lower block of code sleeps fine, sequentially. Output is properly staggered (instead of arriving all in a lump at the end of 10 secs).
It's weird because I don't know what the above is doing that would make everything in the block below work all right. If I remove it, my block doesn't work (i.e., the output accumulates and then spills en masse at the end). If I remove only bits and pieces of the code above, then my thing wants to jump forward a little (but sequentially outputs the rest fine).
I have no idea what the preceding code is doing that makes my (latter block) work the way it should, or how to abbreviate it so that it still makes the latter block fully work, or even how to make the above code invisible on the page while still allowing the latter block to work accurately.
(I've tested the script on both Windows 7 Caucho Resin PHP 5 and Linux Apache CGI-BIN PHP 4 platforms. Identical results.)
It sounds like you should be using flush() instead of sleep().
http://us3.php.net/manual/en/function.flush.php
I'm including a local class that requests a file from a remote server. This process is rather unreliable — because the remote server is often overloaded — and I sometimes have to wait 20 or so seconds before the include gives up and continues.
I would like to have a limit on the execution time of the included script; say, five seconds.
Current code:
include('siteclass.class.php');
Update:
My code inside the class:
$movie = str_replace(" ","+",$movie);
$string = join('',file($siteurl.$l.'/moviename-'.$movie));
if(!$i) { static $i = 1;}
if($file_array = $string)
{
$result = Return_Substrings($file_array, '<item>', '</item>');
foreach($result as $res) {
That's basically it, as far as the loading goes. The internal processing takes about 0.1 s. I guess that's pretty doable.
Note that I didn't test this code, take this like a proposition :
$fp = fopen('siteclass.class.php', 'r');
stream_set_timeout($fp, 2);
stream_set_timeout($fp,$timeout);
$info = stream_get_meta_data($fp);
if ($info['timed_out']) {
echo "Connection Timed Out!";
} else {
$file = '';
while (!feof($fp)) {
$file .= fgets($fp);
}
eval($file);
}
The timeout is set in seconds, so the example set it to two seconds.
This isn't an exact fit to what you're looking for, but this will set the time limit for the include and execution to a total of 25 seconds. If the time limit is reached, it throws a fatal error.
set_time_limit(25);
It sounds like set_time_limit() might do what you want:
PHP Manual for that function
Fix the included code to have a timeout on the HTTP Request and then recover nicely, instead of just aborting by setting a time limit on the script itself.
My advice would be to get to the root of the problem instead of looking for a workaround.