php my infinity loop stop between 630 to 660 seconds - php

i wrote a site crawler and i use while loop to crawl whole web site but my loop stop after about 660 seconds .
i set set_time_limit ( 0 ) and use flush to echo output and i use sleep function,i thought my code is wrong but i test a simple while loop in my code :
while ( 1 ) {
sleep ( 30 );
echo "Crawling on the go ..." . time ();
echo "<br />";
echo str_pad ( " ", 4096 );
flush ();
}
But this simple while stop too after about 660 seconds!i don't know what's my problem ! I checked WHM Process Manager on my server and i saw that my process killed !
i wanna know server firewall kill my process because of something like CPU or RAM usage ?
plz help me!

You need to write set_time_limit(0); at top of the page after PHP opening tag. This has worked for me .. I hope this will also work for you too..

set_time_limit() may be denyed (it has no effect if PHP is in safe_mode). On a side note, the sleep() time is not included in the time limit, so if you remove the sleep and the time script is working is changed, probably it's time limit. To check if this is a time limit (other option is CPU-time limit), register shutdown function and check connection status (value of 2 means time out)

You can also put the following line in your .htaccess php_value max_execution_time <timeinseconds>

Related

"No data received" error when I have a mysql query in while loop

I have this PHP code:
<?php
include_once("connect_to_mysql.php");
$max=300;
while($max--)
{
sleep(1);
doMyThings();
}
?>
it is supposed to repeat a mysql query 300 times with gap of 1 second between each. But the problem is after a minute or so in the browser i get this message: No Data Received. Unable to load the webpage because the server sent no data.
The problem is the following: Your code will at least (without considering the amount of time needed by doMyThings()) last 300 seconds. Most PHP environments set the default script running time to about 60 secs, the script stops and nothing is printed out.
Next thing is (if script execution time is set high enough to allow long running scripts), the script has to run until its finished (that is, ~300 secs) and after that, data is written onto the output stream. Until there, you won't see any output.
To circumvent those two problems, see this code:
<?php
// If allowed, unlimited script execution time
set_time_limit(0);
// End output buffering
ob_end_flush();
include_once("connect_to_mysql.php");
$max=300;
// End output buffering IE and Safari Workaround
// They will only display the webpage if it's completely loaded or
// at least 5000 bytes have been "printed".
for($i=0;$i<5000;$i++)
{
echo ' ';
}
while($max > 0)
{
sleep(1);
doMyThings();
$max--;
// Manual output buffering
ob_flush();
flush();
}
?>
Maybe this post is also of interest to you: Outputting exec() ping result progressively
The browser will not wait a whole 5 minutes for you to complete your queries.
You need to find a different solution. Consider executing the PHP script in CLI.
It seems that you have a timeout executing 300 times doMyThings();
You can try with set_time_limit(0);
Set the number of seconds a script is allowed to run. If this is reached, the script returns a fatal error. The default limit is 30 seconds or, if it exists, the max_execution_time value defined in the php.ini.
When you execute long time php code on server side, you need change max_execution_time directive in php.ini. But browser will not wait how long as you want so you need use async technology like AJAX

CodeIgniter php script is not running more than 2 minutes??

I have a PHP script in my Code Igniter application,
its run on server and fetch some data but its not running more than 2 minutes approx..
and when I run this without using code igniter its works properly..what may be the reason behind this?
thanks #air4x its works . by setting set_time_limit(300) in the system/core/CodeIgniter.php
if (function_exists("set_time_limit") == TRUE AND #ini_get("safe_mode") == 0)
{
#set_time_limit(300);
}
after setting this code script running well..
Try adding this before you run your code: set_time_limit(0);
More info: http://php.net/manual/en/function.set-time-limit.php
If that doesn't work, you'll need to share what code you are running and what happens when it stops running.
There must be, Set_time_limit - Sets the maximum execution time of a script.
bool set_time_limit ( int $seconds )
When set_time_limit() called. It returns the counter to zero. In other words, if the default limit is 30 seconds, and after 25 seconds of script execution the set_time_limit(20) call is made, then the script will run for a total of 45 seconds before finishing .
Please visit http://php.net/manual/fr/function.set-time-limit.php for more information.

PHP timeout doesn't stop the script?

I was expecting some error like 500 or timeout page with the following code:
<?php
ini_set('max_execution_time',5);
set_time_limit(5);
echo 'start';
sleep(10);
echo '<br/>hi';
However, I get something like this:
start
hi
Did I do something incorrect?
All I want is see the script stoped when timesout in the 5thd second, so the second echo should not be executed(I know this is quite a weird requirement)
Could anyone shred a light, thanks.
PS: seems the sleep() part is quite a distraction, how about I change the code like this:
<?php
ini_set('max_execution_time',5);
set_time_limit(5);
echo 'start';
for($i=1;$i<100000000;$i++){
if($i%100==2) echo $i;
else echo '--';
}
echo '<br/>hi';
According to this comment in php.net you must be using unix...
"Please note that, under Linux, sleeping time is ignored, but under Windows, it counts as execution time." (Sleep is not taken into consideration as part of the execution time in Unix/Linux.)
Additionally, to make it timeout simply loop forever
<?php
while (true) {
// i'll error out after max_execution_time
}
To see how long that is, you can either find out the execution time using microtime or inquire about the max_execution_time variable you just set.
$max_time = ini_get("max_execution_time");
echo $max_time
?>
UPDATE
With your updated code, the output is as expected.
Fatal error: Maximum execution time of 5 seconds exceeded in newfile.php on line 7

PHP resets variables

I'm trying to create a script that creates unique codes and writes them to a textfile.
I've managed to generate the codes, and write them to the file.
Now my problem is the fact that my loop keeps running, resulting in over 92 000 codes being written to the file, before the server times-out.
I've done some logging, and it seems that everything works fine, it's just that after a certain amount of seconds, all my variables are reset and everything starts from scratch. The time interval after which this happens varies from time to time.
I've already set ini_set('memory_limit', '200M'); ini_set('max_execution_time',0); at the top of my script. Maybe there's a php time-out setting I'm missing?
The script is a function in a controller. I set the ini_set at the beginning of this function. This is the loop I'm going through:
public function generateAction() {
ini_set('memory_limit', '200M');
ini_set('max_execution_time',0);
$codeArray = array();
$numberOfCodes = 78000;
$codeLength = 8;
$totaalAantal = 0;
$file = fopen("codes.txt","a+");
while(count($codeArray)<$numberOfCodes){
$code = self::newCode($codeLength);
if(!in_array($code,$codeArray))
{
$totaalAantal++;
$codeArray[] = $code;
fwrite($file,'total: '.$totaalAantal."\r\n");
}
}
fclose($file);
}
In the file this would give something like this:
total: 1
total: 2
total: ...
total: 41999
total: 42000
total: 1
total: 2
total: ...
total: 41999
total: 42000
Thanks.
Edit: so far we've established that the generateAction() is called 2 or 3 times, before the end of the script, when it should only be called once.
I already found the solution for this problem.
The host's script limit was set to 90 seconds, and because this script had to run for longer, I had to run it via the command line.
Taking account of the test with uniqid(), we can say that variables are not reseted, but the method generateAction() is called several times.
Since you code is probably synchronous, we may say that generateAction() is called several times because the main script is called several times.
What happens in detail?
Because of the nature of your algorithm, each pass in the loop is slower then the previous one. So the duration of executing generateAction() may be quite long.
You probably don't wait for the end, and you stop the process or even start the process from a new page. Nevertheless, the process don't really stop so soon, and it keeps running in back-end. I've observed such a behavior on my local WAMP/LAMP installation: the script is not actually stopped even if I stop the page, if I close the page, even if I close the navigator or if I restart Apache.
So it happens to you that several script processes are writing simultaneously in the codes.txt file.
In order to avoid this, you can for example lock the file during the loop using function flock().

Importing csv timeout issue

I'm trying to modify a csv import function which times out after 60 sec of importing. For every line there are images that are resized and some other code is executed.
I know the vps can handle this but in batches because I have another website on the same server that runs a different csv program but does the same thing. That program can import 8000 lines and resize images as well. The settings there are: process 10 lines and wait 3 sec, repeat.
Settings I raised:
set_time_limit
max_execution_time
Browser http keep alive timeout
I have tried sleep() for every 10th line but this only makes the process import fewer lines
if( (($current_line % 10) == 0) && ($current_line != 0) )
{
sleep(3);
}
This is how the script loops through the file
for ($current_line = 0; $line = fgetcsv($handle, MAX_LINE_SIZE, Tools::getValue('separator')); $current_line++)
{
//code here
}
Server:
Apache
PHP 5.3.3
MYSQL
Varnish cache
What can I do to make this work?
The first thing to try when your script times out is to run it using the php-cli. There is no execution time limit to scripts that are run through the command line.
If this doesn't solve your problem, then you know it wasn't the execution time limit.
The second thing to try is to print out regular status messages, including from memory_get_usage() so that you can eliminate memory leaks as a cause for your script crash. This may help you identify whether your script was dying on some input.
you can over-write the default timeout time.
set_time_limit (0) ;
using sleep will make it import fewer lines, basically the script is timeing out because it is taking over 60 seconds. by adding sleep it just gets less done in 60 seconds.
If this is a critical script i'd look at moving it to another programing language that can execute this faster. if its just a one off, or not mission critical try set_time_limit(0) which makes it never time out. also try typing php scriptname into the browser to run the script in command line.
Try outputting something to the browser to keep the browser alive. IE times out after 1 minute if inactivity FF is 3 minutes.
<?
if( (($current_line % 10) == 0) && ($current_line != 0) )
{
sleep(3);
echo '. ';
}
?>

Categories