I'm trying to modify a csv import function which times out after 60 sec of importing. For every line there are images that are resized and some other code is executed.
I know the vps can handle this but in batches because I have another website on the same server that runs a different csv program but does the same thing. That program can import 8000 lines and resize images as well. The settings there are: process 10 lines and wait 3 sec, repeat.
Settings I raised:
set_time_limit
max_execution_time
Browser http keep alive timeout
I have tried sleep() for every 10th line but this only makes the process import fewer lines
if( (($current_line % 10) == 0) && ($current_line != 0) )
{
sleep(3);
}
This is how the script loops through the file
for ($current_line = 0; $line = fgetcsv($handle, MAX_LINE_SIZE, Tools::getValue('separator')); $current_line++)
{
//code here
}
Server:
Apache
PHP 5.3.3
MYSQL
Varnish cache
What can I do to make this work?
The first thing to try when your script times out is to run it using the php-cli. There is no execution time limit to scripts that are run through the command line.
If this doesn't solve your problem, then you know it wasn't the execution time limit.
The second thing to try is to print out regular status messages, including from memory_get_usage() so that you can eliminate memory leaks as a cause for your script crash. This may help you identify whether your script was dying on some input.
you can over-write the default timeout time.
set_time_limit (0) ;
using sleep will make it import fewer lines, basically the script is timeing out because it is taking over 60 seconds. by adding sleep it just gets less done in 60 seconds.
If this is a critical script i'd look at moving it to another programing language that can execute this faster. if its just a one off, or not mission critical try set_time_limit(0) which makes it never time out. also try typing php scriptname into the browser to run the script in command line.
Try outputting something to the browser to keep the browser alive. IE times out after 1 minute if inactivity FF is 3 minutes.
<?
if( (($current_line % 10) == 0) && ($current_line != 0) )
{
sleep(3);
echo '. ';
}
?>
Related
In PHP, I want to put a number of second delay on each iteration of the loop.
for ($i=0; $i <= 10; $i++) {
$file_exists=file_exists($location.$filename);
if($file_exists) {
break;
}
//sleep for 3 seconds
}
How can I do this?
Use PHP sleep() function. http://php.net/manual/en/function.sleep.php
This stops execution of next loop for the given number of seconds. So something like this
for ($i=0; $i <= 10; $i++) {
$file_exists=file_exists($location.$filename);
if($file_exists) {
break;
}
sleep(3); // this should halt for 3 seconds for every loop
}
I see what you are doing... your delaying a script to constantly check for a file on the filesystem (one that is being uploaded or being written by another script I assume). This is a BAD way to do it.
Your script will run slowly. Choking the server if several users are running that script.
Your server may timeout for some users.
HDD access is a costly resource.
There are better ways to do this.
You could use Ajax. And use a timeout to call your PHP script every few seconds. This will avoid the slow script loading. And also you can keep doing it constantly (the current for loop will only run for 33 seconds and then stop).
You can use a database. In some cases database access is faster than HDD access. Especially with views and caching. The script creating the file/uploading the file can set a flag in a table (i.e. file_exists) and then you can have a script that checks that field in your database.
You can use sleep(3) which sleeps the thread for 3 seconds.
Correction sleep method in php are in seconds.
Hare are two ways to sleep php script for some period of time. When you have your code and want to pause script working for some time use these functions.
In these examples the first part of code will be done on script run and the second part of code will be done but with time delay.
Using sleep() function you can define sleep time in seconds.
Example:
echo "Message 1";
// The first part of code.
$timeInSeconds = 3;
sleep($timeInSeconds);
// The second part of code.
echo "Message 2";
This way it is possible to sleep php script for 3 seconds. Using this function you can sleep script for whole number (integer) of seconds.
Using usleep() function you can define sleep time in microseconds. This sleep time is convenient for intervals that require more precise time than one second.
Example:
echo "Message 1";
// The first part of code.
$timeInMicroSeconds = 2487147;
usleep($timeInMicroSeconds);
// The second part of code.
echo "Message 2";
You can use this function if you want to sleep php for smaller time values than second (float). In this example I have put script to sleep for 2.487147 seconds.
Have you considered using a PHP Daemon script using supervisorD. I use it in multiple tasks that are required to be running all the time.
The catch is making sure that each time you are running your script you check for memory resources. If its too high, stop the process and then let it restart itself up again.
I have successfully used this process to be always checking database records for tasks to process.
It might be overkill but worth considering.
I have this PHP code:
<?php
include_once("connect_to_mysql.php");
$max=300;
while($max--)
{
sleep(1);
doMyThings();
}
?>
it is supposed to repeat a mysql query 300 times with gap of 1 second between each. But the problem is after a minute or so in the browser i get this message: No Data Received. Unable to load the webpage because the server sent no data.
The problem is the following: Your code will at least (without considering the amount of time needed by doMyThings()) last 300 seconds. Most PHP environments set the default script running time to about 60 secs, the script stops and nothing is printed out.
Next thing is (if script execution time is set high enough to allow long running scripts), the script has to run until its finished (that is, ~300 secs) and after that, data is written onto the output stream. Until there, you won't see any output.
To circumvent those two problems, see this code:
<?php
// If allowed, unlimited script execution time
set_time_limit(0);
// End output buffering
ob_end_flush();
include_once("connect_to_mysql.php");
$max=300;
// End output buffering IE and Safari Workaround
// They will only display the webpage if it's completely loaded or
// at least 5000 bytes have been "printed".
for($i=0;$i<5000;$i++)
{
echo ' ';
}
while($max > 0)
{
sleep(1);
doMyThings();
$max--;
// Manual output buffering
ob_flush();
flush();
}
?>
Maybe this post is also of interest to you: Outputting exec() ping result progressively
The browser will not wait a whole 5 minutes for you to complete your queries.
You need to find a different solution. Consider executing the PHP script in CLI.
It seems that you have a timeout executing 300 times doMyThings();
You can try with set_time_limit(0);
Set the number of seconds a script is allowed to run. If this is reached, the script returns a fatal error. The default limit is 30 seconds or, if it exists, the max_execution_time value defined in the php.ini.
When you execute long time php code on server side, you need change max_execution_time directive in php.ini. But browser will not wait how long as you want so you need use async technology like AJAX
I have a PHP script in my Code Igniter application,
its run on server and fetch some data but its not running more than 2 minutes approx..
and when I run this without using code igniter its works properly..what may be the reason behind this?
thanks #air4x its works . by setting set_time_limit(300) in the system/core/CodeIgniter.php
if (function_exists("set_time_limit") == TRUE AND #ini_get("safe_mode") == 0)
{
#set_time_limit(300);
}
after setting this code script running well..
Try adding this before you run your code: set_time_limit(0);
More info: http://php.net/manual/en/function.set-time-limit.php
If that doesn't work, you'll need to share what code you are running and what happens when it stops running.
There must be, Set_time_limit - Sets the maximum execution time of a script.
bool set_time_limit ( int $seconds )
When set_time_limit() called. It returns the counter to zero. In other words, if the default limit is 30 seconds, and after 25 seconds of script execution the set_time_limit(20) call is made, then the script will run for a total of 45 seconds before finishing .
Please visit http://php.net/manual/fr/function.set-time-limit.php for more information.
I'm trying to create a script that creates unique codes and writes them to a textfile.
I've managed to generate the codes, and write them to the file.
Now my problem is the fact that my loop keeps running, resulting in over 92 000 codes being written to the file, before the server times-out.
I've done some logging, and it seems that everything works fine, it's just that after a certain amount of seconds, all my variables are reset and everything starts from scratch. The time interval after which this happens varies from time to time.
I've already set ini_set('memory_limit', '200M'); ini_set('max_execution_time',0); at the top of my script. Maybe there's a php time-out setting I'm missing?
The script is a function in a controller. I set the ini_set at the beginning of this function. This is the loop I'm going through:
public function generateAction() {
ini_set('memory_limit', '200M');
ini_set('max_execution_time',0);
$codeArray = array();
$numberOfCodes = 78000;
$codeLength = 8;
$totaalAantal = 0;
$file = fopen("codes.txt","a+");
while(count($codeArray)<$numberOfCodes){
$code = self::newCode($codeLength);
if(!in_array($code,$codeArray))
{
$totaalAantal++;
$codeArray[] = $code;
fwrite($file,'total: '.$totaalAantal."\r\n");
}
}
fclose($file);
}
In the file this would give something like this:
total: 1
total: 2
total: ...
total: 41999
total: 42000
total: 1
total: 2
total: ...
total: 41999
total: 42000
Thanks.
Edit: so far we've established that the generateAction() is called 2 or 3 times, before the end of the script, when it should only be called once.
I already found the solution for this problem.
The host's script limit was set to 90 seconds, and because this script had to run for longer, I had to run it via the command line.
Taking account of the test with uniqid(), we can say that variables are not reseted, but the method generateAction() is called several times.
Since you code is probably synchronous, we may say that generateAction() is called several times because the main script is called several times.
What happens in detail?
Because of the nature of your algorithm, each pass in the loop is slower then the previous one. So the duration of executing generateAction() may be quite long.
You probably don't wait for the end, and you stop the process or even start the process from a new page. Nevertheless, the process don't really stop so soon, and it keeps running in back-end. I've observed such a behavior on my local WAMP/LAMP installation: the script is not actually stopped even if I stop the page, if I close the page, even if I close the navigator or if I restart Apache.
So it happens to you that several script processes are writing simultaneously in the codes.txt file.
In order to avoid this, you can for example lock the file during the loop using function flock().
I have a script that is very long to execute, so when i run it it hit the max execution time on my webserver and end up timing out.
To illustrate that imagine i have a for loop that make some pretty intensive manipulation one million time. How could i spread this loop execution in several parts so that i don t hit the max execution time of my Webserver?
Many thanks,
If you have an application that is going to loop a known number of times (i.e. you are sure that it's going to finish some time) you can increase time limit inside the loop:
foreach ($data as $row) {
set_time_limit(10);
// do your stuff here
}
This solution will protect you from having one run-away iteration, but will let your whole script run undisturbed as long as you need.
Best solution is to use http://php.net/manual/en/function.set-time-limit.php to change the timeout. Otherwise, you can use 301 redirects to send to an updated URL on a timeout.
$threshold = 10000;
$t = microtime();
$i = isset( $_GET['i'] ) ? $_GET['i'] : 0;
for( $i; $i < 10000000; $i++ )
{
if( microtime - $t > $threshold )
{
header('Location: http://www.example.com/?i='.$i);
exit;
}
// Your code
}
The browser will only respect a few redirects before it stops, you're better to use javascript to force a page reload.
I someday used a technique where I splitted the work from one file into three parts. It was just an array of 120.000 elements with intensive operation. I created a splitter script which stored the arrays in a database of the size of 40.000 each one. Then I created an HTML file with a redirect to the first PHP file to compute the first 40.000 elements. After computing the first 40.000 elments I had again a HTML forward to the next PHP file and so on.
Not very elegant, but it worked :-)
If you have the right permissions on your hosting server, you could use the php interpreter to execute a php script and have it run in the background.
See Asynchronous shell exec in PHP.
if you are running a script that needs to execute for unknown time, you can use:
set_time_limit(0);
If possible you can make the script so that it handles a portion of the wanted operations. Once it completes say 10%, you via AJAX call the script again to execute the next 10%. But there are circumstances where this is not an ideal solution, it really depends on what you are doing.
I used this method to create a web-based crawler which only ran on my computer for instance. If it had to do the operations at once it would time out as well. So it was split into 200 "tasks", each called via Ajax once the previous completes. Works perfectly, and it's been over a year since it started running (crawling?)