I have made a queue using MySQL and PHP. The PHP script first retrieves all the tasks to be done from database and then executes all the tasks one by one using loop. But since there are many tasks and each task require a lot of time, the result is an error of 'Maximum execution time exceeded'.
How can I fix this? Please don't suggest I edit php.ini. I tested this on my browser but the PHP script will be invoked using cron.
You may set this piece of code on the top of your php code:
ini_set('max_execution_time', 0);
Or create in the same folder a .htaccess file with this code in it:
php_value max_execution_time 0
Related
PHP Has a method called set_time_limit (and a config option max_execution_time), which allows you to specify a time limit which triggers the script to exit when reached. However, this does not take into account time on external calls. Which means that if (for example) the PHP script makes a database call that does something dumb and runs for hours, the set time limit will not trigger.
If I want to guarantee that my PHP script's execution time will not exceed a specific time, is there some way to do this? I'm running the script directly in PHP (rather than via apache or similar), so I'm considering writing a bash script to monitor ps and kill it after a certain time. It occurs to me that I can't be the first person with this problem though, so is there some already-built solution available for this?
Note: This question is NOT about the PHP error from exceeding it's maxed execution time. This is about the case where PHP exceeds a time limit without triggering this error. This question is NOT just about limiting the time of a PHP script, it is about limiting the time allowed to a PHP script making external calls.
When running PHP from the command line the default setting "max_execution_time" is 0.
You can try something similar to solve your problem.
$cmd = "(sleep 10 && kill -9 ".getmypid().") > /dev/null &";
exec($cmd); //issue a command to force kill this process in 10 seconds
$i=0;
while(1){
echo $i++."\n";
sleep(1); //simulating a query
}
I want insert about 19000 data to mysql database with php , but with load page or cron job just added about 2000 , how can i insert all data
foreach($inventories as $item){
$this->model->AddInventory($item);
}
cron job and load page both are basically same thing, cron job just hit the url for you after defined interval of time.
in your case there may be the reason of php execution time out (default execution time is 30 seconds), now you have 2 options
increase php max_execution_time in php.ini file
execute your script via command line
I would recommend to use command line, altering max_execution_time is not the right approach
I made a script that shouldn't return anything to the browser (not any echo, print or interruptions of the code with blank space, like ?> <?, and that uses ignore_user_abort(true); to avoid that, once the browser window is closed, the process stops.
Thus once the script is launched, it should go till the end.
The script is designed for newsletter, and it sends one email each 5 seconds, to respect spam policies of my provider, through mail();
Said that, what's happening is that after about 20 minutes working (the total emails are 1002 ), the script "collapses", with no error returned.
Hence my question: is there a life time limit for scripts are running with ignore_user_abort(true); ?
EDIT
Following the suggestion of Hanky (here below) I put the line:
set_time_limit(0);
But the issue persists
So whilst ignore_user_abort(true); will prevent the script stopping after a visitor browses away from a page, it is set_time_limit(0); that will remove the time limit. You can also change the PHP memory_limit in your php.ini or by setting something like php_value memory_limit 2048M in your .htaccess file.
In order to list the default max_execution time you can run echo ini_get('max_execution_time'); (seconds) or echo ini_get('memory_limits'); (megabytes).
This being said, it sounds like your PHP scripts are better suited to being run from the CLI. Using the command line you can run PHP scripts, this sounds better suited to your usage as it seems, from what you have described, the script doesn't really need to serve anything to the web browser. This method is better for PHP scripts that are run in order to operate a background process rather than to return a front-end to the user.
You can run a file from the command line simply by running php script.php or php -f script.php.
Initially there was not way to solve the issue. Also the provider still investigating.
Meanwhile following your suggestions, I was able to make it running. I created a TEST file and I fired it to verify:
exec("/php5.5/bin/php -f /web/htdocs/www.mydomain.tld/home/test.php > /dev/null 2>&1 &");
In worked. I setup a sleep(600); and I sent 6 emails + one that inform me when the process is really finished.
It runs in a transparent way till the end.
Thank you so much for your support
I am trying to run a php file through cron job. It starts the running well but after a certain time period,the server is terminating it's execution.
So that I can't get my desired output. I am taking the output in a text file. After start the cron, it store some output into the text file but before completing full execution, it is terminating the process.
I also called mail function at beginning and ending of the file. But I only got the beginning message.
I set the max_time, max_memory to the infinite and also checked the settings from php_info().
Everything is ok there but file is not completing its execution successfully. I am able to run the file through browser but it requires a very long time.
So I must do it with others way like cron. If any one provide me a better solution in this regard, I will be grateful.
Turn off safe_mode in php.ini and change maximum execution time by adding
set_time_limit(0)
At the beginning of your script
php set_time_limit function
I want run a php script weekly using a cron job, however the script may take a few minutes or more.
Is there any way i can allow a greater max_execution_time just for this script?
You don't need to set a higher max_execution_time if you use PHP CLI: http://nl3.php.net/manual/en/features.commandline.differences.php
Maybe you should try these answers:
How do you get a Cronjob executing a PHP script to run longer than 30 seconds.
PHP command line: max_execution_time and memory_limit
But of course, using ini_set("max_execution_time",60) as the first php line in your job's script should do the trick.
Regards, Daniel
You can use set_time_limit(). If you want to disable timeout overall, pass it 0 as an argument. Otherwise pass it the number of seconds of max execution time.
You can use set_time_limit(0) at the start of your code: this removes the execution time limit altogether for this script. Note that it means that the script could run "forever", so put some checks in place in case it hangs.
set_time_limit