PHP is running as an Apache module.
The script start with a: ini_set('max_execution_time', 300);
What it does is basically connecting to a database, doing a big SELECTquery and looping through the results, writing them to a file and echoing a "result ok" after each write with explicit flush();
There is no sleep() call.
This is a "test" script made from a co-worker of mine for backup purposes and is intended to run up to a few hours! I thought I was aware of script execution time limit and thought his script would time out after 300 seconds...
But it didn't !
It's invoked from a web browser. The page is left open and we can see the results being echoed in real-time.
Why doesn't it time out?
Even stranger, one of the test issued a "Maximum execution time of 300 seconds exceeded" but this appeared at least after 2 hours of execution!
What's going on here? Is there something to understand between max_execution_time and flush() or a browser window being kept opened?
As you can see on the man page for the set_time_limit function, here the total execution time you are setting only affects the actual script. The time spent on database queries or any other external calls is not counted (if the OS is not Windows).
The only thing I can think of that may cause this is if PHP is running in safe mode. ini_get('max_execution_time') should tell you if it's actually being set.
Edit: Spotted your comment above.
echo ini_get('max_execution_time'); // says 300
If 300 is reported, and your not on Windows, #mishu is likely right. Your SELECT query is probably taking hours.
Related
I have a php script which is runtime should take about 20-25 minutes, and approximately 15 minutes during execution time it simply re-executing for no apparent reason.
I gave it a token to make sure it kills the second execution before it runs again, but it's not the solution I prefer to use in the long run.
Some details about the script:
It ignites after a socket connection (socket does not timeout nor restarting at any point of the session.)
I gave the php script ignore_user_abort(true) in order to prevent the script from ending when user goes somewhere else.
I don't get any execution runtime errors, or any errors at all... In the client side I get a GET error message to the script after 15 minutes, which is actually the second execution try. It's still running and socket is still delivering data, though it appears the script has ended.
There are no two separated requests
I'm not sure what could be the problem of it, hopefully one of the masters here could hint with a clue?
Much thanks!
i have a script that load a csv by CURL, once i have the csv it add each of the records to the database and when finished, display the total amount of registries added.
on less than 500 registries, it execute just fine, the problem is that whennever the amount of registries is too big, the execution is interrupted at some point and the browser displays the download dialog with a file named like the last part of my url withouth extension containing nothing. no warning, error or any kind of message. the database shows that it added some of the registries, if i run the script several times it adds a small amount more.
i have tried to look for someone with a similar situation but haven't find it yet.
i would appreciate any insight in the matter, i'm not sure if this is a symfony2 problem, a server configuration problem or what.
thanks in advance
Probably your script is reaching the maximum php execution time which is by default 30 secs. You can change it in the controller doing the lengthy operation with the php set_time_limit() function. For example:
set_time_limit (300); //300 seconds = 5 minutes
That's more a limitation of your webserver/environment PHP is running in.
Increase max_execution_time to allow your webserver running the request longer - alternative would be writing a console command, the cli environment isn't restricted in many cases.
I have a very painful slow script that gets lots of data from MySQL and creates a large report out of it that it serves to the user at the end as application/force-download.
Long story short, on production server it keeps terminating after about 30 seconds (quite consistently) and spitting out an empty file instead. On development server it works fine, but it does take significantly longer to execute - about 90 seconds. Just to be 'safe', I set my php.ini file to max_execution_time = 2000 and also run set_time_limit(4000) in the beginning of my script (numbers way over the expected completion time, but just to be sure ;)).
What could be causing my web server to ignore the time limits I set and quit on me after only 30 seconds?
Edit: one thing I know for sure is that it takes MySQL portion of the code 8-9 seconds to complete, and it successfully gets past that point every time.
Maybe the PHP safe_mode.
Try to do a
die(ini_get('max_execution_time'))
to read the value after you have called the set_time_limit(0); to see if actually it get overwrited.
If it gets overwrited to 0 and your script still dies then the cause could be somewhere else in your code
I've got a php script that queries an external API that allows a limited amount of queries in a time period. To handle this I sleep my script for 60 seconds if I get a message back that I've hit this limit and then wake up and check again then repeat this until I can get more data back from the API.
The problem seems to be that sometimes during that sleep (somewhat rarely) the script will either crash, or never return. I'm not certain which yet all I can tell is that the script it doesn't pick back up and restart processing.
I'm looking for any tips or ideas on what could be happening so I have an idea of what to look for to fix this.
Thanks.
Can you post some of the code?
Since you are not sure if the script is crashing or never returning, it is possible that there is another area of your script that isn't hitting the sleep command, and just finishes executing.
check the max execution time in php.ini
$time=ini_get('max_execution_time');
$slept=0;
while(!dataready()){
sleep(1);
$slept++;
if($slept>=$time-4) die("script timed out");
}
I have few doubts about maximum execution time set in php.ini.
Assuming max_execution_time is 3 minutes, consider the following cases:
I have a process which will end in 2 minutes.
But it's in a loop and it should work 5 times. So it become 10 minutes.
Will the script run properly without showing error for timeout? Why?
PHP function just prints the data and it will take only 2 minutes.
But the query execution is taking 5 minutes.
Will the script run without error? Why?
My single php process itself take 5 minutes.
But am calling the script from command line.
Will it work properly? Why?
How are memory allowed and execution time related?
If execution time for a script is very high
But it returns small amount of data
Will it affect memory or not? Why?
I want to learn what is happening internally, that is why am asking these.
I don't want to just increase time limit and memory limit.
The rules on max_execution_time are relatively simple.
Execution time starts to count when the file is interpreted. Time needed before to prepare the request, prepare uploaded files, the web server doing its thing etc. does not count towards the execution time.
The execution time is the total time the script runs, including database queries, regardless whether it's running in loops or not. So in the first and second case, the script will terminate with a timeout error because that's the defined behaviour of max_execution_time.
External system calls using exec() and such do not count towards the execution time except on Windows. (Source) That means that you could run a external program that takes longer than max_execution_time.
When called from the command line, max_execution_time defaults to 0. (Source) So in the third case, your script should run without errors.
Execution time and memory usage have nothing to do with each other. A script can run for hours without reaching the memory limit. If it does, then often due to a loop where variables are not unset, and previously reserved memory not freed properly.