I have a very painful slow script that gets lots of data from MySQL and creates a large report out of it that it serves to the user at the end as application/force-download.
Long story short, on production server it keeps terminating after about 30 seconds (quite consistently) and spitting out an empty file instead. On development server it works fine, but it does take significantly longer to execute - about 90 seconds. Just to be 'safe', I set my php.ini file to max_execution_time = 2000 and also run set_time_limit(4000) in the beginning of my script (numbers way over the expected completion time, but just to be sure ;)).
What could be causing my web server to ignore the time limits I set and quit on me after only 30 seconds?
Edit: one thing I know for sure is that it takes MySQL portion of the code 8-9 seconds to complete, and it successfully gets past that point every time.
Maybe the PHP safe_mode.
Try to do a
die(ini_get('max_execution_time'))
to read the value after you have called the set_time_limit(0); to see if actually it get overwrited.
If it gets overwrited to 0 and your script still dies then the cause could be somewhere else in your code
Related
I have a php script which is runtime should take about 20-25 minutes, and approximately 15 minutes during execution time it simply re-executing for no apparent reason.
I gave it a token to make sure it kills the second execution before it runs again, but it's not the solution I prefer to use in the long run.
Some details about the script:
It ignites after a socket connection (socket does not timeout nor restarting at any point of the session.)
I gave the php script ignore_user_abort(true) in order to prevent the script from ending when user goes somewhere else.
I don't get any execution runtime errors, or any errors at all... In the client side I get a GET error message to the script after 15 minutes, which is actually the second execution try. It's still running and socket is still delivering data, though it appears the script has ended.
There are no two separated requests
I'm not sure what could be the problem of it, hopefully one of the masters here could hint with a clue?
Much thanks!
PHP is running as an Apache module.
The script start with a: ini_set('max_execution_time', 300);
What it does is basically connecting to a database, doing a big SELECTquery and looping through the results, writing them to a file and echoing a "result ok" after each write with explicit flush();
There is no sleep() call.
This is a "test" script made from a co-worker of mine for backup purposes and is intended to run up to a few hours! I thought I was aware of script execution time limit and thought his script would time out after 300 seconds...
But it didn't !
It's invoked from a web browser. The page is left open and we can see the results being echoed in real-time.
Why doesn't it time out?
Even stranger, one of the test issued a "Maximum execution time of 300 seconds exceeded" but this appeared at least after 2 hours of execution!
What's going on here? Is there something to understand between max_execution_time and flush() or a browser window being kept opened?
As you can see on the man page for the set_time_limit function, here the total execution time you are setting only affects the actual script. The time spent on database queries or any other external calls is not counted (if the OS is not Windows).
The only thing I can think of that may cause this is if PHP is running in safe mode. ini_get('max_execution_time') should tell you if it's actually being set.
Edit: Spotted your comment above.
echo ini_get('max_execution_time'); // says 300
If 300 is reported, and your not on Windows, #mishu is likely right. Your SELECT query is probably taking hours.
I have a script that updates my database with listings from eBay. The amount of sellers it grabs items from is always different and there are some sellers who have over 30,000 listings. I need to be able to grab all of these listings in one go.
I already have all the data pulling/storing working since I've created the client side app for this. Now I need an automated way to go through each seller in the DB and pull their listings.
My idea was to use CRON to execute the PHP script which will then populate the database.
I keep getting Internal Server Error pages when I'm trying to execute a script that takes a very long time to execute.
I've already set
ini_set('memory_limit', '2G');
set_time_limit(0);
error_reporting(E_ALL);
ini_set('display_errors', true);
in the script but it still keeps failing at about the 45 second mark. I've checked ini_get_all() and the settings are sticking.
Are there any other settings I need to adjust so that the script can run for as long as it needs to?
Note the warnings from the set_time_limit function:
This function has no effect when PHP is running in safe mode. There is no workaround other than turning off safe mode or changing the time limit in the php.ini.
Are you running in safe mode? Try turning it off.
This is the bigger one:
The set_time_limit() function and the configuration directive max_execution_time only affect the execution time of the script itself. Any time spent on activity that happens outside the execution of the script such as system calls using system(), stream operations, database queries, etc. is not included when determining the maximum time that the script has been running. This is not true on Windows where the measured time is real.
Are you using external system calls to make the requests to eBay? or long calls to the database?
Look for particularly long operations by profiling your php script, and looking for long operations (> 45 seconds). Try to break those operations into smaller chunks.
Well, as it turns out, I overlooked the fact that I was testing the script through the browser. Which means Apache was handling the PHP process, which was executed with mod_fcgid, which had a timeout of exactly 45 seconds.
Executing the script directly from shell and CRON works just fine.
i have a script that load a csv by CURL, once i have the csv it add each of the records to the database and when finished, display the total amount of registries added.
on less than 500 registries, it execute just fine, the problem is that whennever the amount of registries is too big, the execution is interrupted at some point and the browser displays the download dialog with a file named like the last part of my url withouth extension containing nothing. no warning, error or any kind of message. the database shows that it added some of the registries, if i run the script several times it adds a small amount more.
i have tried to look for someone with a similar situation but haven't find it yet.
i would appreciate any insight in the matter, i'm not sure if this is a symfony2 problem, a server configuration problem or what.
thanks in advance
Probably your script is reaching the maximum php execution time which is by default 30 secs. You can change it in the controller doing the lengthy operation with the php set_time_limit() function. For example:
set_time_limit (300); //300 seconds = 5 minutes
That's more a limitation of your webserver/environment PHP is running in.
Increase max_execution_time to allow your webserver running the request longer - alternative would be writing a console command, the cli environment isn't restricted in many cases.
I have a massive amount of data that needs to be read from mysql, analyzed and based on the results split up and stored in new records.
five record takes about 20 seconds, but the records vary in length so I can't really estimate how long the program will take, however have calculated that the process should not take longer much longer than 5 hours, so I'd like to run it over night and feel quite sure that when I come back to the office the next morning the program is done.
Assuming the code is fail safe (I know right ;) how should set up Apache / PHP /Mysql settings so that when I execute the script so that I can be sure that the program will not time out and/or not run out of ram?
(it is basically running in a loop fetching sets of 100 rows until it can't anymore a loop so, I am hoping the fact that the variables are being reset at the beginning of each iteration will keep the memory usage constant.)
The actual size of the database when dumped is 14mb, so the volume of the data is not so high
(on a side note, it might also be that I haven't assigned the maximum resources to the server settings, so maybe that's why it takes 20 seconds to run 5 records)
Make sure you have removed any max_execution_time limits by setting this to 0 (unlimited) in your PHP.ini or by calling set_time_limit(0). This will ensure that PHP doesn't stop the script mid-execution.
If it all possible, you should run the script from the CLI so that you don't have to worry about Apache timing your request out (it shouldn't, but it might).
Since you are working with only 15 MB of data I wouldn't worry about memory usage (128 MB is the default in PHP). If you are really worried you can remove memory limits in PHP by modifying the memory_limit to be either a higher number of -1 (infinite memory).
Keep in mind modifying the PHP.ini will affect all scripts that are interpreted by that installation. I prefer to use the appropriate ini setting functions at the top of my scripts to prevent dangerous global changes.
On a side note: This doesn't really sound like a job for PHP. I'm not trying to discourage your use of PHP here, but there are other languages that are better suited for command line usage.
Better make your script exit the execution, and then restart it. Store the point
where it left last time. This will ensure you do not have memory leaks and script does not run out of memory due to some error in garbage collection,and that
the execution continues if there is unexpected failure.
A simple shell command would be :
while [ 1 ]; do php myPhpScript.php a; done
you can make other checks to ensure proper running.
I'd like to point out, by default scripts run via a CLI in PHP, default to having no time limit, unlike scripts run through CGI, mod_php etc.
And as stated avoid running this via Apache.
However if you MUST do this, consider breaking it down. You can make a page that could process 5-10 results, appends the dump file, then prints out either a meta refresh, or some JavaScript to reload the page with with a parameter telling it where it's up too, and to continue until done.
Not recommended though.
adding to some of the other good options here you might want to look at http://www.electrictoolbox.com/article/php/process-forking/ and also sending some requests to dev/null if you dont need them to give back feedback.
Don't do this using a web interface. Run it from the command line; but look to see if your code can be optimised, or set break points and do it in "chunks"
First of all http://php.net/manual/en/function.set-time-limit.php put set_time_limit(0); at the beginning of the script.
As for the memory you should take care of that by unsetting any variables, array, pointers that you do not need on each iteration.
Better run the script from the shell (CLI) or as cronjob.
As far as I know MySQL connections do not time out, so you should be safe by setting:
php_value max_execution_time X
in a .htaccess file or placing set_time_limit(X) at the beginning of your script where X is a comfortable value in seconds.