I have a wordress blog and there is plugin that does around 20000 SQL inserts on user request. I noticed that the process takes long time, which is normal, but the request times out at 30 seconds.
I checked PHP settings and notice that PHP max_execution_time was 30 second, so I increased it to 90, but the requst keeps to timeout at 30 seconds (I even logged what does ini_get('max_execution_time') return and it says "30". Then, I checked if there are any apache directives that limit request time and found that there is a "TimeOut" directive ( http://httpd.apache.org/docs/2.2/mod/core.html#timeout )
Its value was 60 and I increased it to 90 as well, but the problem persist - the request times out after 30 seconds as it was before I changed anything.
as a note: I restart the server after I do any modification
By modifying your PHP Settings
That’s not easy, as you need to have access to your server, or a way to change your PHP settings. If you have access to your php.ini, you need to look for the max_execution_time variable and set it to the number of seconds you would like, 60 seconds for example.
max_execution_time = 60
If that doesn’t work, or can’t access your php.ini, you can also try to set this variable by using the .htaccess (at the root of your WordPress install). You can add this line.
php_value max_execution_time 60
If you set the value to 0 (instead of 60, here), the process will be allowed to run forever. Don’t do this, you will run into much bigger issues, extremely difficult to resolve.
By calling a function in PHP
This is generally not really recommended to do this. But if you know what you are doing, you can tell PHP that you would like the process to run for more time. For this, you could write this call in your theme, in the functions.php for example.
set_time_limit(100);
Related
I'm trying to copy a 10GB file to another directory in my local disk using this code
Storage::copy( 'file/test.txt', 'file2/dest.txt' );
But when I check it on the destination path it only copied 1.7GB out of 10GB.
It didn't show any timeout errors at all.
Is there any work around on this?
as long as invoking php script will be happen with a request. a malicious code can turns system into an infinite loop, or an attacker (Danial of Service) can make a script run for a long time.
so to preventing this. PHP came up with some default setting which make it to not run for a long time period and exit the execution if some amount of time out passed and it didn't returned a response.
usually this times are sufficient. and its an unusual case you have to copy 10 GB file within a script run
i suppose you have a regular HDD whit speed of about 60 MB/s for a
execution time of 30 seconds, it will copy about 1800 MB. and it make
sense with your evidence.
Temporary fix
10000 MB / 60 MBps = 168 second. i come up with 180 second via a margin
in your php.ini , in Resource Limit Section, there are parameters for time out.
uncomment them and make them 180 seconds:
max_execution_time = 180
max_input_time = 180
I need to run a really slow PHP/MySQL script once, on my local server.
The problem is that Laravel times out after 60 seconds with the message "Maximum execution time of 60 seconds exceeded".
I have set
max_execution_time = 360
and
max_input_time = 360
in my php.ini. The settings are there (checked phpinfo()) but Laravel still times out after 60 seconds. Is there anything in Laravel that I can set as well?
I dont think Laravel will override PHP settings. After changing the settings in your ini file, you have to restart the server to take effects.
So check whether you have restarted your server after the change in the settings
I'm calling an MLS service that responds with 4000+ records ... and I need to process each and every one of them, as well as insert all of the meta data per listing.
I'm able to get to about 135 (* 150 meta records) and then the script apparently stops responding, or at least stops processing the rest of the data.
I've added the following to my .htaccess file:
php_value memory_limit 128M
But this doesn't seem to help me any. Do I need to process chunks of the data at a time, or is there another way to ensure that the script will indeed finalize?
You should probably enable display_errors and error_reporting to get a better analysis of why the script isn't processing.
However, you should also consider making sure the time limit isn't being hit by calling:
set_time_limit( 0 );
This will give you an unlimited time period. You can also just set it to something relatively high, like 600 (10 minutes)
It isn't the memory- it's most likely the script execution time.
Try adding this to your htaccess, then restart apache:
php_value max_execution_time 259200
I am running a huge import to my database(about 200k records) and I'm having a serious issue with my import script timing out. I used my cell phone as a stop watch and found that it times out at exactly 45 seconds every pass(internal server error)... it only does about 200 records at a time, sometimes less. I scanned my phpinfo() and nothing is set to 45 seconds; so, I am clueless as to why it would be doing this.
My max_execution_time is set to 5 minutes and my max_input_time is set to 60 seconds. I also tried setting set_time_limit(0); ignore_user_abort(1); at the top of my page but it did not work.
It may also be helpful to note that my error file reads: "Premature end of script headers" as the execution error.
Any assistance is greatly appreciated.
I tried all the solutions on this page and, of course, running from the command line:
php -f filename.php
as Brent says is the sensible way round it.
But if you really want to run a script from your browser that keeps timing out after 45 seconds with a 500 internal server error (as I found when rebuilding my phpBB search index) then there's a good chance it's caused by mod_fcgid.
I have a Plesk VPS and I fixed it by editing the file
/etc/httpd/conf.d/fcgid.conf
Specifically, I changed
FcgidIOTimeout 45
to
FcgidIOTimeout 3600
3600 seconds = 1 hour. Should be long enough for most but adjust upwards if required. I saw one example quoting 7200 seconds in there.
Finally, restart Apache to make the new setting active.
apachectl graceful
HTH someone. It's been bugging me for 6 months now!
Cheers,
Rich
It's quite possible that you are hitting an enforced resource limit on your server, especially if the server isn't fully under your control.
Assuming it's some type of Linux server, you can see your resource limits with ulimit -a on the command line. ulimit -t will also show you just the limits on cpu time.
If your cpu is limited, you might have to process your import in batches.
First, you should be running the script from the command line if it's going to take a while. At the very least your browser would timeout after 2 minutes if it receives no content.
php -f filename.php
But if you need to run it from the browser, try add header("Content-type: text/html") before the import kicks.
If you are on a shared host, then it's possible there are restrictions on the system when any long running queries and/or scripts are automatically killed after a certain length of time. These restrictions are generally loosened for non-web running scripts. Thus, running it from the command line would help.
The 45 seconds could be a coincidence -- it could be how long it takes for you to reach the memory limit.. increasing the memory limit would be like:
ini_set('memory_limit', '256M');
It could also be the actual db connection that is timing out.. what db server are you using?
For me, mssql times out with an extremely unhelpful error, "Database context changed", after 60 seconds by default. To get around this, you do:
ini_set('mssql.timeout', 60 * 10); // 10 min
First of all
max_input_time and
set_time_limit(0)
will only work with VPS or dedicated servers . Instead of that you can follow some rules to your implementation like below
First read the whole CSV file .
Then grab only 10 entries (rows) or less and make a ajax calls to import in DB
Try to call ajax every time with 10 entries and after that echo out something on browser . In this method your script will never timeout .
Follow the same method untill the CSV rows are finished .
How can I rewrite this into a cron that will run every day for longer than 30 seconds? Also, do I need to edit the .htaccess or php.ini file in the cron.php directory to say something? Over the browser it runs just fine for longer than 30 seconds; over the shell, it runs just fine too. But as a cron set task, it dies after 30 seconds. I'm on 1and1 share hosting.
0 12 * * * php5 /this/is/the/file/cron.php
There are several things that could be terminating your script. One could be the maximum execution time set in the php.ini file. If that's the case, you can override it in your script with set_time_limit(0); where zero means no limit and any number greater than zero is the number of seconds to allow the script to run for before being terminated. It's important to note that this time does NOT include the time it takes for the browser to make the request, so file upload time wouldn't count here.
If you're in a shared hosting environment (like Dreamhost), they have process watches that will kill off any PHP process after a set time limit. You cannot get around these. You would need to contact the hosting provider to see what you need to do to get access to run the script for longer (for Dreamhost, they want you to have a they're PS offering).
Use this syntax to start php:
php -c /path/to/another/php.ini /this/is/the/file/cron.php
Then you can specify a different timeout (or no timeout) in a different php.ini file.
ini_set('max_execution_time', 600);
Add this to the top of your php file and it will run for 600 seconds. Anything more is not recommended but you can have a go if you want.
You don't need to set a higher max_execution_time if you use PHP CLI:
CLI SAPI default value for "max_execution_time" is set to unlimited.
http://nl3.php.net/manual/en/features.commandline.differences.php
I would just use wget http://path.to.myscript.php
If it's dying after 30 you may need to set max_execution_time = 60 in your php.ini to allow the script to run longer than 30 seconds.
You could also use ini_set('max_execution_time', 60)
But as the manual page says, in some cases (i.e. running in safe mode) this will have no effect at all: http://uk.php.net/manual/en/info.configuration.php#ini.max-execution-time
It could also be possible that the php.ini for client line has different max execution values than for the browser. I have seen it sometimes.