PHP Max execution time is 0 but still times out - php

I have a php script that continues to timeout at 45 seconds. I have done everything I can find with execution time and currently have the following.
In my php.ini:
max_execution_time = 0
In my .htaccess:
# set timeout to 10 minutes
<IfModule mod_php5.c>
php_value max_execution_time 600
</IfModule>
And in the script that timesout:
ini_set("memory_limit", "-1");
set_time_limit(0);
In the script, when I echo ini_get('max_execution_time') I get 0 so it looks like everything is right. Is are there other resource limits at play that are keeping the script from running? I've researched memory_limit, input time, etc but am thinking there's something here I don't know about.
The script does a while loop against a table and then crawls different sites according to the record. When I limit the return to 1 or 2 records it works fine but any more than that it goes to a 404 page not found. To me this means it times out but does the 404 error indicate something else is going on? Thanks

A 404 error does not mean your script times out, it just means that those URLs you hit are Not Found.
You need to evaluate your script and maybe send a HEAD request to check the status code.
See here a list of HTTP status code and their meaning.

Related

Can I increase Apache's TimeOut directive for specific PHP script?

I have a PHP script that will be executed by requests from the application admins. It does lots of stuff and takes at least 20 minutes(depending on the database size)
the Apache TimeOut directive is set to 300(5 minutes) which closes the connection and returns 500 status code after my PHP script is finished if it takes longer time to execute
Setting the PHP ini max_execution_time for long time for this script is useless.
<?php
// long script
ini_set("max_execution_time", 3600);// 1 hour
// Apache still responses with the same "Connection: close" header and 500 status code
And I don't want to change the entire Apache TimeOut directive just for those couple of scripts, because if I did, any request will be able to take very long time which makes a scope for DDOS vulnerabilities, is this right?
Is there any way to allow this script only to run longer at the Apache level ?
Have you tried PHP's set_time_limit() method?
https://www.php.net/manual/en/function.set-time-limit.php
In addition to setting the initial execution time, the manual says that calling it resets the time expended to zero when called, and starts the counter again to the limit provided.
So if you want to be sure, you could just call set_time_limit(0) (0 == no limit) regularly throughout your script to make sure that you don't ever hit a limit (even though you're setting an infinite limit by passing in 0).

Wordpress timeouts at 30 seconds

I have a wordress blog and there is plugin that does around 20000 SQL inserts on user request. I noticed that the process takes long time, which is normal, but the request times out at 30 seconds.
I checked PHP settings and notice that PHP max_execution_time was 30 second, so I increased it to 90, but the requst keeps to timeout at 30 seconds (I even logged what does ini_get('max_execution_time') return and it says "30". Then, I checked if there are any apache directives that limit request time and found that there is a "TimeOut" directive ( http://httpd.apache.org/docs/2.2/mod/core.html#timeout )
Its value was 60 and I increased it to 90 as well, but the problem persist - the request times out after 30 seconds as it was before I changed anything.
as a note: I restart the server after I do any modification
By modifying your PHP Settings
That’s not easy, as you need to have access to your server, or a way to change your PHP settings. If you have access to your php.ini, you need to look for the max_execution_time variable and set it to the number of seconds you would like, 60 seconds for example.
max_execution_time = 60
If that doesn’t work, or can’t access your php.ini, you can also try to set this variable by using the .htaccess (at the root of your WordPress install). You can add this line.
php_value max_execution_time 60
If you set the value to 0 (instead of 60, here), the process will be allowed to run forever. Don’t do this, you will run into much bigger issues, extremely difficult to resolve.
By calling a function in PHP
This is generally not really recommended to do this. But if you know what you are doing, you can tell PHP that you would like the process to run for more time. For this, you could write this call in your theme, in the functions.php for example.
set_time_limit(100);

cURL loop giving a 500 response

So I have a script which loops doing multiple cURL calls. After about 7-9 minutes it randomly stops execution. I have set the .user.ini file to adjust these settings:
max_execution_time = 30000
max_input_time = 200
I believe I have fastCGI but can't for the life of me figure out why this keeps dying on me. I have a submit form on the front end and I just get a 500 when it dies with nothing in the error log. Anything else I could be missing? Some PHP setting somewhere limiting the number of cURLs or execution time?
EDIT: So this issue was definitely FastCGI limiting my time with the param "FcgidBusyTimeout". My hosting company upped it for me as a test and everything worked great. The issue now is that because I'm on shared hosting they wont up FastCGI timeouts for people. I'm going to try and loop my script onto itself (kind of like a function loop where it calls itself again) and see if the new process' get me past the timeout issue.
FastCGI has its own timeout.
<IfModule mod_fcgid.c>
IPCConnectTimeout 20
IPCCommTimeout 120
FcgidBusyTimeout 200
</IfModule>
So if your PHP timeout is high enough its possible that your FastCGI process were killed after that time.
If you have heavy scripts its better to run the script over your CLI then you have only the PHP Timeout.

PHP: Processing thousands of entries, script dies after certain amount

I'm calling an MLS service that responds with 4000+ records ... and I need to process each and every one of them, as well as insert all of the meta data per listing.
I'm able to get to about 135 (* 150 meta records) and then the script apparently stops responding, or at least stops processing the rest of the data.
I've added the following to my .htaccess file:
php_value memory_limit 128M
But this doesn't seem to help me any. Do I need to process chunks of the data at a time, or is there another way to ensure that the script will indeed finalize?
You should probably enable display_errors and error_reporting to get a better analysis of why the script isn't processing.
However, you should also consider making sure the time limit isn't being hit by calling:
set_time_limit( 0 );
This will give you an unlimited time period. You can also just set it to something relatively high, like 600 (10 minutes)
It isn't the memory- it's most likely the script execution time.
Try adding this to your htaccess, then restart apache:
php_value max_execution_time 259200

PHP Script Times out after 45 seconds

I am running a huge import to my database(about 200k records) and I'm having a serious issue with my import script timing out. I used my cell phone as a stop watch and found that it times out at exactly 45 seconds every pass(internal server error)... it only does about 200 records at a time, sometimes less. I scanned my phpinfo() and nothing is set to 45 seconds; so, I am clueless as to why it would be doing this.
My max_execution_time is set to 5 minutes and my max_input_time is set to 60 seconds. I also tried setting set_time_limit(0); ignore_user_abort(1); at the top of my page but it did not work.
It may also be helpful to note that my error file reads: "Premature end of script headers" as the execution error.
Any assistance is greatly appreciated.
I tried all the solutions on this page and, of course, running from the command line:
php -f filename.php
as Brent says is the sensible way round it.
But if you really want to run a script from your browser that keeps timing out after 45 seconds with a 500 internal server error (as I found when rebuilding my phpBB search index) then there's a good chance it's caused by mod_fcgid.
I have a Plesk VPS and I fixed it by editing the file
/etc/httpd/conf.d/fcgid.conf
Specifically, I changed
FcgidIOTimeout 45
to
FcgidIOTimeout 3600
3600 seconds = 1 hour. Should be long enough for most but adjust upwards if required. I saw one example quoting 7200 seconds in there.
Finally, restart Apache to make the new setting active.
apachectl graceful
HTH someone. It's been bugging me for 6 months now!
Cheers,
Rich
It's quite possible that you are hitting an enforced resource limit on your server, especially if the server isn't fully under your control.
Assuming it's some type of Linux server, you can see your resource limits with ulimit -a on the command line. ulimit -t will also show you just the limits on cpu time.
If your cpu is limited, you might have to process your import in batches.
First, you should be running the script from the command line if it's going to take a while. At the very least your browser would timeout after 2 minutes if it receives no content.
php -f filename.php
But if you need to run it from the browser, try add header("Content-type: text/html") before the import kicks.
If you are on a shared host, then it's possible there are restrictions on the system when any long running queries and/or scripts are automatically killed after a certain length of time. These restrictions are generally loosened for non-web running scripts. Thus, running it from the command line would help.
The 45 seconds could be a coincidence -- it could be how long it takes for you to reach the memory limit.. increasing the memory limit would be like:
ini_set('memory_limit', '256M');
It could also be the actual db connection that is timing out.. what db server are you using?
For me, mssql times out with an extremely unhelpful error, "Database context changed", after 60 seconds by default. To get around this, you do:
ini_set('mssql.timeout', 60 * 10); // 10 min
First of all
max_input_time and
set_time_limit(0)
will only work with VPS or dedicated servers . Instead of that you can follow some rules to your implementation like below
First read the whole CSV file .
Then grab only 10 entries (rows) or less and make a ajax calls to import in DB
Try to call ajax every time with 10 entries and after that echo out something on browser . In this method your script will never timeout .
Follow the same method untill the CSV rows are finished .

Categories