I'm connecting to a web service using SOAP in PHP. Everything was working fine when using Xampp on my computer but when I moved the PHP code to a web server, my timeout problems started.
First I got a timeout after 30 secs. max_execution_time fixed that.
Then I got a timeout after 60 secs, default socket timeout fixed that.
Now I get a 503 error after exactly 5 minutes. I am not sure how to fix that. I know the reason is simply that I am getting a lot of data and the fact that it is always exactly 5 minutes (+1 to 2 secs) means that it is a timeout.
But what can I do about it? Do I need a third time extender?
Edit
I also got the 503 error when this was my only code:
<?php
echo "hello";
sleep(305);
?>
So I assume there must be a timeout for something that I don't know about. But I have no idea how to check for those timeouts or if I can actually change those.
Related
I use mysql through phpmyadmin interface. i have no problem with the apache server. It responded,as it was before. But when i am trying to access the phpmyadmin page, the page is loading with a huge time. After a long time it came with a message
`Fatal error: Maximum execution time of 30 seconds exceeded in
C:\xampp\phpMyAdmin\libraries\classes\Dbi\DbiMysqli.php on line 213
` i have changed the value of the variable
$cfg['ExecTimeLimit']
from 300 t0 1200. I think for that, i am able to see the loaded page. But, after loading i can't do anything with the interface as it takes too much time to respond.
I have tried the mentioned things in the following link
WAMP/XAMPP is responding very slow over localhost
can anyone help me out to get rid of this problem, it's really waste a huge time of mine from several days
I was having the same problem you could edit your php.ini file and set the max_execution_time = 120. But this isn't always working. Another suggestion is the past this two lines in your code. This solved my problem. It will affect the execution time of the script itself so it will get more time to run then the 30 seconds.
ini_set('max_execution_time', 300);
set_time_limit(0);
I don't know if you use this but this is the easiest way to visualise the errors.
error_reporting(E_ALL);
ini_set('display_errors', 1);
On my page, there is a script which takes a long time to execute fully. While in process, after 30 seconds, I'm getting 502 Bad gateway error. I have searched for this and it seems to be the KeepAlive feature of Apache. I've tried few things to keep it alive, such as:
set_time_limit(-1);
header("Connection: Keep-Alive");
header("Keep-Alive: timeout=600, max=100");
ini_set('max_execution_time', 30000);
ini_set('memory_limit', '-1');
I have also called an Ajax function that hits a page on server in every 5 seconds. But nothing worked for me.
I'm using PHP + MySql + Apache on Linux server.
If you are using some type of hosting it is quite possible that between your client and your server there is a proxy or a load balancer with connection time limit set to 30 seconds. It's quite a common solution.
Try to investigate logs to find which service returns 502.
When I use set_time_limit and the script runs for any amount of time greater than 360 seconds, it throws a 500 error.
359, nothing, 360 and above, error.
I don't have access to php.ini, how can I fix this bug?
script runs for any amount of time greater than 360 seconds, it throws a 500 error.
It sounds like you're hitting another timeout somewhere. If your server uses FastCGI, for example, Apache and/or the FastCGI process could be configured to only wait for six minutes (360 seconds) before timing out. It also could be that there's a reverse proxy sitting between you and Apache with the same timeout, though proxy timeouts are usually 504s, not 500s.
Please examine your server configuration. If you're on shared hosting, ask your host about the timeout.
If your script needs to execute for an extended time, you may wish to find another way to run it.
If you use Apache you can change maximum execution time by .htaccess with this line
php_value max_execution_time 200
I am trying to extend the Connection/Request Timeout at our allotted server space.
The Reason i am trying to do this is, for some operations in my application takes more than 120 seconds, then the server is not waiting for the operation to complete. It returns 500 Internal Server Error, exactly after 120 seconds.
To test it i placed the below script on server:
<?php
sleep(119);
echo "TEST";
?>
It will return TEST, to the browser after 119 seconds.
But when i place below script:
<?php
sleep(121);
echo "TEST";
?>
It will return 500 Internal Server Error after 120 seconds
we have set the Max_execution_time=360 in php.ini, but the problem still exists.
We have Apache installed with FastCGI.
I am trying to extend it to 360 seconds, using .htaccess, because that is the only way i can in Shared Hosting.
Any solutions or Suggestions ?, Thanks in Advance.
Fastcgi is a different beast; using set_time_limit will not solve the problem. I'm not sure what you can do with .htaccess, but the normal setting you're looking for is called IPCCommTimeout; you can try to change that in the .htaccess, I'm not sure if it's allowed or not.
See the directives on the apache fcgid page; if you're using an old version, you might need to try setting FcgidIOTimeout instead.
I would suggest that 120 seconds is far too long for a user to wait for a request over a web server; if things take this long to run, try running your script from the command line with PHP CLI instead.
Try this, hope it will work:
set_time_limit(int seconds)
I am running a huge import to my database(about 200k records) and I'm having a serious issue with my import script timing out. I used my cell phone as a stop watch and found that it times out at exactly 45 seconds every pass(internal server error)... it only does about 200 records at a time, sometimes less. I scanned my phpinfo() and nothing is set to 45 seconds; so, I am clueless as to why it would be doing this.
My max_execution_time is set to 5 minutes and my max_input_time is set to 60 seconds. I also tried setting set_time_limit(0); ignore_user_abort(1); at the top of my page but it did not work.
It may also be helpful to note that my error file reads: "Premature end of script headers" as the execution error.
Any assistance is greatly appreciated.
I tried all the solutions on this page and, of course, running from the command line:
php -f filename.php
as Brent says is the sensible way round it.
But if you really want to run a script from your browser that keeps timing out after 45 seconds with a 500 internal server error (as I found when rebuilding my phpBB search index) then there's a good chance it's caused by mod_fcgid.
I have a Plesk VPS and I fixed it by editing the file
/etc/httpd/conf.d/fcgid.conf
Specifically, I changed
FcgidIOTimeout 45
to
FcgidIOTimeout 3600
3600 seconds = 1 hour. Should be long enough for most but adjust upwards if required. I saw one example quoting 7200 seconds in there.
Finally, restart Apache to make the new setting active.
apachectl graceful
HTH someone. It's been bugging me for 6 months now!
Cheers,
Rich
It's quite possible that you are hitting an enforced resource limit on your server, especially if the server isn't fully under your control.
Assuming it's some type of Linux server, you can see your resource limits with ulimit -a on the command line. ulimit -t will also show you just the limits on cpu time.
If your cpu is limited, you might have to process your import in batches.
First, you should be running the script from the command line if it's going to take a while. At the very least your browser would timeout after 2 minutes if it receives no content.
php -f filename.php
But if you need to run it from the browser, try add header("Content-type: text/html") before the import kicks.
If you are on a shared host, then it's possible there are restrictions on the system when any long running queries and/or scripts are automatically killed after a certain length of time. These restrictions are generally loosened for non-web running scripts. Thus, running it from the command line would help.
The 45 seconds could be a coincidence -- it could be how long it takes for you to reach the memory limit.. increasing the memory limit would be like:
ini_set('memory_limit', '256M');
It could also be the actual db connection that is timing out.. what db server are you using?
For me, mssql times out with an extremely unhelpful error, "Database context changed", after 60 seconds by default. To get around this, you do:
ini_set('mssql.timeout', 60 * 10); // 10 min
First of all
max_input_time and
set_time_limit(0)
will only work with VPS or dedicated servers . Instead of that you can follow some rules to your implementation like below
First read the whole CSV file .
Then grab only 10 entries (rows) or less and make a ajax calls to import in DB
Try to call ajax every time with 10 entries and after that echo out something on browser . In this method your script will never timeout .
Follow the same method untill the CSV rows are finished .