I have a PHP application that has multiple "nested" include() functions. For some reason the applications stops after 60 seconds. I'm using set_time_limit(0), also I have tested this without the include function in the file and it runs forever. I'm not sure what the issue is.
Working:
set_time_limit(0);
while(1 < 2){
echo 'hello';
}
Not Working:
//MASTER FILE
set_time_limit(0);
while(1 < 2){
include('file.php');
}
//INCLUDED FILE 'file.php'
echo 'hello';
First, it's bad practice to write infinite loops, especially in response to a web request. In general you also want your web requests to respond as quickly as possible and have long-running processes run separately.
That said, assuming you're running PHP behind Apache, you'll want to adjust your Apache TimeOut config. It defaults to 60 seconds.
Related
Following situation:
Apache2 Webserver
PHP Version 5.5.38 (Module) (Upgrade not possible for the moment)
Test environment reachable over http://www.test-environment.com
Productive environment reachable over https://www.productive-environment.com
(Fictional domains)
Both systems are located on the same machine:
/www/htdocs/myuser/test (Webroot of test environment, reachable over http)
/www/htdocs/myuser/productive (Webroot of productive environment, forced HTTPS)
I got the following code and executed it on both systems, but they behave differently:
session_write_close_test.php
<?php
session_start();
// Some logic here, but doesn't matter in the test case...
session_write_close();
set_time_limit(30);
$counter = 30;
while ($counter > 0) {
sleep(1);
$counter -= 1;
}
Calling the script on my test environment (http-only) over http://www.test-environment/session_write_close_test.php and calling http://www.test-environment.com within this 30 seconds in another tab, everything works fine. No blocking. session_write_close() seems to work.
Calling https://www.productive-environment.com/session_write_close_test.php and calling https://www.productive-environment.com within this 30 seconds in another tab: Nothing, until the test script ends. session_write_close() doesn't seem to work.
Any explanation for this behavior? Both systems use the same configuration. I already tried to change the session save path to prevent any file write permission issues, sadly with no effects.
Thanks in advance
I write the php code in iis to serve file for download with speed limit, so i need to use sleep function for the speed limit.
Here, few lines of my code:
set_time_limit(0);
while(!feof($file))
{
echo fread($file, 1024*10);
ob_flush();
flush();
sleep(1);
if (connection_status()!=0)
{
#fclose($file);
exit;
}
}
But the browser say: 'Waiting for mysite'. If i remove sleep(1) everything is right. I also test in apache and everything is right too.
So I have a problem in IIS with the sleep function.
You need to have your server properly configured for that. TBH you should use something on the server to do that, rather then relying on PHP, the sleep(1); causes it to send a chunk, pause, send a chunk pause, etc. It does not maintain 10kbps but goes from like 500kbps for a second to 0 kbps for a second, it may average out to 10kbps, but it is not the same and some programs won't treat it correct and may terminate the download. You should look into QoS (How to Limit Download Speeds from my Website on my IIS Windows Server?)
What exactly is the problem with IIS? Note that waiting for 1 second will mean that your script may exceed the timeout limit (this can be as low as 30 seconds) so IIS will kill your script.
If you want to serve large files, I recommend serving them directly from IIS and using IIS' built-in rate limiter rather than via PHP.
See here: http://www.iis.net/configreference/system.applicationhost/sites/site/limits
I need to read a large file to find some labels and create a dynamic form. I can not use file() or file_get_contents() because the file size.
If I read the file line by line with the following code
set_time_limit(0);
$handle = fopen($file, 'r');
set_time_limit(0);
if ($handle) {
while (!feof($handle)) {
$line = fgets($handle);
if ($line) {
//do something.
}
}
}
echo 'Read complete';
I get the following error in Chrome:
Error 101 (net::ERR_CONNECTION_RESET)
This error occurs after several minutes so that the constant max_input_time, I think not is the problem.(is set to 60).
What browser software do you use? Apache, nginx? You should set the max accepted file upload at somewhere higher than 500MB. Furthermore, the max upload size in the php.ini should be bigger than 500MB, too, and I think that PHP must be allowed to spawn processes larger than 500MB. (check this in your php config).
Set the memory limit ini_set("memory_limit","600M");also you need to set the time out limit
set_time_limit(0);
Generally long running processes should not be done while the users waits for them to complete.
I'd recommend using a background job oriented tool that can handle this type of work and can be queried about the status of the job (running/finished/error).
My first guess is that something in the middle breaks the connection because of a timeout. Whether it's a timeout in the web server (which PHP cannot know about) or some firewall, it doesn't really matter, PHP gets a signal to close the connection and the script stops running. You could circumvent this behaviour by using ignore-user-abort(true), this along with set_time_limit(0) should do the trick.
The caveat is that whatever caused the connection abort will still do it, though the script would still finish it's job. One very annoying side effect is that this script could possibly be executed multiple times in parallel without neither of them ever completing.
Again, I recommend using some background task to do it and an interface for the end-user (browser) to verify the status of that task. You could also implement a basic one yourself via cron jobs and database/text files that hold the status.
I have a backup script which backups up all files for a website to a zip file (using a script similar to the answer to this question). However, for large sites the script times out before it can complete.
Is there any way I can extend the length of time available for the script to run? The websites run on shared Windows servers, so I don't have access to the php.ini file.
If you are in a shared server environment, and you don’t have access to the php.ini file, or you want to set php parameters on a per-site basis, you can use the .htaccess file (when running on an Apache webserver).
For instance, in order to change the max_execution_time value, all you need to do is edit .htaccess (located in the root of your website, usually accesible by FTP), and add this line:
php_value max_execution_time 300
where 300 is the number of seconds you wish to set the maximum execution time for a php script.
There is also another way by using ini_set function in the php file
eg. TO set execution time as 5 second, you can use
ini_set('max_execution_time', 300); //300 seconds = 5 minutes
Please let me know if you need any more clarification.
set time limit comes to mind, but may still be limited by php.ini settings
set_time_limit(0);
http://php.net/manual/en/function.set-time-limit.php
Simply put; don't make a HTTP request to start the PHP script. The boundaries you're experiencing are set because you're using a HTTP request, which means you can have a time-out. A better solution would be to implement this using a "cronjob", or what Microsoft calls "Scheduled tasks". Most hosting providers will allow you to run such a task at set times. By calling the script from command line, you don't have to worry about the time-outs any more, but you're still at risk of running into memory issues.
If you have a decent hosting provider though, why doesn't it provide daily backups to start with? :)
You can use the following in the start of your script:
<?php
if(!ini_get('safe_mode')){
set_time_limit(0); //0 in seconds. So you set unlimited time
}
?>
And at the end of the script use flush() function to tell PHP to send out what it has generated.
Hope this solves your problem.
Is the script giving the "Maximum execution time of xx seconds exceeded" error message, or is it displaying a blank page? If so, ignore_user_abort might be what you're looking for. It tells php not to stop the script execution if the communication with the browser is lost, which may protect you from other timeout mechanisms involved in the communication.
Basically, I would do this at the beginning of your script:
set_time_limit(0);
ignore_user_abort(true);
This said, as advised by Berry Langerak, you shouldn't be using an HTTP call to run your backup. A cronjob is what you should be using. Along with a set_time_limit(0), it can run forever.
In shared hosting environments where a change to the max_execution_time directive might be disallowed, and where you probably don't have access to any kind of command line, I'm afraid there is no simple (and clean) solution to your problem, and the simplest solution is very often to use the backup solution provided by the hoster, if any.
Try the function:
set_time_limit(300);
On windows, there is a slight possibility that your webhost allows you to over ride settings by uploading a php.ini file in the root directory of your webserver. If so, upload a php.ini file containing:
max_execution_time = 300
To check if the settings work, do a phpinfo() and check the Local Value for max_execution_time.
Option 1: Ask the hosting company to place the backups somewhere accesible by php, so the php file can redirect the backup.
Option 2: Split the backup script in multiple parts, perhaps use some ajax to call the script a few times in a row, give the user a nice progress bar and combine the result of the script calls in a zip with php and offer that as a download.
I am trying to extend the Connection/Request Timeout at our allotted server space.
The Reason i am trying to do this is, for some operations in my application takes more than 120 seconds, then the server is not waiting for the operation to complete. It returns 500 Internal Server Error, exactly after 120 seconds.
To test it i placed the below script on server:
<?php
sleep(119);
echo "TEST";
?>
It will return TEST, to the browser after 119 seconds.
But when i place below script:
<?php
sleep(121);
echo "TEST";
?>
It will return 500 Internal Server Error after 120 seconds
we have set the Max_execution_time=360 in php.ini, but the problem still exists.
We have Apache installed with FastCGI.
I am trying to extend it to 360 seconds, using .htaccess, because that is the only way i can in Shared Hosting.
Any solutions or Suggestions ?, Thanks in Advance.
Fastcgi is a different beast; using set_time_limit will not solve the problem. I'm not sure what you can do with .htaccess, but the normal setting you're looking for is called IPCCommTimeout; you can try to change that in the .htaccess, I'm not sure if it's allowed or not.
See the directives on the apache fcgid page; if you're using an old version, you might need to try setting FcgidIOTimeout instead.
I would suggest that 120 seconds is far too long for a user to wait for a request over a web server; if things take this long to run, try running your script from the command line with PHP CLI instead.
Try this, hope it will work:
set_time_limit(int seconds)