I am running a php script as a cron job that might take very long time to finish. It will create a massive xml file and save it.
What should I think of if I implement it?
One thing I do is set max_execution_time for a long time:
ini_set('max_execution_time', 300);
Is there anything else I should do? Increase memory limit?
Does it help if put header "keep-alive"?
What can I do to make sure the script will always run until everything neccessary is done?
You can remove the execution time limit by using the set_time_limit function passing 0 as parameter:
set_time_limit(0);
Adding HTTP headers won't help because as it is a cronjob script, you are not dealing with a browser.
Related
I have a site hosted on rackspace cloud sites. I have a troubleshooting script I am trying to run to figure out some problems with the site.
Cloud sites has a 30 second timeout and it is timing out before the results page can load. I spoke with their support and they advised me to put a page loading script at the top of the php file to keep the connection open but I have no idea how to do that and the googling I have done hasnt been much help.
The script I am trying to run is too long to include here but if anyone needs it you can find it here http://forum.joomla.org/viewtopic.php?f=621&t=582860
edit: so no matter what I set the execution time to in the script the load balancers rackspace uses will still timeout after 30 seconds. they have told me to run a 'page loading' script at the beginning of the script to keep the connection open so I am about to start looking into how to do that.
You could try the set_time_limit() function:
http://php.net/manual/en/function.set-time-limit.php
By default, a PHP script times out after 30 seconds.
Use the set_time_limit( int $seconds ) function to extend the maximum execution time.
You can also use ini_set() and set the max_execution_time:
ini_set("max_execution_time", 300);
EDIT
if the above doesn't work, then they probably use a secondary mechanism to timeout blocking connections. What you could try in this situation is to flush some data at a regular interval.
ob_start(); // enable output buffering
// output something at regular interval
echo " ";
ob_flush();
// at end of script
ob_end_flush();
Hope this helps.
First of all the script take sometimes to execute and it stops after 30 seconds. Says that the execution time is max about 30 seconds (I know that we can change this param in the httpconf)
but I don't know the max execution time may be 1 hour or more.
So I want to know if there is no time limit execution in php?
Second question: how to show the content in a browser when it's ready and continue the execution, because I always wait the script to end in order to see the content displayed in the browser.
Next time you post a question, make sure to research it better. Both questions are well documented on stackoverflow and the internet.
now to your answer...
To let it run until finished regardless of the time it takes:
http://www.php.net/manual/en/function.set-time-limit.php
set_time_limit(0);
To display content right after it is generated:
http://www.php.net/manual/en/function.flush.php
ob_flush();
flush();
A script that might run that long should be run from the console. Browser timeouts could also interfere with running it via browser.
If you use a console script, you could route the output into a database and query it from there.
You have in php.ini this code. I am not sure you could check it from elsewhere than php.ini.
; Maximum execution time of each script, in seconds
; http://php.net/max-execution-time
; Note: This directive is hardcoded to 0 for the CLI SAPI
max_execution_time = 30
You can't do that in php. The script runs until it finishes execution (normal, or error).
I have a backup script which backups up all files for a website to a zip file (using a script similar to the answer to this question). However, for large sites the script times out before it can complete.
Is there any way I can extend the length of time available for the script to run? The websites run on shared Windows servers, so I don't have access to the php.ini file.
If you are in a shared server environment, and you don’t have access to the php.ini file, or you want to set php parameters on a per-site basis, you can use the .htaccess file (when running on an Apache webserver).
For instance, in order to change the max_execution_time value, all you need to do is edit .htaccess (located in the root of your website, usually accesible by FTP), and add this line:
php_value max_execution_time 300
where 300 is the number of seconds you wish to set the maximum execution time for a php script.
There is also another way by using ini_set function in the php file
eg. TO set execution time as 5 second, you can use
ini_set('max_execution_time', 300); //300 seconds = 5 minutes
Please let me know if you need any more clarification.
set time limit comes to mind, but may still be limited by php.ini settings
set_time_limit(0);
http://php.net/manual/en/function.set-time-limit.php
Simply put; don't make a HTTP request to start the PHP script. The boundaries you're experiencing are set because you're using a HTTP request, which means you can have a time-out. A better solution would be to implement this using a "cronjob", or what Microsoft calls "Scheduled tasks". Most hosting providers will allow you to run such a task at set times. By calling the script from command line, you don't have to worry about the time-outs any more, but you're still at risk of running into memory issues.
If you have a decent hosting provider though, why doesn't it provide daily backups to start with? :)
You can use the following in the start of your script:
<?php
if(!ini_get('safe_mode')){
set_time_limit(0); //0 in seconds. So you set unlimited time
}
?>
And at the end of the script use flush() function to tell PHP to send out what it has generated.
Hope this solves your problem.
Is the script giving the "Maximum execution time of xx seconds exceeded" error message, or is it displaying a blank page? If so, ignore_user_abort might be what you're looking for. It tells php not to stop the script execution if the communication with the browser is lost, which may protect you from other timeout mechanisms involved in the communication.
Basically, I would do this at the beginning of your script:
set_time_limit(0);
ignore_user_abort(true);
This said, as advised by Berry Langerak, you shouldn't be using an HTTP call to run your backup. A cronjob is what you should be using. Along with a set_time_limit(0), it can run forever.
In shared hosting environments where a change to the max_execution_time directive might be disallowed, and where you probably don't have access to any kind of command line, I'm afraid there is no simple (and clean) solution to your problem, and the simplest solution is very often to use the backup solution provided by the hoster, if any.
Try the function:
set_time_limit(300);
On windows, there is a slight possibility that your webhost allows you to over ride settings by uploading a php.ini file in the root directory of your webserver. If so, upload a php.ini file containing:
max_execution_time = 300
To check if the settings work, do a phpinfo() and check the Local Value for max_execution_time.
Option 1: Ask the hosting company to place the backups somewhere accesible by php, so the php file can redirect the backup.
Option 2: Split the backup script in multiple parts, perhaps use some ajax to call the script a few times in a row, give the user a nice progress bar and combine the result of the script calls in a zip with php and offer that as a download.
I have an upload form that uploads mp3s to my site. I have some intermittent issues with some users which I suspect to be slow upload connections...
But anyway the first line of code is set_time_limit(0); which did fix it for SOME users that had connections that were taking a while to upload, but some are still getting timed out and I have no idea why.
It says the script has exceeded limit execution of 60 seconds. The script has no loops so it's not like it's some kind of infinite loop.
The weird thing is that no matter what line of code is in the first line it will always say "error on line one, two, etc" even if it's set_time_limit(0);. I tried erasing it and the very first line of code always seems to be the error, it doesn't even give me a hint of why it can't execute the php page.
This is an issue only few users are experiencing and no one else seems to be affected. Could anyone throw some ideas as to why this could be happening?
set_time_limt() will only effect the actual execution of the PHP code on the page. You want to set the PHP directive max_input_time, which controls how long the script will accept input (like files) for. The catch is that you need to set this in php.ini, as if the default max_input_time is exceeded, it'll never reach the script which is attempting to change it with ini_set().
Sure, a couple of things noted in the PHP Manual.
Make sure PHP is not running in safe-mode. set_time_limit has no affect when PHP is running in safe_mode.
Second, and this is where I assume your problem lies.....
Note: The set_time_limit() function and the configuration directive max_execution_time only affect the execution time of the script itself. Any time spent on activity that happens outside the execution of the script such as system calls using system(), stream operations, database queries, etc. is not included when determining the maximum time that the script has been running. This is not true on Windows where the measured time is real.
So your stream may be the culprit.
Can you post a little of your upload script, are you calling a separate file to handle the upload using Headers?
Try ini_set('max_execution_time', 0); instead.
I want to have a simple PHP script that loops to do something every ten minutes. It would be hosted offsite, and I would activate it via my browser. I don't have access to the server other than my web space, so 'cron' as such isn't an option.
(I'm happy to have this stop after a certain time or number of job cycles. I just need it to continue running after I point the browser away from the page script.)
Is such a thing possible? Thanks.
It's possible, see ignore_user_abort():
set_time_limit(0);
ignore_user_abort(true);
while (true) // forever
{
// your code
}
You can use this two functions with a combination of sleep(), usleep(), time_nanosleep() or even better - time_sleep_until() to achieve a CRON-like effect.
PHP scripts timeout after a certain amount of time - they're not designed for long-running programs. You'll have to find some way to prod it every ten minutes.
Have a look at set_time_limit.
This is from the above page:
You can do set_time_limit(0); so that the script will run forever - however this is not recommended and your web server might catch you out with an imposed HTTP timeout (usually around 5 minutes).
Maybe you can write another script on a computer which you have access and then make that script request the other one periodically.
Yop can look at pnctl_fork.
Here's a hack for your problem:
// Anything before disconnecting, but nothing to be output to the client!
ob_end_clean();
header('Connection: close');
ob_start();
// Here you can output anything before disconnecting
echo "Bla bla bla";
$outsize = ob_get_length();
header('Content-Length: '.$outsize);
ob_end_flush();
flush();
// Do your background processing here
// and feel free to quit anytime you want.
A way to do this might be to launch a new php process from the web page, e.g.
<?php
exec("php script_that_runs_for_a_while.php > /dev/null");
?>
Adding the /dev/null means (on a linux system) that your script will complete, rather than waiting for the execution to finish.
So then that script that launches can do whatever it likes, since it is basically just a new process running on the server.
Note that at the start of your long running script, you will want to use the set_time_limit function to set the max execution time to some large value.