I have a site hosted on rackspace cloud sites. I have a troubleshooting script I am trying to run to figure out some problems with the site.
Cloud sites has a 30 second timeout and it is timing out before the results page can load. I spoke with their support and they advised me to put a page loading script at the top of the php file to keep the connection open but I have no idea how to do that and the googling I have done hasnt been much help.
The script I am trying to run is too long to include here but if anyone needs it you can find it here http://forum.joomla.org/viewtopic.php?f=621&t=582860
edit: so no matter what I set the execution time to in the script the load balancers rackspace uses will still timeout after 30 seconds. they have told me to run a 'page loading' script at the beginning of the script to keep the connection open so I am about to start looking into how to do that.
You could try the set_time_limit() function:
http://php.net/manual/en/function.set-time-limit.php
By default, a PHP script times out after 30 seconds.
Use the set_time_limit( int $seconds ) function to extend the maximum execution time.
You can also use ini_set() and set the max_execution_time:
ini_set("max_execution_time", 300);
EDIT
if the above doesn't work, then they probably use a secondary mechanism to timeout blocking connections. What you could try in this situation is to flush some data at a regular interval.
ob_start(); // enable output buffering
// output something at regular interval
echo " ";
ob_flush();
// at end of script
ob_end_flush();
Hope this helps.
Related
i want to upload a local file via PHP to Google Drive. If I upload a small file (<5MB) everything is working. If I want to upload a larger file I get a "Fatal error: Maximum execution time of 30 seconds exceeded". I can set the max_execution_time in my php.ini on a very high value but it seems to be a bad solution. If I use "set_time_limit(seconds);" I can't upload more files simultaneously (I don't know why).
So my question is how to upload a large file without changing my php.ini to a bad execution time. How is that supposed to solve in PHP? I would like to use just a simple CURL cli command but that does not work with Google Drive because CURL can't support Oath.
Setting the max_execution_time with set_time_limit is the right solution. If your script takes more than 30 seconds to upload the file, it will run more than 30 seconds.
set_time_limit(600); //File upload should not take longer than 10 minutes
do_your_file_upload();
set_time_limit(30); //Change it back to 30 seconds for further processing
I can't upload more files simultaneously
Can you specify what you mean?
If you cannot reach your PHP-Script while the other one runs, it is often because of PHP session management. When you start the session with session_start PHP (creates and) opens the session file and locks it for writing. The file will be locked until your scripts calls session_write_close or the script execution ends. To work with two or more sessions simultaneously, you have to call session_write_close:
session_start();
//do stuff
session_write_close();
set_time_limit(600); //File upload schould not take longer than 10 minutes
do_your_file_upload();
set_time_limit(30); //Change it back to 30 seconds for further processing
session_start();
//do more stuff
session_write_close();
This a bad idea to use a simple PHP code to upload a very large file.
You should use Javascript to upload large files because it's using AJAX methods. Also, it will allow you to display a progress bar.
I did a little research. Here is some nice scripts using jQuery :
http://designscrazed.org/html5-jquery-file-upload-scripts/
Or another script which not use jQuery (I don't if it's a good one) :
http://igstan.ro/posts/2009-01-11-ajax-file-upload-with-pure-javascript.html
Good luck !
I am running a php script as a cron job that might take very long time to finish. It will create a massive xml file and save it.
What should I think of if I implement it?
One thing I do is set max_execution_time for a long time:
ini_set('max_execution_time', 300);
Is there anything else I should do? Increase memory limit?
Does it help if put header "keep-alive"?
What can I do to make sure the script will always run until everything neccessary is done?
You can remove the execution time limit by using the set_time_limit function passing 0 as parameter:
set_time_limit(0);
Adding HTTP headers won't help because as it is a cronjob script, you are not dealing with a browser.
I'm running really long task in php. It's a website crawler and It has to be polite and sleep for 5 seconds each page to prevent server overloading. Script starts with:
ignore_user_abort(1);
session_write_close();
ob_end_clean();
while (#ob_end_flush());
set_time_limit(0);
ini_set('max_execution_time',0);
After few hours (between 3-7h) script dies without any visible reason.
I've checked
apache error log (nothing)
php_errors.log (nothing)
output for errors (10 578 467b of debug output, no errors)
memory consumption (stable, around 3M from memory_get_usage(true) checked every 5 sec, limit set to 512M)
It's not browser, cause I was using wget and chrome to check with the similar reason.
Output is sent to browser every 2-3 seconds, so I don't think that's the fault + I ignore user abort.
Is there any other place I can check to find the issue?
I think there's a problem in the rest of your script, not Apache.
Try profiling your application using the register_tick_function with a light profiler like this and logging the memory usage, may be that.
I have a backup script which backups up all files for a website to a zip file (using a script similar to the answer to this question). However, for large sites the script times out before it can complete.
Is there any way I can extend the length of time available for the script to run? The websites run on shared Windows servers, so I don't have access to the php.ini file.
If you are in a shared server environment, and you don’t have access to the php.ini file, or you want to set php parameters on a per-site basis, you can use the .htaccess file (when running on an Apache webserver).
For instance, in order to change the max_execution_time value, all you need to do is edit .htaccess (located in the root of your website, usually accesible by FTP), and add this line:
php_value max_execution_time 300
where 300 is the number of seconds you wish to set the maximum execution time for a php script.
There is also another way by using ini_set function in the php file
eg. TO set execution time as 5 second, you can use
ini_set('max_execution_time', 300); //300 seconds = 5 minutes
Please let me know if you need any more clarification.
set time limit comes to mind, but may still be limited by php.ini settings
set_time_limit(0);
http://php.net/manual/en/function.set-time-limit.php
Simply put; don't make a HTTP request to start the PHP script. The boundaries you're experiencing are set because you're using a HTTP request, which means you can have a time-out. A better solution would be to implement this using a "cronjob", or what Microsoft calls "Scheduled tasks". Most hosting providers will allow you to run such a task at set times. By calling the script from command line, you don't have to worry about the time-outs any more, but you're still at risk of running into memory issues.
If you have a decent hosting provider though, why doesn't it provide daily backups to start with? :)
You can use the following in the start of your script:
<?php
if(!ini_get('safe_mode')){
set_time_limit(0); //0 in seconds. So you set unlimited time
}
?>
And at the end of the script use flush() function to tell PHP to send out what it has generated.
Hope this solves your problem.
Is the script giving the "Maximum execution time of xx seconds exceeded" error message, or is it displaying a blank page? If so, ignore_user_abort might be what you're looking for. It tells php not to stop the script execution if the communication with the browser is lost, which may protect you from other timeout mechanisms involved in the communication.
Basically, I would do this at the beginning of your script:
set_time_limit(0);
ignore_user_abort(true);
This said, as advised by Berry Langerak, you shouldn't be using an HTTP call to run your backup. A cronjob is what you should be using. Along with a set_time_limit(0), it can run forever.
In shared hosting environments where a change to the max_execution_time directive might be disallowed, and where you probably don't have access to any kind of command line, I'm afraid there is no simple (and clean) solution to your problem, and the simplest solution is very often to use the backup solution provided by the hoster, if any.
Try the function:
set_time_limit(300);
On windows, there is a slight possibility that your webhost allows you to over ride settings by uploading a php.ini file in the root directory of your webserver. If so, upload a php.ini file containing:
max_execution_time = 300
To check if the settings work, do a phpinfo() and check the Local Value for max_execution_time.
Option 1: Ask the hosting company to place the backups somewhere accesible by php, so the php file can redirect the backup.
Option 2: Split the backup script in multiple parts, perhaps use some ajax to call the script a few times in a row, give the user a nice progress bar and combine the result of the script calls in a zip with php and offer that as a download.
I want to have a simple PHP script that loops to do something every ten minutes. It would be hosted offsite, and I would activate it via my browser. I don't have access to the server other than my web space, so 'cron' as such isn't an option.
(I'm happy to have this stop after a certain time or number of job cycles. I just need it to continue running after I point the browser away from the page script.)
Is such a thing possible? Thanks.
It's possible, see ignore_user_abort():
set_time_limit(0);
ignore_user_abort(true);
while (true) // forever
{
// your code
}
You can use this two functions with a combination of sleep(), usleep(), time_nanosleep() or even better - time_sleep_until() to achieve a CRON-like effect.
PHP scripts timeout after a certain amount of time - they're not designed for long-running programs. You'll have to find some way to prod it every ten minutes.
Have a look at set_time_limit.
This is from the above page:
You can do set_time_limit(0); so that the script will run forever - however this is not recommended and your web server might catch you out with an imposed HTTP timeout (usually around 5 minutes).
Maybe you can write another script on a computer which you have access and then make that script request the other one periodically.
Yop can look at pnctl_fork.
Here's a hack for your problem:
// Anything before disconnecting, but nothing to be output to the client!
ob_end_clean();
header('Connection: close');
ob_start();
// Here you can output anything before disconnecting
echo "Bla bla bla";
$outsize = ob_get_length();
header('Content-Length: '.$outsize);
ob_end_flush();
flush();
// Do your background processing here
// and feel free to quit anytime you want.
A way to do this might be to launch a new php process from the web page, e.g.
<?php
exec("php script_that_runs_for_a_while.php > /dev/null");
?>
Adding the /dev/null means (on a linux system) that your script will complete, rather than waiting for the execution to finish.
So then that script that launches can do whatever it likes, since it is basically just a new process running on the server.
Note that at the start of your long running script, you will want to use the set_time_limit function to set the max execution time to some large value.