I have a problem with router/modem/ISP timeouts where my page will be given the blank page with timeout warnings. It seems not to be an issue on the server side, since the page will load eventually with enough time given on a different router/modem/ISP.
Let us assume that I have no other way to optimize the running time, and the page will need to run as long as it does. Is there any way to 100% preventing timeouts in client's browser? I coded in PHP.
If you are running a resource-extensive script, consider increasing your timeout limit using the set_time_limit() function.
set_time_limit(60); // 60 seconds
If you hate your server, you can even use 0for no limit:
set_time_limit(0); // to stop this, you would need to restart your web server
If you want your script to continue working even if the request is aborted such as when the web browser is closed by the user, you may be interested to use ignore_user_abort().
<?php
set_time_limit(60);
ignore_user_abort(true);
echo 'Hello world! Wait for it...';
sleep(30);
file_put_contents('testing.txt','Even if you closed your browser, this file will be created.');
echo 'Thanks for waiting!';
?>
Reminder
Use of these functions may NOT be a good practice. Consider revising your script. If this is a script that will be accessed by multiple users, this could crash your server.
Related
I have a long running script that can run for awhile. (It sends an email every 5 seconds) to many users. This script is triggered via an ajax request
If the response is never received such as the browser is closed, will the script continue to run? (It appears it does, but are there any conditions on when it won't?
Does sleep count towards the max execution time. (It appears this is false also)
1.
The short answer is: it depends.
In fact, it can be configured in PHP and in web server you use. It depends on the mode you use PHP in (module or CGI or whatever).
You can configure it sometimes, though. There is an option in php.ini:
/* If enabled, the request will be allowed to complete even if the user aborts
; the request. Consider enabling it if executing long requests, which may end up
; being interrupted by the user or a browser timing out. PHP's default behavior
; is to disable this feature.
; http://php.net/ignore-user-abort
;ignore_user_abort = On*/
2. Almost always sleep does count. There are conditions when it does not, but in that case not the execution time is measured but execution cpu time. IIS does count CPU usage by app pools, not sure how it applies to PHP scripts.
It is true that PHP does not kill a script that is in sleep right now. That mean that the script will be killed once the sleep is over (easy test: add sleep(15); to your php and set max execution time to 10. You will get an error about time limit but in 15 seconds, not in 10).
In general, you can not rely on freely using sleeps in script. You should not rely on script that is run without a user (browser) within a web server, either. You are apparently solving a problem with wrong methods: you really should consider using cron jobs/separate tasks.
This depends on the server. Some servers could terminate the script when the socket is closed (which will probably happen when the browser is closed), others could let the script execute until the timeout is reached.
Again, would depend on the server. I can really see a implementation looking at the time the script puts load on a CPU, then again - just measuring how long ago the script was started is an equally good approach. It all depends on what the person(s) making the server software was after.
If you want definite answers I would suggest sifting through the documentation for the webserver and php-implementation your script is running on.
I have made a PHP script which probably would take about 3 hours to complete. I run it from browser and after about 45minutes it stops doing anything. I know this since its polling certain web addresses and then saves some data to database. So it basically stops putting any data to database which lead me to conclusion that it has stopped. It still shows in browser like it would be loading the page though but its neverending.
There arent any errors so it probably is some kind of timeout... But where it occurs is mystery or how can I prevent it from happening. In my case I cant use the CLI, I must user browser client to initiate the script.
I have tried to put
set_time_limit(0);
But it had no apparent effect. Any suggestions what could cause the timeout and a fix for it?
Try this:
set_time_limit(0);
ignore_user_abort(true);
ini_set('max_execution_time', 0);
Most webhosts kill processes that run for a certain length of time. This is intended as a failsafe against infinite loops.
Ask your host about this and see if there's any way it can be disabled for this particular script of yours. In some cases, the killer doesn't apply to Cron tasks, or processes run by SSH. However, this varies from host to host.
Might be the browser that's timing out, not sure if browsers do that or not, but then I've also never had a page require so much time.
Suggestion: I'm assuming you are running a loop. Load a page then run each iteration of the loop in an ajax call to another page, not firing the next iteration until the previous one returns.
There's a setting in PHP to kill processes after sometime. This is especially important for shared servers (you do not want that one process slows up the whole server).
You need to ask your host if you can make modifications to php.ini (through .htaccess). In particular the max_execution_time setting.
If you are using session, then you would need to look at 'session.cookie_lifetime' and not set_time_limit. If you are using an array, the array size might also fill up.
Without more info on how your script handles the task, it would be difficult to identify.
I'm making a project on PHP and it pretty much doesn't involve any human intervention, in other words, I'm making a special script that should run for a long time (a user visits my website and the script keeps running until the user exits...). The one problem that is concerning me is:
Won't the sessions expire when the script runs like this for a while? If So how could I bypass that, so it could run for a very a long time?
Oh, And one more question: Could my PHP script start by its own without any user intervention? If so, How could I configure it to do so?
You can run PHP from the command line. Those scripts do not have a timeout like user-called scripts. Otherwise, you can set the timeout with set_time_limit like
set_time_limit(0); // 0 means no limit
See http://php.net/manual/en/function.set-time-limit.php
You can use set_time_limit(0) to avoid the timeout.
For the PHP script to start on its own, you could use a cron
This question already has answers here:
How do I close a connection early?
(20 answers)
Closed 9 years ago.
Is there a way in PHP to close the connection (essentially tell a browser than there's no more data to come) but continue processing. The specific circumstance I'm thinking of is that I would want to serve up cached data, then if the cache had expired, I would still serve the cached data for a fast response, close the connection, but continue processing to regenerate and cache new data. Essentially the only purpose is to make a site appear more responsive as there wouldn't be the occasional delay while a user waits for content to be regenerated.
UPDATE:
PLuS has the closest answer to what I was looking for. To clarify for a couple of people I'm looking for something that enables the following steps:
User requests page
Connection opens to server
PHP checks if cache has expired, if still fresh, serve cache and close connection (END HERE). If expired, continue to 4.
Serve expired cache
Close connection so browser knows it's not waiting for more data.
PHP regenerates fresh data and caches it.
PHP shuts down.
UPDATE:
This is important, it must be a purely PHP solution. Installing other software is not an option.
If running under fastcgi you can use the very nifty:
fastcgi_finish_request();
http://php.net/manual/en/function.fastcgi-finish-request.php
More detailed information is available in a duplicate answer.
I finally found a solution (thanks to Google, I just had to keep trying different combinations of search terms). Thanks to the comment from arr1 on this page (it's about two thirds of the way down the page).
<?php
ob_end_clean();
header("Connection: close");
ignore_user_abort(true);
ob_start();
echo 'Text the user will see';
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // All output buffers must be flushed here
flush(); // Force output to client
// Do processing here
sleep(30);
echo('Text user will never see');
I have yet to actually test this but, in short, you send two headers: one that tells the browser exactly how much data to expect then one to tell the browser to close the connection (which it will only do after receiving the expected amount of content). I haven't tested this yet.
You can do that by setting time limit to unlimited and ignoring connection
<?php
ignore_user_abort(true);
set_time_limit(0);
see also: http://www.php.net/manual/en/features.connection-handling.php
PHP doesn't have such persistence (by default). The only way I can think of is run cron jobs to pre-fill the cache.
Can compile and run programs from PHP-CLI(not on shared hosting > VPS)
Caching
For caching I would not do it that way. I would use redis as my LRU cache. It is going to be very fast(benchmarks) especially when you compile it with client library written in C.
Offline processing
When you install beanstalkd message queue you can also do delayed puts. But I would use redis brpop/rpush to do the other message queuing part because redis is going to be faster especially if you use PHP client library(in C user-space).
Can NOT compile or run programs from PHP-CLI(on shared hosting)
set_time_limit
most of the times this set_time_limit is not available(because of safe-mode or max_execution_time directive) to set 0 at least when on shared hosting.Also shared hosting really providers don't like for users to hold up PHP processes for a long time. Most of the times the default limit is set to 30.
Cron
Use cron to write data to disc using Cache_lite. Some stackoverflow topic already explaining this:
crontab with wget - why is it running twice?
Bash commands not executed when through cron job - PHP
How can I debug a PHP CRON script that does not appear to be running?
Also rather easy, but still hacky. I thinky you should upgrade(>VPS) when you have to do such hacking.
Asynchronous request
As last resort you could do asynchronous request caching data using Cache_lite for example. Be aware that shared hosting does not like for you to hold up a lot of long running PHP processes. I would use only one background process which calls another one when it reaches max-execution-time directive. I would note time when script starts and between a couple of cache calls I would measure time spent and when it gets near the time I would do another asynchronous request. I would use locking to make sure only 1 process is running. This way I will not piss of the provider and it can be done. On the other hand I don't think I would write any of this because it is kind of hacky if you ask me. When I get to that scale I would upgrade to VPS.
As far as I know, unless you're running FastCGI, you can't drop the connection and continue execution (unless you got Endophage's answer to work, which I failed). So you can:
Use cron or anything like that to schedule this kind of tasks
Use a child process to finish the job
But it gets worse. Even if you spawn a child process with proc_open(), PHP will wait for it to finish before closing connection, even after calling exit(), die(), some_undefined_function_causing_fatal_error(). The only workaround I found is to spawn a child process that itself spawns a child process, like this:
function doInBackground ($_variables, $_code)
{
proc_open (
'php -r ' .
escapeshellarg ("if (pcntl_fork() === 0) { extract (unserialize (\$argv [1])); $_code }") .
' ' . escapeshellarg (serialize ($_variables)),
array(), $pipes
);
}
$message = 'Hello world!';
$filename = tempnam (sys_get_temp_dir(), 'php_test_workaround');
$delay = 10;
doInBackground (compact ('message', 'filename', 'delay'), <<< 'THE_NOWDOC_STRING'
// Your actual code goes here:
sleep ($delay);
file_put_contents ($filename, $message);
THE_NOWDOC_STRING
);
If you are doing this to cache content, you may instead want to consider using an existing caching solution such as memcached.
No. As far as the webserver is concerned, the request from the browser is handled by the PHP engine, and that's that. The request lasts as long as the PHP.
You might be able to fork() though.
For example, there is a very simple PHP script which updates some tables on database, but this process takes a long time (maybe 10 minutes). Therefore, I want this script to continue processing even if the user closed the browser, because sometimes users do not wait and they close the browser or go to another webpage.
If the task takes 10 minutes, do not use a browser to execute it directly. You have lots of other options:
Use a cronjob to execute the task
periodically.
Have the browser
request insert a new row into a
database table so that a regular
cronjob can process the new row and
execute the PHP script with the
appropriate arguments
Have the
browser request write a message to
queue system, which has a subscriber
listening for such events (which then
executes the script).
While some of these suggestions are probably overkill for your situation, the key, combining feature is to de-couple the browser request from the execution of the job, so that it can be completed asynchronously.
If you need the browser window updated with progress, you will need to use a periodically-executed AJAX request to retrieve the job status.
To answer your question directly, see ignore_user_abort
More broadly, you probably have an architecture problem here.
If many users can initiate this stuff, you'll want the web application to add jobs to some kind of queue, and have a set number of background processes that chew through all the work.
The PHP script will keep running after the client terminates the connection (not doing so would be a security risk), but only up to max_execution_time (set in php.ini or through a PHP script, generally 30 seconds by default)..
For example:
<?php
$fh = fopen("bluh.txt", 'w');
for($i=0; $i<20; $i++) {
echo $i."<br/>";
fwrite($fh,$i."\n");
sleep(1);
}
fclose($fh);
?>
Start running that in your browser and close the browser before it completes. You'll find that after 20 seconds the file contains all of the values of $i.
Change the upper bound of the for loop to 100 instead of 20, and you'll find it only runs from 0 to 29. Because of PHP's max_execution_time the script times out and dies.
if the script is completely server based (no feedback to the user) this will be done even if the client is closed.
The general architecture of PHP is that a clients send a request to a script that gives a reply to the user. if nothing is given back to the user the script will still execute even if the user is not on the other side anymore. More simpler: their is no constant connection between server and client on a regular script.
You can make the PHP script run every 20 minutes using a crontab file which contains the time and what command to run in this case it would be the php script.
Yes. The server doesn't know if the user closed the browser. At least it doesn't notice that immediately.
No: the server probably (depending of how it is configured) won't allow for a php script to run for 10 minutes. On a cheap shared hosting I wouldn't rely on a script running for longer than a reasonable response time.
A server-side script will go on what it is doing regardless of what the client is doing.
EDIT: By the way, are you sure that you want to have pages that take 10 minutes to open? I suggest you to employ a task queue (whose items are executed by cron on a timely basis) and redirect user to a "ok, I am on it" page.