This question already has answers here:
How do I close a connection early?
(20 answers)
Closed 9 years ago.
Is there a way in PHP to close the connection (essentially tell a browser than there's no more data to come) but continue processing. The specific circumstance I'm thinking of is that I would want to serve up cached data, then if the cache had expired, I would still serve the cached data for a fast response, close the connection, but continue processing to regenerate and cache new data. Essentially the only purpose is to make a site appear more responsive as there wouldn't be the occasional delay while a user waits for content to be regenerated.
UPDATE:
PLuS has the closest answer to what I was looking for. To clarify for a couple of people I'm looking for something that enables the following steps:
User requests page
Connection opens to server
PHP checks if cache has expired, if still fresh, serve cache and close connection (END HERE). If expired, continue to 4.
Serve expired cache
Close connection so browser knows it's not waiting for more data.
PHP regenerates fresh data and caches it.
PHP shuts down.
UPDATE:
This is important, it must be a purely PHP solution. Installing other software is not an option.
If running under fastcgi you can use the very nifty:
fastcgi_finish_request();
http://php.net/manual/en/function.fastcgi-finish-request.php
More detailed information is available in a duplicate answer.
I finally found a solution (thanks to Google, I just had to keep trying different combinations of search terms). Thanks to the comment from arr1 on this page (it's about two thirds of the way down the page).
<?php
ob_end_clean();
header("Connection: close");
ignore_user_abort(true);
ob_start();
echo 'Text the user will see';
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // All output buffers must be flushed here
flush(); // Force output to client
// Do processing here
sleep(30);
echo('Text user will never see');
I have yet to actually test this but, in short, you send two headers: one that tells the browser exactly how much data to expect then one to tell the browser to close the connection (which it will only do after receiving the expected amount of content). I haven't tested this yet.
You can do that by setting time limit to unlimited and ignoring connection
<?php
ignore_user_abort(true);
set_time_limit(0);
see also: http://www.php.net/manual/en/features.connection-handling.php
PHP doesn't have such persistence (by default). The only way I can think of is run cron jobs to pre-fill the cache.
Can compile and run programs from PHP-CLI(not on shared hosting > VPS)
Caching
For caching I would not do it that way. I would use redis as my LRU cache. It is going to be very fast(benchmarks) especially when you compile it with client library written in C.
Offline processing
When you install beanstalkd message queue you can also do delayed puts. But I would use redis brpop/rpush to do the other message queuing part because redis is going to be faster especially if you use PHP client library(in C user-space).
Can NOT compile or run programs from PHP-CLI(on shared hosting)
set_time_limit
most of the times this set_time_limit is not available(because of safe-mode or max_execution_time directive) to set 0 at least when on shared hosting.Also shared hosting really providers don't like for users to hold up PHP processes for a long time. Most of the times the default limit is set to 30.
Cron
Use cron to write data to disc using Cache_lite. Some stackoverflow topic already explaining this:
crontab with wget - why is it running twice?
Bash commands not executed when through cron job - PHP
How can I debug a PHP CRON script that does not appear to be running?
Also rather easy, but still hacky. I thinky you should upgrade(>VPS) when you have to do such hacking.
Asynchronous request
As last resort you could do asynchronous request caching data using Cache_lite for example. Be aware that shared hosting does not like for you to hold up a lot of long running PHP processes. I would use only one background process which calls another one when it reaches max-execution-time directive. I would note time when script starts and between a couple of cache calls I would measure time spent and when it gets near the time I would do another asynchronous request. I would use locking to make sure only 1 process is running. This way I will not piss of the provider and it can be done. On the other hand I don't think I would write any of this because it is kind of hacky if you ask me. When I get to that scale I would upgrade to VPS.
As far as I know, unless you're running FastCGI, you can't drop the connection and continue execution (unless you got Endophage's answer to work, which I failed). So you can:
Use cron or anything like that to schedule this kind of tasks
Use a child process to finish the job
But it gets worse. Even if you spawn a child process with proc_open(), PHP will wait for it to finish before closing connection, even after calling exit(), die(), some_undefined_function_causing_fatal_error(). The only workaround I found is to spawn a child process that itself spawns a child process, like this:
function doInBackground ($_variables, $_code)
{
proc_open (
'php -r ' .
escapeshellarg ("if (pcntl_fork() === 0) { extract (unserialize (\$argv [1])); $_code }") .
' ' . escapeshellarg (serialize ($_variables)),
array(), $pipes
);
}
$message = 'Hello world!';
$filename = tempnam (sys_get_temp_dir(), 'php_test_workaround');
$delay = 10;
doInBackground (compact ('message', 'filename', 'delay'), <<< 'THE_NOWDOC_STRING'
// Your actual code goes here:
sleep ($delay);
file_put_contents ($filename, $message);
THE_NOWDOC_STRING
);
If you are doing this to cache content, you may instead want to consider using an existing caching solution such as memcached.
No. As far as the webserver is concerned, the request from the browser is handled by the PHP engine, and that's that. The request lasts as long as the PHP.
You might be able to fork() though.
Related
I have a problem with router/modem/ISP timeouts where my page will be given the blank page with timeout warnings. It seems not to be an issue on the server side, since the page will load eventually with enough time given on a different router/modem/ISP.
Let us assume that I have no other way to optimize the running time, and the page will need to run as long as it does. Is there any way to 100% preventing timeouts in client's browser? I coded in PHP.
If you are running a resource-extensive script, consider increasing your timeout limit using the set_time_limit() function.
set_time_limit(60); // 60 seconds
If you hate your server, you can even use 0for no limit:
set_time_limit(0); // to stop this, you would need to restart your web server
If you want your script to continue working even if the request is aborted such as when the web browser is closed by the user, you may be interested to use ignore_user_abort().
<?php
set_time_limit(60);
ignore_user_abort(true);
echo 'Hello world! Wait for it...';
sleep(30);
file_put_contents('testing.txt','Even if you closed your browser, this file will be created.');
echo 'Thanks for waiting!';
?>
Reminder
Use of these functions may NOT be a good practice. Consider revising your script. If this is a script that will be accessed by multiple users, this could crash your server.
This question came up to me when I encountered a bug that caused my PHP program to loop infinitely. Here is an example situation:
Suppose I have a PHP webpage that receives picture uploads (the page perhaps is a response page for an image upload form). In the server, the script should store the image in a temporary file. The script should then output a confirmation message to the client then stop sending data so that the client would not wait. The script should then continue executing, processing the image (like resizing it) before ending.
I think this "technique" could be useful such that the client will not wait during time-consuming processes, therefore preventing time-outs.
Also, could this be solved using HTTP methods?
Yes.
This can easily be done without any asynchronous processing if you correctly utilize HTTP headers.
Under normal conditions PHP will stop processing as soon as the client on the other end closes the connection. If you want to continue processing after this event, you need to do one thing: tell PHP to ignore user aborts. How?
ignore_user_abort()
This will allow your script to keep running even after the client gets the heck out of dodge. But we're also faced with the problem of how to tell the client that the request they made is finished so that it will close the connection. Normally, PHP transparently handles sending these headers for us if we don't specify them. Here, though, we need to do it explicitly or the client won't know when we want them to stop reading the response.
To do this, we have to send the appropriate HTTP headers to tell the client when to close:
Connection: close
Content-Length: 42
This combination of headers tells the client that once it reads 42 bytes of entity body response that the message is finished and that they should close the connection. There are a couple of consequences to this method:
You have to generate your response BEFORE sending any output because you have to determine its content length size in bytes so you can send the correct header.
You have to actually send these headers BEFORE you echo any output.
So your script might look something like this:
<?php
ignore_user_abort();
// do work to determine the response you want to send ($responseBody)
$contentLength = strlen($responseBody);
header('Connection: close');
header("Content-Length: $contentLength");
flush();
echo $responseBody;
// --- client will now disconnect and you can continue processing here ---
The big "Gotchya" with this method is that when you're running PHP in a web SAPI you can easily run up against the max time limit directive if you do time-consuming processing after the end user client closes the connection. If this is a problem, you may need to consider an asynchronous processing option using cron because there is no time limit when PHP runs in a CLI environment. Alternatively, you could just up the time limit of your scripts in the web environment using set_time_limitdocs.
It's worth mentioning that if you do something like this, you may also want to add a check to connection_aborted()docs while generating your response body so that you can avoid the additional processing if the user aborts before completing the transfer.
I have facing the same problem when i upload image on twitter & facebook from iphone through web service of php.
If the processing time of image upload is not much then you can check the comment of #Musa this may help you but if it takes too much time to process then try this steps.
1. Image store in folder
2. Fetch image from folder using cron
3. Cron run for every 2 min in backend
these will decrease your processing time.
Hope this help you.
It is advisable to do these asynchronously. That is, make another script which only processes the previously-created tmp files, and run it with cron (don't even involve apache). When php is running as web-server module, it should be dedicated to quickly forming a response, and then going away to free up resources for the next request.
You are doing the right thing by thinking this way; just keep going one small architectural step further, and fully decouple the request from the heavy lifting that needs to take place.
You can do it several ways #
1 #
ob_start();
//output
header("Content-Length: ".ob_get_length());
header("Connection: close");
ob_end_flush();
//do other stuff
2 #
Using system() or exec() of PHP, close the Process
3 #
Close the Process using Shell Script
You can use ob_implicit_flush(), It will turn implicit flushing on or off. Implicit flushing will result in a flush operation after every output call, so that explicit calls to flush() will no longer be needed.
refer to
How do i implement this scenario using PHP?
OR
You should Create a standalone cron, which will run after a specific amount of time, and do the in asynchronous way, with out letting the user to know what processing is going on, or with out letting the user to wait. This way you will even be able to detect the failed cases also.
And you should also try to minimize the loading time.
I am looking for the PHP equivalent for VB doevents.
I have written a realtime analysis package in VB and used doevents to release to the operating system.
Doevents allows me to stay in memory and run continuously without filling up memory and allows me to respond to user input.
I have rewritten the package in PHP and I am looking for that same doevents feature.
If it doesn't exist I could reschedule myself and exit.
But I currently don't know how to do that and I think that would add a lot more overhead.
Thank you, gerardg
usleep is what you are looking for.. Delays program execution for the given number of micro seconds
http://php.net/manual/en/function.usleep.php
It's been almost 10 years since I last wrote anything in VB and as I recall, doevents() function allowed the application to yield to the processor during intensive processing (usually to allow other system events to fire - the most common being WM_PAINT so that your UI won't appear hung).
I don't think PHP has such functionality - your script will run as a single process and end (either when it's done or when it hits the default 30 second timeout).
If you are thinking in terms of threads (as most Windows programmers tend to do) and needing to spawn more than 1 instance of your script, perhaps you should take look at PHP's Process Control functions as a start.
I'm not entirely sure which aspects of doevents you're looking to emulate, so here's pretty much everything that could be useful for you.
You can use ob_implicit_flush(true) at the top of your script to enable implicit output buffer flushing. That means that whenever your script calls echo or print or whatever you use to display stuff, PHP will automatically send it all to the user's browser. You could also just use ob_flush() after each call to display something, which acts more like Application.DoEvents() in VB with regards to keeping your UI active, but must be called each time something is output.
Naturally if your script uses the output buffer already, you could build a copy of the buffer before flushing, with ob_get_contents().
If you need to allow the script to run for more time than usual, you can set a longer tiemout with set_time_limit($time). If you need more memory, and you have access to edit your .htaccess file, place the following code and edit the value:
php_value memory_limit 64M
That sets the memory limit to 64 megabytes.
For running multiple scripts at once, you can use pcntl_exec to start another one running.
If I am missing something important about DoEvents(), let me know and I will try to help you make it work.
PHP is designed for asynchronous on demand processing. However it can be forced to become a background task with a little hackery.
As PHP is running as a single thread you do not have to worry about letting the CPU do other things as that is already taken care of. If this was not the case then a web server would only be able to serve up one page at a time and all other requests would have to sit in a queue. You will need to write some sort of look that never expires until some detectable condition happens (like the "now please exit" message you set in the DB or something).
As pointed out by others you will need to set_time_limit($something); with perhaps usleep stopping the code from running "too fast" if it eats very much CPU each loop. However if you are also using a Database connection most of your script time is actually the script waiting for the Database (by far the biggest overhead for a script).
I have seen PHP worker threads created by using screen and detatching it to a background task. Other approaches also work so long as you do not have a session that will time out or exit (say when the web browser is closed). A cron that starts a script to check if the script is running every x mins or hours gives you automatic recovery from forced exists and/or system restarts.
TL;DR: doevents is "baked in" to PHP and you don't have to worry about it.
I have a PHP script that grabs a chunk of data from a database, processes it, and then looks to see if there is more data. This processes runs indefinitely and I run several of these at a time on a single server.
It looks something like:
<?php
while($shouldStillRun)
{
// do stuff
}
logThatWeExitedLoop();
?>
The problem is, after some time, something causes the process to stop running and I haven't been able to debug it and determine the cause.
Here is what I'm using to get information so far:
error_log - Logging all errors, but no errors are shown in the error log.
register_shutdown_function - Registered a custom shutdown function. This does get called so I know the process isn't being killed by the server, it's being allowed to finish. (or at least I assume that is the case with this being called?)
debug_backtrace - Logged a debug_backtrace() in my custom shutdown function. This shows only one call and it's my custom shutdown function.
Log if reaches the end of script - Outside of the loop, I have a function that logs that the script exited the loop (and therefore would be reaching the end of the source file normally). When the script dies randomly, it's not logging this, so whatever kills it, kills it while it's in the middle of processing.
What other debugging methods would you suggest for finding the culprit?
Note: I should add that this is not an issue with max_execution_time, which is disabled for these scripts. The time before being killed is inconsistent. It could run for 10 seconds or 12 hours before it dies.
Update/Solution: Thank you all for your suggestions. By logging the output, I discovered that when a MySql query failed, the script was set to die(). D'oh. Updated it to log the mysql errors and then terminate. Got it working now like a charm!
I'd log memory usage of your script. Maybe it acquires too much memory, hits memory limit and dies?
Remember, PHP has a variable in the ini file that says how long a script should run. max-execution-time
Make sure that you are not going over this, or use the set_time_limit() to increase execution time. Is this program running through a web server or via cli?
Adding: My Bad Experiences with PHP. Looking through some background scripts I wrote earlier this year. Sorry, but PHP is a terrible scripting language for doing anything for long lengths of time. I see that the newer PHP (which we haven't upgraded to) adds the functionality to force the GC to run. The problem I've been having is from using too much memory because the GC almost never runs to clean up itself. If you use things that recursively reference themselves, they also will never be freed.
Creating an array of 100,000 items makes memory, but then setting the array to an empty array or splicing it all out, does NOT free it immediately, and doesn't mark it as unused (aka making a new 100,000 element array increases memory).
My personal solution was to write a perl script that ran forever, and system("php my_php.php"); when needed, so that the interpreter would free completely. I'm currently supporting 5.1.6, this might be fixed in 5.3+ or at the very least, now they have GC commands that you can use to force the GC to cleanup.
Simple script
#!/usr/bin/perl -w
use strict;
while(1) {
if( system("php /to/php/script.php") != 0 ) {
sleep(30);
}
}
then in your php script
<?php
// do a single processing block
if( $moreblockstodo ) {
exit(0);
} else {
// no? then lets sleep for a bit until we get more
exit(1);
}
?>
I'd log the state of the function to a file in a few different places in each loop.
You can get the contents of most variables as a string with var_export, using the var_export($varname,true) form.
You could just log this to a certain file, and keep an eye on it. The latest state of the function before the log ends should provide some clues.
Sounds like whatever is happening is not a standard php error. You should be able to throw your own errors using a try... catch statement that should then be logged. I don't have more details other than that because I'm on my phone away from a pc.
I've encountered this before on one of our projects at work. We have a similar setup - a PHP script checks the DB if there are tasks to be done (such as sending out an email, updating records, processing some data as well). The PHP script has a while loop inside, which is set to
while(true) {
//do something
}
After a while, the script will also be killed somehow. I've already tried most of what has been said here like setting max_execution_time, using var_export to log all output, placing a try_catch, making the script output ( php ... > output.txt) etc and we've never been able to find out what the problem is.
I think PHP just isn't built to do background tasks by itself. I know it's not answering your question (how to debug this) but the way we worked this is that we used a cronjob to call the PHP file every 5 minutes. This is similar to Jeremy's answer of using a perl script - it ensures that the interpreter if free after the execution is done.
If this is on Linux, try to look into system logs - the process could be killed by the OOM (out-of-memory) killer (unlikely, you'd also see other problems if this was happening), or a segmentation fault (some versions of PHP don't like some versions of extensions, resulting in weird crashes).
I'm trying to make a sort of PHP bot. The idea is to have to php files, named a.php and b.php. a.php does something, then sleeps 30 seconds, calls b.php, b.php ends the Http request, does some processing, and then calls a.php, which ends the Http request, and so on.
Only problem now is how to end the Http reqest, made using cURL. Ive tried this code below:
<?php
ob_end_clean();
header("Connection: close");
ignore_user_abort(); // optional
ob_start();
echo ('Text the user will see');
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // Will not work
flush(); // Unless both are called !
// At this point, the browser has closed connection to the web server
// Do processing here
echo('Text user will never see');
Slight problem is that it doesn't work, and I actually see "Text user will never see". I've tried cron jobs and such, but host doesn't allow it. I can't sent the script timeout limit either. So my only option is to create repeating php scripts. So how would I send the Http request?
Based on the new understanding of your problem. You are creating a system that checks a remote URL every 30 seconds to monitor a fragment of content. For this I recommend a CRON which can either be server based: http://en.wikipedia.org/wiki/Cron or web based if your host does not permit it: http://www.webbasedcron.com/ (example).
PHP scripts in this case run in the context if web server request, therefore you can't stop talking to the web connection and then continue doing stuff, which is what I think you're attempting to do with the connection close.
The reason you're seeing the output at the end is because at the end of a script PHP will call an implicit flush (see ob_implicit_flush in the manual), but you close the connection to the browser by ending the PHP script.
Ways around this:
You might be able to use set_time_limit to extend the execution limit. DO NOT USE ZERO. It's tempting to say "take all the time you need" on a post-process script, but that way lies madness and bitter sysadmins, plus remember you're still running on curl's timeout stopwatch (though you can extend that as an option). set_time_limit(5) will give you five more seconds, so doing that periodically will allow you to do your post-processing but - if you're careful - still protect you from infinite loops. Infinate loops with no execute limits in the context of apache requests are also likely to make you unpopular with your sysadmin.
It might be possible to build a shell script in your application, save it to disk, execute that in the background and have it delete itself after. That way it will run outside the web-request context, and if the script still exists when you next do the request, you can know that the other processing is still happening. Be really careful about things that might take longer than your gap between executions, as that way leads to sorrow and more bitter sysadmins. This course of action would get you thrown off my hosting environment if you did it without talking to me about it first, though, as it's a terrible hack with a myriad of possible security issues.
But you appear to be attempting to run a regular batch process on a system where they don't want you to do that - or they'd have given you access to cron - so your best and most reliable method is to find a host that actually supports the thing you're trying to do.