Run clean up code after die() - php

I'm working on speeding up the response time of some php code that generates html. One of the issues with the code, is that when determines a piece of information does not need to be displayed, it makes a sql call to delete the item from the database. This isn't visible to the user, and the won't be visible to the server until the next time the page is loaded, so that sql query does not need to be run as soon as the system knows that it should be run.
What I would like to do is return the response to the user, with the generated html, and then make the sql queries. I was trying this flush and ob_flush, but the page response is still not loaded until I make a call to die.
Is there anyway in PHP to run code after a call to die() so that the user gets their data and then I can run my database clean up code and the client is no long waiting on me to close the connection?

You can register shutdown functions using register_shutdown_function:
register_shutdown_function(function () {
// cleanup stuff
});
Or in older versions of PHP:
function myFunc() {
// cleanup stuff
}
register_shutdown_function("myFunc");

Thanks to #robbrit and #Luis Siquot. I was looking at register_shutdown_function and because of Luis' comment, I was reading the comments on that page, and I came across "When using php-fpm, fastcgi_finish_request() should be used instead of register_shutdown_function() and exit()"
which lead me to fastcgi_finish_request which says:
"This function flushes all response data to the client and finishes the request. This allows for time consuming tasks to be performed without leaving the connection to the client open."
So it looks like fastcgi_finish_request() is what I'm looking for, not register_shutdown_function()
Edit: It seems that fastcgi_finish_request() requires another library, so instead use:
ob_end_clean();
header("Connection: close");
ignore_user_abort(true); // just to be safe
ob_start();
echo "The client will see this!";
$size = ob_get_length();
header("Content-Length: $size");
//Both of these flush methods must be called, otherwise strange things happen.
ob_end_flush();
flush();
echo "The client will never see this";

Related

register_shutdown_function Or ignore_user_abort?

Hey i've seen people recommend each of them, One calimed register_shutdown_function to be better but without explination.
I'm talking about which is better to send a response back and still preform other tasks.
I Wondered what really is the better method and why.
EDIT:
In the register_shutdown_function documentation, someone published the following method:
<?php
function endOutput($endMessage){
ignore_user_abort(true);
set_time_limit(0);
header("Connection: close");
header("Content-Length: ".strlen($endMessage));
echo $endMessage;
echo str_repeat("\r\n", 10); // just to be sure
flush();
}
// Must be called before any output
endOutput("thank you for visiting, have a nice day');
sleep(100);
mail("you#yourmail.com", "ping", "im here");
?>
Could it be better then any of the functions i stated?
ignore_user_abort() tells PHP/Apache to not terminate execution when the user disconnects. register_shutdown_function simply allows you to do some cleanup while PHP is in the process of shutting down.
register_shut_down is only useful if you need to do some cleanup that PHP's normal shutdown routines wouldn't take care, e.g. removing a manually created lock file, flipping a bit in a DB record somewhere, etc...
In older versions of PHP (<4.1.0 under Apache), register_shutdown_function() would ensure that the connection was closed before your shutdown functions ran. This is no longer the case. The endOutput() function in your edit should indeed do what you want, provided you don't have any output buffers open. Though, it does set the script to be able to run forever if necessary, which could be annoying if it goes into an infinite loop (especially during debugging). You might want to change set_time_limit() to use a value that actually reflects how many seconds the script should take.
It's probably best to avoid register_shutdown_function() if you don't need it, since it has some other odd behavior (such as not being able to add a second layer of shutdown functions to run if the first shutdown function calls exit()).

How to flush data to browser but continue executing

I have a ob_start() and a corresponding ob_flush(). I would like to flush a portion of data and continue executing the rest. Using ob_flush() didn't help. Also if possible rest needs to happen without showing loading in browser.
EDIT:
I don't want to use ajax
I have done this in the past and this is how I solved it:
ob_start();
/*
* Generate your output here
*/
// Ignore connection-closing by the client/user
ignore_user_abort(true);
// Set your timelimit to a length long enough for your script to run,
// but not so long it will bog down your server in case multiple versions run
// or this script get's in an endless loop.
if (
!ini_get('safe_mode')
&& strpos(ini_get('disable_functions'), 'set_time_limit') === FALSE
){
set_time_limit(60);
}
// Get your output and send it to the client
$content = ob_get_contents(); // Get the content of the output buffer
ob_end_clean(); // Close current output buffer
$len = strlen($content); // Get the length
header('Connection: close'); // Tell the client to close connection
header("Content-Length: $len"); // Close connection after $len characters
echo $content; // Output content
flush(); // Force php-output-cache to flush to browser.
// See caveats below.
// Optional: kill all other output buffering
while (ob_get_level() > 0) {
ob_end_clean();
}
As I said in a couple of comments before, you should watch out for gzipping your content, since that will alter the length of your content, but not change the header about it. It also can buffer your output, so it won't get send to the client instantly.
You could try letting apache know to not gzip your content by using apache_setenv('no-gzip', '1');. But this will not work if you use rewrite-rules to go to your page, since then it will also modify those environment variables. At least, it did so for me.
See more caveats about flushing your content to the user in the manual.
ob_flush writes the buffer. In other words, ob_flush tells PHP to give Apache (or nginx/lighttpd/whatever) the output and then for PHP to forget about it. Once Apache has the output, it does whatever it wants with it. (In other words, after ob_flush it's out of your control whether or not it gets immediately written to the browser).
So, short answer: There's no guaranteed way to do that.
Just a guess, you're likely looking for AJAX. Whenever people are trying to manipulate when page content loads as you're doing, AJAX is almost always the correct path.
If you want to continue a task in the background, you can use ignore_user_abort, as detailed here, however, that is often not the optimal approach. You essentially lose control over that thread, and in my opinion, a web server thread is not where heavy processing belongs.
I would try to extract it out of the web facing stuff. This could mean a cron entry or just spawning a background process from inside of PHP (a process that though started from inside of script execution will not die with the script, and the script will not wait for it to finish before dying).
If you do go that route, it will mean that you can even make some kind of status system if necessary. Then you could monitor the execution and give the user periodic updates on the progress. (Technically you could make a status system with a ignore_user_abort-ed script too, but it doesn't seem as clean to me.)
this is my function
function bg_process($fn, $arr) {
$call = function($fn, $arr){
header('Connection: close');
header('Content-length: '.ob_get_length());
ob_flush();
flush();
call_user_func_array($fn, $arr);
};
register_shutdown_function($call, $fn, $arr);
}
wrap the function to be executed in the end, after php close the connection. and of course the browser will stop buffering.
function test() {
while (true) {
echo 'this text will never seen by user';
}
}
this is how to call the function
bg_process('test');
first argument is callable,
second argument is an array to be passed to 'test' function with an indexed array
Note : I don't use ob_start() at the beginning of the script.
I have an article explaining how this can be achieved using apache/mod_php on my blog here: http://codehackit.blogspot.com/2011/07/how-to-kill-http-connection-and.html Hope this helps, cheers
If you are using PHP-FPM:
ignore_user_abort(true);
fastcgi_finish_request();
Above two functions are the key factors which ignore_user_abort prevents error and fastcgi_finish_request closes client connection.
fastcgi_finish_request
This function flushes all response data to the client and finishes the request. This allows for time consuming tasks to be performed without leaving the connection to the client open.
not working on Apache.(PHP 5 >= 5.3.3, PHP 7)
Use:
header("Content-Length: $len");
..where $len is the length of the data to be flushed to the client.
I don't have the background to know when and where this is going to work, but I tried on a few browsers, and all returned instantly with:
<?PHP
header("Content-length:5");
echo "this is more than 5";
sleep(5);
?>
edit: Chrome, IE, and Opera showed this, while FireFox showed this is more than 5. All of them closed the request after that though.

Fire-and-forget in PHP

Final update
Seems like I did make a very simple error. Since I already have a stream implementation I can just not start reading from the stream :D
I'm trying to achieve fire-and-forget like functionality in PHP.
From php.net
<?php
ignore_user_abort(true);
header("Content-Length: 4");
header("Connection: Close");
echo "abcd";
flush();
sleep(5);
echo "Text user should not see"; // because it should have terminated
?>
This works if I open the script with a browser. (shows "abcd").
But if I open it with file_get_contents or some stream library it will wait for ~5 seconds and show the second text as well.
I'm using PHP 5.2.11 / Apache 2.0
Update
I seems there is some confusion about what I'm trying to accomplish.
I don't want to hide output using output buffers (that's stupid). I want to have the client terminate before the server starts a possibly lengthy process (sleep(5)) and I don't want the client to wait for it (this is what fire-and-forget means, sort off).
The use of output buffers is merely a side effect. I've amended the sample code without the use of output buffers.
What I don't understand is: why does this script behave differently when accessing it from the browser vs. fetching it in PHP with file_get_contents("http://dev/test.php") or some stream library? What I've seen in testing is that for instance stream_get_contents will actually block for 5 seconds before it returns any output at all, the is quite the opposite of what I want.
Update2
Some more results:
The browser somehow responds to the flush(). I can't figure out how to replicate this behavior with streams in PHP, my streams keep blocking.
I've tried fread and found that it behaves similar to stream_get_contents.
Specifying a maxlength has no effect, it will still block for ~5 seconds.
Changing the blocking mode has no effect (other than generating a bunch more calls to stream_get_contents()). It will wait ~5 seconds before returning anything.
stream_set_read_buffer has no effect (tested on a PHP 5.3.5 sever)
The second portion of text is showing up because you're stopping output buffering with ob_end_flush() and ob_end_clean(). When that happens PHP outputs content as normal. Try something like the following:
<?php
ob_start(); // turn on output buffering
print "Text the user will see.";
ob_flush(); // send above output to the user and keep output buffering on
print "Text the user will never see";
ob_end_clean(); // empty the buffer and turn off output buffering. your script should end here.
?>
It's important for ob_end_clean() to appear at the end of the script. It empties the buffer and does not send its contents to the user, thus keeping everything after ob_flush() hidden.
How do you access the script using file_get_contents? How do you access it with your browser? If you access the script without "http://", of course it will never get executed. Use the same URL as in the browser.
Edit:
The browser will render the page even before the connection is closed. Even if you flush, I don't think the connection is closed. You can fire up Wireshark and check. stream_get_contents and file_get_contents will block until they have all the output. Even if you flushed, they can't be sure that there isn't more content. Since the content-length header didn't seem to make {file,stream}_get_contents return earlier, you probably need to implement your own buffering, ala. fopen, read, fclose.
Seems like I did make a very simple error. Since I already have a stream implementation I can just not start reading from the stream :D

PHP Async Execution

Scenario is as follows:
Call to a specified URL including the Id of a known SearchDefinition should create a new Search record in a db and return the new Search.Id.
Before returning the Id, I need to spawn a new process / start async execution of a PHP file which takes in the new Search.Id and does the searching.
The UI then polls a 3rd PHP script to get status of the search (2nd script keeps updating search record in the Db).
This gives me a problem around spawning the 2nd PHP script in an async manner.
I'm going to be running this on a 3rd party server so have little control over permissions. As such, I'd prefer to avoid a cron job/similar polling for new Search records (and I don't really like polling if I can avoid it). I'm not a great fan of having to use a web server for work which is not web-related but to avoid permissions issues it may be required.
This seems to leave me 2 options:
Calling the 1st script returns the Id and closes the connection but continues executing and actually does the search (ie stick script 2 at the end of script 1 but close response at the append point)
Launch a second PHP script in an asynchronous manner.
I'm not sure how either of the above could be accomplished. The first still feels nasty.
If it's necessary to use CURL or similar to fake a web call, I'll do it but I was hoping for some kind of convenient multi-threading approach where I simply spawn a new thread and point it at the appropriate function and permissions would be inherited from the caller (ie web server user).
I'd rather use option 1. This would also keep related functionality closer to each other.
Here is a hint how to send something to user and then close the connection and continue executing:
(by tom ********* at gmail dot com, source: http://www.php.net/manual/en/features.connection-handling.php#93441)
<?php
ob_end_clean();
header("Connection: close\r\n");
header("Content-Encoding: none\r\n");
ignore_user_abort(true); // optional
ob_start();
echo ('Text user will see');
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // Strange behaviour, will not work
flush(); // Unless both are called !
ob_end_clean();
//do processing here
sleep(5);
echo('Text user will never see');
//do some processing
?>
swoole: asynchronous & concurrent extension.
https://github.com/matyhtf/swoole
event-driven
full asynchronous non-blocking
multi-thread reactor
multi-process worker
millisecond timer
async MySQL
async task
async read/write file system
async dns lookup

Can a PHP script trick the browser into thinking the HTTP request is over?

I first configure my script to run even after the HTTP request is over
ignore_user_abort(true);
then flush out some text.
echo "Thats all folks!";
flush();
Now how can I trick the browser into thinking the HTTP request is over? so I can continue doing my own work without the browser showing "page loading".
header(??) // something like this?
Here's how to do it. You tell the browser to read in the first N characters of output and then close the connection, while your script keeps running until it's done.
<?php
ob_end_clean();
header("Connection: close");
ignore_user_abort(true); // optional
ob_start();
echo ('Text the user will see');
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // Will not work
flush(); // Unless both are called !
// At this point, the browser has closed connection to the web server
// Do processing here
echo('Text user will never see');
?>
Headers won't work (they're headers, so they come first)
I don't know of any way to close the http connection without terminating the script, though I suppose there's some obscure way of doing it.
Telling us what you want to do after the request is done would help us give better suggestions.
But generally, I'd be thinking about one of the following:
1) Execute some simple command-line script (using exec()) that looks like:
#!/bin/sh
php myscript.php <arg1> <arg2> .. <argN> &
Then kick that off from your http-bound script like:
<?PHP
exec('/path/to/my/script.sh');
?>
Or:
2) Write another program (possibly a continuously-running daemon, or just some script that is cronned ever so often), and figure out how your in-request code can pass it instructions. You could have a database table that queues work, or try to make it work with a flat file of some sort. You could also have your web-based script call some command-line command that causes your out-of-request script to queue some work.
At the end of the day, you don't want your script to keep executing after the http request. Assuming you're using mod_php, that means you'll be tying up an apache process until the script terminates.
Maybe this particular comment on php.net manual page will help: http://www.php.net/manual/en/features.connection-handling.php#71172
Theoretically, if HTTP 1.1 keep-alive is enabled and the client receives the amount of characters it expects from the server, it should treat it as the end of the response and go ahead and render the page (while keeping the connection still open.) Try sending these headers (if you can't enable them another way):
Connection: keep-alive
Content-Length: n
Where n is the amount of characters that you've sent in the response body (output buffering can help you count that.) I'm sorry that I don't have the time to test this out myself. I'm just throwing in the suggestion in case it works.
The best way to accomplish this is using output buffering. PHP sends the headers when it's good and ready, but if you wrap your output to the browser with ob_* you can control the headers every step of the way.
You can hold a rendered page in the buffer if you want and send headers till the sun comes up in china. This practice is why you may see a lot of opening <?php tags, but no closing tags nowadays. It keeps the script from sending any headers prematurely since there might some includes to consider.

Categories