Is it possible to disconnect an http client from PHP? - php

In PHP is it possible (and is it safe) to close the http connection without returning any http status code? My server is apache.
Will this be logged to the access log or error log?

I use the following code:
/**
* Make the script run in the background
* return a message to the browser
* #param unknown_type $response
*/
function freeUserBrowser($response)
{
// let's free the user, but continue running the
// script in the background
ignore_user_abort(true);
header("Connection: close");
header("Content-Length: " . mb_strlen($response));
echo $response;
flush();
}

I don't think this is possible: Ultimately it's Apache who closes the connection, and returns a status code if PHP doesn't emit one.
We had a question some time ago about whether http connections can be forcibly cut off from within PHP. IIRC, the consensus was that except for shooting down the web server thread responsible for the current request, this was not possible. Looking for the question now... Update: Can't find it right now, sorry.

I don't think so, short of killing the apache worker itself, which would certainly not be a good idea.
It may be possible if using PHP as an Apache module. There may be some Apache internal function available to modules that you could use for this, but I don't know enough of Apache internals to tell for sure.

In nutshell Apache calling the shots, and you cannot change Apache behaviour from outside, for security reasons.

Related

How do I end a PHP connection with a client and still log to an external service?

I was wanting to log to Keen.IO (an external logger service) after a php session was over but I didn't want the user to need to wait for that process to finish. Is there a way to end the connection with the client before this happens? Additionally, I didn't want the client session to hang if for some reason the Keen.IO service goes down.
You can forcefully close the connection with fastcgi_finish_request() if you're using PHP-FPM. See this answer for more details.
<?php
echo 'Page content';
fastcgi_finish_request(); // cut connection
do_logging();
I have tried other methods before (eg. setting HTTP headers to disable keep-alive and defining the response length), but none of them worked. So if you're not using FPM, you're out of luck.

why echo does not show anything while php is busy

PHP Echo or Print functions does not show anything when php is busy with something (like when surfing the web with curl or something like that).
Later i discovered php does show the output when you execute your php on the command line:
php myscript.php
But right now i don't get any outputs from command line too!
Is there any kind of tricks or setting should be done to make php show the outputs?
Chances are, it's caching the results (both in PHP and the web server) and not actually sending them to the browser yet. the best suggestion I can give is this chunk from my code:
/**
* Act as a "breakpoint" in the code, forcing everything to be flushed to the browser
*/
function breakpoint() {
global $debug;
if ($debug) { // So that it doesn't slow down extracts
ob_flush();
flush();
sleep(.1);
}
}
The $debug stuff is if we're running the page in debug mode, specific to our website.
The two main thing you need are ob_flush(), which will send PHP's buffer to the web server's buffer, and flush() which will cause the server to dump to the browser. (Note: if the browser caches before displaying anything, nothing can prevent this) The sleep is there to help make sure it doesn't get overloaded, and has a chance to flush properly.
See:
http://ca.php.net/manual/en/function.ob-flush.php
and
http://ca.php.net/manual/en/function.flush.php
Both PHP and your web server are likely to be caching the output of echo and print. This will often result in no output until the script completes.
Try looking at flush() to force the output out of PHP, but it still may get held up at the web server, so may not help...

Serve a large file via Zend Framework

In our application, authentication is handled via set of Controller Plugins that validate the user etc.
I want to serve a large (video) file only to authenticated users- the obvious way to do this is via readfile() in the controller, but I'm finding it hits the PHP memory limit - presumably the output from the controller is buffered somewhere.
How can I turn off buffering just for this one controller?
EDIT: Thanks for all the useful tips about flushing any existing output buffering - I guess I was specifically looking for a way of doing this within the framework though?
Interesting problem... You could try:
// ...
public function largeFileAction()
{
// this clears all active output buffers
while (ob_get_level()) {
ob_end_clean();
}
readfile('path/to/large/file');
exit(); // to prevent further request handling
}
// ...
Ok, I might be totally wrong here, but I think to have read somewhere OB has to be enabled for ZendLayout and placeholder helpers to work, so you'd have to disable them for the downloadAction (you probably aint gonna need them for serving the file anyway).
Would something like this achieve what you want to do?
class DownloadController
{
public function downloadAction()
{
$this->_helper->layout()->disableLayout();
$this->_helper->viewRenderer->setNoRender(true);
// authenticate user if not done elsewhere already
header( /* ... the usual stuff ... */);
filepassthru(/* some path outside webroot */);
exit;
}
}
As Tyson wrote, your best choice (if you have full control over the server) is to validate users credentials and redirect him (302 temporary redirect) to the URL where he can download the file.
To prevent reuse of this URLs we are using Lighttpd and its mod_secdownload that allows you to generate a hash that is valid for the specified amount of time.
nginx has X-Accel-Redirect and Apache has mod_xsendfile.
If you decide to implement a separate lightweight web server there are other benefits as well (mainly lower memory consumption while serving static files and faster response times).
If you decide to go this route you will either have to add another IP address to the server and bind Apache only to the one IP address, and the other server (lighty of nginx) to the other because they are web servers the both listen on port 80. And changing the port for one of the servers is not a good idea because a lot of people do not have access to higher ports.
If adding another IP address is not an option you can install nginx on port 80 and use it as a reverse proxy to pass the dynamic requests to Apache which can listen on another port and serve all of the static files.
Considering using an external script to output the file, and stream it to the browser using PHP's passthru function.
If on a Linux-based system, you could try something like passthru("cat video_file.flv");
However, a better practice is to avoid this streaming (from within PHP) altogether and issue the client a 301 HTTP redirection to the URL of the actual static resource so that the webserver can handle streaming it directly.
I don't think you can actually. As far as i know php buffers all output before sending it to the requester.
you could increase the memory limit using ini_set()
$handle = fopen('/path/to/file', 'r');
$chunk_size = 8192;
while ($chunk = fread($handle, $chunk_size)) {
echo $chunk;
ob_flush();
}
This will probably need some tweaking, such as adding correct headers and reading in binary mode if necessary, but the basic idea is sound. I have used this method successfully to send 50+ MB files, with a 16 MB PHP memory limit.

php simultaneous file downloads from the same browser and same php script

<?php
$filename= './get/me/me_'.rand(1,100).'.zip';
header("Content-Length: " . filesize($filename));
header('Content-Type: application/zip');
header('Content-Disposition: attachment; filename=foo.zip');
readfile($filename);
?>
Hi,
I have this simple code that forces a random file download, my problem is that if I call the script two or more times from the same browser the second download won't start until the first is completed or interrupted. Thus I can download only one file per time.
Do you have any clue?
This may be related to PHP's session handling.
Using the default session handler, when a PHP script opens a session it locks it. Subsequent scripts that need to access it have to wait until the first script is finished with it and unlocks it (which happens automatically at shutdown, or by session_write_close() ). This will manifest as the script not doing anything till the previous one finishes in exactly the same way you describe.
Clearly you aren't starting the session explicitly, but there's a config flag that causes the session to start automatically: session.auto_start - http://www.php.net/manual/en/session.configuration.php
Either use phpinfo() to determine if this is set to true, or look in your config. You could also try adding session_write_close() to the top of the script, see if it makes the issue go away.
just guesses. There could be different reasons.
first, your server could restrict the number of connections or childs in paralell. But I guess this sin't the problem
second, it is more likely that the client restricts the number of connections. The "normal" browser opens only two connections at a time to a certain server. Modern browsers allow up to 8 (?) connections. This is a simple restriction in order to avoid problems which could occur with slow servers.
One workaround could be to place every download on a "virtual" subdomain.
give it a try!
Just to say that the session_write_close(); solved the problem for me.
I was using session_destroy(); (that worked) but was not much good if I needed to keep session data :)
All you need to do I place session_write_close(); just before you start streaming the file data.
Example:
<?php
$filename= './get/me/me_'.rand(1,100).'.zip';
session_write_close();
header("Content-Length: " . filesize($filename));
header('Content-Type: application/zip');
header('Content-Disposition: attachment; filename=foo.zip');
readfile($filename);
?>
I'd further investigate Ralf's suggestion about the server restrictions and start with checking the logfiles to ensure that the second request is received by the server at all. Having that knowledge, you can eliminate one of the possibilities and at least see which side the problem resides on.
From the client's browser - you didn't mention which one is it - if Firefox, try to install the Live Http Headers extension to see what happens to request you send and if browser receives any response from the server side.
As far as I can find, there is no php configuration setting that restricts max downloads or anything like that - besides, such a configuration is outside the scope of php.
Therefore, I can come to only two conclusions:
The first is that this is browser behaviour, see if the problem is repeated across multiple browsers (let me know if it is). The HTTP spec does say that only two connections to the same domain should be active at any one time, but I wasn't aware that affected file downloads as well as page downloads. A way of getting round such a limitation is to allocate a number of sub-domains to the same site (or do a catch-all subdomains DNS entry), and when generating a link to the download, select a random sub domain to download from. This should work around the multiple request issue if it is a browser problem.
A second and much more unlikely option is that (and this only applys if you are using Apache), your MaxKeepAliveRequests configuration option is set to something ridiculously low and KeepAlives are enabled. However, I highly doubt that is the issue, so I suggest investigating the browser possibility.
Are you getting an error message from the browser when the second download is initiated, or does it just hang? If it just hangs, this suggests it is a browser issue.

PHP Background Processes

I'm trying to make a PHP script, I have the script finished but it takes like 10 minutes to finish the process it is designed to do. This is not a problem, however I presume I have to keep the page loaded all this time which is annoying. Can I have it so that I start the process and then come back 10mins later and just view the log file it has generated?
Well, you can use "ignore_user_abort(true)"
So the script will continue to work (keep an eye on script duration, perhaps add "set_time_limit(0)")
But a warning here: You will not be able to stop a script with these two lines:
ignore_user_abort(true);
set_time_limit(0);
Except you can directly access the server and kill the process there! (Been there, done an endless loop, calling itself over and over again, made the server come to a screeching stop, got shouted at...)
Sounds like you should have a queue and an external script for processing the queue.
For example, your PHP script should put an entry into a database table and return right away. Then, a cron running every minute checks the queue and forks a process for each job.
The advantage here is that you don't lock an apache thread up for 10 minutes.
I had lots of issues with this sort of process under windows; My situation was a little different in that I didn't care about the response of the "script"- I wanted the script to start and allow other page requests to go through while it was busy working away.
For some reason; I had issues with it either hanging other requests or timing out after about 60 seconds (both apache and php were set to time out after about 20 minutes); It also turns out that firefox times out after 5 minutes (by default) anyway so after that point you can't know what's going on through the browser without changing settings in firefox.
I ended up using the process open and process close methods to open up a php in cli mode like so:
pclose(popen("start php myscript.php", "r"));
This would ( using start ) open the php process and then kill the start process leaving php running for however long it needed - again you'd need to kill the process to manually shut it down. It didn't need you to set any time outs and you could let the current page that called it continue and output some more details.
The only issue with this is that if you need to send the script any data, you'd either do it via another source or pass it along the "command line" as parameters; which isn't so secure.
Worked nicely for what we needed though and ensures the script always starts and is allowed to run without any interruptions.
I think shell_exec command is what you are looking for.
However, it is disables in safe mode.
The PHP manual article about it is here: http://php.net/shell_exec
There is an article about it here: http://nsaunders.wordpress.com/2007/01/12/running-a-background-process-in-php/
There is another option which you can use, run the script CLI...It will run in the background and you can even run it as a cronjob if you want.
e.g
> #!/usr/bin/php -q
<?php
//process logs
?>
This can be setup as a cronjob and will execute with no time limitation....this examples is for unix based operation system though.
FYI
I have a php script running with an infinite loop which does some processing and has been running for the past 3 months non stop.
You could use ignore_user_abort() - that way the script will continue to run even if you close your browser or go to a different page.
Think about Gearman
Gearman is a generic application framework for farming out work to
multiple machines or processes. It allows applications to complete
tasks in parallel, to load balance processing, and to call functions
between languages. The framework can be used in a variety of
applications, from high-availability web sites to the transport of
database replication events.
This extension provides classes for writing Gearman clients and
workers.
- Source php manual
Offical website of Gearman
In addition to bastiandoeen's answer you can combine ignore_user_abort(true); with a cUrl request.
Fake a request abortion setting a low CURLOPT_TIMEOUT_MS and keep processing after the connection closed:
function async_curl($background_process=''){
//-------------get curl contents----------------
$ch = curl_init($background_process);
curl_setopt_array($ch, array(
CURLOPT_HEADER => 0,
CURLOPT_RETURNTRANSFER =>true,
CURLOPT_NOSIGNAL => 1, //to timeout immediately if the value is < 1000 ms
CURLOPT_TIMEOUT_MS => 50, //The maximum number of mseconds to allow cURL functions to execute
CURLOPT_VERBOSE => 1,
CURLOPT_HEADER => 1
));
$out = curl_exec($ch);
//-------------parse curl contents----------------
//$header_size = curl_getinfo($ch, CURLINFO_HEADER_SIZE);
//$header = substr($out, 0, $header_size);
//$body = substr($out, $header_size);
curl_close($ch);
return true;
}
async_curl('http://example.com/background_process_1.php');
NB
If you want cURL to timeout in less than one second, you can use
CURLOPT_TIMEOUT_MS, although there is a bug/"feature" on "Unix-like
systems" that causes libcurl to timeout immediately if the value is <
1000 ms with the error "cURL Error (28): Timeout was reached". The
explanation for this behavior is:
[...]
The solution is to disable signals using CURLOPT_NOSIGNAL
pros
No need to switch methods (Compatible windows & linux)
No need to implement connection handling via headers and buffer (Independent from Browser and PHP version)
cons
Need curl extension
Resources
curl timeout less than 1000ms always fails?
http://www.php.net/manual/en/function.curl-setopt.php#104597
http://php.net/manual/en/features.connection-handling.php
Zuk.
I'm pretty sure this will work:
<?php
pclose(popen('php /path/to/file/server.php &'));
echo "Server started. [OK]";
?>
The '&' is important. It tells the shell not to wait for the process to exit.
Also You can use this code in your php (as "bastiandoeen" said)
ignore_user_abort(true);
set_time_limit(0);
in your server stop command:
<?php
$output;
exec('ps aux | grep -ie /path/to/file/server.php | awk \'{print $2}\' | xargs kill -9 ', $output);
echo "Server stopped. [OK]";
?>
Just call StartBuffer() before any output, and EndBuffer() when you want client to close connection. The code after calling EndBuffer() will be executed on server without client connection.
private function StartBuffer(){
#ini_set('zlib.output_compression',0);
#ini_set('implicit_flush',1);
#ob_end_clean();
#set_time_limit(0);
#ob_implicit_flush(1);
#ob_start();
}
private function EndBuffer(){
$size = ob_get_length();
header("Content-Length: $size");
header('Connection: close');
ob_flush();ob_implicit_flush(1);
}

Categories