PHP kill script after disconnection of client - php

Currently I am working on a system in PHP (for syncing data between webservices) (using the Lithium framework) which makes multiple requests to different webservices through curl.
For every response in a request (could be quite some few, lets say 100). However when the client disconnects (e.g. presses stop in the browser) the script continues to running (the log lines keep coming in the log file) 'long' after the client is disconnected.
From what I've read PHP only detects client disconnection when it tries to send output to the browser. So now I am flushing some data after every requests (and every line to the log file) to the browser. But the PHP script continues to run.
I now have the following code after each log call, however this does not seem to work.
//Die on no connection
ignore_user_abort(false);
echo chr(0); flush(); ob_flush();
if(connection_aborted())
{
die();
}
The script just keeps running, it there anything I can do to make sure the script stops after the client (or at least soon after) disconnects?
edit: Why do I always find the solution after posting a question. added 'ob_flush' and that seems to work. Thanks anyways for anyone who has looked into this.

try using ob_flush() instead of flush(), that will flush the output buffer..

Related

How do I view the status of my msql query?

I recently executed a mysql query via chrome and closed it out. How exactly does a browser stop a PHP script using the stop button? I thought PHP was a server-side language and could not be controlled through a client.
*UPDATE*I'm now aware of SHOW PROCESSLIST, but this only shows you which threads are running.Is there a SQL command I can use to view a executed query with great detail?
A client (Chrome) has nothing to do with the execution of scripts (PHP) on the server, which in turn have no control over database processes (MySQL query).
Look at your servers process list to see what's going on in general (Apache processes).
Or even better: use SHOW PROCESSLIST; on the MySQL console to find the long running query. You may quit it by using KILL ###ID_OF_QUERY###;.
No, you don't need to keep it open. If you exit a running car, does the car turn off? No.
Sorry, that came off a little snotty, but it wasn't intended too.
The browser, in your case Chrome, is not actually running the actual code. The server is. Thus, once the instruction is executed, closing the browser no longer matters as the request has been given to the server.
two functions are essential for executing time consuming php scripts.
it has nothing to do with the browser (as other users already pointed out)
lookup ignore_user_abort and set_time_limit
The script will continue to execute regardless of browser closure. You can free up your browser by sending the response and allowing the php process to continue on.
ignore_user_abort(true);
$response = "Processing!";
header("Connection: close");
header("Content-Length: " . mb_strlen($response));
echo $response;
flush();
// Insert your lengthy query here
The Answer is it depends, as others mentioned you can check what is running on the mysql server by using the show processlist;
If it is a single query that takes a long time, the it will most likely carry on running after the browser has closed. PHP will have sent the request to the Database and will in effect be sat waiting for it to complete, in turn the browser will be waiting for the webserver to finish building the page/resource that is on that url
so the request is: browser <-> web server (<-> php ) <-> mysql in an ideal world if the user cancels the request everything would tidy itself up nicely, but that in my experience sadly is not the case, if one of the chain decides not to wait, the process that it is waiting for doesn't necessarily know until it tries to send the response back and fails
Come on guys, this is PHP 101. Quoted from the manual:
You can decide whether or not you want a client disconnect to cause
your script to be aborted. Sometimes it is handy to always have your
scripts run to completion even if there is no remote browser receiving
the output. The default behaviour is however for your script to be
aborted when the remote client disconnects.
Execution will stop at the next tickable event after the connection flag is set to ABORTED - which will be detected when PHP attempts to flush output to the client
The current MySQL query will finish executing (as the next event that PHP has control over doesn't occur until after the query has completed), but your script would not make it past that, unless you explicitly set ignore_user_abort. It's always important to account for this when writing code.
There are two ways around this
Set ignore_user_abort to true for the invocation of your script
Do not print anything back to the client until after all of your processing is complete - since a connection closed status won't be detected until output is flushed

How do I implement Server Sent Events in PHP?

I had set up a server sent event script with php and a while loop, I did not want for the script to have to keep closing and have to repoll so I put it all in a while loop.
The issue was that the script was getting stuck and I had to abandon that route and I went with a node.js websocket backend instead.
My question is, if I ever went back to making a server sent event php script, how do I implement it?
while loops do not seem to cut it as it hangs the script, and if it is just connecting and disconnecting every second, it is no different than long polling, so how do I create a PHP script that will not hang, while also sending over the SSE messages?
You seemed to have issue on php output buffering. Try adding these line to the end of your while loop:
ob_flush();
flush();
This should disable the output buffering.
EDIT You can also terminates the script after some time (i.e. 10mins) to reduce server load.
I've created a library for you to do it very easily. Check it here.
Second Edit
Do you have a reverse proxy such as nginx or varnish? This may be the reason because the proxy tries to cache the content of the output but the SSE script never ends until you stop it so the whole thing hangs. Other things that captures the output may have similar results such as mod_deflate.
Third edit
If you have a reverse proxy, you can try to turn off caching to allow SSE to work.
There are another ways in PHP to disable output buffering. See the code below:
<?php
for($i=0;$i<ob_get_level();$i++){
ob_end_flush();
}
#apache_setenv('no-gzip',1);
#ini_set('implict_flush',1);
ob_implict_flush(true);

Send data with opened connection in PHP

I'm trying to find a solution to my problem with sending data to client with PHP. The biggest issue is - what I'm trying to do is to keep sending data inside a single connection with PHP script. I'm sure there are other ways but currently I don't know how to solve this.
What I'm trying to do is: A client connects to a web server and keeps the connection opened, so the TCP connection is "established". He will keep making for example GET requests every X seconds to keep this connection alive.
Inside of this connection on certain event I want to send the client some data without him making the request! So it means the event is triggered on the server side, not on the client side.
There is no possibility of using any JavaScript or any client-side technique as my client is Arduino module. I can keep the connection opened but I need to pass data to the client by using HTTP.
I have a database set up on the server and PHP will send data to the client when something changes inside the database.
I was trying to play with php flush() running in loop in the PHP script; but that doesn't do it the way I want.
So any advices appreciated.
Thank you.
edit: would be perfect it the solution would also work on Windows machine!
edit2: There will be multiple clients, not just one (eg hundreds)
As long as you don’t have lots of clients, Server-side Events sounds like it could work for you.
http://dsheiko.com/weblog/html5-and-server-sent-events
Just read that you will have hundreds of clients, in that case you probably won't want to use PHP but use node.js instead.
How about CRON jobs?
http://en.wikipedia.org/wiki/Cron
http://www.cyberciti.biz/faq/how-do-i-add-jobs-to-cron-under-linux-or-unix-oses/
I think that might be the solution for your project. If i undrestand CRON's correctly, what thy do is execute a given script at given intervals. So that is basicly what u want, scripts to execute for every X seconds. And inside your script u have your function working with the database.
I think what you are looking for is IPC - Inter Process Communication. In your case I would suggest a message queue (or multiple of those)
(On the client)
Open connection to foo.php
When receiving a new line, process it.
If connection times out, re-open it
(On the server - foo.php)
Open a Message Queue (You will have to register a message queue for each user!)
Register it so that your bar.php knows about it.
Start a blocking receive.
When a message is received, send whatever you want to send, FLUSH OUTPUT BUFFERS, go back to 2.
If anything times out, back to 2.
(On the server - bar.php)
When the database changes, send message to all active queues
There are a few problems with this approach:
The server side only really works on Linux / Unix (that includes Macs)
There is a limited number of message queues
You may have to do some housekeeping, removing old queues, etc.
The benefits:
This is application independent. Message queues are an operating system feature, thus your bar.php could really be say a java application.
Ok, so I think I found the way I want it to work. The problem why flush() wasn't working is that I didn't reach the flush buffer limit before flushing. Also I'm using Nginx server and I disabled gzip (just in case).
My test code which absolutely works looks like this:
<?
ob_implicit_flush(1);
for($i=0; $i<10; $i++){
echo $i;
//this is for the buffer achieve the minimum size in order to flush data
echo str_repeat(' ',1024*64);
sleep(1);
}
?>
Found my answer here: PHP Flush that works... even in Nginx
I will test it with my Arduinos if it can accept such output. Thanks all for your help.

How do I stop PHP from sending data to client while still running PHP code in server?

This question came up to me when I encountered a bug that caused my PHP program to loop infinitely. Here is an example situation:
Suppose I have a PHP webpage that receives picture uploads (the page perhaps is a response page for an image upload form). In the server, the script should store the image in a temporary file. The script should then output a confirmation message to the client then stop sending data so that the client would not wait. The script should then continue executing, processing the image (like resizing it) before ending.
I think this "technique" could be useful such that the client will not wait during time-consuming processes, therefore preventing time-outs.
Also, could this be solved using HTTP methods?
Yes.
This can easily be done without any asynchronous processing if you correctly utilize HTTP headers.
Under normal conditions PHP will stop processing as soon as the client on the other end closes the connection. If you want to continue processing after this event, you need to do one thing: tell PHP to ignore user aborts. How?
ignore_user_abort()
This will allow your script to keep running even after the client gets the heck out of dodge. But we're also faced with the problem of how to tell the client that the request they made is finished so that it will close the connection. Normally, PHP transparently handles sending these headers for us if we don't specify them. Here, though, we need to do it explicitly or the client won't know when we want them to stop reading the response.
To do this, we have to send the appropriate HTTP headers to tell the client when to close:
Connection: close
Content-Length: 42
This combination of headers tells the client that once it reads 42 bytes of entity body response that the message is finished and that they should close the connection. There are a couple of consequences to this method:
You have to generate your response BEFORE sending any output because you have to determine its content length size in bytes so you can send the correct header.
You have to actually send these headers BEFORE you echo any output.
So your script might look something like this:
<?php
ignore_user_abort();
// do work to determine the response you want to send ($responseBody)
$contentLength = strlen($responseBody);
header('Connection: close');
header("Content-Length: $contentLength");
flush();
echo $responseBody;
// --- client will now disconnect and you can continue processing here ---
The big "Gotchya" with this method is that when you're running PHP in a web SAPI you can easily run up against the max time limit directive if you do time-consuming processing after the end user client closes the connection. If this is a problem, you may need to consider an asynchronous processing option using cron because there is no time limit when PHP runs in a CLI environment. Alternatively, you could just up the time limit of your scripts in the web environment using set_time_limitdocs.
It's worth mentioning that if you do something like this, you may also want to add a check to connection_aborted()docs while generating your response body so that you can avoid the additional processing if the user aborts before completing the transfer.
I have facing the same problem when i upload image on twitter & facebook from iphone through web service of php.
If the processing time of image upload is not much then you can check the comment of #Musa this may help you but if it takes too much time to process then try this steps.
1. Image store in folder
2. Fetch image from folder using cron
3. Cron run for every 2 min in backend
these will decrease your processing time.
Hope this help you.
It is advisable to do these asynchronously. That is, make another script which only processes the previously-created tmp files, and run it with cron (don't even involve apache). When php is running as web-server module, it should be dedicated to quickly forming a response, and then going away to free up resources for the next request.
You are doing the right thing by thinking this way; just keep going one small architectural step further, and fully decouple the request from the heavy lifting that needs to take place.
You can do it several ways #
1 #
ob_start();
//output
header("Content-Length: ".ob_get_length());
header("Connection: close");
ob_end_flush();
//do other stuff
2 #
Using system() or exec() of PHP, close the Process
3 #
Close the Process using Shell Script
You can use ob_implicit_flush(), It will turn implicit flushing on or off. Implicit flushing will result in a flush operation after every output call, so that explicit calls to flush() will no longer be needed.
refer to
How do i implement this scenario using PHP?
OR
You should Create a standalone cron, which will run after a specific amount of time, and do the in asynchronous way, with out letting the user to know what processing is going on, or with out letting the user to wait. This way you will even be able to detect the failed cases also.
And you should also try to minimize the loading time.

Does php execution stop after a user leaves the page?

I want to run a relatively time consuming script based on some form input, but I'd rather not resort to cron, so I'm wondering if a php page requested through ajax will continue to execute until completion or if it will halt if the user leaves the page.
It doesn't actually output to the browser until a json_encode at the end of the file, so would everything before that still execute?
It depends.
From http://us3.php.net/manual/en/features.connection-handling.php:
When a PHP script is running normally
the NORMAL state, is active. If the
remote client disconnects the ABORTED
state flag is turned on. A remote
client disconnect is usually caused by
the user hitting his STOP button.
You can decide whether or not you want
a client disconnect to cause your
script to be aborted. Sometimes it is
handy to always have your scripts run
to completion even if there is no
remote browser receiving the output.
The default behaviour is however for
your script to be aborted when the
remote client disconnects. This
behaviour can be set via the
ignore_user_abort php.ini directive as
well as through the corresponding
php_value ignore_user_abort Apache
httpd.conf directive or with the
ignore_user_abort() function.
That would seem to say the answer to your question is "Yes, the script will terminate if the user leaves the page".
However realize that depending on the backend SAPI being used (eg, mod_php), php cannot detect that the client has aborted the connection until an attempt is made to send information to the client. If your long running script does not issue a flush() the script may keep on running even though the user has closed the connection.
Complicating things is even if you do issue periodic calls to flush(), having output buffering on will cause those calls to trap and won't send them down to the client until the script completes anyway!
Further complicating things is if you have installed Apache handlers that buffer the response (for example mod_gzip) then once again php will not detect that the connection is closed and the script will keep on trucking.
Phew.
It depends on your settings - usually it will stop but you can use ignore_user_abort() to make it carry on.
Depending on the configuration of the web server and/or PHP, the PHP process may, or may not, kill the thread when the user terminates the HTTP connection. If an AJAX request is pending when the user walks away from the page, it is dependent on the browser killing the request (not guaranteed) ontop of your server config (not guaranteed). Not the answer you want to hear!
I would recommend creating a work queue in a flat file or database that a constantly-running PHP daemon can poll for jobs. It doesn't suffer from cron delay but keeps CPU/memory usage to a usable level. Once the job is complete, place the results in the flat file/database for AJAX fetch. Or promise to e-mail the user once the job is finished (my preferred method).
Hope that helps
If the client/user/downloader/viewer aborts or disconnects, the script will keep running until something tries do flush new data do the client. Unless you have used
ignore_user_abort(), the script will die there.
In the same order, PHP is unable to determine if client is still there without trying to flush any data to the httpd.
found the actual solution for my case of it not terminating the connection. The SESSION on my Apache/Php server needed to close before the next one could start.
Browser waits for ajax call to complete after abort.

Categories