I have a php/apache page that takes a long time to load. Basically, it looks like this:
<?php
doHeavyStuff_1();
doHeavyStuff_2();
doHeavyStuff_3();
printResults();
?>
It happens from time to time that the client disconnects in the middle of the processing, say, between step1 and step2. Is there a way in php to check if the client is still connected and to stop further processing if it isn't? I'd like my code to be like this:
<?php
doHeavyStuff_1();
if(<clientDisconnected>) die;
doHeavyStuff_2();
if(<clientDisconnected>) die;
doHeavyStuff_3();
if(<clientDisconnected>) die;
printResults();
?>
Look into using the connection_aborted function.
Normally PHP (as an apache module) should stop automatically if the user disconnects. No work is necessary.
Just if you are interested: if you want PHP to continue processing there is the function ignore_user_abort(). You should start to read the manual page and especially the comments to see how it can be used and which problems can occur.
Related
I'm using in my project server-sent events where the JS is calling a PHP page, say eventserver.php which consists basically of an infinite loop which checks the existence of an event in a $_SESSION variable.
On my first implementation this lead my website to hung up because the eventserver took the lock on the session and did not release it until the timeout expired; however, I managed to resolve this issue by locking/unlocking the session by using session_write_lock() and
session_start() continuously in the loop.
This is actually causing a lot of PHP warnings (on Apache error.log) saying that "cannot send session cache limiter - headers already sent", "cannot send session cookies" and so on.
Posting some code here
session_start();
header('Cache-Control: no-cache');
header('Content-Type: text/event-stream');
class EventServer
{
public function WaitForEvents( $eventType )
{
// ... do stuff
while( true )
{
// lock the session to this instance
session_start();
// ...check/output the event
ob_flush();
flush();
// unlock the session
session_write_close();
sleep( 1 );
}
}
}
Why is this happening?
I am doing the same thing as the OP and ran into the same issue. Some of these answers don't understand how eventSource should work. My code is identical to yours and uses a session variable to know what view the user is on which drives what data to return in the event of a server trigger. It's part of a realtime collaboration app.
I simply prepended an # to the session_start() to suppress the warnings in the log. Not really a fix, but it keeps the log from filling up.
Alternatively, not sure how well it would work for your application, but you could use ajax to write the session variable you are monitoring to the database, then your eventSource script can monitor for a change in the DB instead of having to start sessions.
This is not a good idea. HTTP is a request-response protocol so if you want server-client communication to be bi-directional you will need to look into websockets or something similar. There are also things like "long polling" and "heart beating"
If you want an event loop try something like servlets in apache tomcat.
You will grapple for hours with issues because of your design.
Also check out ajax if you just want to shoot messages from javascript to PHP.
Make sure you know an overview of the tech stack you are working with :)
You don't need an infinite loop with SSE. The EventSource keeps an open connection to the server and any update on the server side data will be read by the client.
Check out basic usage of SSE here
It's probably because you start the session twice in your code. Don't restart the session at the beginning of the loop, but after the sleep().
I'm sorry if my wording is going to be off or my question is a little too vague.
Let's say I have a simple script sitting on an Ubuntu box running Apache with mod_php. This is the entirety of my script:
<?php
echo 'Hello, World!';
?>
What happens when I call echo? Does the text get written to a buffer somewhere and then sent to the client when the script ends? I'd like to get a handle on something low level like that.
Usually, the script output (notice my phrasing) gets sent directly to the client during the parsing of the script.
When you want to store (read: buffer) the output before sending it, you can use output buffering, like Yazmat already mentioned.
I think you are looking for this : Output Buffering
What's in my mind is, when the server is called by a client to view their hosted PHP file, the server will sent an executed PHP script to the client, so it's not a PHP script anymore.
I have been trying to fix this wired php session issue for some time now.
Setup: running on IIS V6.0, php for windows V 5.2.6
Issue:
At totally random times, the next php line after session_start() times out.
I have a file, auth.php that gets included on every page on an extranet site (to check valid logins)
auth.php
session_start();
if (isset($_SESSION['auth']==1) { <---- timesout here
do something ...
}
...
When using the site, I get random "maximum execution time of 30 seconds exceeded" errors at the line 2: if (isset($_SESSION['auth']==1) {
If I modify this script to
session_start();
echo 'testing'; <---- timesout here
if (isset($_SESSION['auth']==1) {
do something ...
}
...
The random error now happens on line 2 as well (echo 'testing'), which is a simple echo statement, strange.
It looks like session_start() is randomly causing issues, preventing line of code right after it to throw a timeout error (even for a simple echo statement) ....
This is happening on all sorts of page on the site (db intensive, relatively static ...) which is making it difficult to troubleshoot. I have been tweaking the session variables and timeouts in php.ini without any luck
Has anyone encountered something like that, or could suggest possible places to look at ?
thanks !
A quick search suggests that you should be using session_write_close() to close the session when you are done using it if you are on an NTFS file system. Starting a session locks the session file so no other file can access it while code is running. For some reason, the lock sometimes doesn't release automatically reliably on Windows/NTFS, so you should manually close the session when you are done with it.
If I'm generating a stream of data to send out to a browser, and the user closes the browser, can I tell within PHP that I don't need to bother generating or sending the rest of the stream? I'd like to insert something into this loop:
while (!feof($pipes[1])) {
echo fgets($pipes[1]);
}
My fallback plan is to have the browser use a JavaScript onunload to hit another PHP page to kill the process that's generating the data, but it would be cleaner if PHP could tell when I'm echoing to nowhere.
By default PHP will abort the script if the user navigates away. There are however times where you don't want this to happen so php has a config you set called ignore_user_abort.
http://php.net/manual/en/misc.configuration.php
There's also a function called register_shutdown_function() that is supposedly executed when execution halts. I've never actually used it, so I won't vouch for how well it works, but I thought I'd mention it for completeness.
I believe that script will automatically abort when loaded normally (No ajax). But if you want to implement some sort of long polling via php using xmlhttprequest I think you will have to do it with some sort of javascript because then php can't detect it. Also like to know the precise case.
These answers pointed me towards what I was looking for. The underlying process needed special attention to kill it. I needed to jump out of the loop. Thanks again, Stack Overflow.
while (!feof($pipes[1]) && !connection_aborted())
{
echo fgets($pipes[1]);
}
if (connection_aborted())
{
exec('kill -4 '.$mypid);
}
I first configure my script to run even after the HTTP request is over
ignore_user_abort(true);
then flush out some text.
echo "Thats all folks!";
flush();
Now how can I trick the browser into thinking the HTTP request is over? so I can continue doing my own work without the browser showing "page loading".
header(??) // something like this?
Here's how to do it. You tell the browser to read in the first N characters of output and then close the connection, while your script keeps running until it's done.
<?php
ob_end_clean();
header("Connection: close");
ignore_user_abort(true); // optional
ob_start();
echo ('Text the user will see');
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // Will not work
flush(); // Unless both are called !
// At this point, the browser has closed connection to the web server
// Do processing here
echo('Text user will never see');
?>
Headers won't work (they're headers, so they come first)
I don't know of any way to close the http connection without terminating the script, though I suppose there's some obscure way of doing it.
Telling us what you want to do after the request is done would help us give better suggestions.
But generally, I'd be thinking about one of the following:
1) Execute some simple command-line script (using exec()) that looks like:
#!/bin/sh
php myscript.php <arg1> <arg2> .. <argN> &
Then kick that off from your http-bound script like:
<?PHP
exec('/path/to/my/script.sh');
?>
Or:
2) Write another program (possibly a continuously-running daemon, or just some script that is cronned ever so often), and figure out how your in-request code can pass it instructions. You could have a database table that queues work, or try to make it work with a flat file of some sort. You could also have your web-based script call some command-line command that causes your out-of-request script to queue some work.
At the end of the day, you don't want your script to keep executing after the http request. Assuming you're using mod_php, that means you'll be tying up an apache process until the script terminates.
Maybe this particular comment on php.net manual page will help: http://www.php.net/manual/en/features.connection-handling.php#71172
Theoretically, if HTTP 1.1 keep-alive is enabled and the client receives the amount of characters it expects from the server, it should treat it as the end of the response and go ahead and render the page (while keeping the connection still open.) Try sending these headers (if you can't enable them another way):
Connection: keep-alive
Content-Length: n
Where n is the amount of characters that you've sent in the response body (output buffering can help you count that.) I'm sorry that I don't have the time to test this out myself. I'm just throwing in the suggestion in case it works.
The best way to accomplish this is using output buffering. PHP sends the headers when it's good and ready, but if you wrap your output to the browser with ob_* you can control the headers every step of the way.
You can hold a rendered page in the buffer if you want and send headers till the sun comes up in china. This practice is why you may see a lot of opening <?php tags, but no closing tags nowadays. It keeps the script from sending any headers prematurely since there might some includes to consider.