php: flush data and end client connection - php

I have a php-script (in a normal LAMP environment) that runs a couple of housekeeping-tasks at the end of script.
I use flush() to push all the data to the client, which works fine (the page is fully loaded), but the browser still waits for data (indicated by the "loading"-animation) which is confusing for the user but of course clear because Apache cannot know whether PHP will generate more output after flush() - in my case it never does, however.
Is there a way to tell the client that the output is finished and the http-connection should be closed immediately even though the script keeps running?

It sounds like you have a long running script performing varioous tasks. Especially it appears to script goes on doing things after it has sent the reply to the client. This is a design that opens a whole lot of potential problems. You should re-think your architecture.
Keep house keeping tasks and client communication strictly separate. For example you could have a client request processed and trigger internal sub requests (which you can detach from) or deligate tasks to a cron like system. Then offer a second view to the client which visualized the progress and result of those tasks. This approach is much safer, more flexible and easier to extend when required. And your problem at hand is solved, too :-)

you can use this function fastcgi_finish_request() special function to finish request and flush all data while continuing to do something time-consuming (video converting, stats processing etc.); http://php.net/manual/en/install.fpm.php but you need to install FPM for it like
<?php
echo "You can see this from the browser immediately.<br>";
fastcgi_finish_request();
sleep(10);
echo "You can't see this form the browser.";
?>

Related

PHP while(true) loop for file updates

I've got the following problem at hand:
I'm having users on two seperate pages, but saving page input to the same text file. While one user is editing, the other can't. I'm keeping track of this with sessions and writing changes and who's turn to edit it is in a file.
Works fine so far, the output in the end is quite similar to a chat. However, right now I'm having users manually actualize their page and reload the file. What I'd like to do is have the page execute a redirect when the file-timestamp changes (to indicate that the last user has saved their edits and its another users turn). I've looked into javascript shortpolling a little, but then found the php filmtime function and it looks much easier to use. Well - here's what I got:
while(true){
$oldtimestamp=filemtime(msks/$session['user']['kampfnr'].txt);
$waittimer=2;
$waittimer++;
sleep($waittimer);
$newtimestamp=filemtime(msks/$session['user']['kampfnr'].txt);
if ($eintragszeit2 > $eintragszeit1){
addnav("","kampf_ms.php?op=akt");
redirect("kampf_ms.php?op=akt");
}}
In theory, while the user sees the output "it's ... turn to edit the file." this should loop in the background, checking if the file has already been updated, and if yes, redirect the user.
Practically this heavily affects server perfomance (I'm on shared hosting) until it breaks with a memory exceeded error message.
Is something wrong with the code? Or is it generally a bad idea to use a while loop in this case?
Thanks in advance!
PHP language should be only used to generate web content (client do a request to the server => server calls the required script, and returns the response to the client).
Once page is loaded and displayed to the client, the connection is closed, so Internet can die, the client isn't informed...
So with an infinite loop, not only the client can wait for response... an infinite time, but also the server may be heavy impacted because of load... Effectively It is a really bad idea :)
PHP can't be used to make a bidirectional communication: it is just called to build web pages that client demands, and so he can't do anything "in the background" (not directly, effectively you can call an external script, but not for notify a client...)
Also, to do a bidirectional communication, php and "regular" http is not good, because of client / server architecture (the server only answers client request, it is passive)
I can suggest to use WebSocket protocol, to do a chat application:
http://socket.io/
https://en.wikipedia.org/wiki/WebSocket
But for that, you need to use an "active" server solution, such as node.js or ruby (depends of your server capabilities...)
The other way if you want to stay in php is that client makes Ajax request every 10 seconds, for example, to call a php script which check the file, and send back a message to the client if file is updated, but it is really deprecated, because of heavy performance loss, so forget it immediately.

Php, how to kill the self-request?

Lets imagine a request is done, which lasts for a while, until its running, Php is echo-ing content. To flush the content, I use:
echo str_repeat(' ', 99999).str_repeat(' ', 99999); ob_implicit_flush(true);
so, while the request is being processed, we actually can see an output.
Now I would like to have a button like "stop it" so that from Php I could kill this process (Apache process I guess). How to?
Now I would like to have a button like "stop it" so that from Php I could kill this process
It's not clear what you're asking here. Normally PHP runs in a single thread (of execution that is - not talking about light weight processes here). Further for any language running as CGI/FastCGI/mod_php there aer no input events - input from the HTTP channel is only read once at the beeginning of execution.
It is possible (depending on whether the thread of execution is regularly re-entering the PHP interpreter) to ask PHP to run a function at intervals (register_tick_function()) which could poll for some event communicated via another channel (e.g. a different HTP request setting a semaphore).
Sending an stream of undefined and potentially very large length to the browser is a really bad idea. The right solution (your example is somewhat contrived) may be to to spawn a background process on the webserver and poll the output via Ajax. You would still need to implement some sort of control channel though.
Sometimes the thread of execution goes out of PHP and stays there for a long time. In many cases if the user terminates a PHP script which has a long running database query, the PHP may stop running but the SQL will keep on running until completion. There are solutions - bu you didn't say if that was the problem.

PHP - Display status of loop

I have a PHP script something like:
$i=0;
for(;$i<500;++i) {
//Do some operation with files numbered 0 to 500;
}
The thing is, the script works and displays the end results, but the operation takes a while and watching a blank screen can be frustrating. I was thinking if there is some way I can continuously update the page at the client's end, detailing which file is currently being worked upon. That is, can I display and continuously update what is the current value of $i?
The Solution
Thanks everyone! The output buffering is working as suggested. However, David has offered valuable insight and am considering that approach as well.
You can buffer and control the output from the PHP script.
However, you may want to consider the scalability of this design. In general, heavy processes shouldn't be done online. Your particular case may be an edge in that the wait is acceptable, but consider something like this as an alternative for an improved user experience:
The user kicks off a process. This can be as simple as setting a flag on a record in the database or inserting some "to be processed" records into the data.
The user is immediately directed to a page indicating that the process has been queued.
An offline process (either kicked off by the PHP script on the server or scheduled to run regularly) checks the data and does the heavy processing.
In the meantime, the user can refresh the page (manually, by navigating elsewhere and coming back to check, or even use an AJAX polling mechanism to update the page) to check the status of the processing. In this case, it sounds like you'd have several hundred records in a database table queued for processing. As each one finishes, it can be flagged as done. The page can just check how many are left, which one is current, etc. from the data.
When the processing is completed, the page shows the result.
In general this is a better user experience because it doesn't force the user to wait. The user can navigate around the site and check back on progress as desired. Additionally, this approach scales better. If your heavy processing is done directly on the page, what happens when you have many users or the data processing load increases? Will the page start to time out? Will users have to wait longer? By making the process happen outside of the scope of the website you can offload it to better hardware if needed, ensure that records are processed in serial/parallel as business rules demand (avoid race conditions), save processing for off-peak hours, etc.
Check out PHP's Output Buffering.
Try to use:
flush();
http://php.net/manual/ru/function.flush.php
Try the flush() function. Calling this function forces PHP to send whatever output it has so far to the client, instead of waiting for the script to end.
However, some web servers will only send the output once the entire page is done being built, so calling flush() would have no effect in this case.
Also, browsers themselves buffer input, so you may run into problems there. For example, certain versions of IE won't start displaying the page until 256 bytes has been received.

break up recursive function in php

What is the best way to break up a recursive function that is using a ton of resources
For example:
function do_a_lot(){
//a lot of code and processing is done here
//it takes a lot of execution time
if($true){
//if true we have to do all of that processing again
do_a_lot();
}
}
Is there anyway to make the server only have to take the brunt of the first execution and then break up the recursion into separate processes? Or am I dreaming?
Honestly, if your function is using up that much of your system's resources, I'd most likely refactor my code. However, it's not truly multithreading, but you could perhaps look at using popen to fork your process.
One of the rule of PHP is "Share nothing". That means every PHP process is independant and shares nothing with the others. So if you want to break your execution on several PHP process you'll have to store the data somewhere. It can be a memcached storage, or a database, or the session, as you want.
Then you'll need to 'fork' your PHp process. They're solutions available to get this done on the server side. IMHO this is all hacks. Dangerous and not minded in the PHP/web way. With the exception of 'work queues' tools.
I think the nicest way is to break your task with ajax. This will allow you a clean user interface and will avoid any long response timeout in the web process. i.e. show a 'working zone' to you user, then ask in ajax for next step of the job (first one), get response (in server side stor you response), then ask for next step, store new response and respond , next step, etc. You can even add a 'stop that stuff' function on the client side.
You can check as well for 'php work queue' on google.
If it's a long running task, divide and conquer with gearman

php asynchronous call and getting response from the background job

I have done some google search on this topic and couldn't find the answer to my question.
What I want to achieve is the following:
the client make an asynchronous call to a function in the server
the server runs that function in the background (because that function is time consuming), and the client is not hanging in the meantime
the client constantly make a call to the server requesting the status of the background job
Can you please give me some advices on resolving my issue?
Thank you very much! ^-^
You are not specifying what language the asynchronous call is in, but I'm assuming PHP on both ends.
I think the most elegant way would be this:
HTML page loads, defines a random key for the operation (e.g. using rand() or an already available session ID [be careful though that the same user could be starting two operations])
HTML page makes Ajax call to PHP script to start_process.php
start_process.php executes exec /path/to/scriptname.php to start the process; see the User Contributed Notes on exec() on suggestions how to start a process in the background. Which one is the right for you, depends mainly on your OS.
long_process.php frequently writes its status into a status file, named after the random key that your Ajax page generated
HTML page makes frequent calls to show_status.php that reads out the status file, and returns the progress.
Have a google for long running php processes (be warned that there's a lot of bad advice out there on the topic - including the note referred to by Pekka - this will work on Microsoft but will fail in unpredicatable ways on anything else).
You could develop a service which responds to requests over a socket (your client would use fsockopen to connect) - some simple ways of acheiving this would be to use Aleksey Zapparov's Socket server (http://www.phpclasses.org/browse/package/5758.html) which handles requests coming in via a socket however since this runs as a single thread it may not be very appropriate for something which requiers a lot of processing. ALternatively, if you are using a non-Microsoft system then yo could hang your script off [x]inetd however, you'll need to do some clever stuff to prevent it terminating when the client disconnects.
To keep the thing running after your client disconnects then the PHP code must be running from the standalone PHP executable (not via the webserver) Spawn a process in a new process group (see posix_setsid() and pcntl_fork()). To enable the client to come back and check on progress, the easiest way to achieve this is to configure the server to write out its status to somewhere the client can read.
C.
Ajax call run method longRunningMethod() and get back an idendifier (e.g an id)
Server runs the method, and sets key in e.g. sharedmem
Client calls checkTask(id)
server lookup the key in sharedmem and check for ready status
[repeat 3 & 4 until 5 is finished]
longRunningMethod is finished and sets state to finished in sharedmem.
All Ajax calls are per definition asynchronous.
You could (although not a strictly necessary step) use AJAX to instantiate the call, and the script could then create a reference to the status of the background job in shared memory (or even a temporary entry in an SQL table, or even a temp file), in the form of a unique job id.
The script could then kick off your background process and immediately return the job ID to the client.
The client could then call the server repeatedly (via another AJAX interface, for example) to query the status of the job, e.g. "in progress", "complete".
If the background process to be executed is itself written in PHP (e.g. a command line PHP script) then you could pass the job id to it and it could provide meaningful progress updates back to the client (by writing to the same shared memory area, or database table).
If the process to executed it's not itself written in PHP then I suggest wrapping it in a command line PHP script so that it can monitor when the process being executed has finished running (and check the output to see if was successful) and update the status entry for that task appropriately.
Note: Using shared memory for this is best practice, but may not be available if you are using shared hosting, for example. Don't forget you want to have a means to clean up old status entries, so I would store "started_on"/"completed_on" timestamps values for each one, and have it delete entries for stale data (e.g. that have a completed_on timestamp of more than X minutes - and, ideally, that also checks for jobs that started some time ago but were never marked as completed and raises an alert about them).

Categories