PHP Server Sent Event reduce traffic - php

I just started using SSE in my PHP pages. I can very easy send new data with this code:
while(true) {
if (isset($_GET["selectedName"]) && $_GET["selectedName"] != "empty") {
echo "data:current timestamp for user ".$_GET["selectedName"]." is ".time().PHP_EOL;
echo PHP_EOL;
ob_flush();
flush();
}
sleep(1);
}
But this means, that I'm sending data every second to the client and I have to run a loop. The string in the example is not much data. But later I want to connect to a database and pull stuff from there.
If I would pull something like a whole article (or message or what ever), I would produce a very big amount of data.
Now my question: How can I tell script to provide new data and send it to the client from another PHP script?
I created a little iOS app which uses a small API to send status updates to the server. I'd like one of the API scripts to tell the web interfaces event source script to send the new data.
And only, when I tell it to do so.
How can I achieve this?
Thanks for help, with kind regards, Julian

Regarding the fact I don't have MUCH stuff to send at once (a state number from 0 to 4 and a little message string), I decided to set a variable and check if anything changed:
static $oldStateString;
while(true) {
$result = mysql_query("SELECT * FROM st_workers");
$currentStateString = "";
while ($user = mysql_fetch_array($result)) {
$currentStateString .= $user["stateIndex"].":--;".$user["state"].":--;".$user["humanReadableName"].":__;";
}
if ($currentStateString != $oldStateString) {
echo "data:".$currentStateString.PHP_EOL;
echo PHP_EOL;
ob_flush();
flush();
$oldStateString = $currentStateString;
}
sleep(1);
}
BUT: I noticed that you have to start to event handler a little later, not <body onload="">. This produces errors in most browsers except Safari 6 and the current FireFox. The browser doesn't recognize that it's not loading anymore. So the "loading spinner" does not stop until like 5 minutes or more.

Related

How to get data from Wordpress website (LAMP stack) to Android app in nearly realtime?

I have a Wordpress website with a working order system. Now I want to make an Android app which displays every new order in a list view as soon as the order was made.
The last two days I thought about the following solutions:
Simple HTTP GET requests every 10 seconds
Websockets
MySQL binary log + Pusher Link
Server Sent Events
My thoughts (working with a LAMP stack):
Simple HTTP requests are obviously the most ineffective solution.
I figured out that websockets and Apache aren't working well together.
Feels quite hacky and I want to avoid any 3rd party service if I can.
4. Looks like this is the optimal way for me, however there are some problems with Apache/php and Server Sent Events from what I experienced.
I tried to implement a simple demo script but I don't understand why some of them are using an infinite while loop to keep the connection open and others don't.
Here is an example without a loop and here with an infinite loop, also here
In addition to that, when I tested the variant with the infinite loop, my whole page won't load because of that sleep() function. It looks like the whole server freezes whenever I use it.
Does anyone have an idea how to fix that? Or do you have other suggestions?
That is the code that causes trouble (copied from here) and added a missing curly bracket:
<?php
// make session read-only
session_start();
session_write_close();
// disable default disconnect checks
ignore_user_abort(true);
// set headers for stream
header("Content-Type: text/event-stream");
header("Cache-Control: no-cache");
header("Access-Control-Allow-Origin: *");
// Is this a new stream or an existing one?
$lastEventId = floatval(isset($_SERVER["HTTP_LAST_EVENT_ID"]) ? $_SERVER["HTTP_LAST_EVENT_ID"] : 0);
if ($lastEventId == 0) {
$lastEventId = floatval(isset($_GET["lastEventId"]) ? $_GET["lastEventId"] : 0);
}
echo ":" . str_repeat(" ", 2048) . "\n"; // 2 kB padding for IE
echo "retry: 2000\n";
// start stream
while(true){
if(connection_aborted()){
exit();
}
else{
// here you will want to get the latest event id you have created on the server, but for now we will increment and force an update
$latestEventId = $lastEventId+1;
if($lastEventId < $latestEventId){
echo "id: " . $latestEventId . "\n";
echo "data: Howdy (".$latestEventId.") \n\n";
$lastEventId = $latestEventId;
ob_flush();
flush();
}
else{
// no new data to send
echo ": heartbeat\n\n";
ob_flush();
flush();
}
}
// 2 second sleep then carry on
sleep(2);
}
?>
I'm thankful for every advice I can get! :)
EDIT:
The main idea is to frequently check my MySQL database for new entries and if there is a new order present, format the data nicely and send the information over SSE to my android application.
I already found libraries to receive SSEs on android, the main problem is on the server side.
Based on your question I think you could implement SSE - Server sent events, which is part of HTML5 standard. It is a one-way communication from server to client. It needs html/javascript and a backend language, e.g PHP.
The client will subscribe on events and when subscription is up and running the server will send any updates from the input data. As standard the update will be visible each 3 seconds. This can be adjusted though.
I would recommend you to first create a basic functioning web-browser-client as a start. When and if it is working as you expect, only then you would judge about the effort of building the client as an app.
You would probably need to add functions on the client-side, such as start/stop the subscription.
My understanding of users not recommending the combination of (server sent events) and Apache is the lack of control how many open connections there are and what would control the continuously need of closing of connections. This could lead to sever server performance problems.
Seems using for example node.js would not cause that problem.
Here are some start link:
MDN:
https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events
Stream Updates with Server-Sent Events:
https://www.html5rocks.com/en/tutorials/eventsource/basics/

How to auto update data from json api with php?

im experimenting with JSON Api's with PHP.
Im using a free Bitcoin price ticker api from Blockchain.
Its working but to refresh the data i need to refresh the page.
Would it be possible to auto-update the data without refreshing the page?
This is what i got now (its working)
<?php
$json = file_get_contents('https://blockchain.info/ticker');
$data = json_decode($json,true);
$priceUSD = $data['USD']['last'];
echo $priceUSD;
Thanks in advance, have a nice day!
King regards,
L Kenselaar
In order to refresh the data in your PHP array, you'll have to run a new HTTP request against your API from within the PHP code.
Without refreshing the page where your PHP renders, you would need to keep the connection open (which will only last for as long as your php.ini max_execution_time is and PHP can't edit data it has already sent, so the closest you could get is a news ticker that appends new lines at the bottom)
If all you want is a self-refreshing website, you'll have to use JavaScript (that can run infinitely and request new data from your PHP backend in regular intervals). Look for AJAX or XMLHttpRequests in general.
If you must stick to PHP you might want to run an independent process in background (checkout nohup or disown on Linux/Unix).
Your script would do something like:
<?php
while(true){
try {
$json = file_get_contents('https://blockchain.info/ticker');
$data = json_decode($json,true);
$priceUSD = $data['USD']['last'];
// Do the internal handling
// update your database, etc
}
catch (Exception $e) {
echo 'Error: ' . $e->getMessage() . "\n";
}
// wait for 5 seconds
sleep(5);
}
Keep in mind that PHP code runs in a blocking thread and this means that this process has to run aside of your web server.
However, if you wanted to both tasks at the same time (fetching and serving requests), you would have to consider alternatives like NodeJS.

PHP script unexpectedly not continuing

I have a PHP script (let's call it execute.php) that draws the whole page (HTML tags and body tags etc.) at the beginning and, afer that, executes some commands (C++ programs) in the background. It then waits for these programs to terminate (some depend on the results of others, so they may be executed sequentially) and then has a JavaScript that auto-submits a form to another PHP script (which we will call results.php) because results.php needs the POST-information from the previous script.
execute.php:
<?php
print"
<html>
<body>
Some HTML code here
</body>
</html>
";
// Here come some C++-program calls
$pid_program1 = run_in_background($program1)
$pid_program2 = run_in_background($program2)
while (is_running($pid_program1) or is_running($pid_program2) )
{
//echo(".");
sleep(1);
}
// Here come some later C++-program calls that execute quickly
$pid_program3 = run_in_background($program3)
$pid_program4 = run_in_background($program4)
while (is_running($pid_program3) or is_running($pid_program4) )
{
sleep(1);
}
...
// We are now finished
print "
<form action=\"results.php\" id=\"go_to_results\" method=\"POST\">
<input type='hidden' name=\"session_id\" value=\"XYZ\">
</form>
<script type=\"text/javascript\">
AutoSubmitForm( 'go_to_results' );
</script>
";
This works nicely if the C++ programs 1 and 2 execute quickly. However, when they take their time (around 25 minutes in total), the PHP script seems to fail to continue. Interestingly the C++ programs 3 and 4 are nevertheless executed and produce the expected outputs etc.
However, when I put a echo("."); in the first while-loop before the sleep(), it works and continues until the JavaScript autosubmit.
So it seems to me that the remaining PHP code (including the autosubmit) is, for whatever reason, not send when there is no output in the first while loop.
I have also tried using set_time_limit(0) and ignore_user_abort(true) and different other things like writing to an outputbuffer (don't want to clutter the already finally displayed webpage) instead of the echo, but none of these work.
When I run the same scripts on a machine with multiple cores, so that program1 and 2 can be executed in parallel, it also works, without the echo(".").
So I am currently very confused and can't find any error messages in the apache log or PHP log and thus would really appreciate your thoughts on this one.
EDIT
Thanks again for your suggestions so far.
I have now adopted a solution involving (really simple) AJAX and it's definitely nicer this way.
However, if the C++-programs executions take "longer" it is not autosubmitting to the results-page, which is actually created this time (failed to do so before).
Basically what I have done is:
process.php:
<?php
$params = "someparam=1";
?>
<html>
<body>
<script type="text/javascript">
function run_analyses(params){
// Use AJAX to execute the programs independenantly in the background
// Allows for the user to close the process-page and come back at a later point to the results-link, w/o need to wait.
if (window.XMLHttpRequest)
{
http_request = new XMLHttpRequest();
}
else
{
//Fallback for IE5 and IE6, as these don't support the above writing/code
http_request = new ActiveXObject("Microsoft.XMLHTTP");
}
//Is http_request still false
if (!http_request)
{
alert('Ende :( Kann keine XMLHTTP-Instanz erzeugen');
}
http_request.onreadystatechange=function(){
if (http_request.readyState==4 && http_request.status==200){
// Maybe used to display the progress of the execution
//document.getElementById("output").innerHTML=http_request.responseText;
// Call of programs is finished -> Go to the results-page
document.getElementById( "go_to_results" ).submit();
}
};
http_request.open("POST","execute.php",true);
//Send the proper header information along with the request
http_request.setRequestHeader("Content-type", "application/x-www-form-urlencoded");
http_request.setRequestHeader("Content-length", params.length);
http_request.setRequestHeader("Connection", "close");
http_request.send(params);
};
</script>
<?php
// Do some HTML-markup
...
// Start the programs!
print "
<script type=\"text/javascript\">
run_analyses('".$params."');
</script>
<form action=\"results.html" id=\"go_to_results\" method=\"POST\">
<input type='hidden' name=\"session_id\" value=\"XYZ\">
</form>
?>
</html>
</body>
and execute.php contains the C++-program calls, waiting-routines and finally, via "include("results.php")" the creation of the results-page.
Again, for "not so long" program executions, the autosubmission works as expected, but not if it takes "longer". By "longer" I mean around 25 minutes.
I have absolutely no idea what could cause this as again, there are no error-messages to be found.
Am I missing a crucial configuration option there (apache, php, etc.)?
EDIT
As it turned out, letting the requested PHP-script "echo" something repeatedly prevents the timeout. So it is basically the same as for the PHP-solution without AJAX, but this time, since the responseText of the AJAX-request is not necessarily needed, the progress-page is not cluttered and it may be used as a workaround. Specifically, I would not necessarily recommend it a as a general solution or good-practice.
It occurs to me that a better approach would be to:
Output the complete HTML page
Show a loading message to the user
Send an AJAX request to start the external program
Wait for callback (waiting for external program to finish)
Repeat steps 3 and 4 until all program have been executed
Update the page to tell the user what is going on
Submit the form
This way, you get the HTML to the user as quickly as possible, then you execute the programs sequentially in an orderly and controlled fashion without worrying about hitting the max_execution_time threshold. This also enables you to keep your user informed - after each AJAX callback, you can tell the user that "program ABC has completed, starting DEF..." and so on.
EDIT
Per request, I'll add an outline of how this could be implemented. A caveat, too: If you are going to be adding more javascript-derived functionality to your page, you'll want to consider using a library like jQuery or mootools (my personal favorite). This is a decision you should make right away - if you aren't going to be doing a lot of javascript except this, then a library will only bloat your project, but if you are going to be adding a lot of javascript, you don't want to have to come back later and re-write your code because you add a library 3/4 of the way through the project.
I've used mootools to create this demonstration, but it isn't necessary or even advisable to add mootools if this is the only thing you're going to use it for. It is simply easier for me to write an example really quick without having to stop and think :)
First, the main page. We'll call this page view.php. This should contain your initial HTML as well as the javascript that will fire off the AJAX requests. Basically, this entire jsFiddle would be view.php: http://jsfiddle.net/WPnEy/1/
Now, execute.php looks like this:
$program_name = isset($_POST['program_name']) ? $_POST['program_name'] : false;
switch ($program_name) {
case 'program1':
$program_path = '/path/to/executable/';
$friendly_name = 'Some program 1';
break;
case 'program2':
$program_path = '/path/to/executable/';
$friendly_name = 'Some program 2';
break;
case 'program3':
$program_path = '/path/to/executable/';
$friendly_name = 'Some program 3';
break;
case 'program4':
$program_path = '/path/to/executable/';
$friendly_name = 'Some program 4';
break;
default:
die(json_encode(array(
'program_name'=>'Invalid',
'status'=>'FAILED',
'error'->true,
'error_msg'=>'Invalid program'
)));
break;
}
$pid = run_in_background($program_path)
while (is_running(pid)) {
sleep(1);
}
// check here for errors, get any error messages you might have
$error = false;
$error_msg = '';
// use this for failures that are not necessarily errors...
$status = 'OK';
die(json_encode(array(
'program_name'=>$friendly_name,
'status'=>$status,
'error'->$error,
'error_msg'=>$error_msg
)));
execute.php would then be called once for each program. The $friendly_program variable gives you a way to send back something for the user to see. The switch statement there makes sure that the script isn't being asked to execute anything you aren't expecting. The program is executed, and when it is done you send along a little package of information with the status, the friendly name, any errors, etc. This comes into the javascript on view.php, which then decides if there are more programs to run. If so, it will call execute.php again... if not, it will submit the form.
This seems rather convoluted... And very risky. Any network glitches, the user's browser closing for whatever reason, and even a firewall timing out, and this script is aborted.
Why not run the whole thing in the background?
<?php
session_start();
$_SESSION['background_run_is_done'] = false;
session_write_close(); // release session file lock
set_time_limit(0);
ignore_user_abort(true); // allow job to keep running even if client disconnects.
.... your external stuff here ...
if ($successfully_completed) {
session_start(); // re-open session file to update value
$_SESSION['background_run_is_done'] = TRUE;
}
... use curl to submit job completion post here ...
?>
This disconnects the state of the user's browser from the processing of the jobs. You then just have your client-side code ping the server occasionally to monitor the job's progress.
Launching and managing multiple and long-running processes from a webserver PHP process is fraught with complications and complexity. It's also very different on different platforms (you didn't say which you are using).
Handling the invocation of these processes synchronously from the execution of your PHP is not the way to address this. You really need to run the programs in a seperate session group - and use (e.g.) Ajax or Comet to poll the status of them.

Ajax long polling (comet) + php on lighttpd v1.4.22 multiple instances problem

I am new to this site, so I really hope I will provide all the necessary information regarding my question.
I've been trying to create a "new message arrived notification" using long polling. Currently I am initiating the polling request by window.onLoad event of each page in my site.
On the server side I have an infinite loop:
while(1){
if(NewMessageArrived($current_user))break;
sleep(10);
}
echo $newMessageCount;
On the client side I have the following (simplified) ajax functions:
poll_new_messages(){
xmlhttp=GetXmlHttpObject();
//...
xmlhttp.onreadystatechange=got_new_message_count;
//...
xmlhttp.send();
}
got_new_message_count(){
if (xmlhttp.readyState==4){
updateMessageCount(xmlhttp.responseText);
//...
poll_new_messages();
}
}
The problem is that with each page load, the above loop starts again. The result is multiple infinite loops for each user that eventually make my server hang.
*The NewMessageArived() function queries MySQL DB for new unread messages.
*At the beginning of the php script I run start_session() in order to obtain the $current_user value.
I am currently the only user of this site so it is easy for me to debug this behavior by writing time() to a file inside this loop. What I see is that the file is being written more often than once in 10 seconds, but it starts only when I go from page to page.
Please let me know if any additional information might help.
Thank you.
I think I found a solution to my problem. I would appreciate if anyone could tell, if this is the technique that is being used in COMET and how scalable this solution.
I used a user based semaphore like this:
$sem_id = sem_get($current_user);
sem_acquire($sem_id);
while(1){
if(NewMessageArrived($current_user))break;
sleep(10);
}
sem_release($sem_id);
echo $newMessageCount;
It seems common for long-polling requests to timeout after 30 seconds. So in your while loop you could echo 'CLOSE' after 30 seconds.
while(!$new_message && $timer < 30){
$new_message = NewMessageArrived($current_user);
if(!$new_message) {
sleep(10);
$timer += 10;
}
}
if($newMessageCount) {
echo $newMessageCount;
} else {
echo 'CLOSE';
}
In the Javascript, you can listen for the CLOSE.
function poll_new_messages(){
xmlhttp=GetXmlHttpObject();
//...
xmlhttp.onreadystatechange=got_new_message_count;
//...
xmlhttp.send();
}
function got_new_message_count(){
if (xmlhttp.readyState==4){
if(xmlhttp.responseText != 'CLOSE') {
updateMessageCount(xmlhttp.responseText);
}
//...
poll_new_messages();
}
}
Now, the PHP will return a response within 30 seconds, no matter what. If you use stays on the page, and you receive a CLOSE, you just don't update the count on the page, and re-ask.
If the user moves to a new page, your PHP instance will stop the loop regardless within 30 seconds, and return a response. Being on a new page though, the XHR that cared about that connection no longer exists, so it won't start up another loop.
You might try checking connection_aborted() periodically. Note that connection_aborted() might not pick up on the fact that the connection has in fact been aborted until you've written some output and done a flush().
In fact, just producing some output periodically may be sufficient for php to notice the connection close itself, and automatically kill your script.

How to display HTML to the browser incrementally over a long period of time?

Do I need to pass back any HTTP headers to tell the browser that my server won't be immediately closing the connection and to display as the HTML is received? Is there anything necessary to get the HTML to incrementally display like flush()?
This technique used to be used for things like chat, but I'm thinking about using it for a COMET type application.
Long polling is a common technique to do something like this; to briefly summarise, it works as follows:
The client sends an XHR to the server.
If there is data ready, the server returns this immediately.
If not, the server keeps the connection open until data does become available, then it returns this.
If the request times-out, go back to 1).
The page running on the client receives this data, and does what it does with it.
Go back to 1)
This is how Facebook implements its chat feature.
This article also clears up some of the misconceptions of long-polling, and details some of the benefits of doing so.
The client will close the connection when it does not receive any data for a certain time. This timeout cannot be influenced by HTTP headers. It is client-specific and usually set to 120 seconds IIRC.
So all you have to do is send small amounts of data regularly to avoid hitting the timeout.
I think a more robust solution is a page with a Javascript timer that polls the server for new data. Keeping the response open is not something the HTTP protocol was designed for.
I would just echo / print the HTML as I went. There are a few different ways you can have the script pause before sending the next bit. You shouldn't need to do anything with headers or any special code to tell the browser to wait. As long as your script is still running it will render the HTML it receives from the script.
echo "<HTML><HEAD.../HEAD><BODY>";
while (running)
{
echo "printing html... </br>";
}
echo "</BODY></HTML>"; //all done
Try forever frame (like in gmail)
All of these technics are just hacks, http isn't designed to do this.
at the end of your script, use something like this (assuming you had output buffering on by putting ob_start() at the top of your page
<?php
set_time_limit(0); // Stop PHP from closing script after 30 seconds
ob_start();
echo str_pad('', 1024 * 1024, 'x'); // Dummy 1 megabyte string
$buffer = ob_get_clean();
while (isset($buffer[0])) {
$send = substr($buffer, 0, 1024 * 30); // Get 30kbs bytes from buffer :D
$buffer = substr($buffer, 1024 * 30); // Shorten buffer
echo $send; // Send buffer
echo '<br />'; // forces browser to reload contents some how :P
ob_flush(); // Flush output to browser
flush();
sleep(1); // Sleep for 1 second
}
?>
That script basically outputs 1 megabyte of text at 30kbs (simulated) no matter how fast the user and server connection is.
Depending on what you are doing, you could just echo as your script proceeds, this will then send the html to the browser as it is echoed.
I would suggest you investigate implementing such functionality using Ajax, rather than plain old HTML. This allows you much more flexibility in terms of architectural design and user interface

Categories