I'm using in my project server-sent events where the JS is calling a PHP page, say eventserver.php which consists basically of an infinite loop which checks the existence of an event in a $_SESSION variable.
On my first implementation this lead my website to hung up because the eventserver took the lock on the session and did not release it until the timeout expired; however, I managed to resolve this issue by locking/unlocking the session by using session_write_lock() and
session_start() continuously in the loop.
This is actually causing a lot of PHP warnings (on Apache error.log) saying that "cannot send session cache limiter - headers already sent", "cannot send session cookies" and so on.
Posting some code here
session_start();
header('Cache-Control: no-cache');
header('Content-Type: text/event-stream');
class EventServer
{
public function WaitForEvents( $eventType )
{
// ... do stuff
while( true )
{
// lock the session to this instance
session_start();
// ...check/output the event
ob_flush();
flush();
// unlock the session
session_write_close();
sleep( 1 );
}
}
}
Why is this happening?
I am doing the same thing as the OP and ran into the same issue. Some of these answers don't understand how eventSource should work. My code is identical to yours and uses a session variable to know what view the user is on which drives what data to return in the event of a server trigger. It's part of a realtime collaboration app.
I simply prepended an # to the session_start() to suppress the warnings in the log. Not really a fix, but it keeps the log from filling up.
Alternatively, not sure how well it would work for your application, but you could use ajax to write the session variable you are monitoring to the database, then your eventSource script can monitor for a change in the DB instead of having to start sessions.
This is not a good idea. HTTP is a request-response protocol so if you want server-client communication to be bi-directional you will need to look into websockets or something similar. There are also things like "long polling" and "heart beating"
If you want an event loop try something like servlets in apache tomcat.
You will grapple for hours with issues because of your design.
Also check out ajax if you just want to shoot messages from javascript to PHP.
Make sure you know an overview of the tech stack you are working with :)
You don't need an infinite loop with SSE. The EventSource keeps an open connection to the server and any update on the server side data will be read by the client.
Check out basic usage of SSE here
It's probably because you start the session twice in your code. Don't restart the session at the beginning of the loop, but after the sleep().
Related
I have a Wordpress website with a working order system. Now I want to make an Android app which displays every new order in a list view as soon as the order was made.
The last two days I thought about the following solutions:
Simple HTTP GET requests every 10 seconds
Websockets
MySQL binary log + Pusher Link
Server Sent Events
My thoughts (working with a LAMP stack):
Simple HTTP requests are obviously the most ineffective solution.
I figured out that websockets and Apache aren't working well together.
Feels quite hacky and I want to avoid any 3rd party service if I can.
4. Looks like this is the optimal way for me, however there are some problems with Apache/php and Server Sent Events from what I experienced.
I tried to implement a simple demo script but I don't understand why some of them are using an infinite while loop to keep the connection open and others don't.
Here is an example without a loop and here with an infinite loop, also here
In addition to that, when I tested the variant with the infinite loop, my whole page won't load because of that sleep() function. It looks like the whole server freezes whenever I use it.
Does anyone have an idea how to fix that? Or do you have other suggestions?
That is the code that causes trouble (copied from here) and added a missing curly bracket:
<?php
// make session read-only
session_start();
session_write_close();
// disable default disconnect checks
ignore_user_abort(true);
// set headers for stream
header("Content-Type: text/event-stream");
header("Cache-Control: no-cache");
header("Access-Control-Allow-Origin: *");
// Is this a new stream or an existing one?
$lastEventId = floatval(isset($_SERVER["HTTP_LAST_EVENT_ID"]) ? $_SERVER["HTTP_LAST_EVENT_ID"] : 0);
if ($lastEventId == 0) {
$lastEventId = floatval(isset($_GET["lastEventId"]) ? $_GET["lastEventId"] : 0);
}
echo ":" . str_repeat(" ", 2048) . "\n"; // 2 kB padding for IE
echo "retry: 2000\n";
// start stream
while(true){
if(connection_aborted()){
exit();
}
else{
// here you will want to get the latest event id you have created on the server, but for now we will increment and force an update
$latestEventId = $lastEventId+1;
if($lastEventId < $latestEventId){
echo "id: " . $latestEventId . "\n";
echo "data: Howdy (".$latestEventId.") \n\n";
$lastEventId = $latestEventId;
ob_flush();
flush();
}
else{
// no new data to send
echo ": heartbeat\n\n";
ob_flush();
flush();
}
}
// 2 second sleep then carry on
sleep(2);
}
?>
I'm thankful for every advice I can get! :)
EDIT:
The main idea is to frequently check my MySQL database for new entries and if there is a new order present, format the data nicely and send the information over SSE to my android application.
I already found libraries to receive SSEs on android, the main problem is on the server side.
Based on your question I think you could implement SSE - Server sent events, which is part of HTML5 standard. It is a one-way communication from server to client. It needs html/javascript and a backend language, e.g PHP.
The client will subscribe on events and when subscription is up and running the server will send any updates from the input data. As standard the update will be visible each 3 seconds. This can be adjusted though.
I would recommend you to first create a basic functioning web-browser-client as a start. When and if it is working as you expect, only then you would judge about the effort of building the client as an app.
You would probably need to add functions on the client-side, such as start/stop the subscription.
My understanding of users not recommending the combination of (server sent events) and Apache is the lack of control how many open connections there are and what would control the continuously need of closing of connections. This could lead to sever server performance problems.
Seems using for example node.js would not cause that problem.
Here are some start link:
MDN:
https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events
Stream Updates with Server-Sent Events:
https://www.html5rocks.com/en/tutorials/eventsource/basics/
I have a php/apache page that takes a long time to load. Basically, it looks like this:
<?php
doHeavyStuff_1();
doHeavyStuff_2();
doHeavyStuff_3();
printResults();
?>
It happens from time to time that the client disconnects in the middle of the processing, say, between step1 and step2. Is there a way in php to check if the client is still connected and to stop further processing if it isn't? I'd like my code to be like this:
<?php
doHeavyStuff_1();
if(<clientDisconnected>) die;
doHeavyStuff_2();
if(<clientDisconnected>) die;
doHeavyStuff_3();
if(<clientDisconnected>) die;
printResults();
?>
Look into using the connection_aborted function.
Normally PHP (as an apache module) should stop automatically if the user disconnects. No work is necessary.
Just if you are interested: if you want PHP to continue processing there is the function ignore_user_abort(). You should start to read the manual page and especially the comments to see how it can be used and which problems can occur.
I'm interested in using the mark as read method for emails and in-app notifications as specified here
I'm using Symfony 1.4 and I can't find anything within sfContext or sfWebRequest that can be used to detect if the connection has been terminated. Is there anyway to do this within the Symfony framework?
Would I be able to use the PHP functions connection_status() or connection_aborted()?
In a typical PHP script, the script execution will stop at the next tickable event after the user terminates a connection.
Because of this, you shouldn't need to really do anything to get this to work (unless you have changed this setting in your php.ini) or someone has set the ignore_user_abort flag.
But, if you want to be safe, you could do the following:
<?php
// Whatever you need here
sleep(5);
if (false == connection_aborted()) {
// Mark the notification as read
}
Below is the code that i am end up with using successful comet implementation.
$lastmodif = isset($this->params['form']['timestamp']) ? $this->params['form']['timestamp'] : 0;
$currentmodif = $already_updated[0]['Update']['lastmodified'];
while ($currentmodif <= $lastmodif)
{
usleep(5000000);
clearstatcache();
$already_updated_new = $this->Update->find('all',array
(
'conditions' => array
(
'Update.receiver_id' => $this->Auth->user('id'),
'Update.table_name' => "request_responses"
)
));
$currentmodif = $already_updated_new[0]['Update']['lastmodified'];
}
$already_updated[0]['Update']['lastmodified'] is the query result for get last updated timestamp of table.
In above code $lastmodif and $currentmodif is the timestamp that is being passed after every successful comet response.
But now problem is that when i am clicking on other links on same page nothing happens but after wait for so long its redirecting.
i think usleep is blocking other HTTP request.
i am using mysql and cakephp please guys guide me what should i do in order to solve this issue.
I have tried to flush when page is called but it shows can not modify header error as output is already sent.
Thanks.
I've met similar situation several times. It looks like Session is blocked by your sleeping script.
How to solve it in CakePHP:
call session_write_close(); at the start of your script.
There is no way to do that via Cake's Session Component or Helper
Note: If something inside script uses session - Cake will reopen session and hang all requests that use same session again. In this case you will need to close session before sleep or before any operations that take a lot of time to be finished
If your script uses sessions then you could notice such behavior. PHP locks the session file until the script completes.
This means that once a script starts a session, any other script that attempts to start a session using same session id is blocked until the previous script releases the lock (or terminates).
The workaround for this is to unlock the session before any lengthy process:
call session_start()
read/write any session variables
call session_write_close()
do lengthy processing
Yes, the usleep is blocking further requests. Depending on your hosting environment, you probably have a limited amount of processes available. I assume you have multiple users in your chat -> they all issue blocking processes unless none is available, that's why your other "links" timeout.
I would suggest to implement the wait on the client-browser side, eg
setTimeout(function() {
fetchAndPrintTheNewChats();
}, 50000000);
Any approach to do this within your PHP code will result in the same problem.
Can you share what version of cakephp you are using in case someone else who comes along might have a solution?
Cake has a session component: http://book.cakephp.org/2.0/en/core-libraries/components/sessions.html
and a session helper: http://book.cakephp.org/2.0/en/core-libraries/helpers/session.html
Scenario is as follows:
Call to a specified URL including the Id of a known SearchDefinition should create a new Search record in a db and return the new Search.Id.
Before returning the Id, I need to spawn a new process / start async execution of a PHP file which takes in the new Search.Id and does the searching.
The UI then polls a 3rd PHP script to get status of the search (2nd script keeps updating search record in the Db).
This gives me a problem around spawning the 2nd PHP script in an async manner.
I'm going to be running this on a 3rd party server so have little control over permissions. As such, I'd prefer to avoid a cron job/similar polling for new Search records (and I don't really like polling if I can avoid it). I'm not a great fan of having to use a web server for work which is not web-related but to avoid permissions issues it may be required.
This seems to leave me 2 options:
Calling the 1st script returns the Id and closes the connection but continues executing and actually does the search (ie stick script 2 at the end of script 1 but close response at the append point)
Launch a second PHP script in an asynchronous manner.
I'm not sure how either of the above could be accomplished. The first still feels nasty.
If it's necessary to use CURL or similar to fake a web call, I'll do it but I was hoping for some kind of convenient multi-threading approach where I simply spawn a new thread and point it at the appropriate function and permissions would be inherited from the caller (ie web server user).
I'd rather use option 1. This would also keep related functionality closer to each other.
Here is a hint how to send something to user and then close the connection and continue executing:
(by tom ********* at gmail dot com, source: http://www.php.net/manual/en/features.connection-handling.php#93441)
<?php
ob_end_clean();
header("Connection: close\r\n");
header("Content-Encoding: none\r\n");
ignore_user_abort(true); // optional
ob_start();
echo ('Text user will see');
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // Strange behaviour, will not work
flush(); // Unless both are called !
ob_end_clean();
//do processing here
sleep(5);
echo('Text user will never see');
//do some processing
?>
swoole: asynchronous & concurrent extension.
https://github.com/matyhtf/swoole
event-driven
full asynchronous non-blocking
multi-thread reactor
multi-process worker
millisecond timer
async MySQL
async task
async read/write file system
async dns lookup