php websocket fallback to ajax best practices - php

Everything is working perfectly with this set:
I'm using docker-compose running some containers: nginx:1.22.0-alpine, php:8.1-cli, php:8.1.10-fpm, redis:7.0.5, [...].
My folders within the server are like:
-var/www/web
-var/www/web/sharedPHPScript.php
-var/www/web/public
-var/www/web/public/index.php
-var/www/web/core/ajax
-var/www/web/core/ajax/ajax.parser.php
-var/www/web/core/websocket
-var/www/web/core/websocket/websocket.php
-var/www/web/core/websocket/websocket.parser.php
[...]
I just mounted the same host /var/www/web into both containers: php:8.1-cli, php:8.1.10-fpm.
This way I need just to code from my host and the code state are replicated in both containers.
I just picked php:8.1-cli to run my websocket-server because I was benchmarking for performance but I do not know if would be better to use php-fpm to it.
on docker-composer.yml:
version: '3.7'
services:
[...]
fpm:
[...]
volumes:
- ./web:/web:delegated
swoole:
[...]
volumes:
- ./web:/web:delegated
index.php includes a controller that manage to call ajax.php when ever I send a fetch request from the client:
index.php>controller.php>include('/web/sharedPHPScript.php')
websocket.php keeps running like a thread listening for connections to spaw filedescriptors to the server:
use Swoole\WebSocket\Server;
// Init redis
$redis = new Redis();
$redis->connect(REDIS['host'], REDIS['port']);
// Create websocket instance
$server = new Server("0.0.0.0", 8443);
[...]
// Register the "onOpen" event handler
$server->on("open", function (Server $server, $request) use(&$redis) {
[...]
include('/web/sharedPHPScript.php')
}
[...]
My client (using javascript) onload event, creates a websocket connection with the server: server-ip/websocket>nginx:1.22.0-alpine>php:8.1-cli.
If the websocket connection status is not working, it just start using an ajax fetch.
function sendJSON(obj){
// check if ws is active
console.log('socket.readyState',socket.readyState);
// use Websocket
if(socket.readyState === 1){
wsSend(obj); // some wrapper for socket.send();
return;
}
// use AJAX
ajaxSend(obj); // some wrapper for ajax fetch().then()
}
My doubt is:
As I want to use this /web/sharedPHPScript.php to handle the fetches comming from php-fpm and the socket comming from php-cli, for better debugging as no need to write for two diferent data flows and database connections, I'm creating some wrap functions to decide which action to take depending on the origin (from websocket.parser.php or ajax.parser.php). inside /web/sharedPHPScript.php:
if($continue){ sendGlobal($req,$server,$request); $continue = dieGlobal($req); }
if($continue){
sendGlobal($object,$server,$request);
$continue = dieGlobal($req);
}
if(!$continue){return;}
from vendor/autoload.php from each container:
composer.json:
[...]
"autoload":{
"psr-4":{
"Classes\\":"../web/core/!classes/"
}
}
/web/core/!classes/sharedFunctions.php:
// Requests
// used for ajax
function sendAJAX($obj){
header("Content-type: application/json");
echo json_encode($obj);
}
// used for ws
function sendWS($obj,$server,$request){
if($server && $request){$server->push($request->fd, json_encode($obj));}
}
// used for both ajax and ws (Global)
function sendGlobal($obj,$server=0,$request=0){
if(isset($obj->ajax)){ sendAJAX($obj); }
if(isset($obj->ws)){ sendWS($obj,$server,$request); }
}
function dieGlobal($obj){
if(isset($obj->ajax)){ die; }
if(isset($obj->ws)){ return false; }
}
from ajax.parser.php:
$ajax = true;
$req = (object)$_POST;
$continue = true;
$server=null;
$request=null;
include('/web/sharedPHPScript.php');
from websocket.parser.php:
$ws=true;
include('/web/sharedPHPScript.php');
The problem is that php-cli as running a loop, I cannot use php commands like die;,setheader();,echo(); to send the http response back to the client. And from php-fpm, I do not have a $server or $request object, etc... Not even mentioning the wrapper for the $_SESSION...
So I'm woundering if there is a better/cleaner/simpler (in terms of best practice) way to do this websocket fallback to ajax because it is working like a charm and performing very nice but the sharedPHPScript.php is looking ugly...
I tryed separated flows for each websocket and ajax connections but this way i realised, as the project grows, I would need to create two slightly different codes for each container to do the same task...

Related

How can we call url() with port number in Laravel controller. Node server is running on different port that 8001 and Laravel port is 8000?

My Laravel application is running on 127.0.0.1:8000 port but my node server is running on port 127.0.0.1:8001 which has a node script that scrapes data. how can I call the URL function to point out the 127.0.0.1:8001 port from a Laravel controller to run such a script?
for example, check the comment in the index() in DataScrapingController
class DataScrapingController extends Controller
{
public function index()
{
// Here is the below URL I want to call this URL when a user calls this index
//controller function
"http://127.0.0.1:8001/scrape"
// After scraping the data when this **http://127.0.0.1:8001/scrape** URL call. which
// saves the data in the output.json file here I get the output.json file and return the
// data to the data-scraping blade file.
$data = json_decode(file_get_contents('output.json'));
return view('data-scraping', compact('data'));
}
}
Below image is the node hierarchy in the Laravel application
thanks in advance
Use the Laravel Http wrapper around guzzle
$response = Http::get("http://127.0.0.1:8001/scrape");
Please see https://laravel.com/docs/9.x/http-client#main-content
Make sure you are not running php artisan serve as this can only handle one connection at once.

Amphp : Run many async loops with same connection (eventstore client)

I'm using eventstore client which uses amphp. I need inside my application to reuse the connection in many parts.
So I created a connection provider:
public function getConnection(): EventStoreConnection
{
if ($this->connection) {
return $this->connection;
}
$this->connection = $this->createConnection();
wait($this->connection->connectAsync());
return $this->connection;
}
And then I use this connection at many places:
\Amp\Loop::run(function () use ($eventStoreEvents, $streamName) {
$connection = $this->connectionProvider->getConnection();
// Creation of an event stream
yield $connection->appendToStreamAsync($streamName, ExpectedVersion::ANY, $eventStoreEvents);
// sleep(10); // This sleep does not work, code continue like nothing happend
});
\Amp\Loop::run(function () use ($streamName, $aggregateFqcn, &$aggregateRoot) {
$start = 0;
$count = \Prooph\EventStore\Internal\Consts::MAX_READ_SIZE;
$connection = $this->connectionProvider->getConnection();
do {
$events = [];
/** #var StreamEventsSlice $streamEventsSlice */
$streamEventsSlice = yield $connection
->readStreamEventsForwardAsync(
$streamName,
$start,
$count,
true
);
if (!$streamEventsSlice->status()->equals(SliceReadStatus::success())) {
dump($streamEventsSlice); // Event stream does not exist
// Error here: the event stream doesn't exist at this point.
throw new RuntimeGangxception('Impossible to generate the aggregate');
}
} while (! $streamEventsSlice->isEndOfStream());
});
The problem: it seems that the first request is not over but the second loop starts already. The sleep uncommented doesn't have any effect!
But the event stream is finally created with the related events inside, so the first request worked.
If I start a connection then close then start a new one, it works. But it's slow, due to handshake overhead on each new connection.
I tried a similar example with the WebSocket library of Amphp and it worked. Do you see anything wrong?
Here is my test with websocket that worked:
$connection = \Amp\Promise\wait(connect('ws://localhost:8080'));
Amp\Loop::run(function () use ($connection) {
/** #var Connection $connection */
yield $connection->send("Hello...");
sleep(10); // This sleep works!
});
Amp\Loop::run(function () use ($connection) {
/** #var Connection $connection */
yield $connection->send("... World !");
});
$connection->close();
What you are trying to do makes no sense. You should read amphp's documenation.
Amp uses a global accessor for the event loop as there’s only one event loop for each application. It doesn’t make sense to have two loops running at the same time, as they would just have to schedule each other in a busy waiting manner to operate correctly.
That said, there is literally NO SECOND LOOP.
Prooph eventstore library is based on amphp but doesn't follow all principles: you can't wait for the connection to be ready. It will be even worse if you try to use it at scale, so don't try to wait for the promise is complete.
As an alternative, you can set a promise for later and check if the connection is null. That's what actually does the library internally to process further steps.
On my side, I decided to stop using this library. But as an alternative you can use the library that uses the HTTP client, it's also from the prooph team.

Using reactive PHP in a blocking application

I'm currently working on a PHP application that will be using some websocket connections to talk to another service.
To talk to this websocket service, we are using Ratchet - which is a PHP library based on react PHP.
This piece of code needs to send and respond to a couple of requests, and after that, should return the information to the "main thread".
Example flow:
HTTP request -> controller -> Starts a service which opens a websocket client -> websocket client is talking to server -> once its done it should return the outcome to the controller code -> controller outputs to user
The issue I'm having is that I'm not familiar with Reactive PHP and am not sure how to handle this.
I've tried;
$service = new WebsocketService();
$startTimer = time();
$service->getList(44);
while($service->getResponse() == null) {
usleep(500);
if (time() > $startTimer + 10) {
continue; //Timeout on 10 seconds
}
}
var_dump($service->getResponse());
The service code would set its "response" variable to something other than null once its done. This obviously fails, because the sleep method is blocking the thread. Also without, it seems like the while loop is blocking I/O and the reactive code fails.
A solution would be to open up a new thread and run the websocket code there, but I wouldn't be happy with that.
I feel like I need to implement some sort of "watcher" around the websocket process, but I'm not sure how to do that.
Our Websocket service client code looks like this;
private $response = null;
/**
* #return null|object
*/
public function getResponse() {
return $this->response;
}
public function getList($accountId) {
$this->response = null;
\Ratchet\Client\connect('ws://192.168.56.1:8080')->then(function(\Ratchet\Client\WebSocket $conn) use ($accountId) {
$login = new \stdClass();
$login->action = 'login';
$conn->on('message', function($msg) use ($conn, $login, $accountId) {
try {
$response = json_decode($msg);
if ($response->result_id == 100) {
//Succesfully logged in to websocket server
//Do our request now.
$message = new \stdClass();
$message->target = 'test';
$conn->send(json_encode($message));
}
if (isset($response->reply) && $response->reply == 'list') {
$this->response = $response; //This is the content I need returned in the controller
$conn->close(); //Dont need it anymore
}
} catch (\Exception $e) {
echo 'response exception!';
//Do nothing for now
}
});
$conn->send(json_encode($login));
}, function ($e) {
echo "Could not connect: {$e->getMessage()}\n";
});
}
Running the code like this also does not work;
$service = new WebsocketService();
$service->getList(44);
echo 'Test';
var_dump($service->getResponse());
because the "test" echo comes before I even get a response from the websocket server.
Please, enlighten me! I'm not sure what to search for.
PHP and websockets still seem to be a bit experimental. Nevertheless I have found a great tutorial on medium.com, written by Adam Winnipass which should be really helpful for solving your problem: https://medium.com/#winni4eva/php-websockets-with-ratchet-5e76bacd7548
The only difference is that they are implementing their websocket client with JavaScript instead of PHP. But in the end there should not be much of a difference, because as soon as we have opened the Websocket connection of each end both applications have to send and also wait to receive notifications - this is how they illustrate it:
Seems like one possibility to create a successful Websocket connection is to extend the MessageComponentInterface
use Ratchet\MessageComponentInterface;
which also requires
use Ratchet\ConnectionInterface;
The message component interface defines the following methods:
onOpen
onMessage
onClose
onError
And I think this is how the Ratchet library is implementing it. This is how they are finally starting their server:
use Ratchet\Server\IoServer;
use MyApp\MyCustomMessageComponentInterface;
use Ratchet\Http\HttpServer;
use Ratchet\WebSocket\WsServer;
require dirname(__DIR__) . '/vendor/autoload.php';
$server = IoServer::factory(
new HttpServer(
new WsServer(
new MyCustomMessageComponentInterface()
)
),
8080
);
$server->run();
With this architecture you already can receive (onMessage) and sending is also possible with the send() method.
I can not solve the exact problem with your existing code. But I guess if you are using the pre-built classes and interfaces of the library as intended (and demonstrated here) you should be able to achieve what you want by adding your code to the corresponding methods.
More information and examples can be found in the docs:
http://socketo.me/docs/server
http://socketo.me/api/namespace-Ratchet.html
Are you extending class with WsServer, This might be issue, if you are getting fatal errors. I am not sure whether you are getting fatal errors or warnings. Also i notice the public function onOpen() opens a connection. Please try referring this document http://socketo.me/api/class-Ratchet.WebSocket.WsServer.html might be useful.

Websocket Testing [Functional]

I am using Ratchet for websockets in PHP. I was able to write unit (phpspec) and acceptance tests (behat) for websockets but I cannot find a way on how to test the connection to the websocket server by using a functional phpunit test.. I think a test, which checks if the connection is up and running, would be very important. I thought of something like the following:
Create a (ratchet) client in phpunit
Connect to ws url (e.g. client->connect(host, port, ...)
ping websocket / send / receive some messages (call methods of client, e.g. client->push(..)..)
The problem is, that I don't know which class is responsible for establishing the connection (creating a client which can request the websocket) in Ratchet and how a test would then look like. How can I create a Ratchet Client in order to be able to connect and request a websocket in phpunit functional test? (similar to e.g. a webclient within a standard phpunit functional test)
As an example, within a functional test for a feature, I could do the following:
$client = static::createClient();
$client->request('GET', '/book/3', array(), array(), array('HTTP_X-Requested-With' => 'XMLHttpRequest'));
$response = $client->getResponse();
$this->assertEquals(
440,
$response->getStatusCode()
);
Or e.g. create an authenticated client instead of an anonymous. How would it be possible, to "translate" this functional test into a ratchet websocket one?
I hope this partially solves your question of what part of the Ratchet Websocket is responsible of making a connection.
The test itself will probably not work, I am not that experienced with testing in PHP although it should put you on the right path.
public function testCanConnect()
{
\Ratchet\Client\connect('ws://127.0.0.1:8080')->then(function($conn) {
$conn->on('message', function($msg) use ($conn) {
print $msg."\n";
$this->assertEquals('true',true);
$this->assertEquals("Hello World",$msg);
$conn->close();
});
$conn->send('Hello World!');
}, function ($e) {
echo "Could not connect: {$e->getMessage()}\n";
});
}
If you have any further questions please let me know.

How to use Ratchet to respond to HTML5 server-side events?

(Note: I've intentionally put non adequate websocket tag here, as it's best chance for WebSocket expert folks to know architecture of Ratchet).
I'm up for implementing HTML5 server side events, and what I need is server side solution. Since hanging Apache's one process per connection (connection pool limit, memory consumption...) is out of consideration I was hoping that Ratchet project can be of help, since it's most maintained project and they have http server coupled along with other components.
My question is: how can I use it? Not for upgrading http request (default usage), but for serving dynamically generated content.
What have I tried so far?
installed Ratchet as explained in tutorial
tested WebSocket functionality - works properly
followed very basic set of instructions given on page that describes http server component:
/bin/http-server.php
use Ratchet\Http\HttpServer;
use Ratchet\Server\IoServer;
require dirname(__DIR__) . '/vendor/autoload.php';
$http = new HttpServer(new MyWebPage);
$server = IoServer::factory($http);
$server->run();
One should not be an expert to figure out that MyWebPage class here needs to be declared in order for server to work, but how?
The Ratchet documentation does not seems to cover this.
Your MyWebPage class needs to implement HttpServerInterface. Since it's just going to be a simple request/response you need to send a response and then close the connection within the onOpen() method of your class:
<?php
use Guzzle\Http\Message\RequestInterface;
use Guzzle\Http\Message\Response;
use Ratchet\ConnectionInterface;
use Ratchet\Http\HttpServerInterface;
class MyWebPage implements HttpServerInterface
{
protected $response;
public function onOpen(ConnectionInterface $conn, RequestInterface $request = null)
{
$this->response = new Response(200, [
'Content-Type' => 'text/html; charset=utf-8',
]);
$this->response->setBody('Hello World!');
$this->close($conn);
}
public function onClose(ConnectionInterface $conn)
{
}
public function onError(ConnectionInterface $conn, \Exception $e)
{
}
public function onMessage(ConnectionInterface $from, $msg)
{
}
protected function close(ConnectionInterface $conn)
{
$conn->send($this->response);
$conn->close();
}
}
I ended up using the Ratchet\App class instead of Ratchet\Http\HttpServer because it allows you to set up routing among other things, so your /bin/http-server.php would then look like this:
<?php
use Ratchet\App;
require dirname(__DIR__) . '/vendor/autoload.php';
$app = new App('localhost', 8080, '127.0.0.1');
$app->route('/', new MyWebPage(), ['*']);
$app->run();
When you run php bin/http-server.php and visit http://localhost:8080 you should see the Hello World! response in your browser.
This is all you need for a basic request/response system, but it could be extended further by implementing HTML templates and things like that. I've implemented this myself in a little test project which I've uploaded to github along with a lot of other things, including an abstract controller which I can extend for different pages.
Chat server using Ratchet - Basic
Chat server using Ratchet - Advanced
Check the link above. The guy here is using Ratchet to build a real time chat server. He is basically storing usernames initially and then sending/broadcasting to all. You can modify it and check at the time of sending that certain username or uid is active at the moment and send data to them only. You can generate data dynamically and send to particular users or to all. May be this will help.

Categories