How do I use kafka and redis in a laravel project? - php

I have been having problem configuring Kafka and Redis together in Laravel.
I am able to run Redis for the use of in-memory database. So Redis works fine.
$redis = app()->make('redis');
return $redis->get('name1'); // it runs fine returning value of "name1"
I am able to configure Kafka in my windows system where I am able to produce and consume messages in terminals.
Successfully configured Rdkafka as php client library and extensions.
The package I am using in Laravel for Kafka is "superbalist/laravel-pubsub": "^3.0", "superbalist/php-pubsub-kafka": "^2.0"LINK
The below mentioned code is to subscribe and consume the message
$pubsub = app('pubsub');
$pubsub->subscribe('test1', function ($message) {
var_dump($message); // the code just stuck here
});
The browser just keeps loading and won't stop. I tried to look into the code within vendors but the response is non understandable.
My ENV as requested by the package
REDIS_HOST=localhost
REDIS_PASSWORD=null
REDIS_PORT=6379
PUBSUB_CONNECTION=redis
KAFKA_BROKERS=localhost
GOOGLE_CLOUD_PROJECT_ID=your-project-id-here
GOOGLE_CLOUD_KEY_FILE=path/to/your/gcloud-key.json
HTTP_PUBSUB_URI=null
HTTP_PUBSUB_SUBSCRIBE_CONNECTION=redis
If the Redis local server and client terminals are closed the error I get
Error while reading line from the server [tcp://localhost:9092]
Please let me know if someone have been able to configure them both in laravel.

the call to the subscribe() method is blocking, which means that the script will never finish, hence the reason why your browser never stops loading.
The PHP script where you have the call to subscribe() needs to be run from the CLI and not the browser, because that code consumes Kafka messages and needs to be always alive. If you want to publish messages to Kafka you need to use the publish() method.
From the documentation:
// consume messages
// note: this is a blocking call
$adapter->subscribe('my_channel', function ($message) {
var_dump($message);
});
// publish messages
$adapter->publish('my_channel', 'HELLO WORLD');
$adapter->publish('my_channel', ['hello' => 'world']);
$adapter->publish('my_channel', 1);
$adapter->publish('my_channel', false);

Related

Laravel Caching with Redis is very slow

I´m making my first steps with Redis on Laravel and there is something odd I figured out.
When using Redis as a cache driver in my setup it is taking far way to much time to load a page.
How do I know? When not using the Cache facade but the Redis facade directly response times are just a fraction. I set up a laravel installation on scratch and build a migration and seeder for a simple Article model.
First I thought the items were not stored in redis as redis-cli didn´t show them when searching with KEYS *. I figured out the cache is stored in another DB with REDIS_CACHE_DB as found in config/database.php
`INFO keyspace in redis-cli lists those two DB´s named 0 and 1.
I thought the problem could be caused by my localhost setup with Mamp Pro. So I switched over to the Laravel Homestead box and uploaded my project there. Same here.
Here´s the code I´m using:
routes/web.php
use Illuminate\Support\Facades\Redis;
use Illuminate\Support\Facades\Cache;
use Illuminate\Http\Request;
use App\Article;
Route::get('/get-articles-mysql', function (Request $request) {
return response()->json(Article::take(20000)->get());
});
Route::get('/get-articles-cache', function (Request $request) {
return Cache::remember('posts', 60, function () {
return Article::take(20000)->get();
});
});
Route::get('/get-articles-redis', function (Request $request) {
if($posts = Redis::get('posts.all')) {
return response()->json(json_decode($posts));
}
$posts = Article::take(20000)->get();
Redis::set('posts.all', Article::take(20000)->get());
return response()->json($posts);
});
I´m using postman to get the response times. I made several runs as the caching routes should be slow on the first request when caching is empty. But what I get on the average is this:
http://laravel-echo.local/get-articles-mysql 583ms
http://laravel-echo.local/get-articles-redis 62ms
http://laravel-echo.local/get-articles-cache 730ms
I´m not getting this. Using the Redis facade directly is super-fast. But why is caching so slow? Yes, I double checked my .env files. There is CACHE_DRIVER=redis so I´m not using file system by accident. And I used both php artisan config:clear and php artisan cache:clear to avoid mistakes when debugging.
I see a key called "laravel_cache:posts" in redis-cli. The cached posts are there. It only takes ages to load them. I also tested the requests in Chrome. The response times are much longer but still caching takes more than mere mysql querying.
So any suggestions what could be going on here?
I know this thread is already very old, but I am still getting the same.
I am using Laragon for local development and Redis makes my API request 4x slower.
EDIT:
OMFG... I just the problem.
In my .env file I had "REDIS_HOST=localhost" and that is exactly the problem.
After I change it to "REDIS_HOST=127.0.0.1", everything is running fast.
Try it and let me know.

PHP Secure Websocket Client Trouble, Needs to be Non Blocking

I'm building a dashboard that allows me to visualise my crontab as it runs (Think a queue of upcoming tasks, those that are running currently and those that have finished and whether the outcome was successful.) To do this, I need to send messages from the tasks (running or monitored by PHP) on my server, to the client browsers that run the dashboard using javascript. It also has to be secure.
To solve this issue I implemented a Twisted/Autobahn socket server in Python, which worked fine once I had paid for proper security certificates. The problem I have however is getting the PHP running the crontasks to be able to send messages to the webSocket server which passes them on to the client browsers, so far I have hacked this by writing a Python client that accepts the message to send as an argument and run this as an exec from PHP.
Obviously this is not a robust solution (which is also relatively slow to execute) and I'd now like to send logfile entries from the crontasks over websockets to my dashboards so I can see what's happening on my servers as tasks run. I've been looking for a while and tried various approaches (most are too long to post) however they range from tutorials, to segments from the PHP website to libraries such as Thruway (which seems too over-engineered for my use case, specialised and hard to adapt).
The best progress I have so far is Pawl, and using the following code I'm able to successfully send three messages to the Python Socket Server using wss:
<?php
require __DIR__ . '/vendor/autoload.php';
\Ratchet\Client\connect('wss://127.0.0.1:9000')->then(function($conn) {
$conn->on('message', function($msg) use ($conn) {
echo "Received: {$msg}\n";
$conn->close();
});
$conn->send('MSG 1');
$conn->send('MSG 2');
$conn->send('MSG 3');
}, function ($e) {
echo "Could not connect: {$e->getMessage()}\n";
});
?>
(Note that this depends on the libraries found here)
The problem I have is that I would like to be able to open and close the connection and send messages as separate steps, in the code example (which I've had difficulty adapting) it seems that as open, send and close are all wrapped in the then method and anonymous function I cannot call these methods seperately. Ideally I'd like to open the connection at the beginning of my crontask execution, each time a message is logged call the send method and close the connection at the end without wasting time opening and closing a connection to my socket server for each and every message. Please note that listening for replies isn't necessary.
Also, any solutions that work to 127.0.0.1:9000 over WSS and don't need libraries or use a different one I'm happy to consider. Please also note (after seeing other posts) this question specifically refers to a websocket client, not a server.
Many thanks,
James
Leaving this in case anybody else finds this final solution welcome:
In the end I wrapped a module called Textalk by Fredrik Liljegren et al in a small class to make it more accesible and this solved my issue.
Here is the code I used in the end:
require('vendor/autoload.php');
use WebSocket\Client;
class secureSocketClient {
private $OClient;
function __construct($VProtocol, $VLocation, $VPort, $VDir) {
$this->OClient = new Client("$VProtocol://$VLocation:$VPort" . ($VDir != null ? "/$VDir" : ""));
}
function sendMessage($ORequestData) {
$VLocalMessage = json_encode($ORequestData);
$this->OClient->send($VLocalMessage);
}
function __destruct() {
$this->OClient->close();
}
}
Which can be invoked as so:
require_once <class location>
$this->OSecureSocketClient = new secureSocketClient("wss", "127.0.0.1", "9000", null);
$this->OSecureSocketClient->sendMessage($OMSG1);
$this->OSecureSocketClient->sendMessage($OMSG2);
$this->OSecureSocketClient->sendMessage($OMSG3);
To install textTalk (on linux), you can use the following commands in the directory where the class will reside:
curl -sS https://getcomposer.org/installer | php
add the following to composer.json (in the same directory):
{
"require": {
"textalk/websocket": "1.0.*"
}
}
then run the following:
sudo php composer.phar install
Regards,
James

How to stop a running php process on heroku server?

I have written a slack auto-message service on heroku that will auto send a message on some event. However, I have accidentally made an infinity loop and it keeps sending error message to my slack account. (the loop caused by retrying on error and I forgot to add a counter on that)
I have tried restarting the server by typing the command heroku restart (at that directory, so app name can be omitted), as well as git pushing my corrected version which should restart the server. I even turn off the server by settings 0 dymo to it. None of these works and I still keep receiving message on slack.
I am quite sure I have turn off my heroku server, so I think there should be other way to stop the process. The php process will have a bash_exec to trigger phantomjs. Not sure if it is related to the current problem. Do anyone has any suggestions?
P.S. As request, this is my code with important information hidden.
<?php
sendSlackMessageToChannel("Request received. Loading, please wait.");
startPhantomJS();
function startPhantomJS() {
$return_value = bash_exec("path/to/phantomjs myscript.js");
sendSlackMessageToMyself($return_value);
if ($return_value == "error")
startPhantomJS();
else
sendSlackMessageToChannel($return_value);
}

How to connect to ESL from a remote server?

I would like to make a web interface in PHP to see the FreeSWITCH activities (calls, etc), possibly hosted on a different server than the one where FS is running.
I've seen the server status on the FS server using command line (php single_command.php status), but now I would like to see this status from another server.
When I try to copy ESL.php file to this remote server and try to check the status, I get this error message:
Fatal error: Call to undefined function new_ESLconnection() in
/var/www/freeswitch/ESL.php on line 127
This is my index.php file:
<?php
ini_set('display_errors', 1);
$password = "ClueCon";
$port = "8021";
$host = "192.168.2.12";
require_once('ESL.php');
set_time_limit(0); // Remove the PHP time limit of 30 seconds for completion due to loop watching events
// Connect to FreeSWITCH
$sock = new ESLconnection($host, $port, $password);
// We want all Events (probably will want to change this depending on your needs)
$sock->sendRecv("status");
// Grab Events until process is killed
while($sock->connected()){
$event = $sock->recvEvent();
print_r($event->serialize());
}
?>
I undestand that the webserver doesn't have FreeSWITCH installed, so the error message is obvious, but i don't see how to access to this information from this webserver.
Thank you for your help.
Depending upon your need you can use either Inbound or Outbound socket. I do not know much about PHP and FS Event Socket but yeah tried enough with python. I highly recommended to go through thislink.
So if you just want to do small task like initiating a call, bridging any two given number etc i think you should go with Inbound socket(making cli command from your web server to freeswitch server) or mod_xml_rpc.
And if you want to have full control of everything that happens on FS server like showing live call status and modifying their states or say a complete interactive telephony dashboard then you should go with Outbound socket.(Your FS server will send all events to your web server.)
However in your case problem is I think you did not properly build the php ESL module.
this link might help you installing ESL
Rather than using ESL, you might want to consider using the XMLRPC. The connection is very straight forward:
https://wiki.freeswitch.org/wiki/Freeswitch_XML-RPC
The credentials for the XMLRPC are in your autoloads_configs/xml_rpc.conf.xml

Gearman return value not working

I am trying to get gearman to return a value to php from a function after php makes a request to it using $gmclient->do("somefunction", "somedata"). However, the php client simply times out. The exact code I am using is straight from the php manual. I am using example #1 from http://docs.php.net/manual/en/gearmanclient.do.php
The browser gives me this message:
This webpage is not available.
The webpage at
http://yoursite.com/client.php
might be temporarily down or it may
have moved permanently to a new web
address.
More information on this error. Below
is the original error message
Error 324 (net::ERR_EMPTY_RESPONSE):
Unknown error.
The browser is Chrome if that helps to elaborate the error message.
In case it makes a difference, the worker.php file is being executed in a terminal window using the command "php worker.php". I am running on Ubuntu 9.10 Karmic Koala. I installed gearman using the directions found at http://blog.stuartherbert.com/php/2010/02/26/getting-gearman-up-and-running-on-ubuntu-karmic/
I checked the terminal window and gearman is getting the request and echos the results into the terminal - it just not sent back to the client.
The end goal is to get gearman to return to the client the return value from the function that was executed and display that value to the user.
UPDATE:
As requested, the code is below:
worker.php (the worker)
<?php
echo "Starting\n";
# Create our worker object.
$gmworker= new GearmanWorker();
# Add default server (localhost).
$gmworker->addServer();
# Register function "reverse" with the server. Change the worker function to
# "reverse_fn_fast" for a faster worker with no output.
$gmworker->addFunction("reverse", "reverse_fn");
print "Waiting for job...\n";
while($gmworker->work())
{
if ($gmworker->returnCode() != GEARMAN_SUCCESS)
{
echo "return_code: " . $gmworker->returnCode() . "\n";
break;
}
}
function reverse_fn($job)
{
return strrev($job->workload());
}
?>
client.php (client code - this is the page I am loading the browser)
<?php
# Client code
echo "Starting\n";
# Create our client object.
$gmclient= new GearmanClient();
# Add default server (localhost).
$gmclient->addServer();
echo "Sending job\n";
$result = $gmclient->do("reverse", "Hello!");
echo "Success: $result\n";
?>
The comments below where it said it was working.. I repeat, it was NOT working. It only appeared to work because I changed $gmclient->do to $gmclient->doBackground which output the job ID, not the actual result from the function.
FINAL UPDATE (WITH SOLUTION)
After some work, I've figured out that it was not a coding error. Gearman was improperly installed. Instead of using apt-get install, I decided to do things manually. I downloaded the gearmand (c) from the gearman site (http://gearman.org/index.php?id=download). I then used the tutorials on the gearman site as well starting with http://gearman.org/index.php?id=getting_started and then http://gearman.org/index.php?id=gearman_php_extension
Change your code and use GearmanClient::addTask. You can use its return value to implement monitoring process. Check on the other functions that you can use on the GearmanTask class.
//Wrong
$gmworker->addServer();
//Correct
$gmworker->addServer("localhost",4730);

Categories