I´m making my first steps with Redis on Laravel and there is something odd I figured out.
When using Redis as a cache driver in my setup it is taking far way to much time to load a page.
How do I know? When not using the Cache facade but the Redis facade directly response times are just a fraction. I set up a laravel installation on scratch and build a migration and seeder for a simple Article model.
First I thought the items were not stored in redis as redis-cli didn´t show them when searching with KEYS *. I figured out the cache is stored in another DB with REDIS_CACHE_DB as found in config/database.php
`INFO keyspace in redis-cli lists those two DB´s named 0 and 1.
I thought the problem could be caused by my localhost setup with Mamp Pro. So I switched over to the Laravel Homestead box and uploaded my project there. Same here.
Here´s the code I´m using:
routes/web.php
use Illuminate\Support\Facades\Redis;
use Illuminate\Support\Facades\Cache;
use Illuminate\Http\Request;
use App\Article;
Route::get('/get-articles-mysql', function (Request $request) {
return response()->json(Article::take(20000)->get());
});
Route::get('/get-articles-cache', function (Request $request) {
return Cache::remember('posts', 60, function () {
return Article::take(20000)->get();
});
});
Route::get('/get-articles-redis', function (Request $request) {
if($posts = Redis::get('posts.all')) {
return response()->json(json_decode($posts));
}
$posts = Article::take(20000)->get();
Redis::set('posts.all', Article::take(20000)->get());
return response()->json($posts);
});
I´m using postman to get the response times. I made several runs as the caching routes should be slow on the first request when caching is empty. But what I get on the average is this:
http://laravel-echo.local/get-articles-mysql 583ms
http://laravel-echo.local/get-articles-redis 62ms
http://laravel-echo.local/get-articles-cache 730ms
I´m not getting this. Using the Redis facade directly is super-fast. But why is caching so slow? Yes, I double checked my .env files. There is CACHE_DRIVER=redis so I´m not using file system by accident. And I used both php artisan config:clear and php artisan cache:clear to avoid mistakes when debugging.
I see a key called "laravel_cache:posts" in redis-cli. The cached posts are there. It only takes ages to load them. I also tested the requests in Chrome. The response times are much longer but still caching takes more than mere mysql querying.
So any suggestions what could be going on here?
I know this thread is already very old, but I am still getting the same.
I am using Laragon for local development and Redis makes my API request 4x slower.
EDIT:
OMFG... I just the problem.
In my .env file I had "REDIS_HOST=localhost" and that is exactly the problem.
After I change it to "REDIS_HOST=127.0.0.1", everything is running fast.
Try it and let me know.
Related
use Symfony\Contracts\Cache\ItemInterface;
use Symfony\Component\Cache\Adapter\FilesystemAdapter;
$cache = new FilesystemAdapter();
$value = $cache->get('my_cache_key', function (ItemInterface $item) {
$item->expiresAfter(3600);
// ... do some HTTP request or heavy computations
$computedValue = 'foobar';
return $computedValue;
});
I use Symfony 5.4 and the cache contracts on an application and some cache expirations are quite long. My problem is that some values need to be changed and to do it properly, I would need to be able to purge the cache with a command line on my production server to be sure to have correct data.
I can make a custom command ex: php bin/console app:cache:custom-clear that invalidates some tags but I'm surprised I don't have a native command to do this cache purge operation globally.
It may be that it's simple and I didn't understand anything but I don't see much in the doc on this point.
If anyone has a lead, I'm interested.
Octane Version: 1.0.8
Laravel Version: 8.50.0
PHP Version: 8.0
Server & Version: Swoole 4.6.7
Database Driver & Version: MySQL 8.0.25
Everything works as expected when using Redis for example.
cache()->store('redis')->remember("test_key", now()->addMinutes(), fn() => 'test_value');
Cache::remember() method does not store the value when using Laravel Octane Cache. (returns null)
cache()->store('octane')->remember("test_key", now()->addMinutes(), fn() => 'test_value');
I did another tests and seems that Octane store is not persistent. If I use put then get immediately will receive the value, if I use put then refresh the page the value will be null. This is only for Octane driver. Redis store works fine.
cache()->store('octane')->put("test_key", 'test_value', now()->addMinutes());
cache()->store('octane')->get("test_key"); => returns null
Redis works as expected.
cache()->store('redis')->put("test_key", 'test_value', now()->addMinutes());
cache()->store('redis')->get("test_key"); => returns test_value
I just ran into same issue. It seems the "octane" cache does not work if you try to use it via console. When I use it inside my controller (or anywhere where the process starts from any web request) it works fine.
I think, if you run any command of your application, the OS runs it separately. In this case you can't see the soole cache (because that is under a different process) and also can't use SwooleTable (Exception: Tables may only be accessed when using the Swoole server.)
I'm building a web app that retrieves dynamic generated content through puppeteer. I have set up (apache + php) docker containers, one for the p5js project that generates an svg based on a (large, 2MB) json file, and one container with PHP that retrieves that svg. Dockers runs in an Nginx config (nginx for routing, apache for quicker PHP handling). I'm using the cheapest CENTOS server available on digitalocean. So upgrading would definitley help.
I don't want the javascript in the p5js project to be exposed to the public, so I thought a nodejs solution would be best in this scenario.
The PHP page does a shell_exec("node pup.js"). It basically runs in approx 1-3 seconds which is perfect.
Problem is when I try to test a multi user scenario and open 5 tabs to run this PHP page, the loadtime drops to even 10+ seconds, which is killing for my app.
So the question would be how to set up this architecture (php calling a node command) for a multi user environment.
===
I've tried several frameworks like x-ray, nightmare, jsdom, cheerio, axios, zombie, phantom just trying to replace puppeteer. Some of the frameworks returned nothing, some just didn't work out for me. I think I just need a headless browser solution, to be able to execute the p5js. Eventually puppeteer gets the job done, only not in a multi-user environment (I think due to my current php shell_exec puppeteer architecture).
Maybe my shell_exec workflow was the bottleneck, so I ended up building a simple node example.js which waits 5 seconds before finish (not using puppeteer), and I ran this with several tabs simultaneously, works like a charm. All tabs load in about 5-6 seconds.
I've also tried pm2 to test if my node command was the bottleneck, I did some testing on the commandline, with no major results and I couldn't get PHP to run a pm2 command, so I dropped this test.
I've tried setting PuPHPeteer up, but couldn't get it to run.
At some time I thought it had something to do with multiple puppeteer browsers launched, but I've read that this should be no problem.
The PHP looks like:
<?php
$puppeteer_command = "node /var/www/pup.js >&1";
$result = shell_exec($puppeteer_command);
echo $result;
?>
My puppeteer code:
const puppeteer = require('puppeteer');
let url = "http://the-other-dockercontainer/";
let time = Date.now();
let scrape = async () => {
const browser = await puppeteer.launch({
args: ['--no-sandbox']
});
const page = await browser.newPage();
await page.goto(url);
await page.waitForSelector('svg', { timeout: 5000 });
let svgImage = await page.$('svg');
await svgImage.screenshot({
path: `${time}.png`,
omitBackground: true,
});
await browser.close();
return time;
}
scrape().then((value) => {
console.log(value); // Success!
});
I was thinking about building the entire app in nodejs if that is the best solution, but I've put so many hours in this PHP infrastructure, I'm at the point of really like getting some advice :)
Since I have full control over the target and destination site one brainfart would be to have node to serve a server which accepts a json file and return the svg based on a local p5js site, but don't now (yet) if this would be any different.
UPDATE
So thanks to some comments, I've tried a new approach: not using p5js, but native processing code (java). I've exported the processing code to a linux 64bit application and created this little nodejs example:
var exec = require('child_process').exec;
var cmd = '/var/www/application.linux64/minimal';
exec(cmd, processing);
// Callback for command line process
function processing(error, stdout, stderr) {
// I could do some error checking here
console.log(stdout);
};
When I call this node example.js within a shell_exec in PHP, I get this:
First call takes about 2 seconds. But when I hit a lot of refreshes, time is again building up by a lot of seconds. So, clearly, my understanding of multithreading is not that good, or am I missing something crucial in my testing?
I have a setup similar to you. The biggest problem is your droplet. You should use at minimum the cheapest CPU optimized droplet ($40 per month from memory). The cheapest generic droplets on DO are in a shared environment (noisy neighbours create performance fluctuations). You can easily test this out by making a snapshot of your server and cloning your drive.
Next as someone else suggested is to reduce the cold starts. On my server a cold start adds around 2 extra seconds. I take 10 screenshots before opening a new browser. Anything more than that and you may run into issues with memory.
If you are trying to use puppeteer with php you should use make sure you have composer installed then while inside the project folder open terminal windows and type composer require nesk/puphpeteer
also install nesk/rialto, then require autoload everything should work. One thing when working with puppeteer setting const use $ and skip using await command plus replace "." with ->
<?php
require 'vendor/autoload.php';
use Nesk\Puphpeteer\Puppeteer;
use Nesk\Rialto\Data\JsFunction;
$puppeteer = new Puppeteer;
$browser = $puppeteer->launch(['headless'=>false,'--proxy-server=000.00.0.0:80']);// Type Proxy
$bot = $browser->newPage();
$data = $bot->evaluate(JsFunction::createWithBody('return document.documentElement.innerHTML'));
$urlPath = $bot->url();
$bot->goto('https://google.com');//goes to google.com
$bot->waitForTimeout(3000);// waits
$bot->type('div', '"cellphonemega"');//searches
$bot->keyboard->press('Enter');//presses enter
$bot->waitForTimeout(8000);//waits
$bot->click('h3', 'https:');//clicks webpage
$bot->waitForTimeout(8000);//waits while site loads
$bot->screenshot(['path' => 'screenshot.png']);// TAKES SCREENSHOT!
$browser->close();//Shuts Down
I have been having problem configuring Kafka and Redis together in Laravel.
I am able to run Redis for the use of in-memory database. So Redis works fine.
$redis = app()->make('redis');
return $redis->get('name1'); // it runs fine returning value of "name1"
I am able to configure Kafka in my windows system where I am able to produce and consume messages in terminals.
Successfully configured Rdkafka as php client library and extensions.
The package I am using in Laravel for Kafka is "superbalist/laravel-pubsub": "^3.0", "superbalist/php-pubsub-kafka": "^2.0"LINK
The below mentioned code is to subscribe and consume the message
$pubsub = app('pubsub');
$pubsub->subscribe('test1', function ($message) {
var_dump($message); // the code just stuck here
});
The browser just keeps loading and won't stop. I tried to look into the code within vendors but the response is non understandable.
My ENV as requested by the package
REDIS_HOST=localhost
REDIS_PASSWORD=null
REDIS_PORT=6379
PUBSUB_CONNECTION=redis
KAFKA_BROKERS=localhost
GOOGLE_CLOUD_PROJECT_ID=your-project-id-here
GOOGLE_CLOUD_KEY_FILE=path/to/your/gcloud-key.json
HTTP_PUBSUB_URI=null
HTTP_PUBSUB_SUBSCRIBE_CONNECTION=redis
If the Redis local server and client terminals are closed the error I get
Error while reading line from the server [tcp://localhost:9092]
Please let me know if someone have been able to configure them both in laravel.
the call to the subscribe() method is blocking, which means that the script will never finish, hence the reason why your browser never stops loading.
The PHP script where you have the call to subscribe() needs to be run from the CLI and not the browser, because that code consumes Kafka messages and needs to be always alive. If you want to publish messages to Kafka you need to use the publish() method.
From the documentation:
// consume messages
// note: this is a blocking call
$adapter->subscribe('my_channel', function ($message) {
var_dump($message);
});
// publish messages
$adapter->publish('my_channel', 'HELLO WORLD');
$adapter->publish('my_channel', ['hello' => 'world']);
$adapter->publish('my_channel', 1);
$adapter->publish('my_channel', false);
I have noticed that when my laptop is connected to the internet my PHPUnit tests takes between ~90 sec ~200 sec to finish. But when I disconnect it from the internet it runs in less than 20 sec!! that makes me happy and sad at the same time!
In both cases all the tests are passing, I'm sure I'm mocking every request to external API's.
I'm using Laravel and MySQL for real data storage and in-memory sqlite for the tests environment. Also my development environment is all on running on Docker.
Is this something related to PHPUnit or to my code!! any one has an idea on what's going on. Thanks
More Info
The domain I'm using is something.dev and my API's uses api.something.dev. Every test makes at least one call to each API endpoint.
DNS!
If you think this is due to DNS lookup: I changed all the domain and subdomains to 127.0.0.1 just to test it, and it didn't helped the tests are still slow. Should this eliminate the possibility of DNS lookup!
In addition I tried mocking the DNS using The PHPUnit Bridge with PHPUnit but I guess I couldn't make it work due to the lack of documentation, so I didn't knew what to pass as parameter to DnsMock::withMockedHosts([here!!]) after calling it from my setUp() function.
Something else
I think the problem is related to the data storage because the delay happens before and after querying the database, mostly to store data.
Wow that wasn't expected. Turns out my tests are slow because of the image() function provided by the PHP Faker package $faker->image().
I was using it in one of my factories to prepare a fake Image for the DB, I didn't know it's literally downloading images and storing them in folder like this /private/var/folders/51/5ybn3kjn8f332jfrsx7nmam00000gn/T/.
I was able to find that by monitoring what the PHP process is doing while the test is running, to find out it has an open .jpg file in that directory, so I looked in my code anything related to images and discovered that, after about 6 hours of debugging. Happy coding :)
Never use $faker->image('storage/app',640,480, null, false) in factory
: it will be time consuming.
Transform
$factory->define(Model::class, function (Faker $faker) {
return [
'name' => $faker->name,
'description' => $faker->text,
'image' => $faker->image('storage/app',640,480, null, false),
];
});
into
$factory->define(Model::class, function (Faker $faker) {
return [
'name' => $faker->name,
'description' => $faker->text,
'image' => 'test.png',
];
});