I am using redis as a session driver and I want to clear the cache while keeping the session data, so basically user can stay logged in. Any suggestions regarding restructuring or handling the current situation?
Note: I don't want to use separate redis instance for sessions and other cache data.
Intro
By default, redis gives you 16 separate databases, but laravel out of the box will try to use database 0 for both sessions and cache.
Our solution is to let Redis caching using database 0, and database 1 for Session, there for solving the session clear by running php artisan cache:clear problem.
1. Setting up Session Redis connection
Modify config/database.php, add session key to the redis option:
'redis' => [
'cluster' => false,
'default' => [
'host' => env('REDIS_HOST', 'localhost'),
'password' => env('REDIS_PASSWORD', null),
'port' => env('REDIS_PORT', 6379),
'database' => 0,
],
'session' => [
'host' => env('REDIS_HOST', 'localhost'),
'password' => env('REDIS_PASSWORD', null),
'port' => env('REDIS_PORT', 6379),
'database' => 1,
],
],
2. Make use of the session connection
Modify config/session.php, change the following:
'connection' => null,
to:
'connection' => 'session',
3. Using Redis as session driver
Modify .env, change SESSION_DRIVER:
SESSION_DRIVER=redis
4. Testing out
Execute the following artisan command, then check your login state:
php artisan cache:clear
If the login state persists, voilĂ !
I don't know Laravel, but in general the best two options would be:
Change the format of the cache keys. You should use versioned cache keys so you can do it in the future, i.e. "cache.1." so you can increment and then it makes all your keys irrelevant at once.
Move the cache to a different db number in the same redis instance. That way you can also later do FLUSHDB on that db number to clear the cache.
In both options, after you first do it, if the cache keys are not time-expiring, you should create a script that uses SCAN to remove old keys. See http://redis.io/commands/scan
As a side note, it's usually a bad idea to keep cache and other things in the same redis instance, as in caches you usually use LRU based eviction, and you don't want to mix that with less volatile keys.
https://laravel.com/docs/5.2/redis#configuration
'redis' => [
'cluster' => false,
'default' => [
'host' => '127.0.0.1',
'port' => 6379,
'database' => 0,
],
],
There is 'database' in redis connection options, just select different databases for session and cache. I just hope that redis cache driver uses flushdb not flushall for flushing :).
Laravel Cache::clear() sends the Redis flushall command which will dump everything so not very useful in my experience. You will need to extend the cache class and create a custom set to index the cache data you want to be able to clear. Then build an another function to read the set and issue Redis del() command for each key in the set. Post some working code and will detail further if necessary
Notagolfers suggestion of separation of cache and session into different redis databases isn't a had call but you will still need to extend the cache class to implement the Redis database config switch
For the laravel 9 (it is similar for an older ones):
1.update session configuration
you must put in .env:
SESSION_CONNECTION=session
that configuration is then loaded at config/session.php
'connection' => env('SESSION_CONNECTION'),
That tells laravel to store sessions in separated redis/database connection
2.update redis configuration
add new redis connection at the config/database.php for sessions
'redis' => [
...
'session' => [
'host' => env('REDIS_HOST', 'localhost'),
'password' => env('REDIS_PASSWORD', null),
'port' => env('REDIS_PORT', 6379),
'database' => 0,
],
...
]
you can use and database number from 0 to 16. For other connections put different database number.
3.clear cache(without touching sessions)
now you can clear the redis cache with this piece of code:
$redisConnection = \Illuminate\Support\Facades\Redis::connection('default');
$redisConnection->flushDB();
TIP: You can put that code in custom command and run it with the artisan
BONUS: You can see and clear all redis data for each connection with:
$redisSession = \Illuminate\Support\Facades\Redis::connection('session');//session|queue|default
$redisSessionKeys = $redisSession->keys('*');
$redisSession->flushDB();
dd($redisSessionKeys);
BONUS2: You can also add "redis database" for queues and then you have queue jobs, sessions and cache separated and you can clear only one of them
'redis' => [
...
'default' => [
'host' => env('REDIS_HOST', 'localhost'),
'password' => env('REDIS_PASSWORD', null),
'port' => env('REDIS_PORT', 6379),
'database' => 0,
],
'session' => [
'host' => env('REDIS_HOST', 'localhost'),
'password' => env('REDIS_PASSWORD', null),
'port' => env('REDIS_PORT', 6379),
'database' => 1,
],
'queue' => [
'host' => env('REDIS_HOST', 'localhost'),
'password' => env('REDIS_PASSWORD', null),
'port' => env('REDIS_PORT', 6379),
'database' => 2,
],
]
Related
I am trying to connect a laravel application to a remote database.The connection works successfully on my local machine but does not work in the production environment.The laravel application is connected to its own database hosted in the cloud but i want to access another database hosted by another provider.Hence,I did the following:
In Config/database file in laravel folder,I added another mysql connection
config the .env file as well.
'mysql' => [
'driver' => 'mysql',
'host' => env('DB_HOST', '127.0.0.1'),
'port' => env('DB_PORT', '3306'),
'database' => env('DB_DATABASE', 'forge'),
'username' => env('DB_USERNAME', 'forge'),
'password' => env('DB_PASSWORD', ''),
'unix_socket' => env('DB_SOCKET', ''),
'charset' => 'utf8mb4',
'collation' => 'utf8mb4_unicode_ci',
'prefix' => '',
'strict' => false,
'engine' => null,
],
'mysql3' => [
'driver' => 'mysql',
'host' => env('DB_HOST_THIRD','localhost'),
'port' => env('DB_PORT_THIRD'),
'database' => env('DB_DATABASE_THIRD',''),
'username' => env('DB_USERNAME_THIRD',''),
'password' => env('DB_PASSWORD_THIRD',''),
],
Then in .env file:
DB_CONNECTION=mysql3
DB_HOST_THIRD=remote IP
DB_PORT_THIRD=3306
DB_DATABASE_THIRD=Remote DB Name
DB_USERNAME_THIRD=Username
DB_PASSWORD_THIRD=password
With the above setting, i can connect to the remote database successfully on my local machine.
Upon deployment, I am unable to connect to this remote database,even though a designated user has been created and access granted through the control panel.This remote database is hosted by GoDaddy and i have called but they said nothing is blocking my application, that everything is ok from their end. Below are what I have done:
I am able to connect the remote Database using Sqlyog, Mysql client successfully.
I connected to Terminal on the Remote Cpanel and try to grant all access to the User at its IP but the response is: "Access denied for user 'username#Remote IP'(using password:YES)".
From web application, the error is : "SQLSTATE[HY000] [2002] Connection refused".
I have telnet the remote IP with port 3306, which respond well. Please, can someone help me, perhaps there are things i am missing or is this kind of issue peculiar to Godaddy shared hosting?. I have checked the following from Stackoverflow:
Enable remote MySQL connection: ERROR 1045 (28000): Access denied for user
I have a Heroku app where I'm hosting my Laravel app. I started the development initially with MySQL, so I wanted to continue doing so using Amazon's RDS service. I create the instance there and managed to successfully connect via my MySQL client, the console etc.
The problem is that the Laravel app can't connect the database after numerous desperate attempts for me to fix it. I have found some articles suggesting the use of DATABASE_URL environment variable is mandatory, so I added it via the Heroku app settings. It looks like so:
mysql://myusername:mypass#myhostnamefromamazon/mydb?sslca=/app/storage/certs/amazon-rds-ca-cert.pem
I found this solution on Heroku's website. I have placed the amazon-rds-ca-cert.pem file on my Laravel's storage folder, like so: /app/storage/certs/amazon-rds-ca-cert.pem
This didn't solve my issue, so then I kept looking and found a Stackoverflow question which had this issue on Lumen. I adjusted my config/database.php according to the answer, but it's still not working for me!
<?php
$credentials = get_db_credentials();
$config = [
'default' => env('DB_CONNECTION', 'mysql'),
'connections' => [
'mysql' => [
'driver' => 'mysql',
'host' => env('DB_HOST', $credentials->host),
'port' => env('DB_PORT', '3306'),
'database' => env('DB_DATABASE', $credentials->database),
'username' => env('DB_USERNAME', $credentials->username),
'password' => env('DB_PASSWORD', $credentials->password),
'unix_socket' => env('DB_SOCKET', ''),
'charset' => 'utf8mb4',
'collation' => 'utf8mb4_unicode_ci',
'prefix' => '',
'strict' => true,
'engine' => null,
],
],
];
if (env('APP_ENV') == 'production') {
$config['connections']['mysql']['options'] = [PDO::MYSQL_ATTR_SSL_CA => '../storage/certs/amazon-rds-ca-cert.pem'];
}
return $config;
The get_db_credentials() function simply parses the DATABASE_URL environment variable.
The exact exception that I get is:
[2018-10-25 19:32:16] production.ERROR: SQLSTATE[HY000] [2002] Connection timed out {"exception":"[object] (Doctrine\\DBAL\\Driver\\PDOException(code: 2002): SQLSTATE[HY000] [2002] Connection timed out at /tmp/build_05920c42a6de0a378402b798320d3f04/vendor/doctrine/dbal/lib/Doctrine/DBAL/Driver/PDOConnection.php:50
I'm totally lost on this and unsure how to proceed.
Your AWS Security Groups must permit traffic from Heroku's IP address range.
https://devcenter.heroku.com/articles/amazon-rds
You must grant Heroku dynos access to your RDS instance. The recommended way to do this is to configure the RDS instance to only accept SSL-encrypted connections from authorized users and configure the security group for your instance to permit ingress from all IPs, eg 0.0.0.0/0.
Hi Following is error what I am getting:-
PDOException: SQLSTATE[HY000]: General error: 20003 Adaptive Server connection timed out [20003] (severity
6) [(null)] in /var/www/html/web/vendor/laravel/framework/src/Illuminate/Database/Connection.php:335
I am trying to connect to MS-SQL-Server on windows from linux laravel 5.2 code snippet.
There is firewall disable on windows machine.
I am able to telnet on windows ip on ms sql default available port.
I am not using free TDS, so this is not duplicate of TDS and even that question is not answered.
centos 7, able to ping to IP. there is no connection issue.
Running script from terminal, so no timeout issue must be come.
Before I was trying to fetch 5k records, I thot query is taking longer time but even I reduce a limit to 100 then also getting same error.
My net connection is high speed and server hardware is highly configured.
I have tried all the solution given on below link and I am getting success but when my script runs facing above issue.
[https://blogs.msdn.microsoft.com/sql_protocols/2008/04/30/steps-to-troubleshoot-sql-connectivity-issues/][1]
My script run well locally. But facing issue when I promoted code to dev.
The problem seems to be related to the execution time for your query.
I was having this same issue and it was solved after I've changed the timeout settings at config/databases.php file.
Use the code below:
'options' => [
PDO::ATTR_TIMEOUT => 300, // up to 5 minutes
],
The complete config should be:
'sqlsrv' => [
'driver' => 'sqlsrv',
'host' => env('DB_HOST', 'localhost'),
'port' => env('DB_PORT', '3306'),
'database' => env('DB_DATABASE', 'forge'),
'username' => env('DB_USERNAME', 'forge'),
'password' => env('DB_PASSWORD', ''),
'prefix' => '',
'options' => [
PDO::ATTR_TIMEOUT => 300,
],
],
I want to use database (currently, MySQL) to store session. But, when I tried to config, table sessions still has no record.
In file config/session.php when I change 'driver' => env('SESSION_DRIVER', 'file'),
to 'driver' => env('SESSION_DRIVER', 'database'),
or 'driver' => env('SESSION_DRIVER', 'redis'),
What should I change for 'connection' => null,?
I tried:
'connection' => 'mysql', with mysql is configured in app/database.php
'connection' => 'database.connections.mysql',
'connection' => Config::get('database.connections.mysql'), got class not found error
All you need to do is specify the right driver in the environment file, and generate the schema as outlined in the documentation.
This line:
'driver' => env('SESSION_DRIVER', 'database')
This is telling the config to get the SESSION_DRIVER variable from your environment file, and use database as a default. I can only assume you have forgotten to update your .env file.
I hade to change SESSION_DRIVER=database in .env file
config/session.php changes didn't work for me
I have added DB credentials as environment variables (using nginx), so that i can use them like so:
return array(
'connections' => array(
'mysql' => array(
'driver' => 'mysql',
'host' => 'localhost',
'database' => getenv('DB_NAME'),
'username' => getenv('DB_USER'),
'password' => getenv('DB_PASS'),
'charset' => 'utf8',
'collation' => 'utf8_unicode_ci',
'prefix' => '',
)
)
);
The problem is that when I use artisan the environment variables do not seem to be available, so when i run migrate or seed i get errors.
Is there a way around that or should i just write my DB credentials directly in my config file ?
To edit my previous answer (sorry for misunderstandig):
Yes, environment variables are created by server, so they can't be reached or modified from CLI. Before deploying, server is generating those variables, so they can be "injected" into application at runtime.
I am thinking that it is maybe possible to reach those variables through remote Laravel package and SSH ? For example php artisan tail command is reading locally errors from the server side.