Can't put a Cache from generated excelfile Laravel 8 - php

hope you are doing well
I'm having the problem with Laravel Cache
First , this only happened on my produyction server (Windows 10 Server)
It seems that the apps can't i put a Cache which comming from PHPSpreadsheet Excel Object.
I already tried it with array or php objects, and they are all fine
Cache::put($cacheKey, $spreadsheet, now()->addMinutes(2));
Cache setting :
'default' => env('CACHE_DRIVER', 'file'),
'file' => [
'driver' => 'file',
'path' => storage_path('framework/cache/data'),
],
I assume if i can put thearray and php object , then there is nothing wrong with the folder permission.
How to solve this ?

Related

Directus Storage Adapter AWS S3 error: No permission to write:

I decided to use Directus as a Headless CMS in one of our projects. For storing media files we decided to use AWS S3 as a storage adapter. I have followed all instructions from Directus documentation how to setup the configurations for S3 but i am keeping on getting this error:
message: "No permission to write: 122d303f-b69a-48f8-8138-8f3404b3c992.jpg"
I have tried the AWS S3 bucket from CLI commands, and i can upload files there and list the files but from Directus i am having the no permission to write error.
Storage configuration in Directus looks like this:
'storage' => [
'adapter' => 's3', // What storage adapter to use for files
// Defaults to the local filesystem. Other natively supported
// options include: Amazon S3, Aliyun OSS, Azure
// You'll need to require the correct flysystem adapters through Composer
// See https://docs.directus.io/extensions/storage-adapters.html#using-aws-s3
'root' => '/originals', // Where files are stored on disk
'thumb_root' => '/thumbnails', // Where thumbnails are stored on disk
'root_url' => '##S3_URL##/originals', // Where files are accessed over the web
// 'proxy_downloads' => false, // Use an internal proxy for downloading all files
// S3
////////////////////////////////////////
'key' => '##S3_KEY##',
'secret' => '##S3_SECRET##',
'region' => '##S3_REGION##',
'version' => 'latest',
'bucket' => '##S3_BUCKET##',
'options' => [
'ACL' => 'public-read',
'CacheControl' => 'max-age=604800'
],
],
Anyone can give any insight on this?
Thanks in advance.
Need to add the following
'options' => [
'ACL' => 'private',
],
I found that I did not need the bucket policy update. Just need to add this to the config.php when trying to upload to a private bucket.
I had the same issue.
In my case, I've allowed "s3:PutObjectAcl" on the bucket and it starts working :)
It seems that this is a flysystem issue.

Laravel under a load balancer + centralized redis session server

I have 2 laravel nodes running in separate servers under a load balancer, and a dedicated redis server for session and cache storage.
I configured the session and cache drivers accordingly to "redis" and it connects just fine. I see files being stored inside the redis server.
The issue is when I try to login, the page just gets refreshed without printing the "Invalid credential" errors that are normally stored in session.
Since the load balancer keeps redirecting from one node to another, the session is somehow getting lost. As a single instance it works just fine though. Is there anyone having the same issue with laravel and load balancing?
If there is a possible fix without configuring the balancer to use sticky sessions, that would be great!
Thanks in advance!
I think this package TrustedProxy solves your issue. Install it and then just add it to config/trustedproxy.php:
return [
'proxies' => [
'192.168.10.10',
],
// These are defaults already set in the config:
'headers' => [
(defined('Illuminate\Http\Request::HEADER_FORWARDED') ? Illuminate\Http\Request::HEADER_FORWARDED : 'forwarded') => 'FORWARDED',
\Illuminate\Http\Request::HEADER_CLIENT_IP => 'X_FORWARDED_FOR',
\Illuminate\Http\Request::HEADER_CLIENT_HOST => 'X_FORWARDED_HOST',
\Illuminate\Http\Request::HEADER_CLIENT_PROTO => 'X_FORWARDED_PROTO',
\Illuminate\Http\Request::HEADER_CLIENT_PORT => 'X_FORWARDED_PORT',
]
];

How to properly keep config files in Laravel?

I've built a piece of software which uses Laravel config module to consume certain settings.
Thing is, we're using Laravel Forge to auto-deploy, and everytime we deploy, configs are reset to the "blank" state, thus breaking some things everytime I deploy.
I have added the files to .gitignore, but doesn't seem to do the trick.
Can anyone point me towards the right direction in order to save this config files without having the re-set them everyime we deploy?
Thanks everyone!
It would be helpful to have an example of your config and .env files
Config files for multiple environments rely on .env files in each environment.
env() returns either the matching variable from your .env or the value specified.
so env('QUEUE_DRIVER', 'sqs') would look into the .env file for the QUEUE_DRIVER variable if it can't find a variable it returns the default 'sqs'.
An example of a queue config file might looks like this.
config/queue.php
<?php
return [
'default' => env('QUEUE_DRIVER', 'sqs'),
'connections' => [
'sync' => [
'driver' => 'sync',
],
'sqs' => [
'driver' => 'sqs',
'key' => env('SQS_KEY'),
'secret' => env('SQS_SECRET'),
'prefix' => env('SQS_URL'),
'queue' => 'general_queue',
'region' => 'us-east-1',
],
],
];
You would then set your variables in your .env file for each environment.
Production may look like this.
.env
QUEUE_DRIVER=sqs
SQS_KEY=yoursqskey
SQS_SECRET=yoursqssecret
SQS_URL=yoursqsurl
Your local environment might look like this.
.env
QUEUE_DRIVER=sync
You can edit your .env file in forge under Sites > Site Details > Environment

Configure Cakephp 2.6.0 with Redis Engine

I am trying to configure cakephp ver 2.6.0 to use redis engine by default. but somehow i am not able to make it work. any help will be highly appreciated.
Things Which i have tried so far..
Configured app/config folder 2 files , core.php and bootstrap.php. , according to the guidelines provided here in this blog configure cake with redis and this blog too Another cake-redis config setup
but i keep on getting errors like.
Fatal error: Uncaught exception 'CacheException' with message 'Cache engine session is not properly configured.' in C:\wamp\www\project\cakephp\cakephp_2.6.0\lib\Cake\Cache\Cache.php on line 181
CacheException: Cache engine session is not properly configured. in C:\wamp\www\project\cakephp\cakephp_2.6.0\lib\Cake\Cache\Cache.php on line 181
Any help will be highly appreciated.
I was having the same exact issue today while trying to setup CakePHP to use Redis as the cache engine.
Coincidentally, I also read the same setup instructions from the two blogs you linked to.
The reason was that I had copied pasted the Configure::write(...) code block from the Another cake-redis config setup blog post as it is and pasted it into the file without first commenting out the Configure::write(...) code block that was already in the core.php file.
I'm assuming that you have already successfully setup Redis on Windows and have installed the PHPRedis extension without any issues.
I am using the instructions from Another cake-redis config setup here.
In your app/Config/core.php file, comment out the following block: (this was starting at line 218 in my core.php)
Configure::write('Session', array(
'defaults' => 'php'
));
Instead, you can put this in: (You can change the values to suit your particular needs)
Configure::write('Session', array(
'defaults' => 'cache',
'timeout' => 100,
'start' => true,
'checkAgent' => false,
'handler' => array(
'config' => 'session'
)
));
After this, change the value of $engine to 'Redis', so it becomes:
$engine = 'Redis';
And then, put this code in, I put this in at the very end of the file: (Again, your values can be different depending on what your setup is)
Cache::config ('session', array (
'Engine' => $engine,
'Prefix' => $prefix . 'cake_session_',
'Duration' => $duration
));
And that's it. You're done! No need to change anything else.
To make sure that Redis is working properly with CakePHP, I ran the RedisEngine Test Suite that comes with CakePHP. You need to have PHPUnit installed for this to work.
It can be accessed via http://your-cakephp-project/test.php
Click on 'Tests' under Core and then click on 'Cache/Engine/RedisEngine'
If everything is working successfully, you should see all the tests pass.
Alternatively, you can use redis-cli at the command prompt to confirm that Redis is storing keys properly.
Once you have logged in by typing redis-cli, type KEYS *
This should give you a list of keys related to your CakePHP setup.
An example would be the "myapp_cake_core_object_map" key.
Hope this helps.

database.php file from environment directory not being loaded in Laravel 4

I am trying to load the local version of my database.php file located in my app/config/local directory.
I have setup my $env environment detection in start.php for local.
$env = $app->detectEnvironment(array(
'local' => array('your-machine-name')
));
The correct environment appears to be detecting correctly as the result of using:
App::environment('local')
evaluates as TRUE, however it only loads the default database.php in the app/config directory.
Just in case it was still related to detection I have tried just hardcoding local as the environment, but it has the same effect.
$env = $app->detectEnvironment(function()
{
return 'local';
});
I also tried using a different name for the environment in case 'local' had some undocumented reserved meaning.
I have successfully setup environments in other projects using Laravel 4, and I can't work out for the life of me what is different in this case. I'd appreciate any suggestions on what could lead to this behaviour.
The local environment is run on MAMP.
I just did an additional test and found that app.php file in my local folder loaded correctly. It is only the local database.php file that does not appear to be loading.
Production config files are loaded first and then merged with overrides from other environments. If the production file generates an error (e.g. undefined index) the config loading will bail early without loading the overrides. Check whether it's generating an error in the log. If so, in the production config file, check the value is set before attempting to use it and the local config file will then load correctly.
In your project inside app/config directory you should create a directory named local and then in that directory you should create a file named database.php and in that fole you should provide the configuration, for example:
// File: app/config/local/database.php
return array(
'connections' => array(
'mysql' => array(
'driver' => 'mysql',
'host' => 'localhost',
'database' => 'myTestDb', // myTestDb is a db name for example
'username' => 'root',
'password' => '',
'charset' => 'utf8',
'collation' => 'utf8_unicode_ci',
'prefix' => '',
),
'pgsql' => array(
'driver' => 'pgsql',
//...
),
),
);
Then in your project_folder/bootstrap/start.php file use something like this:
$env = $app->detectEnvironment(array(
'local' => array('*.dev', gethostname()),
'production' => array('*.com', '*.net')
));
You may also check this answer for environment setup. This should work.

Categories