Laravel file storage URLs work locally but 404 on a real server - php

I'm working with a custom file storage disk and it works as expected on my local machine, but on my Forge server the generated asset URLs don't work.
My filesystem config looks like this:
'disks' => [
'local' => [
'driver' => 'local',
'root' => storage_path('app'),
],
'public' => [
'driver' => 'local',
'root' => storage_path('app/public'),
'url' => env('APP_URL').'/storage',
'visibility' => 'public',
],
'assets' => [
'driver' => 'local',
'root' => storage_path('app/public/assets'),
'url' => env('APP_URL').'/assets',
'visibility' => 'public',
],
];
I have a file at {myapp root}/storage/app/public/assets/file.jpg and I'm generating URLs with Storage::disk('assets')->url('file.jpg'); and this works fine, it gives me the expected URL, which is (locally) myapp.test/assets/file.jpg. In production I get the correct URL as well, myapp.com/assets/file.jpg, but this link 404s. If I insert "storage" into the URL, so that it's myapp.com/storage/assets/file.jpg, it works.
I've php artisan storage:linked everything in both environments, everything else is exactly the same, but on a real server the URLs all 404.
UPDATE SEP 2020
I have a much simpler filesystem setup now, but IIRC the solution here in the end was just to symlink public/assets to storage/app/public/assets on the Forge server. At the time this was ugly to do automatically, but in Laravel 7+ (maybe earlier too) there's a new links key in config/filesystems.php that makes this very easy.
In general, I wouldn't recommend nesting public disks/symlinks like this though. I should have just put things in storage/app/assets and kept it flatter overall.
It's still a mystery to me why this all appeared to work fine locally...

Related

Laravel horizon failure on high availability setup

I'm running two Laravel apps in a high availability setup, with a load balancer directing traffic between them.
Both apps use the same Redis DB (AWS ElasticCache) for queues , and are set up with Laravel Horizon.
Both apps have the same configurations and 3 workers each. "high", "medium" and "low".
Most jobs are running fine, but there's one job that takes longer than others and is causing an issue.
The failing job is running on the 'low' worker.
So the job is processed by one horizon. It's processing and after 1 minute and 30 seconds, the second laravel horizon is also taking the job and start processing it. Since this job can't run in parallel, the job fails.
It looks like the lock system isn't working properly, since both Laravel Horizon instances are taking the job.
Does horizon have a lock system or do I have to implement my own ?
I also have no idea why 90s after the job is taken by horizon, the 2nd horizon is taking it.
config/horizon.php
'environments' => [
'production' => [
'supervisor-1' => [
'connection' => 'redis',
'queue' => ['high', 'default', 'low'],
'balance' => 'auto',
'processes' => 1,
'tries' => 1,
'timeout' => 1250,
'memory' => 2048,
],
],
],
config/queue.php
'connections' => [
'redis' => [
'driver' => 'redis',
'connection' => 'default',
'queue' => 'default',
'retry_after' => 1260,
'block_for' => null,
],
],
WithoutOverlapping middleware should help you with that
https://laravel.com/docs/10.x/queues#preventing-job-overlaps

saving files to scaleway storage php methode

how to upload an image to scaleway storage by laravel or PHP methods?
Laravel uses FlySystem under the hood to abstract file storage. It provides several drivers out of the box including: S3, Rackspace, FTP etc.
If you want to support Scaleway, you would need to write a Custom Driver, which you can read more about it here.
Edit: It seems from the documentation of Scaleway, it supports AWS CLI clients, which means, this should be quite easy to add support for in FlySytem. I tried the following and it worked.
I added a new driver in config/filesystems.php as follows:
'scaleway' => [
'driver' => 's3',
'key' => '####',
'secret' => '#####',
'region' => 'nl-ams',
'bucket' => 'test-bucket-name',
'endpoint' => 'https://s3.nl-ams.scw.cloud',
]
and then, to use the disk, I did the following:
\Storage::disk('scaleway')->put('file.txt', 'Contents');
My file was uploaded.
EDIT: I also made a PR to get Scaleway accepted in the list of adapters for League's FlySystem. It got merged. You can see it live here.

laravel | How to confirm which caching driver is being used?

I am trying to use redis in my application but I am not sure if my app is using redis or file driver as I can't create tags but I can create normal keys fine.
I have set CACHE_DRIVER=redis and also in my cache.php I have:
'default' => env('CACHE_DRIVER', 'redis'),
also in my database.php there is:
'redis' => [
'client' => 'predis',
'default' => [
'host' => env('REDIS_HOST', '127.0.0.1'),
'password' => env('REDIS_PASSWORD', null),
'port' => env('REDIS_PORT', 6379),
'database' => 0,
],
],
The reasons for my suspicion are I cannot create tags and running redis-cli flushall under homestead(ssh) does not seem to get rid of the cahce. I had to use Cache::flush() in laravel instead.
So How can I effectively find out which cache driver my application is using?
Its pretty simple , you can use redis cli monitor command to check the get/set are happening or not
redis-cli monitor
And try to run the application .. if u able to see the keys then redis cache is running
u can also check the redis key by following command
redis-cli
Then enter following
keys *
I hope its useful.
You should simply query your Redis DB like so:
redis-cli
Then, being on redis console:
SELECT <DB-NUMBER>
KEYS *
If you see some keys like
1) PREFIX:tag:TAG:key
2) PREFIX:tag:DIFFERENT-TAG:key
it is quite a hint that Laravel is using Redis as its cache backend. Otherwise, take a look at
<YOUR-APP-DIR>/storage/framework/cache
If you find some files/sub-folders in there, it is quite a hint, Laravel is using file-based caching.

How to manage big files upload in L5 class Storage using Flysystem package?

In my Laravel 5 set up, I included the Flysystem package, configured a disk in config/filesystems.php (ftp)
'ftp' => [
'driver' => 'ftp',
'host' => env('FTP_HOST'),
'username' => env('FTP_USER'),
'password' => env('FTP_PASS'),
'passive' => TRUE,
],
Then I'm able to perform ftp upload and download with the following
instructions:
Storage::disk('ftp')->put($filePath, $contents);
Storage::disk('ftp')->get($filePath);
Until here everything is fine.
Problems start when I'm uploading big files. Above 200MB.
PHP memory limit is reached and execution stops with fatal error.
In fact when Storage->put is called my PC memory increases dramatically.
I've read somewhere that a solution might be to use Streams to perform read write from my "virtual" disk.
Actually I'm still missing how to implement it in order to optimize memory usage during these operations.

Laravel not creating log file.

I'm learning Laravel 4.0 to develop a webserver.
I'm using a LAMP stack (Apache 2, php 5.5).
I can't find the log file where Log::error() calls writes to.
As far as I know it's supposed to be to app/storage/logs/log-cli-.txt but there are no files there.
Here is the code:
app/commands/MsgCommand.php
public function fire(){
Log::error('messages - log');
}
It's called from artisan:
app/start/artisan.php
Artisan::add(new MsgCommand());
Am I looking in the right place?
How can I check that this is indeed the right folder (i.e. where is it configured)? To check for faulty installation or setup.
Thanks to marcanuy, I am now sure it is writing to app/storage/logs.
Also I found out it writes fine if I call the command through artisan. Running on apache 2 nothing happens though. I'm starting to think I set up the command wrong.
By default app/storage is the location for log files, storage folder is configured in bootstrap/paths.php
'storage' => __DIR__.'/../app/storage',
Also be sure that this folder is writable by the web server.
The problem was permissions.
User permissions for the www-var user in the ../app/storage
And MySQL settings: Creating a user corresponding to what is set in the app/config/database.php
'mysql' => array(
'driver' => 'mysql',
'host' => 'your host',
'database' => 'your db',
'username' => 'your user',
'password' => 'your pass',
),

Categories