I'm hosting my own S3Server on DigitalOcean using the Docker image scality/s3server
Previously i have been using Amazon S3 server, so i used the league/flysystem-aws-s3-v3 library for the connection.
Now, because i'm using my own server, i need to change the S3 endpoint that the connector use.
I have tried this configuration in the filesystems.php:
'disks' => [
's3' => [
'driver' => 's3',
'key' => 'accessKey1',
'secret' => 'verySecretKey1',
'bucket' => 'mybucket',
'base_url' => 'http://my_digitalocean_ip:8000'
]
]
The connector still attempts to access https://s3.amazonaws.com/mybucket.
Does anyone know how to do that?
I found the answer myself.
The configuration option should be "endpoint" in stead of "base_url".
'disks' => [
's3' => [
'driver' => 's3',
'key' => 'accessKey1',
'secret' => 'verySecretKey1',
'bucket' => 'mybucket',
'endpoint' => 'http://my_digitalocean_ip:8000'
]
]
Now it works.
I also chose to go with another storage system called minio. It's more intuitive i think, and it has a browserbased GUI.
Related
I upload my laravel website to shared hosting ". Every things work except i am not able to retrieve images stored in the storage/public folder. I ran
"php artisan storage:link"
but it says not found. I also tried
rm storage
but the getthe message rm: cannot remove. Is directory
php artisan storage:link output from ssh command
Note, I'm not addressing the original answer because it doesn't sound like it can be done. This is an alternate solution.
Solution
In your config/filesystems.php file, you'll see the following code (or something similar).
'disks' => [
'local' => [
'driver' => 'local',
'root' => storage_path('app'),
],
'public' => [
'driver' => 'local',
'root' => storage_path('app/public'),
'url' => env('APP_URL').'/storage',
'visibility' => 'public',
],
's3' => [
'driver' => 's3',
'key' => env('AWS_ACCESS_KEY_ID'),
'secret' => env('AWS_SECRET_ACCESS_KEY'),
'region' => env('AWS_DEFAULT_REGION'),
'bucket' => env('AWS_BUCKET'),
'url' => env('AWS_URL'),
'endpoint' => env('AWS_ENDPOINT'),
],
],
To link your public disk to a public directory without using OS symlinks, change the disks['public']['path'] to the public path of your choosing. For example,
'disks' => [
/** -- skipped code **/
'public' => [
'driver' => 'local',
'root' => __DIR__ . '../public/images/`,
'url' => env('APP_URL').'/images',
'visibility' => 'public',
],
/** -- skipped code **/
],
Another Solution
You can totally use the S3 storage as well. That will connect you with an AWS bucket that you can use.
Personal Anecdote
Shared hosting can be tricky with Laravel. I would transition away as soon as I could. I use Laravel Forge to deploy servers and Digital Ocean or AWS for hosting.
I've deployed my laravel project on my Shared Host, and try some my project features one of them is uploading an image. The image successfully uploaded into my Storage Folder, but when I try to view the image, my console give me 404 on that uploaded image URL like this URL :
http://member.gunabraham.com/storage/site/images/Site_Logo_1561217248.png
But in my local server, the uploaded image shows correctly on this URL :
http://localhost:8000/storage/site/images/Site_Logo_1561217645.png
This is my config/filesystems.php :
'disks' => [
'local' => [
'driver' => 'local',
'root' => storage_path('app'),
],
'public' => [
'driver' => 'local',
'root' => storage_path('app/public'),
'url' => env('APP_URL').'/storage',
'visibility' => 'public',
],
's3' => [
'driver' => 's3',
'key' => env('AWS_ACCESS_KEY_ID'),
'secret' => env('AWS_SECRET_ACCESS_KEY'),
'region' => env('AWS_DEFAULT_REGION'),
'bucket' => env('AWS_BUCKET'),
'url' => env('AWS_URL'),
],
],
And my HTML file :
<a href="{{url('/home')}}" class="brand-logo amber-text center"><img src="{{asset('storage/site/images/'.$siteinfo->site_logo)}}" alt="" width="70" style="margin-top:15px;margin-bottom:15px;">
So what should I do on my laravel project which is deployed on Shared Hosting? Thanks for attention.
Solved by change my image source in HTML file from :
storage/site/images/
to :
storage/app/public/site/images/
Is this right choice? Or there're another solution to solve this problem?
You want to try link storage folder.
if /public/storage folder is already exists first go to public folder and delete exising storage folder then run this command
Are you tryed ?
try this command php artisan storage:link
if you don't have command pannel in shere hosting please create one route for run a comand
Route::get('/command',function(){
Artisan::call('storage:link');
});
Your files are being uploaded in the storage path and not the public path. You have mainly two options here:
1. Change your file-system to public path.
In your /config/filesystems.php
'public' => [
'driver' => 'local',
'root' => public_path('app/public/'),
'url' => env('APP_URL').'/storage',
'visibility' => 'public',
],
Create a symlink and point the storage files via public folder. To do this you can run php artisan storage:link in your server console. If you don't have console access to server you can do this using a cron job.
Option 1 is better if you are using shared hosting with no console access.
I have question of elfinder, how to enable 'Mount Network Disk' on elfinder for use google drive in laravel 5.2. Could tell you me, which are steps to follow.
The steps i followed were:
Installation with composer.
composer require barryvdh/laravel-elfinder
Add the ServiceProvider to the providers array in app/config/app.php
Barryvdh\Elfinder\ElfinderServiceProvider::class
Copy the assets to the public folder, using the following artisan command:
php artisan elfinder:publish
Run command.
composer update
I used the default configuration, I created the folder 'files' in the public folder.
With the command
php artisan vendor:publish --provider='Barryvdh\Elfinder\ElfinderServiceProvider' --tag=config
and
php artisan vendor:publish --provider='Barryvdh\Elfinder\ElfinderServiceProvider' --tag=views
In the file elfinder.php with the route
'vendor/barryvdh/laravel-elfinder/config/elfinder.php'
In the section route use
'route' => [
'prefix' => 'elfinder',
'middleware' => null, //Set to null to disable middleware filter
]
In the file filesystems.php with the route
'/config/filesystems.php'.
In the section disk in the option public, add the line
'root' => base_path().'app/public',
after of the line
'driver' => 'local',
Example:
'disks' => [
'local' => [
'driver' => 'local',
'root' => storage_path('app'),
],
'public' => [
'driver' => 'local',
'root' => base_path().'app/public',
'root' => storage_path('app/public'),
'visibility' => 'public',
],
's3' => [
'driver' => 's3',
'key' => 'your-key',
'secret' => 'your-secret',
'region' => 'your-region',
'bucket' => 'your-bucket',
],
],
This is how i solved the error that occurs when you use the copy option, open folder, etc.
Error, Unable to connect to backend. HTTP error 0
But I do not know, what is needed to use the mount network disk.And
how to give the folder(example: Reading)
This is link of the project
https://github.com/AlonsoCauich/driveOne
Help guys: I tried to test the new Laravel admin and got the following error:
Missing storage symlink We could not find a storage symlink. This
could cause problems with loading media files from the browser.
I tried to use:
php artisan storage:link
Set in Nginx config:
disable_symlinks off;
But it still is not working.
Delete first the existing storage link in public.
after, re-run (in app path)
php artisan storage:link
I have fixed the image issue by editing config/filesystems.php
change the 'public' => root & url value as per your project.. i think it will fix it..
'disks' => [
'local' => [
'driver' => 'local',
'root' => storage_path('app'),
],
'public' => [
'driver' => 'local',
'root' => storage_path('../../public_html/images'),
'url' => env('APP_URL').'/images',
'visibility' => 'public',
],
's3' => [
'driver' => 's3',
'key' => env('AWS_ACCESS_KEY_ID'),
'secret' => env('AWS_SECRET_ACCESS_KEY'),
'region' => env('AWS_DEFAULT_REGION'),
'bucket' => env('AWS_BUCKET'),
],
],
Having struggled with this symbolic link, here is the procedure:
Once the installation is done on the production server:
cd public
rm storage
Go to the admin panel of Voyager and click on Fix It. Do not use the PHP command artisan storage:link.
Everything works now.
I have some storage operations being done in my controllers, this is the code that works on my local machine:
Storage::copy('public/filename.pdf', 'public/sub_directory/filename_'.$var.'.pdf');
Storage::delete('public/filename.pdf');
With this code, I am successfully able to copy a file in laravel_root/storage/app/public/, then copy and rename the file to laravel_root/storage/app/public/subdirectory/, the file in public/ gets deleted after the copy operation. This works on my local machine.
When I pushed the code up to the staging server, above paths did not work and I got ERROR: No such file or directory.
I got this working by changing the paths from what worked on the local machine to :
Storage::copy('filename.pdf', 'subdirectory/filename_'.$var.'.pdf');
Storage::delete('job_card.pdf');
I had to remove out the public/from the operations.
My question is - Why does this differ in local machine and when pushed to server?
I am running on macOS on local Machine and ubuntu 16 on the staging server. I did not change any of the config files.
I would say this has to do with your filesystems config.
Look in config/filesystems.php for the root of the disks.
'disks' => [
'local' => [
'driver' => 'local',
'root' => storage_path('app'),
],
'public' => [
'driver' => 'local',
'root' => storage_path('app/public'),
'visibility' => 'public',
],
's3' => [
'driver' => 's3',
'key' => 'your-key',
'secret' => 'your-secret',
'region' => 'your-region',
'bucket' => 'your-bucket',
],
],
Local is storage_path('app') and public is storage_path('app/public')