I am trying to upload a file to a public folder which was working lately but now it is showing below error:
Disk [public] does not have a configured driver.
I tried checking for configured driver in config/filesystems.php but, it is already set there. I am not getting where the issue might be.
Upload code:
public function upload(ProductImageRequest $request, Product $product)
{
$image = $request->file('file');
$dbPath = $image->storePublicly('uploads/catalog/'.$product->id, 'public');
if ($product->images === null || $product->images->count() === 0) {
$imageModel = $product->images()->create(
['path' => $dbPath,
'is_main_image' => 1, ]
);
} else {
$imageModel = $product->images()->create(['path' => $dbPath]);
}
return response()->json(['image' => $imageModel]);
}
Code in config/filesystems.php
'disks' => [
'local' => [
'driver' => 'local',
'root' => storage_path('app'),
],
'public' => [
'driver' => 'local',
'root' => storage_path('app/public'),
'url' => env('APP_URL').'/storage',
'visibility' => 'public',
],
i use this code for moving the picture and storing its name you may want to give it a shot
//get icon path and moving it
$iconName = time().'.'.request()->icon->getClientOriginalExtension();
$icon_path = '/category/icon/'.$iconName;
request()->icon->move(public_path('/category/icon/'), $iconName);
$category->icon = $icon_path;
i usually move the image then store its path in db and this is what my code shows you can edit it as desired
Related
I have managed to configure my sftp in the 'config/filesystems.php' file:
'remote-sftp' => [
'driver' => 'sftp',
'host' => env('SFTP_HOST'),
'username' => env('SFTP_USERNAME'),
'password' => env('SFTP_PASSWORD'),
'root' => '/var/www/html/files/current',
'visibility' => 'public',
'permPublic' => 0644,
'timeout' => 30,
],
An sftp account was created on the remote server (usert) and I successfully manage to upload the file to the server, but, the problem is that the files is being stored in the wrong folder. My code to upload the file :
$fileName = $id_number . '_tps_' . $datetime . '.' . $request->file('file')->extension();
Storage::disk('remote-sftp')->put($fileName, fopen($request->file('file'), 'r+'), 'public');
I expect the file to be stored in '/var/www/html/files/current' as specified in 'config/filesystems.php' but i find the file is actually stored in '/home/usert'. How to I get it to save in the '/var/www/html/files/current'?
example image response error File not found at path
hey can you help me?
so here I want to view the image file from the ftp server that will be responded to by JS along with the FTP server link
controller example :
$explode = explode('#',$lampiran->lampiran_gambar);
foreach($explode as $row){
if($row == null){
$row1[] = 'null';
}else{
$row1[] = Storage::disk('ftp')->get('/lampiranSurat' . $row);
}
}
if($pegawai_pejabat->jenis_jabatan_id == 1){
return response()->json([
'meta' => [
'code' => 200,
'status' => 'success',
'message' => 'Data Ditemukan',
],
'data_verifikasi' => $verifikasi,
'lampiran_gambar' => $row1,
'pegawai_verif' => $pegawai_verif,
]);
}
example config filesystem :
'default' => env('FILESYSTEM_DRIVER', 'ftp')
'ftp' => [
'driver' => 'ftp',
'host' => env('FTP_HOST'),
'username' => env('FTP_USERNAME'),
'password' => env('FTP_PASSWORD'),
'root' => '/web',
],
config file .env :
FTP_HOST=exampleftpserver.com
FTP_USERNAME=userftp
FTP_PASSWORD=password123
so why is my ftp url not being read in storage?
for image file data already in the database and already in FTP
You're missing a / between your directory name and the filename:
Replace
$row1[] = Storage::disk('ftp')->get('/lampiranSurat' . $row);
With
$row1[] = Storage::disk('ftp')->get('/lampiranSurat/' . $row);
I am getting the following error:
Missing required client configuration options: region: (string) A “region” configuration value is required for the “s3” service (e.g., “us-west-2”).
Here is my setup (XXX to hide my creds):
.env file:
DO_SPACES_KEY=XXXXX
DO_SPACES_SECRET=XXXX
DO_SPACES_ENDPOINT=https://nyc3.digitaloceanspaces.com
DO_SPACES_REGION=NYC3
DO_SPACES_BUCKET=XXXX
filesystems.php file (under the disks):
'do_spaces' => [
'driver' => 's3',
'key' => env('XXXXX'),
'secret' => env('XXXXX'),
'endpoint' => env('https://nyc3.digitaloceanspaces.com'),
'region' => env('NYC3'),
'bucket' => env('XXXXXX'),
],
Also in filesystems.php file:
‘cloud’ => env('FILESYSTEM_CLOUD’, 'do_spaces’),
in the view file:
function addDocument(Request $req, $projId)
{
$image = $req->file('uploadFile');
$file_name = pathinfo($image->getClientOriginalName(), PATHINFO_FILENAME);
$input['imagename'] = $file_name.'_'.time().'.'.$image->getClientOriginalExtension();
//$destinationPath = public_path('/images');
//$image->move($destinationPath, $input['imagename']);
$destinationPath = $image->store('/', 'do_spaces');
Storage::setVisibility($destinationPath, 'public');
//$data = array('fid'=>$folderId,'fileLoc'=>$input['imagename'],'projId'=>$projId);
//\DB::table('documents')->insert($data);
return back();
}
As you can see in the view, I try to store the image on the space and then store the path in the DB which would represent the path to the space.
I cannot get this error to go away; do you see any issues?
Thanks!
'region' => env('NYC3'), should be 'region' => env('DO_SPACES_REGION'),
I'm building an Restful API using Laravel 5 and MongoDB.
I'm saving avatar image for users.
It's working fine but I'm trying to create a Folder for every User. For example: "app/players/images/USERID"
I've tried to do something like this in different ways but I always get Driver [] is not supported.
\Storage::disk('players'.$user->id)->put($image_name, \File::get($image));
UploadImage:
public function uploadImage(Request $request)
{
$token = $request->header('Authorization');
$jwtAuth = new \JwtAuth();
$user = $jwtAuth->checkToken($token, true);
$image = $request->file('file0');
$validate = \Validator::make($request->all(), [
'file0' => 'required|image|mimes:jpg,jpeg,png'
]);
if ( !$image || $validate->fails() )
{
$data = array(
'code' => 400,
'status' => 'error',
'message' => 'Image uploading error-'
);
}
else
{
$image_name = time().$image->getClientOriginalName();
\Storage::disk('players')->put($image_name, \File::get($image));
$user_update = User::where('_id', $user->id)->update(['imagen' => $image_name]);
$data = array(
'code' => 200,
'status' => 'success',
'user' => $user->id,
'imagen' => $image_name
);
}
return response()->json($data, $data['code']);
}
filesystems.php:
'players' => [
'driver' => 'local',
'root' => storage_path('app/players/images/'),
'url' => env('APP_URL').'/storage',
'visibility' => 'public',
],
I expect the user avatar image saves on User ID folder.
The disk call, tells Laravel which filesystem to use, let's assume you have an user with Id one, with your code it will access the filesystem playeers1.
What usually is done is to put these files in folder structures for the different users, so instead you could do. This will put your image file, in the folder 1.
\Storage::disk('players')->put($user->id . '/' . $image_name, \File::get($image));
I had a similar problem, check if the lines can change what you want to achieve.
\Storage::disk('players')->put("{$user->id}/{$image_name}", \File::get($image));
I relied on the laravel guide: File Storage - File Uploads
I hope it helps you. A cordial greeting.
I have a task of pulling down assets which are stored on an AWS S3 bucket and storing those in a local project using Laravel. Also, the files are encrypted.
I need to write a script to do this.
Any ideas on how to do this?
Assuming you have following disks :
'disks' => [
'local' => [
'driver' => 'local',
'root' => storage_path('app'),
],
's3' => [
'driver' => 's3',
'key' => env('S3_KEY'),
'secret' => env('S3_SECRET'),
'region' => env('S3_REGION'),
'bucket' => env('S3_BUCKET'),
'http' => [
'connect_timeout' => 30,
],
],
],
Then you can copy file using :
if(Storage::disk('s3')->exists('path/yourfile.txt')){
Storage::disk('local')->writeStream('path/yourfile.txt', Storage::disk('s3')->readStream('path/yourfile.txt'));
}
To move the file :
if(Storage::disk('s3')->exists('path/yourfile.txt')){
Storage::disk('local')->writeStream('path/yourfile.txt', Storage::disk('s3')->readStream('path/yourfile.txt'));
Storage::disk('s3')->delete('path/yourfile.txt');
}
If you have set default disk then you can skip mentioning it spefically and directly do Storage::something()
Moving all files from s3 to local disk :
Considering you have different disks which are not on the same server, you need to do little bit extra as compared to both disks on the same server :
$s3Files = Storage::disk('s3')->allFiles();
foreach ($s3Files as $file) {
// copy
Storage::disk('local')->writeStream($file, Storage::disk('s3')->readStream($file));
// move
Storage::disk('local')->writeStream($file, Storage::disk('s3')->readStream($file));
Storage::disk('s3')->delete($file);
}
Or You can move the delete() after the entire moving and delete all files together like :
Storage::disk('s3')->delete(Storage::disk('s3')->allFiles());
which is essentially similar but just one function call.