I am working on a project in which i want to get the link of the file which i uploaded in the storage-app-public directory of laravel. I followed this link StackOverflow question, but when I try to run following command in Vagrant:
php artisan storage:link
It gives following error:
[Symfony\Component\Console\Exception\CommandNotFoundException]
There are no commands defined in the "storage" namespace.
you need to follow this code, I think It will be helpful for you.
$imageTempName = $userdata->file('image_media')->getPathname();
$imageName = $userdata->file('image_media')->getClientOriginalName();
upload the file to the database within the directory using Laravel.
<?php
public static function upload_file(Request $userdata){
$imageTempName = $userdata->file('image_media')->getPathname();
$imageName = $userdata->file('image_media')->getClientOriginalName();
$path = base_path() . '/public/uploads/images/';
$userdata->file('image_media')->move($path , $imageName);
$data=array('media'=>$imageName);
$insert=with(new UserModel)->uploadFile($data);
}
?>
Related
I am trying to convert an Excel file to PDF (Base64).
This is the code that converts the Excel to PDF:
$spreadsheet = $this->objPHPExcel = \PhpOffice\PhpSpreadsheet\IOFactory::load("MyExcelFile.xlsx");
$class = \PhpOffice\PhpSpreadsheet\Writer\Pdf\Mpdf::class;
\PhpOffice\PhpSpreadsheet\IOFactory::registerWriter('Pdf', $class);
$this->objPHPExcel = \PhpOffice\PhpSpreadsheet\IOFactory::createWriter($spreadsheet, 'Pdf');
$this->objPHPExcel->writeAllSheets();
//Save the file. (THIS IS WHERE THE ERROR OCCUR)
$this->objPHPExcel->save(storage_path() . '/app/temp_files/' . $newFileName);
Everything works locally, but whenever I try to run the same code on my Laravel Forge server, I get below error:
unlink(/tmp/imagick-3.4.0.tgz): Operation not permitted
If I trace the error, it is in this specific line:
$this->objPHPExcel->save(storage_path() . '/app/temp_files/' . $newFileName);
As said, this code runs fine locally. The temp file $newFileName is created inside my /temp_files folder.
What am I doing wrong?
Ok so the solution to this was rather tricky. I found out that it had nothing to do with Phpspreadsheet but rather Mpdf.
The problem was that the file "imagick-3.4.0.tgz" file permission was set to read only. Mening that unlink could not work on this specific file. This goes all the way back to when I first installed the imagick library.
The solution was to go to the /tmp folder and delete the imagick-3.4.0.tgz file manually. This folder should actually be deleted when doing the imagick installation.
I am trying to upload a file to public/attachments/foo.jpg in Laravel using storeAs(), it is working correctly in ubuntu but not in windows.
if($isValidated){
$newFileName = '';
foreach($files as $upload){
$fileName = preg_replace('/\s+/', '_', pathinfo($upload->getClientOriginalName())['filename']);
$newFileName = $fileName.'_'.$upload->uploadTime.'.'.$upload->getClientOriginalExtension();
$upload->storeAs('public/attachments', $newFileName);
}
}
This block of code successfully uploads a file in /public/attachments/foo.jpg
But when I try this in windows platform I get a error saying fopen ... failed to open stream : Invalid aruguments.
I have attached the screenshot of the error .
NOTE :
I have added symlink like so php artisan:storage link
Using Laravel 5.4
Problem is in the name of the file. It contains a colon :, which is not allowed on Windows.
This test is on localhost/windows/php/apache/mysql
For example:
in root of Public folder:
$path = '/upload/images/accounts/12005/thumbnail_profile_eBCeBawXWP.jpg';
I want use it:
$img->save($path);
I get this error:
"Can't write image data to path
(/upload/images/accounts/12005/thumbnail_profile_eBCeBawXWP.jpg)"
I try to fix it:
$path = public_path($path);
$img->save($path);
So I get this error:
"Can't write image data to path
(D:\Server\data\htdocs\laravel\jordankala.com\public_html\ /upload/images/accounts/12005/thumbnail_profile_pH657T62fl.jpg)
probably, in real server (linux), I won't have this problem and the code words.
Now how can I manage this error? (I want it works in windows localhost and real linux server)
You can use this to save at your path
$file = $request->file('image_field_name');
$destinationPath = 'upload/images/accounts/12005/';
$uploadedFile = $file->move($destinationPath,$file->getClientOriginalName()); //move the file to given destination path
I am trying to copy a folder which is already on s3 and save it with different name on S3 in laravel 5.4 . What I have found so far is that I can copy an Image, Not folder. I have tried to copy folder like ie:
$disk->copy("admin/form/$old_form","admin/form/$new_form");
But it doesnot work like that. It give me an error. Do i need to apply loop and get each folder item separately? Like:
$images = $disk->allFiles('admin/form/$id');
Or is there any work around available in laravel or s3 api it self?
Please help, Its driving me crazy.
Thanks in advance.
I'm in the middle of doing this same thing. Based on what I've read so far, copying a directory itself using Laravel doesn't seem to be possible. The suggestions I've seen so far suggest looking through and copying each image, however I'm not at all satisfied with the speed (since I'm doing this on lots of images several times a day).
Note that I'm only using the Filesystem directly like this so I can more easily access the methods in PHP Storm. $s3 = \Storage::disk('s3'); would accomplish the same thing as my first two lines. I'll update this answer if I find anything that works more quickly.
$filesystem = new FilesystemManager(app());
$s3 = $filesystem->disk('s3');
$images = $s3->allFiles('old-folder');
$s3->deleteDirectory('new_folder'); // If the file already exists, it will throw an exception. In my case I'm deleting the entire folder to simplify things.
foreach($images as $image)
{
$new_loc = str_replace('old-folder', 'new-folder', $image);
$s3->copy($image, $new_loc);
}
Another option:
$files = Storage::disk('s3')->allFiles("old/location");
foreach($files as $file){
$copied_file = str_replace("old/location", "new/location", $file);
if(!Storage::disk('s3')->exists($copied_file)) Storage::disk('s3')->copy($file, $copied_file);
}
This can all be done from CLI:
Install and configure s3cmd:
sudo apt install s3cmd
s3cmd --configure
Then you can:
s3cmd sync --delete-removed path/to/folder s3://bucket-name/path/to/folder/
To make the files and folder public:
s3cmd setacl s3://bucket-name/path/to/folder/ --acl-public --recursive
Further reading: https://s3tools.org/s3cmd-sync
I've found a faster way to do this is by utilizing aws command line tools, specifically the aws s3 sync command.
Once installed on your system, you can invoke from within your Laravel project using shell_exec - example:
$source = 's3://abc';
$destination = 's3://xyz';
shell_exec('aws s3 sync ' . $source . ' ' . $destination);
If your set your AWS_KEY and AWS_SECRET in your .env file, the aws command will refer to these values when invoked from within Laravel.
I have AWS SDK for PHP installed into /usr/share/php/
When I execute sample files from this directory, it works fine.
My web root directory is /var/www/
When I execute sample files from this directory, they do not work.
Here is a sample file */var/www/test_sdk.php*:
<?php
require_once 'sdk.class.php';
$s3 = new AmazonS3();
$bucket = 'test_bucket' . time();
$response = $s3->create_bucket($bucket, AmazonS3::REGION_US_W1, AmazonS3::ACL_PUBLIC);
if ((int) $response->isOK()) {
echo 'Created Bucket';
}else {
echo (string) $response->body->Message;
}
?>
I think the problem is in my 'require_once' statement. I have tried:
"require_once '/usr/share/php/AWSSDKforPHP/sdk.class.php'"
It didn't work
Any help is greatly appreciated!!
Is /usr/share/php/ in your PHP includes path? Have you tried simply installing via PEAR?
/usr/share/php/ is already in my PHP includes path. This was the only change I needed to make to get it working:
require_once 'AWSSDKforPHP/skd.class.php'