I am working on a mobile application that uploads a file to my Linux server. The API is made on Laravel which is within the same Linux Server inside /var/www/html directory.
I want my image that is being uploaded from the mobile application to be stored inside /home/uploads folder for that I have set FILESYSTEM_DRIVER=public and FILESYSTEM_PATH=/home/uploads inside my Laravel Project.
Now I made a function inside my controller to upload my file which is below
public function uploadFile(Request $request)
{
if ($request->hasFile('file')) {
Storage::putFileAs('Documents/', $request->file, 'Some name.jpg');
}
}
When I try to run the above code directly from my Laravel project, the file gets uploaded inside /home/uploads/Documents/Some name.jpg but when I hit the same function using Postman, I get a success message but I see no file inside my folder. What can be the cause of this issue?
Below is a screenshot of how I am sending the file through API using POSTMAN
Helps appreciated
maybe it is not a Laravel problem but a linux system wrong configuration try to set the www folder permissions recursively :
chmod -R 777 /var/www
indeed if laravel does not return an error it is possible that the problem is at a lower level
Related
I have created an application and deployed in a server. I also have a functionality of generating QR codes and save them into storage symlink.
Here's my server scaffolding:
In a local directory I have all source Laravel code. Other files are from public directory.
How to create a symlink in the server folder? What's the workaround?
Here's the example of doing via calling a PHP script:
symlink('/home/local/storage/app/public', 'storage');
I have a Apache/2.4.6 server running on a RedHat server. I am running a slim application at the root of the weber server in /var/www/html/.
I want to upload and image and save it to a specified folder. I can run the app perfectly fine on my local apache environment.
Here is my simple controller function:
public function test($request, $response, $args)
{
$files = $request->getUploadedFiles();
if (empty($files['file']))
return $response->withJson(['message' => 'Could not find file'])->withStatus(400);
$file = $files['file'];
$fileName = $file->getClientFilename();
//app and images folder both ahve 777 permissions
$file->moveTo('/var/www/html/app/images/{$fileName}');
return $response->withStatus(200)->write('Test Method');
}
I can echo the file name and see it is there it just move the file I get unwritable path. I don't understand why.
This was an issue caused by SEL(Security Enhanced Linux) which comes activated with RedHat. Apparently SEL does not allow uploads to any destination by default regardless of folder permissions and you have to manually allow this. I was able to finally upload using the following command:
chcon -R unconfined_u:object_r:httpd_sys_rw_content_t:s0 /var/www/app/images
I don't know much about SEL so unfortunately can't add more detail to this other than this solution worked for me.
I am working on a Laravel project, I am uploading images via Admin Panel which are storing in Storage directory. When I try to access the images then it works fine on localhost. i.e.
The image is accessible by following URL on localhost:
http://localhost:8000/storage/default3.png
But when I try to access the same image on live server:
http://13.57.71.20/serio/storage/default3.png
then it doesn't work.
NOTE: serio is the folder where project is uploaded on server.
But it does work if I try the following way:
http://13.57.71.20/serio/storage/app/public/default3.png
I also tried to re-link the Storage directory by running following command:
php artisan storage:link
but nothing helped.
Change following line in .env file
APP_URL=http://13.57.71.20/serio
The problem in my view, it seems is with production url. You running it on production on url:
http://13.57.71.20/serio/
Seems to me that the directory alias serio is not properly configured for Apache/Nginx. Try running the app under http://13.57.71.20/ or you need to add an alias to the default host configuration for serio which you should map to your laravel app public folder.
I have developed a laravel project in laravel 5.4 on xampp localhost.
as on localhost I ued php artisan storage:link to link storage/app/public/images with public/storage to access images. So its working 100% correctly on local host but when I uploaded it on hosting server I lost access to those images. I dont know how can I fix it. please some one help me to make it working.
First delete the shortcut created from localhost (if you have uploaded it to server).
Create a blank php file (like link.php), paste the code below to it:
<?php
symlink('/home/**domain**/storage/app/public', '/home/**domain**/public_html/tests/storage');
In the above code, don't forget to replace domain with the name of the root folder of your domain.
Then upload the file to your public folder (like public_html).
Then on your browser search for your_domain_name/link.php
Hope it will work
Created an application using Codeginiter and is working fine with my local environment which is in XAMPP.
After uploading application folders and files to linux server application is not working at all.
Deployed complete files under "/var/www/html/" and when access from browser it gives me CI error as "404 Page Not Found".
After uploading, I have updated .htaccess file.
I think this is about Linux file permission. You might want to check its file permission.
try to change the permission to read, write and executed by you ONLY example:
sudo chmod 700 var/www/html