I am stuck on this part of my laravel application, Where I am asked to protect the files from directly accessed via url browser hit.
I have a public folder in which in a doc folder is present where all the documents are going to be uploaded. I just need a solution to prevent this where i can access docs directly from my application but any third party visitor can not view my docs (images,pdfs etc..).
I have tried many solutions but its not at all working.
I just want to things :-
1. Protect my docs through direct access.
2. Way of implementing it in laravel (via .htaccess)
I know this can be possible through htaccess, but how?
Kindly help Please :)
Add in your upload folder .htaccess file with content:
Deny from all
There are three approaches I can think of just now;
You intercept all image and video requests with Laravel, then using the router, serve up the content that the user was after, provided they are authorised. THIS WILL BE SLOW!.
You rely on obscurity and put all that clients images, videos etc in a folder that has a long-unguessable random url. You can then link to the content in your code using the 'static' folder name. The customer's content will always be in that folder and accessible if they log in or not. The advantage of this compared to 1 is that your framework does not have to boot for every image or video.
Have all the content hidden away - possibly in the storage folder. When the user logs in, create a temporary symbolic link between their public folder and their folder in storage. Keep a note of the link in the session. Use the link in all gallery etc rather than the static code used in (2) above. Once they log out the code will no longer be valid, and you can delete the symbolic link on logout or have a job to tidy it up periodically.
Related
Example:
if we hit the link of any document then we can easily download it. for example backup.sql inside the backup folder of the website then we can download it by hitting URL www.example.com/backup/backup.sql
I don't know which type of document the client will store there but obviously, it can be confidential that is why it is not shareable to all.
working:
now I am creating a certain document management tool where we can upload a document and download the document and can assign to users who can download that document but while creating I got the idea that anyone can brute force that folder just hitting URL with random names. backup.sql, database.sql and so on. I am using URL myself to make the document downloadable should I go with get_file_content()?.
I want to know if there is a way to download the file in a secure way example only the user that is logged in into my website can only download the file.
something like via htaccess or something else I can block the files directory from outside access. only the logged-in user can download the file and it will be blocked by outside access so that nobody can brute force it. I know I can block it via htaccess but I want them to download too but only for the users of my website.
maybe you should use "x-accel-redirect" for Nginx, and "X-Sendfile" for Apache.
I have a folder full of images on my server where my mobile app accesses them.
www.mysite.com/images/image001.jpg
Whoever has this link now can access the files. Also can comprehend that the images are in a certain order and thus guess the pattern etc...
The image links are gotten via the php inside the app that use token to verify the user is legit and indeed the request is coming from a mobile that has downloaded the app.
What I want to do is to secure the folder from external access and prevent people from accessing the folder and seeing everything from a browser and limit its access only via the php file.
I have used the trick of .htaccess with deny from all so that it show the forbidden message whenever someone visits from the web, however, all my JSON requests also do not work now.
What can I do to accomplish this?
You will have to serve the images with a PHP script that also checks that access is permitted.
Once you've done this you can simply store the images outside the web root, which makes them inaccessible from the web, except through the PHP file that serves them.
best option is to make the pic links randomized and un-guessable
so a pic link would look like this:
www.mysite.com/images/8Md9FhD1hANdIBUz4WVCzKR227fykTByq6SKHas5FyYJDr2EjAlIn1bS0f5gPJih.jpg
youtube use this method for "private" videos
users / bots cant be accesses randomly, and you cant guess the next pic.
when the user is authenticated display the link. the worst thing that can happen is that this user can share that link, (he can download and share not matter what you do)
when you save the picture on your server just randomize the name.
I am creating website which is similar to dropbox. My logic behind the project is that I am going to create 1 table which includes username and pass and unique id.
Then I will create folder with name as that of the unique code and I will store that particular person file like video, mp3, txt in that particular folder. Now my question is how to restrict other users from entering into that folder(because I can access that folder by directly entering the url)?
Also suggest me if any other logic is more efficient.I am working on mini project.
I believe that Google Drive (and Dropbox probebly too), use https behind the scenes. In that case you simply need to make sure that your php/asp files make sure only the logged on user can access his/her files. It all depends on how your creating your cloud platform. You could also use scp, ssh, in that case your server automatically directs the client command to his/her own files.
You would need to create a controller that handles access to files. Do not direct link to the files, for example if you pass your arguments as /myFolder/myImg.jpg, then the controller would take the logedin user unique_id and the path as arguments, and then it would create a path it self.
2323-2332-a51df/myFolder/myImg.jpg
The idea is that the uniqueID will serve as base path, and your controller will handle all file access. This way you dont have to chmod 777 anything. Your controller will have access only to the folders you require and all will remain within your php settings. No need to worry about somebody trying to access any system folders.
Next to that you would just need to load the file contents and return it with the appropiate mime type.
In my Symfony2.3 project, i have a frontend website and a backend. The backend is secure by security.yml file and only role_admin user can acces backend.
Want i want now is only admin users can download pdf files stores in assets.
Is there a way to do this ?
now, all visitors can access my pdfs files by url link.
Do i have to move this pdf to another folder? or use an htaccess maybe?
You need to store these files in a location that is not directly accessible through your webserver. (i.e. not in the web folder or one of it's subfolders)
Then create a controller/action that checks for the permission to download (i.e. a certain user-role) before serving the file.
Read the documentation chapter Serving Files for a quick overview of how you can serve files in symfony2.
Add a role, something like ROLE_ACCESS_PRIVATE ASSETS and assign it to the admin user. Do the check for that permission in the code like any other.
Edit: This is assuming, of course, that a controller stands in the way of these files.
I'm working in php and want to make directories for each user where I would store their uploaded images. Only the user can access his/her directory and respective images.
I couldn't find much documentation on how to do this, namely creating directories, best place to put them in my server, how to link them to a user, and setting them to private. Please share your guidance on where to start. What is the standard practice?
You should consider to put Files with restricted access outside the public webroot folder and serve them via PHP, which will enable you to check the users credentials before.
(see Fastest Way to Serve a File Using PHP)
That way you might not need one directory per user.