I'm working on a project using PHP (Laravel Framework). This project allows logged-in users to store files and folders that only they can access.
I've set up user authentication such that a unique user token is required to access the user's data.
When a user stores/creates a folder, the folder gets stored on the server, prepended with a random 32 character string complicated path such as /WijiIHSIUhoij99U8EJIOJI09DJOIJob/username/folder which is stored in a MySQL database for the user to fetch with an API so long as they have the correct access token for their account. I did this because I guess it seemed more secure, correct me if I'm wrong.
This is half the job (please correct me if what I've done so far is wrong or bad practice).
What I need now is to make sure nobody can access the root folder / and get a list of user folders.
But I fear if I restrict access to the root folder, users will no longer be able to access their folders.
I don't know where to even start looking to find a solution, so anything, even if it's just to get me started will help.
The random string is redundant, as your token does the randomization for you. If the users know that their stuff is stored in /RANDOMSTRING/folder, someone will build a bot to try them all anyhow, but the idea is close to what you want, I think.
Instead, try creating a script that catches all 404s, and checks if they match RANDOM_STRING/user/folder. If so, then you can have the random string in the database match to a universal folder, something like /user_files/username/their_folder. This way, you aren't exposing your folder structure, and you can add a middleware to count how many times they've unsuccessfully tried a URL, then ban their client if it's more than 3, for instance.
In this case, the script delivers the files (or the folder) as a download, or as a screen without the actual folder structure visible. The files can be stored anywhere, and the database correlates the URL entered with an actual file or folder somewhere. (really, we're trying to mitigate someone getting access through a brute force here).
From there, I would look into using a different service to store the files. This helps with two things that you can read up on. First, you can choose something like AWS cold storage which will cost you less, and second you can take the storage off the computational server, such that if there is a security breach, you know that they either only had access to the user files, or the code files. This will help you in the long run (although if they gain access to your code files, they will likely be able to locate your storage location and password... this is really to protect your code files).
Beyond that, start looking into read and write permissions and user groups. PHP, the OS, the users, and the admins should all have their own permissions on the storage server to make sure they can't do certain things with the files. Moreover, you don't want someone uploading a malicious file and having it execute.
This is just a primer, but it'll lead you to a LOT of information, I'm sure.
You have to make a distinction between the public and private files.
Public files
Public files can be accessed, you guessed it, by everyone. They are images, logos, documentations... This kind of files. If you have the url of these files, you can download them.
Private files
Things are a little more complicated for private files. These files should not, in any case, be downloadable without authentification. Said otherwise, only the user that own these files (and often the admins too) should be able to download them. For instance, these files could be an invoice for your order, a paid media...
For the directory structure, you are free, it's up to you. The only important thing is to prevent these files from being publicly accessible, so you MUST NOT put them somewhere in the public folder (and the linked storage/app/public).
By default, Laravel expects these files to be in storage/app. For instance, it could be storage/app/users/1451 for the user with id 1451. It's not important to randomize thes paths since these files are not accessible directly (they are NOT in the public path).
So you may ask, how a user can download them? The answer is to create an authentificated route and check for user authorization before sending the file back in the response.
You can do something like:
Route::get('invoices/{invoice}/download', [InvoiceController::class, 'download']);
Then, you apply a policy to this route, where you check that the authenticated user is able to download the file (like you would do for any other authenticated routes).
If he can, well, perfect, just return the file in the response: return response()->download($path_to_the_file);. If he can't, he will get a 403 Forbidden.
Related
I have a framework that I've written. I created a package for said framework that allows me to track my employee's hours, and gives them a place to dump files for the accountant. (Invoices, etc.)
The problem is that these file dumps are accessible through the browser. I could use a .htaccess file to prevent the files from being served up at all, but the problem is that I would like the accountant and the employee to be able to download their files.
A solution might be to read the files with PHP, create a temporary copy, have the user or accountant download that copy, then delete the copy...but this poses two problems.
1) It's going to be time and resource intensive...especially for large files.
2) For whatever short amount of time, there will be a copy of the file which is accessible to whoever knows the URL.
The other solution would be to put the files outside of the public folder, but the problem is that I would like this package to be portable, and a lot of my servers are shared.
What method could I use to be able to serve the files only when authenticated, and avoid the flaws I described above?
I have a config.php file for my website that I store outside of the publicly accessible web directory and it contains my password information for a gmail account I use to send mail for the site and my database connection credentials. I don't like the idea of saving the password as a plaintext variable and was looking for some way to have this data more securely saved. Beyond blocking read access to the directory from users other than me, what can I do to secure this information?
You will end up saving in plain text most of the time. Say for example you want to encrypt, then the key to decrypt will have to saved in plain text etc etc. So better off making sure your server is secure.
Depends from what you are trying to protect.
Keeping your file outside the htdocs (webroot directory) is generally a good idea so that the file can't be called explicitly from outside. (I.E directing the browser to it).
If you want to protect the $var from within your code (i.e. third party malicious code) you can always unset the variables after they are consumed, although I don't know if that makes much difference.
If you want to protect the file from someone that might "hack" into your server, there isn't much you can do. You can always set the file permissions so that only www-data (your apache user) can read it but if someone gains root access to your machine, you are pretty much screwed.
Anyways, if your server is safe (no root access remotely or only through shh with public/private keys, you don't access your server from public PCs, etc...), you don't use third party code without inspecting it first and you store the pass file outside webroot directory, I think you're as safe as you can be.
Background: I have a website where people can store transactions. As part of this transaction, they could attached a receipt if they wanted.
Question: Is there any security risk if a user is allowed to upload any type of file extension to my website?
Info:
The user will be only person to ever re-download the same file
There will be no opportunity for the user to "run" the file
They will only be able to download it back to themselves.
No other user will ever have access to another users files
There will be a size restriction on the say (say 2mb)
More info: I was originally going to restrict the files to "pdf/doc/docx" - but then realised some people might want to store a jpg, or a .xls etc - and realised the list of files they "might" want to store is quite large...
edit: The file will be stored outside public_html - and served via a "readfile()" function that accepts a filename (not a path) - so is there anything that can 'upset' readfile()?
Yes, it is definitely a security risk unless you take precautions. Lets say, to re-download the file, the use has to go to example.com/uploads/{filename}. The user could upload a malicious PHP file, and then 'redownload' it by going to example.com/uploads/malicious.php. This would, of course, cause the PHP script to execute on your server giving him enough power to completely wreck everything.
To prevent this, create a page that receives the filename as a parameter, and then serve the page to the user with the correct content-type.
Something like, example.com/files?filename=malicious.php
"There will be no opportunity for the user to "run" the file"
As long as you are 100% sure that that will hold true, it is secure. However, make sure the file will not be able to be executed by the webserver. For example, if the user uploads a .php file, make sure the server does not execute it.
Computers don't run programs magically by themselves, so basically you just need to ensure that the user has no ability to trick your server into running the file. This means making sure the proper handlers are disabled if the files are under the web root, or passing them through a proxy script if they are not (basically echo file_get_contents('/path/to/upload') with some other logic)
Another option would be to store the file like name.upload but this would require keeping a list of original names that map to the storage names.
I have a folder (/files) and I have tons of files there that users can download. I want users to be able to download their files only and no be able to see others people file.
For example:
User A can only view and download:
- file1.doc
- file2.jpg
User B can only view and download:
- file3.txt
- file4.jpeg
User C can only view and download:
- file1.doc
- file2.jpg
- file3.txt
My idea was to put all files in the same folder so all users knows where to go. My question is: Can I use .htaccess or should I build a PHP scripts for this? What about security (which one is more secure)?
Thanks
Is it an open directory, to start with? What you could do is create a subfolder for each user, put their files in there and then assign appropriate permissions in .htaccess for said folders. However, this would require some security integration with your OS (i.e., users would have to have accounts on your machine, not just your web application)... A quick and dirty -- and insecure -- alternative would be to prepend all uploaded filenames with the username (e.g., 'file1.jpg' uploaded by 'foobar' could be named 'foobar.file1.jpg', for example), then it's just a case of your PHP script returning only those files with the respective username and perhaps stripping that part out when displaying (or again, you could use folders, as long as your script can create a new folder per user, when one doesn't exist). Another option, which is slightly more secure is to create a hash of the file and usernames in a database, rename all uploaded files with this hash and then query the database appropriately.
The best solution would definitely be OS managed accounts, a I first mentioned, but it entails more overhead.
Build a PHP script where you use readfile to send the file to the browser. This way you can restrict access for individual files, and use the authentication system you already have.
You can certainly use either htaccess or PHP for this. Neither are more secure, as far as I know, that the other - though done wrong both can permit access where none is intended!
PHP might be marginally better, since you have more flexibility (in terms of integrating it with other PHP authentication, say) and you can put the folder outside the usual web root, which is good practise anyway.
I have a site that allows people to upload files to their account, and they're displayed in a list. Files for all users are stored on different servers, and they move around based on how popular they are (its a file hosting site).
I want to add the ability for users to group files into folders. I could go the conventional route and create physical folders on the hard drive, for each user on the server, and transverse them as expected. The downside to that is the user's files will be bound to a single server. If that server starts running of space (or many files get popular at the same time), it will get very tricky to mitigate it.
What I thought about doing is keeping the stateless nature of files, allowing them to be stored on any of the file servers, and simply storing the folder ID (in addition to the user id who owns the file) with each file in the database. So when a user decides to move a file, it doesn't get physically moved anywhere, you just change the folder ID in the database.
Is that a good idea? I use php and mysql.
Yes, it is.
I don't see any downside, except maybe more queries to the database, but with proper indexing of the parent folder id, this will probably be faster than accessing the filesystem directly.
Forget about folders and let the users tag their files, multiple tags per file. Then let them view files tagged X. This isn't much different to implement than virtual folders but is much more flexible for the users.