I have a folder (/files) and I have tons of files there that users can download. I want users to be able to download their files only and no be able to see others people file.
For example:
User A can only view and download:
- file1.doc
- file2.jpg
User B can only view and download:
- file3.txt
- file4.jpeg
User C can only view and download:
- file1.doc
- file2.jpg
- file3.txt
My idea was to put all files in the same folder so all users knows where to go. My question is: Can I use .htaccess or should I build a PHP scripts for this? What about security (which one is more secure)?
Thanks
Is it an open directory, to start with? What you could do is create a subfolder for each user, put their files in there and then assign appropriate permissions in .htaccess for said folders. However, this would require some security integration with your OS (i.e., users would have to have accounts on your machine, not just your web application)... A quick and dirty -- and insecure -- alternative would be to prepend all uploaded filenames with the username (e.g., 'file1.jpg' uploaded by 'foobar' could be named 'foobar.file1.jpg', for example), then it's just a case of your PHP script returning only those files with the respective username and perhaps stripping that part out when displaying (or again, you could use folders, as long as your script can create a new folder per user, when one doesn't exist). Another option, which is slightly more secure is to create a hash of the file and usernames in a database, rename all uploaded files with this hash and then query the database appropriately.
The best solution would definitely be OS managed accounts, a I first mentioned, but it entails more overhead.
Build a PHP script where you use readfile to send the file to the browser. This way you can restrict access for individual files, and use the authentication system you already have.
You can certainly use either htaccess or PHP for this. Neither are more secure, as far as I know, that the other - though done wrong both can permit access where none is intended!
PHP might be marginally better, since you have more flexibility (in terms of integrating it with other PHP authentication, say) and you can put the folder outside the usual web root, which is good practise anyway.
Related
Hello and thanks to everyone for reading my question.
I've been working on a PHP web program for a little while and was wondering what measures should I take to protect the source before putting it on a live server. The source isn't being distributed, it's being accessed through a website (users log into the website to use it).
First I'd like to protect the source php files from being found and downloaded. I'm not using any framework, just php and all files are in the home directory as index.php. I read around and it seems that robots.txt isn't really effective for hiding. I came across some posts of people recommending .htaccess, but I often thought it was for protecting files within a directory with a password, so not sure if there's a way to make it htaccess suitable for a web app.
Second, I'd like to protect the source files in the case someone gets access to them (either finds them and downloads them or a sys admin that has ready access to the server). I thought of source encryption with something like ioncube. My host also has GnuPG [which I'm not familiar with, any thoughts about it compared to ioncube?]
I'm not familiar with source protection, so any ideas would be nice, and of course thank you muchly :)
Just make sure your web server is set up to handle .php files correctly, and that all files have the correct .php extension (not .php.inc or similar)
As long as your server executes the PHP, no one can download its source code (ignoring any security holes in your code, which is a different topic)
There was a time when it was common to name included files along the lines of mystuff.php.inc - this is a bad idea. Say your site is at "example.com", and you store your database configuration in config.php.inc - if someone guesses this URL, they can request http://example.com/config.php.inc and get your database login in plain text..
It is a good idea to store configuration and other libraries up one directory as bisko answered - so you have a directory structure like..
/var/example.com:
include/
config.php
helper_blah.php
webroot/
index.php
view.php
This way, even if your web-server config gets screwed up, and starts serving .php files as plain text, it'll be bad, but at least you wont be announcing your database details to the world..
As for encrypting the files, I don't think this is a good idea.. The files must be unencrypted to Apache (or whatever server you're using) can access them. If Apache can access it, your sysadmin can too..
I don't think encryption is the solution to an untrustworthy sysadmin..
Well for your first point, that's web server security, which you should look for help on serverfault. Basically you would use a secure/locked directory for this, or access the files in a virtual directory via a web service.
For you second point, you would use an obfuscator for this, which will protect your source, but remember that if they get the file, you can only do so much to protect it. If they are really interested, they'll get what they want.
The first step you should take is take out all unnecessary files out of the website root and put them in some other place and leave only the files, being called from the web.
For example if you have this setup:
/var/htdocs/mysexydomain.com/root/config.php
/var/htdocs/mysexydomain.com/root/db.class.php
/var/htdocs/mysexydomain.com/root/index.php
/var/htdocs/mysexydomain.com/root/samplepage1.php
Take all the files one level above so you get
/var/htdocs/mysexydomain.com/includes/config.php
/var/htdocs/mysexydomain.com/includes/db.class.php #see the includes dir? :)
/var/htdocs/mysexydomain.com/root/index.php
/var/htdocs/mysexydomain.com/root/samplepage1.php
I'm working on a project using PHP (Laravel Framework). This project allows logged-in users to store files and folders that only they can access.
I've set up user authentication such that a unique user token is required to access the user's data.
When a user stores/creates a folder, the folder gets stored on the server, prepended with a random 32 character string complicated path such as /WijiIHSIUhoij99U8EJIOJI09DJOIJob/username/folder which is stored in a MySQL database for the user to fetch with an API so long as they have the correct access token for their account. I did this because I guess it seemed more secure, correct me if I'm wrong.
This is half the job (please correct me if what I've done so far is wrong or bad practice).
What I need now is to make sure nobody can access the root folder / and get a list of user folders.
But I fear if I restrict access to the root folder, users will no longer be able to access their folders.
I don't know where to even start looking to find a solution, so anything, even if it's just to get me started will help.
The random string is redundant, as your token does the randomization for you. If the users know that their stuff is stored in /RANDOMSTRING/folder, someone will build a bot to try them all anyhow, but the idea is close to what you want, I think.
Instead, try creating a script that catches all 404s, and checks if they match RANDOM_STRING/user/folder. If so, then you can have the random string in the database match to a universal folder, something like /user_files/username/their_folder. This way, you aren't exposing your folder structure, and you can add a middleware to count how many times they've unsuccessfully tried a URL, then ban their client if it's more than 3, for instance.
In this case, the script delivers the files (or the folder) as a download, or as a screen without the actual folder structure visible. The files can be stored anywhere, and the database correlates the URL entered with an actual file or folder somewhere. (really, we're trying to mitigate someone getting access through a brute force here).
From there, I would look into using a different service to store the files. This helps with two things that you can read up on. First, you can choose something like AWS cold storage which will cost you less, and second you can take the storage off the computational server, such that if there is a security breach, you know that they either only had access to the user files, or the code files. This will help you in the long run (although if they gain access to your code files, they will likely be able to locate your storage location and password... this is really to protect your code files).
Beyond that, start looking into read and write permissions and user groups. PHP, the OS, the users, and the admins should all have their own permissions on the storage server to make sure they can't do certain things with the files. Moreover, you don't want someone uploading a malicious file and having it execute.
This is just a primer, but it'll lead you to a LOT of information, I'm sure.
You have to make a distinction between the public and private files.
Public files
Public files can be accessed, you guessed it, by everyone. They are images, logos, documentations... This kind of files. If you have the url of these files, you can download them.
Private files
Things are a little more complicated for private files. These files should not, in any case, be downloadable without authentification. Said otherwise, only the user that own these files (and often the admins too) should be able to download them. For instance, these files could be an invoice for your order, a paid media...
For the directory structure, you are free, it's up to you. The only important thing is to prevent these files from being publicly accessible, so you MUST NOT put them somewhere in the public folder (and the linked storage/app/public).
By default, Laravel expects these files to be in storage/app. For instance, it could be storage/app/users/1451 for the user with id 1451. It's not important to randomize thes paths since these files are not accessible directly (they are NOT in the public path).
So you may ask, how a user can download them? The answer is to create an authentificated route and check for user authorization before sending the file back in the response.
You can do something like:
Route::get('invoices/{invoice}/download', [InvoiceController::class, 'download']);
Then, you apply a policy to this route, where you check that the authenticated user is able to download the file (like you would do for any other authenticated routes).
If he can, well, perfect, just return the file in the response: return response()->download($path_to_the_file);. If he can't, he will get a 403 Forbidden.
I have a simple site which allows users to upload files (among other things obviously). I am teaching myself php/html as I go along.
Currently the site has the following traits:
--When users register a folder is created in their name.
--All files the user uploads are placed in that folder (with a time stamp added to the name to avoid any issues with duplicates).
--When a file is uploaded information about it is stored in an SQL database.
simple stuff.
So, now my question is what steps do I need to take to:
Prevent google from archiving the uploaded files.
Prevent users from accessing the uploaded files unless they are logged in.
Prevent users from uploading malicious files.
Notes:
I would assume that B, would automatically achieve A. I can restrict users to only uploading files with .doc and .docx extensions. Would this be enough to save against C? I would assume not.
There is a number of things you want to do, and your question is quite broad.
For the Google indexing, you can work with the /robots.txt. You did not specify if you also want to apply ACL (Access Control List) to the files, so that might or might not be enough. Serving the files through a script might work, but you have to be very careful not to use include, require or similar things that might be tricked into executing code. You instead want to open the file, read it and serve it through File operations primitives.
Read about "path traversal". You want to avoid that, both in upload and in download (if you serve the file somehow).
The definition of "malicious files" is quite broad. Malicious for who? You could run an antivirus on the uplaod, for instance, if you are worried about your side being used to distribute malwares (you should). If you want to make sure that people can't harm the server, you have at the very least make sure they can only upload a bunch of filetypes. Checking extensions and mimetype is a beginning, but don't trust that (you can embed code in png and it's valid if it's included via include()).
Then there is the problem of XSS, if users can upload HTML contents or stuff that gets interpreted as such. Make sure to serve a content-disposition header and a non-html content type.
That's a start, but as you said there is much more.
Your biggest threat is going to be if a person manages to upload a file with a .php extension (or some other extension that results in server side scripting/processing). Any code in the file runs on your server with whatever permissions the web server has (varies by configuration).
If the end result of the uploads is just that you want to be able to serve the files as downloads (rather than let someone view them directly in the browser), you'd be well off to store the downloads in a non web-accessible directory, and serve the files via a script that forces a download and doesn't attempt to execute anything regardless of the extension (see http://php.net/header).
This also makes it much easier to facilitate only allowing downloads if a person is logged in, whereas before, you would need some .htaccess magic to achieve this.
You should not upload to webserver-serving directories if you do not want the files to be available.
I suggest you use X-Sendfile, which is a header that instructs the server to send a file to the user. Your PHP script called 'fetch so-and-so file' would do whatever authentication you have in place (I assume you have something already) and then return the header. So long as the web server can access the file, it will then serve the file.
See this question: Using X-Sendfile with Apache/PHP
I am new to web development and I'm learning PHP in order to sell a few binary files (shared Linux host). The site is not yet live.
My php scripts (50% borrowed code, 50% self-written, 95% fully understood) login to MySQL to READ the items for sale, and WRITE sale transaction data into another table. Functions.php, located in a subfolder of the webroot, contains the login name and password for MySQL.
Q1. This doesn't seem secure to me. How should the login/password info be stored so the scripts can access it? If functions.php was stored outside the webroot, could the .php files located in webroot #include (PHP "require_once") it? (I did try this once and my scripts broke in a way that seemed permissions-related -- if I knew it should work I'd keep plugging away at it)
Q2. I am unsure where to store the binaries that purchasers can download. Is it correct that savvy users can somehow find / download them (without paying) if I just store them in a subfolder of the webroot? Is it possible to use a .htaccess file to block access to the "binaries" folder within the webroot? Can black-hats get at / modify a .htaccess file?
Q3. Would it be a better idea to store the binaries (max=4Mb) in a MySQL table and copy them from there to a temp file in webroot before each download, then delete?
Q4. Can anyone recommend a set of scripts that manages this sort of thing that I could review / modify rather than reinventing the wheel?
Thanks
Q1 - Your MySQL password and other application specific settings should be stored in a separate file outside of your webroot. You can either put it out of webroot directly or restrict it via .htaccess. You can include the file or read from it as long as you know the path.
Q2 - The binaries should also be stored outside of the webroot. The ideal way to serve them would be to have them downloadable via a PHP file. This way you can do authentication before the file is served and you can make the links temporary so that users can't share it with other people
Q3 - If you use the above method, you don't need to store it as a BLOB in MySQL
Q4 - I haven't really come across anything that does and is a library/autonomous script. Serving them via the correct headers shouldn't be too difficult though.
Not sure if best practice, but this is how I'd approach it:
Q1: I store MySQL login information, along with local paths and other settings, in a config.inc.php file outside of the web root. I can then include that at the start of each script. I also use a database.inc.php which connects to MySQL and selects the database (plus a few database functions). In theory it isn't insecure inside the web root as being called directly will only execute the PHP, not display the contents of it. Storing an XML config or similar is different however!
Q2: If downloadable binaries are stored within the web root then they could be downloaded if the right URL is discovered. Instead they should be stored outside the web root, and a PHP "gateway" script serves the contents of those files if the request meets the right conditions. You may want to store a token with each purchase in your database, and only valid tokens are permitted to download the files. An example of a download script is here.
Q3: I believe it's better to use the file system to store files, rather than a database. It won't improve security over my answer to Q2 if that's what you mean.
Q4: You could try existing shopping cart software. Magento supports downloadable products.
Hope that helps
Hello and thanks to everyone for reading my question.
I've been working on a PHP web program for a little while and was wondering what measures should I take to protect the source before putting it on a live server. The source isn't being distributed, it's being accessed through a website (users log into the website to use it).
First I'd like to protect the source php files from being found and downloaded. I'm not using any framework, just php and all files are in the home directory as index.php. I read around and it seems that robots.txt isn't really effective for hiding. I came across some posts of people recommending .htaccess, but I often thought it was for protecting files within a directory with a password, so not sure if there's a way to make it htaccess suitable for a web app.
Second, I'd like to protect the source files in the case someone gets access to them (either finds them and downloads them or a sys admin that has ready access to the server). I thought of source encryption with something like ioncube. My host also has GnuPG [which I'm not familiar with, any thoughts about it compared to ioncube?]
I'm not familiar with source protection, so any ideas would be nice, and of course thank you muchly :)
Just make sure your web server is set up to handle .php files correctly, and that all files have the correct .php extension (not .php.inc or similar)
As long as your server executes the PHP, no one can download its source code (ignoring any security holes in your code, which is a different topic)
There was a time when it was common to name included files along the lines of mystuff.php.inc - this is a bad idea. Say your site is at "example.com", and you store your database configuration in config.php.inc - if someone guesses this URL, they can request http://example.com/config.php.inc and get your database login in plain text..
It is a good idea to store configuration and other libraries up one directory as bisko answered - so you have a directory structure like..
/var/example.com:
include/
config.php
helper_blah.php
webroot/
index.php
view.php
This way, even if your web-server config gets screwed up, and starts serving .php files as plain text, it'll be bad, but at least you wont be announcing your database details to the world..
As for encrypting the files, I don't think this is a good idea.. The files must be unencrypted to Apache (or whatever server you're using) can access them. If Apache can access it, your sysadmin can too..
I don't think encryption is the solution to an untrustworthy sysadmin..
Well for your first point, that's web server security, which you should look for help on serverfault. Basically you would use a secure/locked directory for this, or access the files in a virtual directory via a web service.
For you second point, you would use an obfuscator for this, which will protect your source, but remember that if they get the file, you can only do so much to protect it. If they are really interested, they'll get what they want.
The first step you should take is take out all unnecessary files out of the website root and put them in some other place and leave only the files, being called from the web.
For example if you have this setup:
/var/htdocs/mysexydomain.com/root/config.php
/var/htdocs/mysexydomain.com/root/db.class.php
/var/htdocs/mysexydomain.com/root/index.php
/var/htdocs/mysexydomain.com/root/samplepage1.php
Take all the files one level above so you get
/var/htdocs/mysexydomain.com/includes/config.php
/var/htdocs/mysexydomain.com/includes/db.class.php #see the includes dir? :)
/var/htdocs/mysexydomain.com/root/index.php
/var/htdocs/mysexydomain.com/root/samplepage1.php