Actually this is about a cloud storage web app
A web app where user have their own private folder and i want they should have their own folders too .What could be the best way, to tell uploader that the file being uploaded has to be in specific folder in which user is in.
So what i thought is whenever a folder is created and opened a session which stores folder name is started and sent to uploader where Uploader that way uploads.
For eg $_SESSION['folder-name']='x'
$upload_dir= "/$_SESSION['username']/$_SESSION['folder-name']/"
Every user has specific private directory where they can upload files. So there is no chance that session can be manipulated to upload in wrong user directory.
Since now i see this way as the best way, is there any alternative i can get from stackoverflow community on this . if yes, please describe
Since your username is unique I suppose then I seems legit.
But lets see this scenario:
A user get's deleted and another one registers with the same username...
EDIT
It depends on the fs and many other factors. Eg is this going to be url accessible ?
An legit alternative could be:
Since the username as you set it seems to be unique then you could first md5 it (is quick) and then digest to hex if you want it shorter and url accessible. That will give you a good number of usernames that can be converted to folders.
Bad thing about this is that you can't find the user by the folder name. If you had that in mind
That's it.
Related
My question seems to be similar to others here in SO, I have try a few but it doesn't seem to work in my case...
I have develop a site in which you have to fill up a form and then it returns a PDF file that you can download or print, this file is saved so you can retrieve it later
public_html
|_index.php
|_<files>
| |_file_001.pdf
| |_file_002.pdf
|_<asstes> ....etc
that is how my files and folders look on the server, anyone can easily guess other files, .com/folder/file_00X.pdf, where X can be change for any other number and get access to the file... the user after finish with the form the script returns a url .com/file/file_001.pdf so he/she can click on it to download...
a year ago I did something similar an script to generate PDF's but in that case the user needed the email and a code that was sent via email in order to generate the PDF and the PDF's are generated on demand not saved like in this case...
Is there a way to protect this files as they are right now?
or, do I have to make it a little bit more hard to guess?
something like.
.com/files/HASH(MD5)(MICROTIME)/file_(MICROTIME)_001.pdf
and save the file and folder name in the DB for easy access via admin panel, the user will have to get the full URL via email...
Any ideas would be greatly appreciated.
For full security i would move the PDFs out of the public folder and have ascript in charge of delivering the content. If the form is filled correctly, you can generate a temporary hash and store that hash and the pdf path in the database. That way the user will have access to the file as a link through the retriever script, but you will control for how long he will have that link available.
Imagine the temporary link being http://yourdomain/get_pdf/THIS_IS_THE_HASH
Move the PDF's to some non-public folder (that your web server has access to but the public does not). Or you can use .htaccess to restrict access to the pdf's in their current location.
Write a php script that returns the correct pdf based on some passed in http variable.
You can secure/restrict this any way that you want to.
For example, one answer suggested using a temporary hash.
Other options for restricting access:
Store in the user's session that they submit the form and have a download pending, that way no one could direct link.
Check the referrer header. If it is a direct request then do not serve the file.
Here is a code example using the last option:
$hash_or_other_identifier = $_REQUEST["SomeVariable"];
if (!$_SERVER["HTTP_REFERER"])
{
//dont serve the file
} else {
//lookup the file path using the $hash_or_other_identifier
$pdfFile = somelogic($hash_or_other_identifier);
//serve the correct pdf
die(file_get_contents($pdfFile));
}
I don't even think that keeping the file name secret is a very big deal if all you are worried about is people typing it into the URL bar because you can simply check if it is a direct link or not. If you are also worried about bots or clever people who will create a link that points to your file so it looks like a referrer, then you will need to add stricter checks. For example, you can verify that the referrer is your own site. Of course headers can be spoofed so it all just depends how bulletproof it needs to be.
The url would be something like: http://yourdomain/pdf?SomeVariable=12345
However, you don't have to use an http variable. You can also use a url fragment with the same result, eg: http://yourdomain/pdf/12345
General guidelines:
File is not in the directory that's accessible via HTTP
Use a database or any other storage to link up file location with an identifier (an auto incremented number, guid, hash, whatever you deem fit). The location of the file could be in the server's file system or on a shared network location etc.
Instead of hashes, it's also practical to encrypt the ID generated by the database, base64 encode it and provide it back - that makes it nearly impossible to guess the valid string that one needs to send back in order to refer to a file
Use a PHP script that delivers the file if user authentication passes (in case you need authenticated users to be able to retrieve the file)
I am developing an application that allows users to upload images. The script creates multiple sizes of the image and stores them in folders with a session ID e.g. /uploads/skj28cnkjck783wo/thumbnails before renaming the folder to a unique name once the user hits submit/next/go.
I need to allow users to have the ability to remove individual files but wondering what the best secure way of doing this would be. If I base the delete off the session ID surely users would be able to change this and remove files that don't belong to them.
I was thinking I could store user information and the object reference in a table and do a lookup to ensure that user has access to remove that file or store the files in a folder with the user ID which is unique based on DB entry but wondering what the experts think. I have done some Google searching but all have flaws.
I am running IIS7 as the web server.
Thanks
Try to store relations "userName(or userID)"->"UserFile" in separate table. This is the best secure way, bacause OS (Windows in your case) can detect only one user - owner of web service (IIS or Apache, etc.)
It is you who manipulates the files, not the user himself.
The user gives the file name (or perhaps some file id) in his GET or POST request.
Disallow relative paths by rejecting all inputs that contain slashes. It's easy if you don't allow subfolders.
If you allow subfolders, reject all that have ../ etc.
You can keep files by changing their names (like flickr does), and keeping data of the filename (to show to the user) and the owner. If owner and user do not match, reject.
I've been trying to create a website that allows users to upload word documents.. Those documents would then be stored in a public directory of the website. The thing is, I don't want everyone to access the uploaded documents.. I would like to check if they are logged in first, and if they are "Authorized Users".. say if they have account level of 50 or higher, then they are allowed to open the directory..
Is there anyway I can do this through .htaccess?.. or is there a better solution?
I don't know if this is a dumb question, but do help me please. I would deeply appreciate any help I can get right now.
Note:
Sorry for not mentioning earlier, but I actually want to use google docs for viewing these documents in order to embed them in my website.
It sounds like you're taking the approach of putting the public documents directory somewhere underneath your web root directory. For a number of reasons (security, portability, maintainability), this is not the best approach to take.
Here's a quick-and-dirty approach (I'm assuming that you're already handling user authentication using a database or some other means to store credentials):
Place the documents directory somewhere outside your web root directory.
Create a function (or class) to read the list of files in the documents directory (look at scandir() (http://www.php.net/manual/en/function.scandir.php)
Create a page that will show the results of reading the documents directory. Each file should be a link to a page along with a URL parameter indicating the file. In this page, check the user's credentials before showing them the file list.
In the page that the file list page points to, check to make sure the requested file exists in the documents directory (don't forget to check again to make sure the user has the necessary credentials!), and then read that file and push it to the user. See readfile() (http://php.net/manual/en/function.readfile.php), making special note in the example of setting the various header fields.
You'd want to probably use a database (MySQL?) and PHP sessions to check if:
the user has logged in successfully (credentials in database)
the user has 'level 50' or higher if($level >= 50)
use sessions and session variables to create persistent authentication keys when users go between pages.
you should not need to use .htaccess files for this.
Okay, we have a subscription site up on our dedicated server. We feed content to paying members who access the site via our login page. Subscriptions are handled by a third-party biller who writes new member info to a database on our server. Member authentication is done using a MySQL database and not .htaccess/.htpassword. The reason for this was that much research showed that the .htaccess/.htpassword approach was insecure (transmission of user info via plain text) and that it offers no way for a user to log out. Thus the database authentication via MySQL. It all works great.
Except we have a problem in that the folders that contain members-only content need to be secured against anyone typing in the complete file path and file name to access the downloads content, thus bypassing our website.
So we went to the host and had a custom .htaccess file written. We had to do this in the interest of time, and they claimed to know about this sort of thing so we hired them to write the .htaccess file.
First iteration: It redirected every user login back to the index.php page instead of allowing access to the members area. Direct file access was blocked, however.
Second iteration: Member access to the member's area was restored and once again the content was vulnerable to direct download.
Third iteration: Successful access to member's area. Content access blocked to direct browser access. HOWEVER, ALL of the .jpg files that used to display with each of the download files in the member's area are now broken links. All of the thumbnails in the associated download file photo galleries are now broken links, preventing the viewing of the larger images they represent.
CONCLUSION: The host is backing out of the deal saying that what we want can not be done. To recap, what we want is:
Allow our registered members access to our member's area using our login page.
Preventing direct access to our content via browsers.
Allowing all of the .jpg images to display with the download files and in the thumbnail galleries.
They claim this can't be done, my suspicion is that they do not know how to do it. Certainly there are many subscription sites on the internet that use .htaccess files to secure their content.
ADDITIONAL INFO: We have an SSL certificate for this domain. Could that cause a problem? Shouldn't the .htaccess to protect our member's area content be in the member's area folder and not in the root (as it is now, and wouldn't that make the coding of the .htaccess file less complex?)?
I'm having a hard time believing that what we are asking to be done is not do-able.
Please advise. Any and all help will be severely appreciated.
Skip the .htaccess route. Store the file names for the 'member content' in MySQL. Then use .php to link to these for 'members only'. PHP would know only identifying information but not the actual file names. EG MySQL index #, storage date, member ID - all of these can be used to generate (and retrieve) a unique filename that you never expose.
I've done this before in Java using servlets in the 'src=' part of the img tag. I expect that PHP offers something comparable.
I want my files to be secure in my web server. Only authenticated users to access those files should be able to access those files. I thought of storing files in database as "Long BLOB" but it supports only upto 2MB of data. The file size may exceed beyond 50MB. is there any other better way to secure the files? please help me.thanks in advance.
Don't store them in a database. Put them in your web directory and secure them using .htaccess.
If you want to authenticate via other means, then store the files in a directory that isn't web-accessible but is readable by the user php runs as.
Discussion
If you opt to keep high value downloadable content files directly on the filesystem, the best thing to do is to keep them outside of the webroot.
Then, your application will have to solve the problem of creating URLs (url encoding when necessary) for content (PDF's, Word Docs, Songs, etc..).
Generally, this can be achieved by using a query to retrieve the file path, then using the file path to send content to the user (with header() etc ..) when he or she clicks on an anchor (all of this without the user ever seeing the true, server side file path).
If you do not want user A sharing URLs for high value downloadable content to user B, then your application must somehow make the links exclusively tied to user A. What can be done? Where should I start?
Obviously, you want to make sure user A is logged in during a session before he or she can download a file. What is not so obvious is how to prevent a logged in user B from using a URL sent from user A (to user B) to download A's digital content.
Using $_SESSION to store the logged in user's ID (numerical, or string) and making that part of the eventual query (assuming content is tied to user purchases or something) will prevent a logged in user B from downloading things they have not purchased, but you will still incur the resource hit for processing the SQL empty set for items they have not purchased. This sounds like a good step two.
What about step one? Is there something that can prevent the need to do a query to begin with?
Well, let us see. In HTML forms, one might use a XSRF token in a hidden field to verify that a submitted form actually originated from the web server that receives the POST/GET request. One token is used for the entire form.
Given a page of user specific things to download (anchors), one could embed a single token (the same token, but different per page request) into each anchor's href attribute in the form of a query string parameter and store a copy of this token in $_SESSION.
Now, when a logged in user B attempts to use a logged in user A's shared URL, the whole thing fails because user A and user B have different sessions (or, no session at all), and thus different tokens. In other words, "My link is the same as yours, but different." Anchors would be tied to the session, not just to the page, user, or content.
With that system in place, PHP can determine if a request for content is valid without getting the database involved (by comparing the submitted token to the one in $_SESSION). What is more, a time limit can be established in $_SESSION to limit the duration/lifetime of a valid XSRF token. Just use the time() function and basic math. Sixty minutes might be an ideal token lifetime for an anchor in this situation. Have the user login again if the token for a clicked anchor has expired.
Summary
If you use files on a filesystem and store the paths in the database, make sure you do the following (at minimum), too.
Apply proper file permissions to your content directory (outside of webroot).
Use random names for uploaded files.
Check for duplicate file names before saving a file from an upload.
Only logged in users should be able to download high value content.
Have an effective $_SESSION system that deters session fixation.
Make URLs for high value downloadable content unique per page by using hashed XSRF tokens.
XSRF tokens cover more scenarios when they have a terminal life time.
Make SQL queries for user content based on the logged in user's ID, not the product exclusively.
Filter and validate all user input.
Use prepared statements with SQL queries.
A few options come to mind.
If you are using Apache you can use htaccess to password protect directories. (first googled link : http://www.javascriptkit.com/howto/htaccess3.shtml)
or
Store the files above the web server.
Create a script in php that will allow authorised users to access them.
If you want to do it Via FTP, and you are running cpanel you may be able to create new ftp accounts. check yourdomain.com/cpanel to determine if you have it installed.
Storing files in DB is very bad practice. Very good practice to store only information about file. Name, extension. Files save on server like $id.$ext. It will be a good architecture. And when user download file, he take file with name in DB.Sorry for my english.
The best way is to store the file reference in Database. The file itself will be stored in the server filesystem. The complexity of this is making sure there is reference integrity between the database file reference and the existing file in the server filesystem. Some database such as sql server 2008 have feature that maintain the integrity of the file references to the actual file itself.
Other than that securing the file itself in the server depends on the OS where permissions can be configured to the specific folder where the file reside.
If the files are purely static you could use read-only or WORM media to store the data files or indeed run the complete web server from a "LiveCD". It's certainly not suited to everyone's needs but for limited cases where the integrity of the data is paramount it works.
Downloadable files can be stored in htaccess protected folder/s. A script like the one below can be used to generate dynamic links for downloadable files.
for ex. Secure download links. http://codecanyon.net/item/secure-download-links/309295