Okay, we have a subscription site up on our dedicated server. We feed content to paying members who access the site via our login page. Subscriptions are handled by a third-party biller who writes new member info to a database on our server. Member authentication is done using a MySQL database and not .htaccess/.htpassword. The reason for this was that much research showed that the .htaccess/.htpassword approach was insecure (transmission of user info via plain text) and that it offers no way for a user to log out. Thus the database authentication via MySQL. It all works great.
Except we have a problem in that the folders that contain members-only content need to be secured against anyone typing in the complete file path and file name to access the downloads content, thus bypassing our website.
So we went to the host and had a custom .htaccess file written. We had to do this in the interest of time, and they claimed to know about this sort of thing so we hired them to write the .htaccess file.
First iteration: It redirected every user login back to the index.php page instead of allowing access to the members area. Direct file access was blocked, however.
Second iteration: Member access to the member's area was restored and once again the content was vulnerable to direct download.
Third iteration: Successful access to member's area. Content access blocked to direct browser access. HOWEVER, ALL of the .jpg files that used to display with each of the download files in the member's area are now broken links. All of the thumbnails in the associated download file photo galleries are now broken links, preventing the viewing of the larger images they represent.
CONCLUSION: The host is backing out of the deal saying that what we want can not be done. To recap, what we want is:
Allow our registered members access to our member's area using our login page.
Preventing direct access to our content via browsers.
Allowing all of the .jpg images to display with the download files and in the thumbnail galleries.
They claim this can't be done, my suspicion is that they do not know how to do it. Certainly there are many subscription sites on the internet that use .htaccess files to secure their content.
ADDITIONAL INFO: We have an SSL certificate for this domain. Could that cause a problem? Shouldn't the .htaccess to protect our member's area content be in the member's area folder and not in the root (as it is now, and wouldn't that make the coding of the .htaccess file less complex?)?
I'm having a hard time believing that what we are asking to be done is not do-able.
Please advise. Any and all help will be severely appreciated.
Skip the .htaccess route. Store the file names for the 'member content' in MySQL. Then use .php to link to these for 'members only'. PHP would know only identifying information but not the actual file names. EG MySQL index #, storage date, member ID - all of these can be used to generate (and retrieve) a unique filename that you never expose.
I've done this before in Java using servlets in the 'src=' part of the img tag. I expect that PHP offers something comparable.
Related
I have a public website with 1 page that is password protected. On this page are links to several pdf files and just some text. This page and the files are currently protected using .htaccess and .htpasswd files. When users try to access it a pop-up shows up asking for a log-in name and password.
It works, but I don't like how it looks. I'd like to have a page on the website where people can log in, with the same look as the other pages. (Like most websites have)
I have looked around for a while and found this:
Easy way to password-protect php page
After messing around with it for a bit I got it to work and I successfully password protected one page. The problem however are the pdf files on the page. I have no idea what to do with those.
So for my question, I'd like the following:
1) A nice looking page where people can log in.
2) 1 password protected webpage behind this log in.
3) Multiple password protected PDF files that are accessible through this webpage. (They can just open in the browser)
There is only 1 name with 1 password.
Any suggestions on how this can be achieved?
The problem is that on your 'secure' page, you can't provide a static link to the PDF file, else anyone who knows the link, has the PDF. This is security through obscurity, and is considered bad practice.
Two possibilities immediatelly come to mind to protect your PDF's:
Don't link to the PDF itself, but to a script that 'transfers' the PDF. The PDFs are in a directory that is not accessible (out of the root) for the web, but accessible for the script. The script reads from the PDF and writes to the client.
Link to the PDF itself, but configure the webserver to check for a valid session with your script. Users who use the direct link but are not logged in, receive an error. Bonus: configure the error page to go to the logon page. Using mod_auth_form could be the easiest way here.
I know that PDF files can be password protected, would you consider this as an option?
You would do this when you are creating the PDF file.
I've been trying to create a website that allows users to upload word documents.. Those documents would then be stored in a public directory of the website. The thing is, I don't want everyone to access the uploaded documents.. I would like to check if they are logged in first, and if they are "Authorized Users".. say if they have account level of 50 or higher, then they are allowed to open the directory..
Is there anyway I can do this through .htaccess?.. or is there a better solution?
I don't know if this is a dumb question, but do help me please. I would deeply appreciate any help I can get right now.
Note:
Sorry for not mentioning earlier, but I actually want to use google docs for viewing these documents in order to embed them in my website.
It sounds like you're taking the approach of putting the public documents directory somewhere underneath your web root directory. For a number of reasons (security, portability, maintainability), this is not the best approach to take.
Here's a quick-and-dirty approach (I'm assuming that you're already handling user authentication using a database or some other means to store credentials):
Place the documents directory somewhere outside your web root directory.
Create a function (or class) to read the list of files in the documents directory (look at scandir() (http://www.php.net/manual/en/function.scandir.php)
Create a page that will show the results of reading the documents directory. Each file should be a link to a page along with a URL parameter indicating the file. In this page, check the user's credentials before showing them the file list.
In the page that the file list page points to, check to make sure the requested file exists in the documents directory (don't forget to check again to make sure the user has the necessary credentials!), and then read that file and push it to the user. See readfile() (http://php.net/manual/en/function.readfile.php), making special note in the example of setting the various header fields.
You'd want to probably use a database (MySQL?) and PHP sessions to check if:
the user has logged in successfully (credentials in database)
the user has 'level 50' or higher if($level >= 50)
use sessions and session variables to create persistent authentication keys when users go between pages.
you should not need to use .htaccess files for this.
OK here my problem: content is disappearing from my site. It's not the most secure site out there, it has a number of issues. Right now every time I upload a page that can delete content from the my site using simple links wired to a GET request I find the corresponding content being deleted in mass.
Example, I have a functionality on my site to upload images. Once the user uploads an image, the admin(the owner) can use another page to delete all(owned) images from the site. The delete functionality is implemented in such a way that a user clicks on the link under each thumbnail of uploaded images he would send a get request that deletes the image information from the site's database and deletes the image from the server file system.
The other day I uploaded that functionality and the next morning I found all my images deleted. The pages are protected using user authentication when you view the pages using a browser. To my surprise, however, I could wget that page with out any problem.
So I was wondering if some evil web bot was deleting my content using those links? Is that possible? What do you advice for further securing my website.
It is absolutely possible. Even non-evil web bots could be doing it. The Google bot doesn't know the link it follows has any specific functionality.
The easiest way to possibly address this is to setup a proper robots.txt file to tell the bots not to go to specific pages. Start here: http://www.robotstxt.org/
RFC 2616 (HTTP protocol), section 9.1.1: Safe Methods:
The convention has been established that the GET and HEAD methods SHOULD NOT have the significance of taking an action other than retrieval. These methods ought to be considered "safe". This allows user agents to represent other methods, such as POST, PUT and DELETE, in a special way, so that the user is made aware of the fact that a possibly unsafe action is being requested.
Basically, if your application allows deletion via GET requests, it's doing it wrong. Bots will follow public links, and they have no obligation to expect to delete things when doing so, and neither do browsers. If the links are protected it could still be browser prefetching or acceleration of some kind.
Edit: It might also be Bing. Nowadays Internet Explorer sends data to Microsoft about everywhere you go to gather data for its shitty search engine.
Typically, a search-bot will scan a page for any links and peek down those links to see what pages are behind that. So yeah, if a both has access to that page, the page contains links to delete items / stuff and the both opens those links to see what's behind them, the code simply gets triggered.
There's a couple of ways to block bots from scanning pages. Look into robot.txt implementations. Also, you might want to look into the mechanism / safety of your admin authentication system... ;-)
You can use the robots.txt file to block the access for some web bots.
And for those that don't look for the robots.txt file you can also use javascript, there shouldn't be many webbots interpreting it.
delete
I want my files to be secure in my web server. Only authenticated users to access those files should be able to access those files. I thought of storing files in database as "Long BLOB" but it supports only upto 2MB of data. The file size may exceed beyond 50MB. is there any other better way to secure the files? please help me.thanks in advance.
Don't store them in a database. Put them in your web directory and secure them using .htaccess.
If you want to authenticate via other means, then store the files in a directory that isn't web-accessible but is readable by the user php runs as.
Discussion
If you opt to keep high value downloadable content files directly on the filesystem, the best thing to do is to keep them outside of the webroot.
Then, your application will have to solve the problem of creating URLs (url encoding when necessary) for content (PDF's, Word Docs, Songs, etc..).
Generally, this can be achieved by using a query to retrieve the file path, then using the file path to send content to the user (with header() etc ..) when he or she clicks on an anchor (all of this without the user ever seeing the true, server side file path).
If you do not want user A sharing URLs for high value downloadable content to user B, then your application must somehow make the links exclusively tied to user A. What can be done? Where should I start?
Obviously, you want to make sure user A is logged in during a session before he or she can download a file. What is not so obvious is how to prevent a logged in user B from using a URL sent from user A (to user B) to download A's digital content.
Using $_SESSION to store the logged in user's ID (numerical, or string) and making that part of the eventual query (assuming content is tied to user purchases or something) will prevent a logged in user B from downloading things they have not purchased, but you will still incur the resource hit for processing the SQL empty set for items they have not purchased. This sounds like a good step two.
What about step one? Is there something that can prevent the need to do a query to begin with?
Well, let us see. In HTML forms, one might use a XSRF token in a hidden field to verify that a submitted form actually originated from the web server that receives the POST/GET request. One token is used for the entire form.
Given a page of user specific things to download (anchors), one could embed a single token (the same token, but different per page request) into each anchor's href attribute in the form of a query string parameter and store a copy of this token in $_SESSION.
Now, when a logged in user B attempts to use a logged in user A's shared URL, the whole thing fails because user A and user B have different sessions (or, no session at all), and thus different tokens. In other words, "My link is the same as yours, but different." Anchors would be tied to the session, not just to the page, user, or content.
With that system in place, PHP can determine if a request for content is valid without getting the database involved (by comparing the submitted token to the one in $_SESSION). What is more, a time limit can be established in $_SESSION to limit the duration/lifetime of a valid XSRF token. Just use the time() function and basic math. Sixty minutes might be an ideal token lifetime for an anchor in this situation. Have the user login again if the token for a clicked anchor has expired.
Summary
If you use files on a filesystem and store the paths in the database, make sure you do the following (at minimum), too.
Apply proper file permissions to your content directory (outside of webroot).
Use random names for uploaded files.
Check for duplicate file names before saving a file from an upload.
Only logged in users should be able to download high value content.
Have an effective $_SESSION system that deters session fixation.
Make URLs for high value downloadable content unique per page by using hashed XSRF tokens.
XSRF tokens cover more scenarios when they have a terminal life time.
Make SQL queries for user content based on the logged in user's ID, not the product exclusively.
Filter and validate all user input.
Use prepared statements with SQL queries.
A few options come to mind.
If you are using Apache you can use htaccess to password protect directories. (first googled link : http://www.javascriptkit.com/howto/htaccess3.shtml)
or
Store the files above the web server.
Create a script in php that will allow authorised users to access them.
If you want to do it Via FTP, and you are running cpanel you may be able to create new ftp accounts. check yourdomain.com/cpanel to determine if you have it installed.
Storing files in DB is very bad practice. Very good practice to store only information about file. Name, extension. Files save on server like $id.$ext. It will be a good architecture. And when user download file, he take file with name in DB.Sorry for my english.
The best way is to store the file reference in Database. The file itself will be stored in the server filesystem. The complexity of this is making sure there is reference integrity between the database file reference and the existing file in the server filesystem. Some database such as sql server 2008 have feature that maintain the integrity of the file references to the actual file itself.
Other than that securing the file itself in the server depends on the OS where permissions can be configured to the specific folder where the file reside.
If the files are purely static you could use read-only or WORM media to store the data files or indeed run the complete web server from a "LiveCD". It's certainly not suited to everyone's needs but for limited cases where the integrity of the data is paramount it works.
Downloadable files can be stored in htaccess protected folder/s. A script like the one below can be used to generate dynamic links for downloadable files.
for ex. Secure download links. http://codecanyon.net/item/secure-download-links/309295
I am currently working on a project that will work on a membership system. The theory behind the website is that you can download electronic (PDF) versions of a magazine, if you are a paying member of the website.
The problem arises in that after downloading the PDF it can be sent to anyone and accessed by anyone. Is there a way to only allow the member that paid for the PDF access to it. Is there a 3rd party service that could host the PDF's and allow them to read by the user at a unique URL that holds a random string unique to that download?
Being able to stop the ability to openly distribute the PDF's and view them without being a member is paramount.
Does anyone of have any ideas? Basically I am looking for a DRM like system for PHP (I am assuming it is impossible)
Render their name, credit card number and valid thru date on every single page.
Password protect the PDF with a passkey that is unique to the user who has downloaded it (such as their password for your site). There's nothing to stop the registered user giving away their passkey to anybody else that they give a copy of the file to though.
You can hide the PDF's URL from the user by using a download.php together with an ID that will only deliver files to the user if he/she has the appropriate rights. This way you can prevent users from sending the download-link to somebody else. Users without the proper credentials won't be able to start the download then ...
But: As soon as the PDF left your server (even if a second user may not be able to download it), the first user can do whatever he/she likes with it. Even if you encrypt your PDFs, hide them in password-protected ZIP-archives, lock them in a chest and bury them six feet underground ... the first user must have the information to read the PDF and can give it to anybody else ...
If users can read the text on their computer screen then it's already on their machine. In that case your only ally against unsolicited copying is ignorance.
In other words, it's sensible to make the copying of text as difficult as possible.
For example, don't offer the PDF directly but display it through a Flash-based reader. Then the only way for users to copy it is to make a screenshot of each page. Which is the best "copy-protection" you can get without using heavy-handed encryption in combination with a physical security token.
And of course, you can include sensitive information on every page, such as names, passwords etc to make the theoretical copying process even more ardous.
How about this--rather than --giving-- them a copy that they can do whatever with, why not give them access to a Flash-based "pageflip" system with your pdf on it. You control access to the page via login, and you also control the content.
ANY code or content that you give to the user can be stolen. It just depends on the energy and knowledge of your user as to how many will steal it. Many times, the tougher the protection, the more likely they are to share it with others...hence, the plethora of Adobe CS5 downloads on Warez sites.
(Yes, for those of you who will inevitably point it out, I can steal flash too, but it's a lot of work!)