I created a login page that is processed in PHP, which then allows you to type in an itemID and an image is pulled up.
Structure of the directory is the following: Webroot > (cgi-bin - css - error - images - secure - index.html)
(The ones in parenthesis are in the same directory). cgi-bin has include files, css has css files used for the website. error has files to redirect using .htaccess incase of a 401, 402, 404, 500 error, etc. Secure folder has files that verify the user is logged in before serving content. While the index is the login page.
I'm having a hard time finding information on how to protect my sensitive information from the outside (hotlinking, direct URL, etc), while still allowing my program to use it.
The program is written, and works perfectly, but I can type a direct URL to an image or cgi script and view its content. I tried using .htaccess "deny from all", but this denies access from my internal program also.
how can I block external access to the files, but still allow my php scripts/forms to retrieve the data.
Thanks for your time and help.
2 ideas:
Your program can pass the username and password, like so:
http://username:password#example.com/some_resource
Or if you know the IP address of the machines that you want to allow you can deny from all and allow from a specific IP.
Related
This question already has answers here:
How to password protect streaming videos with php
(4 answers)
Closed 4 years ago.
I have an interesting dilemma... I am hosting several .mp4 files on a WAMP server. I am well aware of the method of storing files outside of the document root and using a PHP script to authenticate a user before retrieving the file contents. However, these .mp4 files are required to be inside the document root. Is there any way that I can authenticate a user who is trying to directly access one of these files? I have tried a .htaccess rewrite that takes a requested URL ending in ".mp4" and redirects to a PHP script passing the requested file as a parameter, but of course this just loops.
This is the rewrite rule in the .htaccess file...
RewriteEngine on
RewriteRule ^(.*).mp4$ /media/auth.php?file=$1.mp4
The idea was, a user will try to access http://www.example.com/media/myVideo.mp4, this reuest would get routed to a PHP script (auth.php) that would pick up the requested file from the URL using $_GET['file'], it would authenticate the user using a $_SESSION variable, and then user a header('Location: ' . $file) to send the authenticated user to the actual file. Again, I realized quickly after implementing this that it would just loop...
Any help would be greatly appreciated. Thanks!
Here's why they cannot be outside the doc root...
#IdontDownVote, very complicated, but I'll attempt a short version... I have this MOSTLY working great with files outside the doc root, with one serious issue. When I access the files using the PHP script, I am able to view the video in the Chrome browser, play, pause, rewind, the whole deal. The only (big) problem is when I use this method, I am not able to Cast the video. When I access a .mp4 file directly, it gives me the option to Download or Cast, but using the PHP script, only the Download option is available. Believe me, I have tried everything including discussing with two Google developers. I posted on Stack Overflow about this issue here with no joy...
Why can't I cast an MP4 file served by PHP from outside of the document root?
I am not sure why this is getting downvoted, this is causing my a lot of headache and I am just looking for advice...
First off have you tested it when serving the file though PHP outside of the document root?
When I access a .mp4 file directly, it gives me the option to Download or Cast, but using the PHP script
If you can then you can simply block access to all but one PHP file and use that to dish it out (you could allow access to *.php too..
#.htaccess
Order Deny,Allow
Deny from all
<Files index.php>
Allow from all
</Files>
That will only allow index.php to be accessed in whatever folder (and sub-folders) the htacccese is put in. This might be better if you have a existing login system in PHP that you want to make use of for your users.
If you can't then the only way I can thing of is to try using BasicAuth, and Htaccess like this:
#.htaccess
Order Deny,Allow
Deny from all
AuthType Basic
AuthName "Admin Only"
AuthUserFile /pathto/.htpasswrd
require valid-user
Allow from all
Then in a file named .htpasswrd put some stuff like this:
admin:$apr1$o48wfurr$5WaWCjD85kBu/ydGKsQeq/
You can use something like this to hash the password.
http://www.htaccesstools.com/htpasswd-generator/
Then when you give Chrome the url, you'll have to include the user:pass# part to bypass the login.
http://user:pass#somedomain/path/to/video.mp4
If you wan't to tie this into an existing login, you'll have to create some system to sync the passwords in the htpasswrd file to those of your users. That really shouldn't be too hard. I made one to sync our site users accounts to sFTP server, that was a real pain because of security and we have 3 sFTP servers (legacy, normal and dev). Anyway you just basically have to keep a list of their user/password and then update the file when a user is created/deleted or changes their password.
Maybe it will work (I honestly don't know ... lol).
Worth a shot.
I'm having trouble putting data in safe locations. What I want to do is allow my localhost to access the files to create my pages but prohibit all other access.
I started out trying to write a .htaccess file to prevent access to subfolders but read here that this was a poor way to do things and was getting into a tangle anyway so, following advice, I tried moving the files out of the public_html directory:
The structure is:
bits_folder
images
testimage.jpg
files
testfile.php
public_html
application
callingfile.php
With this layout, I get error 404 if I try to access anything in bits_folder from the browser, as desired. callingfile.php however does not seem able to access the testimage, but can include the php testfile.
callingfile.php:
require("../../bits_folder/files/testfile.php"); //works and displays file echo
<img src="../../bits_folder/images/testimage.jpg" //gives broken image
both the files (testimage and testfile) are in the folders where they should be.
I am assuming that the reason for this behaviour is that the img is a http request after the page is served and will thus be denied but I am no server expert. Is this the case? Can this be overcome? Should I be doing this another way?
Only place scripts and images for PHP to use outside public_html. Images and other things that are as src or otherwise linked in HTML/JavaScript cause the browser to request those. The web server will refuse to serve them from outside the public directory.
Your browser will get access denied for www.example.com/../../bits_folder/images/testimage.jpg
I'd like to protect some files with a session Authentication. Some files can be viewed by users, some not.
I've impelemented a solution with mod_rewrite and readfile(). My problem is that this function will use a lot of ram and the server goes down when more users download files.
I tried this:
1) Pass a file trough the php handler and use the prepend function. It doesn't work because when the prepend php file finished the handler process the file, and in my case the handler was blocked because of invalid ASCII chars. I couldn't manage to stop the handler from processing but output the file.
2) Put the session, ip and the folder name in a temporary file what I tried to check in my nginx.conf to exclude from rewriting. I failed because I was not able to extract only the folder name in nginx into a variable.
How can I solve this problem? Has anyone a suggestion?
Thanks
If I understand the question correctly, you are trying to create a system that only allows authorised users to view certain files, and other users to view other files.
If my understanding is correct, then I would personally store the files above the root or in a secure location, and then have an access script (such as fetch_file.php) with a unique identifier in the URL (e.g. fetch_file.php?uid=1234).
If the user is authorised to access the file with the unique id of 1234; provide the file from the location details within the database, otherwise deny the request.
This way, the user can not access the file without the correct permissions, as it is stored securely above the root which should not be accessible from the internets.
I wrote this VERY simple PHP login system:
<?php
session_start();
$error = '';
if (isset($_POST['username']) && isset($_POST['password']))
{
if ($_POST['username'] == 'user' && $_POST['password'] == 'pass')
{
$_SESSION['client'] = 'ok';
Header ("location: /kit/kit/index.php");
}
else
{
$error = 'Usuario o contraseña incorrectos.';
}
}
?>
Don´t worry about the vulnerability issues, it´s not protecting anything valuable.
In every .php page i add:
<?php
session_start();
if (!isset($_SESSION['client']) || $_SESSION['client'] != 'ok')
{
Header ("location: /kit/index.php");
die();
}
?>
This protects the .php sessions just fine.
The problem is that this doesn´t protect the files.
I mean if go directly to:
something/other/file.zip
it will download it wether you have loged in or not.
I hope the question is clear enough, if not, please ask!
To stop a user from seeing the directory, all you need to do is create an index page in that folder. Ex: index.htm, index.html, default.htm, default.html.
To stop a user from entering the folder (e.g. stop anyone from viewing http://www.yoursite.com/myFolder/), you may need to access some features of your web host. Some hosts allow you to password protect files or folders. You can also create an .htaccess file/folder
An htaccess file is a simple ASCII file, such as you would create through a text editor like NotePad or SimpleText. Many people seem to have some confusion over the naming convention for the file, so let me get that out of the way.
.htaccess is the file extension. It is not file.htaccess or somepage.htaccess, it is simply named .htaccess
Create the file
In order to create the file, open up a text editor and save an empty page as .htaccess (or type in one character, as some editors will not let you save an empty page). Chances are that your editor will append its default file extension to the name (ex: for Notepad it would call the file .htaccess.txt). You need to remove the .txt (or other) file extension in order to get yourself htaccessing--yes, I know that isn't a word, but it sounds keen, don't it? You can do this by right clicking on the file and renaming it by removing anything that doesn't say .htaccess. You can also rename it via telnet or your ftp program, and you should be familiar enough with one of those so as not to need explaining.
htaccess files must be uploaded as ASCII mode, not BINARY. This makes the file usable by the server, but prevents it from being read by a browser, which can seriously compromise your security. (For example, if you have password protected directories, if a browser can read the htaccess file, then they can get the location of the authentication file and then reverse engineer the list to get full access to any portion that you previously had protected. There are different ways to prevent this, one being to place all your authentication files above the root directory so that they are not www accessible, and the other is through an htaccess series of commands that prevents itself from being accessed by a browser, more on that later)
JUST INCASE stop users from downloading your file
store all things that are downloadable ourside your document root. which means before the public_html file.
EDIT: updated the section below to show graphical representation of folder structure
how do you access them then?
work
downloadableFiles
downloadables
- memberOnlyFile.zip
- welcomePackage.zip
- memberhshipVideoVideo.mov
photos
- photo1.jpeg
- photo2.jpeg
publi c_html
- index.htm
About
- about.html
- about.gif
LogIn
- login.htm
- loginScreen.htm
- loginFancyButton.gif
Now anything in the public_html folder the world can see through your website.
Anything outside your public_html folder, will not be visible directly to the world through your website by typing the file name into the address bar in their browser. so thats a good thing as we are going to save all our files that we dont want to give access to outside of the public_html folder.
Now say if you want a certain user to be able to download a file, say maybe a logged in user, you can still make the file downloadable by having a link to that file.
If we are at the login Page, to access the loginScreen webpage you just write down the hyperlink like so:
login screen
since that page is on the same folder. now if you want to allow a user to be able to download a file from the downloadable files folder which is outside the public_html folder since it is not in that folder it self youjust reference to it like so:
How would we get to that folder if we are in the login folder as we are viewing the loginScreen.htm page, you go one folder back so we end up being in the public_html folder. then we go another folder back so we are in the work folder.
so it would look like this so far.
../../ which means two folders back.
then to access the memberonlypath.zip we then need to go into the downloadableFiles folder then we need to get into the downloadable files and then we can link it to the file membersOnlyFile.zip which is the file we were lookng for before.
so the full link now becomes
download file
This way the user cannot access the file by simply typing it on the address bar but can download it if you reference it yourself like the above.
Hope this helps
PK
Store all files you don't want downloaded outside the DocumentRoot.
You need .htaccess to deny access to the folder.
Just have a php download script like: this one that will get the file below the public_html folder.
"Static" files are served by the webserver, not PHP, so authentication is handled differently. There are two easy ways around this:
Handle all authentication in the webserver, e.g. with HTTP basic/digest authentication. Apache 2.2 has a helpful introduction.
Serve the files with PHP, e.g. with foo.php/path/to/file if you have "pathinfo" enabled (according to the PHP docs you set AcceptPathInfo=ON in the server config somewhere) or foo.php?path=path/to/file, which is pretty terrible, but oh well.
There is a more enterprisey solution:
Write an authentication module for your download server which understands authentication cookies from the other site. Many big sites do this (adcdownload.apple.com comes to mind), partly so they can stick the downloads on a CDN but still have some sort of access control.
There is a lazy workaround:
Stick everything in an "unguessable" directory name (e.g. some random base64 chars). Make sure you can't list the parent directory (the easiest way is to create an empty "index.html" file).
I have a directory of files that logged-in users can upload to and access. Some of the files are public, and others are private - for internal access only. The filenames and access settings are saved in a database.
Can anybody give me some resources or show me an example of how i can use session data (and .htaccess?) to allow access of private files only to authorized users?
I'm thinking it might be easier to keep public documents in a seperate, unprotected directory, though i'd kind of like to keep everything together.
I'm not concerned about top-level security or encryption, as the files aren't terribly sensitive, but i want to keep them from being indexed on search engines, etc.
thanks!
I suppose I wouldn't use a .htaccess (or any kind of HTTP-authentication) for that : .htaccess / .htpasswd are great when you want to allow/deny access to a whole directory, and not to specific files.
Instead, I would :
Deny any access to the files -- i.e. use a .htaccess file, containing Deny from All
That way, no-one has access to the file
Which means everyone will have to use another way to get to the files, than a direct URL.
Develop a PHP script that would :
receive a file identifier (a file name, for instance ; or some identifier that can correspond to the file)
authenticate the users (with some login/password fields), against the data stored in the database
if the user is valid, and has access to the file (This is if different users don't have access to the same set of files), read the content of the file from your PHP script, and send it the the user.
The advantage is that your PHP script has access to the DB -- which means it can allow users to log-in, log-out, it can use sessions, ...
About the "send the file from PHP", here are a couple of questions that might bring some light :
Sending correct file size with PHP download script
Resumable downloads when using PHP to send the file?
forcing a file download with php
I'd create a custom index script in PHP -- something that would show the files dynamically. Use that to keep only the right files being listed -- afterwards, to further protect the files, fetch file contents dynamically -- Pascal MARTIN's links show you how to use PHP to control the file streaming, you can use that to block access from hidden files to users that aren't supposed to get to them.