Password protect automatically generated .html files - php

I'm using the code in Tom's response here. However I have a script that automatically generates .html files into my public_html folder. These files are then loaded by my .php file, which looks something like this:
<?php
require('./access.php');
include('./secret_information.html');
?>
However the "secret_information.html" file is viewable by anyone without the password. I am running an Apache web server. As I understand, all html code / images to be used on a website need to be in the public_html folder. So how can I hide this information? Do I need to setup my automated scripts to generate .php files rather than .html or is there another solution?

include can access any file, as long as it is accessible by the web server.
So you can put secret_information.html anywhere in the file system, preferably outside of the document root or public_html.
If you must keep the file inside your publicly accessible web for some reason, you may use Apache's Authentication and Authorization facility.

Related

File location issue

I'm having trouble putting data in safe locations. What I want to do is allow my localhost to access the files to create my pages but prohibit all other access.
I started out trying to write a .htaccess file to prevent access to subfolders but read here that this was a poor way to do things and was getting into a tangle anyway so, following advice, I tried moving the files out of the public_html directory:
The structure is:
bits_folder
images
testimage.jpg
files
testfile.php
public_html
application
callingfile.php
With this layout, I get error 404 if I try to access anything in bits_folder from the browser, as desired. callingfile.php however does not seem able to access the testimage, but can include the php testfile.
callingfile.php:
require("../../bits_folder/files/testfile.php"); //works and displays file echo
<img src="../../bits_folder/images/testimage.jpg" //gives broken image
both the files (testimage and testfile) are in the folders where they should be.
I am assuming that the reason for this behaviour is that the img is a http request after the page is served and will thus be denied but I am no server expert. Is this the case? Can this be overcome? Should I be doing this another way?
Only place scripts and images for PHP to use outside public_html. Images and other things that are as src or otherwise linked in HTML/JavaScript cause the browser to request those. The web server will refuse to serve them from outside the public directory.
Your browser will get access denied for www.example.com/../../bits_folder/images/testimage.jpg

How can I get the contents of a file above webroot (located at home dir)

I want to store the password of my database to a file not into the webroot as directed by many repliers to relevant answers here.
I want to read the file from the dir "~/".
How to do that?
I've tried $file_content = file_get_contents("~/pass", true); but when I echo $file_content it prints nothing.
If you're uploading to an FTP server, use something like FileZilla and you should see your document root.
For me, it's /home/myusername/public_html.
If you had a file in myfolder above the web root, assume it will be /home/myusername/myfolder/file.
Then do file_get_contents(thingimentionedabove);.
Although I would recommend putting your document within the web root and granting / disallowing access via permissions in a .htaccess file.
The classical way to do that would be a config.php in a web app subdirectory, setting database access variables like $password. As security is concerned, "config" is too obvious.
Also all php scripts that deliver files should be programmed restrictively.

Making theme files .html

They have some php code within (if, endif, variables), but essentially they are html files.
Do you think is a good idea to use the .html extension and prevent direct access to them trough .htaccess, so the php code is not visible to anyone ?
Is it safe?
can you test the script?
As far as I know the PHP is just server-side so after the server "do its thing" it doesn't show on the users computer (when he tries to see the source code).
Javascript and HTML are displayed but not the php.
I have used the .htaccess file to block some other files in the folder such as the DB credentials and such that I had named .inc (for include). If you block the html with the htaccess noone is going to be able to see the webpage.
I hope I understood it right! (And clarified it as well)
Just have an .htaccess with deny from all in the views folder... ;p
Tho if you have files like images or css that you want loaded from that theme folder then by all means rename the view to .php and put as this on the first line:
if (!defined("RUN")){die('No direct access');}
Obviously define('RUN',true); in your config.

PHP remote access to .htaccess protected files

I'm having a bit of trouble trying to access the content of .txt files on a remote server that are in an .htaccess protected directory.
What I am trying to do is the following:
Connect to the FTP server via PHP and use ftp_nlist to retrieve a list of all the .txt files in a directory. Up to here, everything works fine.
For each .txt file found, I want to retrieve the contents. There are a number of ways to do this normally which all work fine when there is no .htaccess file protecting the .txt files.
BUT! As soon as I protect the online directory with the .htaccess file, every single method I have tried fails to get the contents of the .txt files. The .htaccess file that is protecting the folder that contains the .txt files has the following (and nothing else):
<Files *.txt>
Order Deny,Allow
Deny from All
</Files>
Obviously, the online PHP website itself can access the contents of the .txt files without any problems, and the .htaccess file itself is doing it's job perfectly (denying direct access to any of the files), but when I'm trying to access the .txt files remotely from my WAMP server, I just can't find a way to bypass the .htaccess protection.
Basically, I want to imitate remotely, from my WAMP server, what my website already does itself locally by using $contents = file($filepath). Surely there must be a way... Can anyone point me in the right direction? Should I be using a different method of protecting the .txt files, or should I be using a specific PHP function to access the contents?
Your question isn't clear.
If you protect a folder or a file with .htaccess you will be still able to download that file with FTP. .htaccess affects only Apache (http requests).
If you want to be able to download those file anyway with http, then you just do a script that outputs its content:
downloader.php:
//> Check if the admin is logged, and check if $_GET['filename'] is allowed
readfile($_GET['filename']);
Then you can request your file with:
http://yoursite/downloader.php?filename=file.txt
Of course be sure to protect the access of this downloader.php

Forbid access to files in a simple PHP login system

I wrote this VERY simple PHP login system:
<?php
session_start();
$error = '';
if (isset($_POST['username']) && isset($_POST['password']))
{
if ($_POST['username'] == 'user' && $_POST['password'] == 'pass')
{
$_SESSION['client'] = 'ok';
Header ("location: /kit/kit/index.php");
}
else
{
$error = 'Usuario o contraseña incorrectos.';
}
}
?>
Don´t worry about the vulnerability issues, it´s not protecting anything valuable.
In every .php page i add:
<?php
session_start();
if (!isset($_SESSION['client']) || $_SESSION['client'] != 'ok')
{
Header ("location: /kit/index.php");
die();
}
?>
This protects the .php sessions just fine.
The problem is that this doesn´t protect the files.
I mean if go directly to:
something/other/file.zip
it will download it wether you have loged in or not.
I hope the question is clear enough, if not, please ask!
To stop a user from seeing the directory, all you need to do is create an index page in that folder. Ex: index.htm, index.html, default.htm, default.html.
To stop a user from entering the folder (e.g. stop anyone from viewing http://www.yoursite.com/myFolder/), you may need to access some features of your web host. Some hosts allow you to password protect files or folders. You can also create an .htaccess file/folder
An htaccess file is a simple ASCII file, such as you would create through a text editor like NotePad or SimpleText. Many people seem to have some confusion over the naming convention for the file, so let me get that out of the way.
.htaccess is the file extension. It is not file.htaccess or somepage.htaccess, it is simply named .htaccess
Create the file
In order to create the file, open up a text editor and save an empty page as .htaccess (or type in one character, as some editors will not let you save an empty page). Chances are that your editor will append its default file extension to the name (ex: for Notepad it would call the file .htaccess.txt). You need to remove the .txt (or other) file extension in order to get yourself htaccessing--yes, I know that isn't a word, but it sounds keen, don't it? You can do this by right clicking on the file and renaming it by removing anything that doesn't say .htaccess. You can also rename it via telnet or your ftp program, and you should be familiar enough with one of those so as not to need explaining.
htaccess files must be uploaded as ASCII mode, not BINARY. This makes the file usable by the server, but prevents it from being read by a browser, which can seriously compromise your security. (For example, if you have password protected directories, if a browser can read the htaccess file, then they can get the location of the authentication file and then reverse engineer the list to get full access to any portion that you previously had protected. There are different ways to prevent this, one being to place all your authentication files above the root directory so that they are not www accessible, and the other is through an htaccess series of commands that prevents itself from being accessed by a browser, more on that later)
JUST INCASE stop users from downloading your file
store all things that are downloadable ourside your document root. which means before the public_html file.
EDIT: updated the section below to show graphical representation of folder structure
how do you access them then?
work
downloadableFiles
downloadables
- memberOnlyFile.zip
- welcomePackage.zip
- memberhshipVideoVideo.mov
photos
- photo1.jpeg
- photo2.jpeg
publi c_html
- index.htm
About
- about.html
- about.gif
LogIn
- login.htm
- loginScreen.htm
- loginFancyButton.gif
Now anything in the public_html folder the world can see through your website.
Anything outside your public_html folder, will not be visible directly to the world through your website by typing the file name into the address bar in their browser. so thats a good thing as we are going to save all our files that we dont want to give access to outside of the public_html folder.
Now say if you want a certain user to be able to download a file, say maybe a logged in user, you can still make the file downloadable by having a link to that file.
If we are at the login Page, to access the loginScreen webpage you just write down the hyperlink like so:
login screen
since that page is on the same folder. now if you want to allow a user to be able to download a file from the downloadable files folder which is outside the public_html folder since it is not in that folder it self youjust reference to it like so:
How would we get to that folder if we are in the login folder as we are viewing the loginScreen.htm page, you go one folder back so we end up being in the public_html folder. then we go another folder back so we are in the work folder.
so it would look like this so far.
../../ which means two folders back.
then to access the memberonlypath.zip we then need to go into the downloadableFiles folder then we need to get into the downloadable files and then we can link it to the file membersOnlyFile.zip which is the file we were lookng for before.
so the full link now becomes
download file
This way the user cannot access the file by simply typing it on the address bar but can download it if you reference it yourself like the above.
Hope this helps
PK
Store all files you don't want downloaded outside the DocumentRoot.
You need .htaccess to deny access to the folder.
Just have a php download script like: this one that will get the file below the public_html folder.
"Static" files are served by the webserver, not PHP, so authentication is handled differently. There are two easy ways around this:
Handle all authentication in the webserver, e.g. with HTTP basic/digest authentication. Apache 2.2 has a helpful introduction.
Serve the files with PHP, e.g. with foo.php/path/to/file if you have "pathinfo" enabled (according to the PHP docs you set AcceptPathInfo=ON in the server config somewhere) or foo.php?path=path/to/file, which is pretty terrible, but oh well.
There is a more enterprisey solution:
Write an authentication module for your download server which understands authentication cookies from the other site. Many big sites do this (adcdownload.apple.com comes to mind), partly so they can stick the downloads on a CDN but still have some sort of access control.
There is a lazy workaround:
Stick everything in an "unguessable" directory name (e.g. some random base64 chars). Make sure you can't list the parent directory (the easiest way is to create an empty "index.html" file).

Categories