I have another question:
when i write localhost/folder/file.txt into browser, it opens and show the content of file.txt.
But I want make this file readable only by PHP, not by browser.
I tried everything with chmod but it doesn't work. Is it possible to do that?
Thanks
Write to a file outside the web root, then the web server won't make it available to clients. (There is no requirement for a file to be under the document root for PHP to read it).
Other options include:
Using your webserver's auth/authz systems to secure the file (not recommended for this problem as it is more likely that a configuration error will break the security then it is that the file will be placed in the wrong place)
Using a database instead
You could refuse access to the .txt extension.
.htaccess
# prevent viewing of a specific file
<Files file.txt>
order allow,deny
deny from all
</Files>
# multiple file types
<FilesMatch ".(htaccess|htpasswd|ini|phps|fla|psd|log|sh|txt)$">
Order Allow,Deny
Deny from all
</FilesMatch>
You can put the text file into a mySQL database as BLOB or TEXT. So it becomes impossible to read by browser, only by query (through php).
Simplest solution :
$s=file_get_contents('test.txt');
If the file has some code to execute, you can eval it.
eval(file_get_contents('test.txt'));
Have you tried chmod it to 660 ?
I just tried it using my web server, it is not available.
Related
I want to be able to prevent people from accessing a file on a server, such as a document if they were to directly link to it via the URL. This is for security purposes so that documents on the site just can't be stumbled upon and downloaded...
What is the best approach for this?
I've tried using the .htaccess to deny access to docs and txts for examples, but you can still download the files it just prevents you from accessing the directory...which isn't what I want to do.
<Files ~ "\.(doc|txt)$">
order allow,deny
deny from all
</Files>
put it in a directory outside the public space and provide it via a custom PHP page which requires login or what you prefer
echo file_get_contents(/var/www/example.com/file.txt);
should works I guess
Try putting this in your .htaccess
<FilesMatch "\.(doc|txt)">
Order deny,allow
Deny from all
</FilesMatch>
The best thing to do is to not put it in the web server's document root in the first place.
You can in your .htaccess redirect all requests to files in that folder to a special PHP page that only allows logged in users to download the file but denies those unauthorized to access it.
Also it's a good idea putting the target file itself in a folder above public_html.
I'm adding some database usage to a public facing site, and I wanted input on what the most secure way to store mysql connection information might be. I've come up with a few options:
First I could store the config in another directory, and just set the PHP include path to look for that dir.
Second, I know there are some files that apache won't serve to browsers, I could use one of these types of files.
Third, I could store encrypted files on the server, and decrypt them with PHP.
Any help would be much appreciated.
Storing the config outside of apache's document root is a must
You can configure apache to disallow any files with htaccess.
in the config folder add a .htaccess with the following
order allow,deny
deny from all
If you don't want to use .htaccess as #johua k, mentions, instead add
<Directory /home/www/public/config>
order allow,deny
deny from all
</Directory>
to your apache config.
This will deny any files in that folder from being served to anyone, which is fine since php doesn't care about htaccess you can just
include('config/db.php')
If you properly config your php scripts, they should never appear in plain text.
so a file like
define('mysql_password', 'pass')
would never display that text.
If you are worried about a shared hosting environment and another use having access to read this file then you should evaluate the security of the linux installation and the host. Other users should have any browsing access to your file. From the web files marked php should never return source.
You can explicitly tell apache not to serve the files ever, so they would only be include() or require() able.
My university has multiple servers which have the same data mirrored across them, so I can access for instance
foo.uni.edu/file.php
bar.uni.edu/file.php
The thing is, not all servers have PHP installed, so anyone could possibly download my php files if they made the connection through a server which didn't have PHP installed.
Is there a way, possibly with .htaccess to avoid this? As in, only allow opening PHP files if PHP server is installed?
If it's possible to store files outside of the document root, you could work around the problem by storing all sensitive data outside the docroot. You would then have your publicly accessible scripts use include to access those files.
So, if you upload to /username/public_html, and public_html is your document root (eg, foo.uni.edu/file.php is /username/public_html/file.php), then you would upload to /username/file.php instead and place another script in /username/public_html which merely contains something like include('../file.php');
This is good practice in any case, in case a configuration error on the server ever stops PHP from being parsed.
You could also try using IfModule and FilesMatch to deny access to PHP files if mod_php isn't enabled:
<IfModule !mod_php.c>
<FilesMatch "\.php$">
Order Deny,Allow
Deny from All
</FilesMatch>
</IfModule>
If this doesn't work, try !mod_php5.c instead.
I have a processing file for my website's payments. It works just fine, but what I would like to do is log all the requests to this page so that if anything throws an error, the raw data is saved and I can process the transaction manually. The processing file uses fopen to write to the log file in another directory.
What I have right now is a separate folder on my root directory with permissions 755. Then a log file inside with permissions 777. The processing file that writes to the log file, in PHP if that matters, is set to 777.
This works right now, but the log file is publicly available. I know I can be doing this better and that the permissions aren't correct. How can I do this better?
Put the log file outside the document root. The PHP script that writes to it will still be able to get to it (via the full path) but Apache won't be able to serve it.
I came across this whilst searching the answer for myself. I don't believe there is a simple "permissions fix" to do what you want and perhaps the safest way is to put the log files outside of public_html directory.
However this can be a nuisance sometimes - especially if you are wanting to e.g. catch paypal ipn dump text in a log file, but not have it publicly accessible.
In such cases, you can use .htaccess file directives to allow write from script, but deny reading from public access.
For example, this works for me (Apache .htaccess in root public_html folder);
<FilesMatch "mycustom\.log">
Order allow,deny
Deny from all
</FilesMatch>
and if you have multiple logs you want to protect, use it like this, with "Pipe Separated";
<FilesMatch "mycustom\.log|ipn_errors\.log">
Order allow,deny
Deny from all
</FilesMatch>
It is worth noting that the above directives are deprecated as of apache 2.4 and you may wish to consider using more current directives instead: https://httpd.apache.org/docs/2.4/howto/access.html
Hope that helps you!
I'm working on a site that allows users to purchase digital content and have implemented a method that attempts to serve secure downloads.
I'm using CodeIgniter to force downloads like this:
$file = file_get_contents($path);
force_download("my_file_name.zip", $file);
Of course, I make sure the user has access to the file using a database before serving the download, but I was wondering if there was a way to make these files more secure.
I'm using a some 8-10 letter keys to create the file paths so urls to the files aren't exactly easy to figure out... something like http://mysite.com/as67Hgr/asdo0980/uth89.zip in lieu of http://mysite.com/downloads/my_file.zip.
Also, I'm using .htaccess to deny directory browsing like so: Options All -Indexes.
Other than that... I have no idea what steps to take. I've seen articles suggesting username and password methods using .htaccess, but I don't understand how to bypass the password prompt that would occur using that method.
I was hoping there might be a method where I could send a username and password combination using headers and cUrl (or something similar), but I wouldn't know where to start.
Any help would be hugely appreciated. Thanks in advance!
Make it so the web server does not serve the files under any circumstances, otherwise all the checking is pretty moot. The best way to do that is to put them somewhere outside the webroot. I.e.:
/
webroot/ <- root web directory, maybe named www or similar
index.php <- your app, served normally
…other serve-able files…
files/ <- not part of the serve-able webroot dir
secret_file <- web server has no access here
Then, if the only way to access them is through your script, it's as secure as you make your script.
why not to just Deny from All in the .htaccess? Or place files above webroot? That would be enough. But your current setup is pretty safe already. Why do you think you need any help?
.htaccess should look like this if you want them to only be downloadable from your localhost. Also, it removes some handlers that that could try to access any of the files, just in case. So that way only you have access to it. Also a good idea to store an index.php file in there that checks the existance of another file, and if exists, set the header, if not, exit.
.htaccess file:
<Files *>
Order Deny,Allow
Deny from all
Allow from localhost
</Files>
RemoveHandler .php .php3 .phtml .cgi .fcgi .pl .fpl .shtml