I would like to keep all config options for a webapp in one file. (pathes, passwords, options which are read by php, sass (during compilation), maybe grunt,..)
I like the JSON format since its very clear and almost anything can parse json. But by default .json files can be downloaded.
Can I safely prevent that by giving the file a .json.php extension?
What are the drawbacks? Better Approaches?
To prevent the file being downloaded, generally the way to go is to store it in a directory that is not served by the web server. I don't know what setup you're in, but assuming an Apache setup, if for example your .php files are served from a directory /home/user/htdocs, you could create a directory /home/user/config, ensure that it is readable by the webserver, and store the .json files there.
Another approach, again assuming Apache, would be to create an .htaccess file containing the following (inspired by this answer):
RedirectMatch 404 \.json$
This would not only prevent downloading any and all .json files in the directory, but hide their very existence.
It might just be possible to do it the way you suggested, by storing the file with a .json.php extension, although this would not be a recommended approach. For this to work, the file has to be valid PHP but it must obviously be valid JSON as well and we are hampered somewhat by the fact that JSON does not allow comments. Something like the following would stop the PHP interpreter soon after the start of the file, before spilling your secrets:
{
"<?php exit('Access denied'); ?>": null,
"password": "secret"
}
Related
Im working on an upload script, and i want a user to be able to upload any file.
I had it al working on localhost, i added
php_flag engine off
AddType text/plain php html shtml php5 php4 php3 cgi asp aspx xml
to my htaccess in the upload folder, and it showed the source of PHP, html and all other files. Exactly as i wanted to.
Now i tried to upload it to a real webserver, and unfortunately my host does not allow such .htaccess files.
I tried openinging the files with file_get_content() and fopen() and giving them a text/plain header.. but nothing works. It first executes the scripts and shows the output in my textarea.
Do you guys have any suggestions on how i can fix this without .htaccess ?
Thanks!
Don't upload files into the webroot and let people access them directly. As you say, .php scripts (and probably a lot more) get executed that way. A classic way for arbitrary code execution attacks.
Store uploaded files outside the webroot where they're not publicly accessible and create a script that allows users to download the files, for example using readfile or Apache mod_xsendfile, after having done the necessary permission checks.
Also see Security threats with uploads.
I'm using php and MYSQL. I've created a members area where people can upload important images (basically for financial record). I was hoping to get some advice as to the best way to store these files. What kind of folder structure would be best? Ex domain.com/Files/UserName/RandomGeneratedName/Files.
Also any advice for chmod, .htaccess, .htpassword and any kind of password protection with php.
Thanks in advance.
I would recommend to store them outside the tree. In this way, by default you need to enable access to them, not disable access (just in case you have a bug in your .htaccess/config/code, the access to the files is disabled, not enabled)
Second, get rid of the random directory, it doesn't add much to the security, but it complicates the implementation unnecessarily
You can use php to check the member credentials, put the appropriate headers (mime type for ex., etag, etc.), and serve the file via passthru or something similar.
Best way:
protect the directory with .htaccess
add an index.html to directory "just in case"
use random file names to store them in your directory
use php/mysql to check if user has access rights to your files
example:
You have a file in domain.com/protecteddir/sdjasdu83299sdnYUsb.dat
You can use php/mysql to send to user to a virtual directory to download the file. You can throw the correct file header + file name via php. So even if the file is called sdjasdu83299sdnYUsb.dat the user would download it as "myfinancial.doc"
The user will never know what the real file is located nor its name.
Your .htaccess file should contain:
<Files *>
Order Allow,Deny
Deny from All
</Files>
you could .htaccess for URL Rewriting and get the index.php file (write this code: ) in folder that save your files, so when anyone want to enter this folder:
First: he dont know real address
second: due there is a index.php that offer to home page he could not enter.
I was reading about PDO and I came across the parse_ini_file function. A number of developers suggested using this function to parse in db settings rather than hard coding the db settings in code for security reasons.
My question to you is, does it make sense to do a file read for every load of your PHP application for this extra "security" ?
I wonder how expensive this file read is..
php 5.3
in the comments
http://www.php.net/manual/en/class.pdo.php
I don't really see how it's any more secure.
For example, if your DB settings are stored in defines within a "config.php" file outside of the main web root, they're just as secure as if they're were stored in a .ini file and there would be no per-page parsing overhead (other than having to include the config file as per normal).
Hard coding settings in PHP files is bad because those same PHP files will be sent around, copied, put into repositories, etc. The passwords should be treated with more privacy. Also, it's annoying to have to the source files overwrite your local copies.
Note that I'm referring specifically to embedding in regular PHP files in your project's codebase. If you place your config settings in a PHP file that sits external to all of that, then none of the above applies.
If you are worried about the overhead of parsing one config file, then you shouldn't be using PHP at all... However, you could limit file reads by parsing it only when a cached (e.g., memcache) copy cannot be found.
It makes sense if you have more than just db access stored in the ini file . It can act like a config for you're app so you don't have to open 10 files to change 3 hardcoded variables/constants/whatever . If you don't like reading a file each time you're app is requested then use a php file to store all you're config options ( keep them all in one place is realy good ) , and as sugested keep the ini/php config file out of you're web root .
Probably not. If its a .ini file then a browser can just visit it and download it. At least a .php has a decency of a blank screen.
I'm trying to develop a file uploading module on our new site that allows you to upload any file to our servers. The uploaded file is uploaded to /files, in which the following .htaccess to prevent users from executing i.e a .php file:
<Files *.*>
ForceType applicaton/octet-stream
</Files>
This triggers the browsers download window (at least in FF and Safari), but is it safe to assume the file won't be run on the server using this method? If not, how would you implement such a solution?
I think the safest thing is to restrict 100% web access to the directory, and have a script like download.php through which you pass a file id that then fetches the appropiate file and outputs it to the browser. However, I am pretty sure that what you have will work and is safe.
is it safe to assume the file won't be run on the server using this method?
Kind of, but it depends on what other directives are present in your config; maybe there are other rules set up to allow PHP files to run. If the only way you're enabling PHP is by keying the PHP handler on file type, that should stop PHP executing.
However, stopping PHP executing is just one of your worries. If people upload files that contain active content, such as HTML or Flash — even if the filetype says it's an innocent image — they can gain control of other users' sessions on your site through cross-site scripting (XSS). See Stop people uploading malicious PHP files via forms for some discussion of this.
A ‘download.php’ interface that uses Content-Disposition to always trigger the download box, coupled with storing the files under non-user-supplied filenames like ‘1234.dat’, is much safer.
I think you actually want this:
<Directory /path/to/files>
SetHandler default-handler
</Directory>
What you have might work in practice, because the server is configured by default not to execute anything unless specifically told to do so, but it doesn't really guarantee that nothing will be executed. ForceType just sets the content type for static files (I'm not sure, but I doubt that it affects executable scripts).
Seconding Paolo's answer, move your files directory out of the accessible path. You can then write the download.php script using PEAR's HTTP_Download module to serve the files.
I agree with Paolo, his way is more secure. There is always the issue of someone exploiting your PHP files to execute an uploaded one. Bad Example:
include_once("/modules/".$_GET["module"].".php");
Where someone passed in module=../Files/exploit
For maximum security, you shuold have the folder containing the uploaded files be mounted from a separate partition with the no-exec flag.
Assuming you have only the URL to a file (hosted on the same server as the app) that has been rewritten via mod_rewrite rules.
How would you read the contents of that file with PHP without having direct access to a .htaccess file or the rewrite rules used to building the original URL?
I'm trying to extract all the script tags that have the "src" attribute set, retrieve the contents of the target file, merge all of them into one big javascript file, minify it and then serve that one instead.
The problem is that reading all of the files via file_get_contents seems to slow the page down, so I was looking at alternatives and if I would be able to somehow read the files directly from the file system without having to generate other requests in the background, but to do this, I would have to find out the path to the files and some are accessed via URLs that have been rewritten.
You can't include it as if it were the original PHP, only get the results of the PHP's execution itself.
If you've got fopen wrappers on, this is as easy as using require, include or file_get_contents on the rewritten URL. Otherwise you have fsockopen and curl as options to create the HTTP request for the result.
As you cannot say how the request would be handled, the only possible solution is to send a HTTP request to that server. But that would only get you the output of that file/script.
PHP lays behind apache and has file access on file-system level using fopen-like or include-like etc... functions. Rewrite module won't work for this access, because these functions use OS file access routines but not apache.
There's no way to do this, but implementing in php-script the same rules of URL-rewriting as you have in .htaccess, because apache-rewriting and php file access know nothing about each other and are on comletely different layers of web-application.
AFTER EDIT: The only way - imlement your rewrite rules in php script and use file system php access after parsing the URLs via php (not rewrite module).