I have concerns that are similar to what were addressed here. I'm using Composer to install Amazon AWS components to set up a SES (email) service.
According to the Amazon documentation, I need to include autoload.php in order to use the classes that I installed. This means that the autoload.php must be in my web directory (/var/www/html).
I didn't fully understand the answer provided to the SO question I previous mentioned, but it essentially says that the vendor directory should NOT be in the web directory. But if I do this, how will I require the autoload.php file, which is in the /vendor directory?
Overall I am very confused about how I should be properly setting this up. Any help would be appreciated.
Edit: This article also suggests putting the /vendor/ folder in the web directory. Is this the standard? What security risks should I be looking out for? Because there are no index.html files or anything in any of the folders, the directories of all the files that were installed can be seen and accessed freely. Surely this can't be a good thing?
The "web directory" is the directory directly served via HTTP to anyone asking with the right URL. So if anyone thinks there is a folder "/foo" hosted on your domain, and you didn't take precautions, and there is in fact that folder, and it does not contain a file that would be served as the directory index, anyone asking probably would get the directory listing of that folder, listing all files.
Now the difference between such a web hosted folder and the require statement in PHP is that PHP does not use a URL pointing at a publicly accessible HTTP hosted folder, but uses a filesystem path pointing to a file.
And most beginners mix this up: Because PHP at starter level is all about having a bunch of scripts spread around the web directory, which emit a lot of HTML containing links to other scripts, they get the idea that the links in HTML and the file paths in PHP are the same and have to be. This is wrong. They don't have to be the same, they are the same because no better approach has been selected.
So here's how a modern web application is constructed. If you deploy the whole project, the main directory on the server might be called /var/www/projectX. Inside this container are some files like /var/www/projectX/composer.json. Because of this there will also be a directory /var/www/projectX/vendor. Additionally, somewhere would be one PHP script that's being accessed (I delay the info HOW it's being accessed for now), and that location should either be A) /var/www/projectX/script.php or B) /var/www/projectX/public/script.php. Those two scripts want to use Composer provided classes and need to include the autoloading.
Because of the file location, the script in location A needs to run require 'vendor/autoload.php';, and the script in location B needs require '../vendor/autoload.php';. This is simply a matter of using the correct relative path from the script to the autoload file. You could even use an absolute path in both cases: require '/var/www/projectX/vendor/autoload.php'; will also work. The main point here is: It does not matter HOW you require that autoload.php file as long as it gets executed by the script. The path does not affect anything.
Now the HTTP hosting and accessing the scripts. The webserver has at least one directory configured that is being exposed to the outside world as the main directory of the domain. This is called DOCUMENT_ROOT, and it can be ANYWHERE. Now it depends on the configuration of your server which directory is preselected, and if you can change that setting (either by administrating your server on the command line, or by clicking some settings in a GUI).
If your server has the directory /var/www/projectX set as the document root, all the world can access the script in case A as http://example.com/script.php, as well as the script in case B as http://example.com/public/script.php, and also the vendor folder as http://example.com/vendor/.... This is not great, but could be avoided by placing .htaccess files inside or otherwise restrict access.
The better solution is to tell the server to only serve the directory /var/www/projectX/public as document root. This will prevent HTTP access to script A and the vendor folder, and access to script B is done via http://example.com/script.php.
In both cases, both scripts successfully include the autoloading of Composer because the restrictions of HTTP access do not apply to filesystem access.
Bad website hosting allows you only to use the first scenario, with the only accessible directory for you being directly the document root, without a method to change it.
More sophisticated website hosting ís using a fixed subdirectory like public or html or webroot as the document root, allowing you to hide sensitive files from ever being served via HTTP.
The best website hosting allows you to select which subdirectory should be hosted as document root.
In any case, the path pointing from a script to Composers autoload.php is not affected at all.
Related
I have an external PHP placed in a plugin which must work in all Wordpress configurations. Including those where the plugin folder has been moved outside Wordpress root.
The external PHP needs to bootstrap wp-load.php and thus need to know the location of that file. The location of the file is passed through the querystring (relative location from plugin folder to Wordpress root), which is obviously unsafe. If a hacker somehow has gained access to upload php files (wp-load.php) to for example wp-content, she will be able to run a malicious wp-load.php through the external script.
The external PHP is "called" by way of a RewriteRule in an .htaccess (which I have control of). On Apache I can block direct access, but not on Nginx.
As the purpose is to load Wordpress, note that using Wordpress functions is out of the question.
I am thinking that perhaps some secret or hash can be passed to the script from the .htaccess.
To validate that the root looks real, the .htaccess in the root could be examined.
With control over the .htaccess, you could put any comment into the .htaccess. To decide whether to accept a proposed root folder, the script can look for a .htaccess in that folder, read it, and see if it contains the magic comment.
To exploit this, the hacker would need to be able to store a file named ".htaccess" as well as "wp-load.php".
The solution could be improved by inserting a hash of something in the magic comment (but hash of what?)
To avoid running a php located in ie "/var/www/example.com/wp-content/uploads/wp-load.php", you could check if any of the parent folders contains "wp-load.php" and exit if they do.
This protection can unfortunately not stand by itself as it will not protect installations where the wp-content folder has been moved out of the root. But it will shield typical installations against running a malicious "wp-load" which has been uploaded in a subfolder.
The plugin could try to create a symlinked folder in the plugins folder, linking to root. Or it could create a php file there, which defines the path to root. It will only be necessary when system is detected as Nginx.
The uploads folder will be write protected on many systems. Instruction needs to be provided for creating such file manually. In this case, this is not a deal breaker as users on Nginx already need to insert the rewrite rules manually.
I am developing a website for myself and I just wonder how can I prevent direct access to include files like header.php and footer.php. Those files should only be incorporated in pages like index.php or other pages wherein they will be called using <?php include(''); ?>. Should I do it through PHP? How about editing the .htaccess file or are there any other methods?
place the files(s) in a directory out side the web root.
the web server will never serve theses files to users.
php et.al. can still access the files via include\require etc
This has been the gold standard approach for several decades.
I offered 3 suggestions and since you didn't provide much to go one, I will give you one elaboration.
As #Dragon eludes to, when you use include() your reading via the file system and not via a HTTP Request. You can check for an HTTP verb ($_REQUEST, $_GET, $_POST) and refuse to show content or fake a 401.
<?php
if(isset($_REQUEST) || isset($_GET) || isset($_POST)){
header("HTTP/1.0 404 Not Found");
die();
}
// Do the needed
?>
I will let you figure out the gotcha on your own here.
It would be perfect if your server is linux, because then what you can do is follow Dagon's suggestion of placing the files to include outside of the web root.
The web root of course is the base folder that contains files the outside world is meant to access. On many systems, this is the public_html folder.
On a system with WHM/cpanel installed, you might have a special user account where the root of that account (where anything can be stored) is located at /home/user on the entire system. This can be viewed by using the file manager utility included with cpanel when logged in. In that /home/user folder, you may find configuration files and folders starting with a period as well as public_ftp and public_html folders.
In the /home/user folder, you can place the PHP files you don't want the world to directly access. Then In public_html, (accessible within /home/user) you can place the index.php file that has the include statement to the "protected" files. That way in index.php you can use this statement:
include "../file-to-include.php";
Just make sure that the administrator has set the owner of the /home/user folder to the same username you login with or you may get access denied messages when trying to access the file. On a decent cpanel setup, the last step would have already been done for you.
I'm trying to find the most effective(safest) way to utilise my own class Library which i want to put outside of public_html site root on a linux (shared hosting) server.
Several classes inside public_html in my site need to reference the classes outside the root and vise-versa. (see attached image)
Typically in one of my site classes in a directory on the root i have this line in a class:
require_once("/home/my_isp_username/myCommonClasses/utilitybelt.php");
which should reference a class file (utilitybelt.php) one-level above the root.
Im getting a blank page and no errors returning; not even warnings, just a blank page.
Note (shared hosting so no access to Apache/PHP config files)
Anyone know the correct way to do this?
Probably your web server does not have permissions to access files in that directory. On shared hosting I doubt that anyone will let you access it, so may be try moving the directory inside your web root.
I'm building a website based on php and i want to ask where to put files that are retrieved with a require statement, so that they can not be accessed from users with their browser.
(for example a php file that connects to my database)
EDIT actually i think the better way is to put them outside the public root because apache tutorial says htaccess will have a slowdown impact. it can be done with adding a ../
for example require("../myFile.php"); (At least this works in my server)
Best regards to all
That depends on the web server configuration. Usually (or at least in all cases I witnessed), you have a document root which cannot be accessed by users with their browser, with in there a folder containing all public material (often called htdocs, httpdocs, public_html or anything of the kind. Often, you can place your PHP include files in that root, and then require them using require("../include_file.php");
However, it depends on the configuration whether PHP can include files outside your public folder. If not, a .htaccess file is your best option.
If you place those files outside the document root of your webserver users cannot access these files with a browser.
If you use apache you can also place these files in a directory to which you do not allow access with a .htaccess file.
And as a last remark, if your files do not generate output, there is no way users can check the contents of the files.
If you mean source code then it is not visible for users, if you want hide folder contents use .htaccess directive Options -Indexes to hide files, if you can access php source your server configuration is wrong and it is not parsing php files.
You normally place them into a directory that is not accessible over the webserver (outside the document or web root). Sometimes called a "private" directory.
You then include/require the file from that path as PHP has still access to the files.
See also:
placing php script outside website root
disable access to included files - For a method if you're not able to place the files in a private directory.
Just make them secure with .htacces!
Here's a very clear tutorial for protecting files with a password. If you don't need direct access to the files per browser, or only your scripts need access, just block them completly by changing the code between
<Files xy>
change this bit here
</Files>
to
Order allow,deny
Deny from all
Then you won't need your htpassword file anymore either!
You need to put these files outside of public-facing folders on your web server. Most (all?) web hosts should have the capability to change the document root of the website.
For example, let's say that all of your files are served from the following directory on your host: /home/username/www/example.com/
This means that anything that resides inside that directory is visible to the internet. If you went to http://example.com/myfile.png it would serve the file at /home/username/www/example.com/myfile.png.
What you want to do is create a new directory called, for example, public which will serve your files, and point the document root there. After you've done that, the request for http://example.com/myfile.png will be served from /home/username/www/example.com/public/myfile.png (note the public directory here). Now, anything else that resides within the example.com directory won't be visible on your website. You can create a new directory called, for example, private where your sensitive include files will be stored.
So say you have two files: index.php, which serves your website, and sensitive.php which contains passwords and things of that nature. You would set those up like this:
/home/username/www/example.com/public/index.php
/home/username/www/example.com/private/sensitive.php
The index.php file is visible to the internet, but sensitive.php is not. To include sensitive.php, you just include the full file path:
require_once("/home/username/www/example/com/private/sensitive.php");
You can also set your application root (the root of your websites files, though not the root of the publicly accessible files) as a define, possibly in a config file somewhere, and use that, e.g.:
require_once(APP_ROOT . "sensitive.php");
If you can't change the document root, then what some frameworks do is use a define to note that the file shouldn't be executed directly. You create a define in any file you want as an entry point to your application, usually just index.php, like so:
if (!defined('SENSITIVE')) {
define('SENSITIVE', 'SENSITIVE');
}
Then, in any sensitive file, you check that it's been set, and exit if it hasn't, since that means the file is being executed directly, and not by your application:
if (!defined('SENSITIVE')) {
die("This file cannot be accessed directly.");
}
Also, make sure that your include files, when publicly accessible (and really, even if not), have a proper extension, such as .php, so that the web server knows to execute them as PHP files, rather than serving them as plain text. Some people use .inc to denote include files, but if the server doesn't recognize them as being handled by PHP, your code will be publicly visible to anyone who cares to look. That's not good! To prevent this, always name your files with a .php extension. If you want to use the .inc style to show your include files, consider using .inc.php instead.
I have a script that allows only authorised users to upload files to a certain folder.
However I do not know how to prevent people from downloading freely without login.
I need the solution in php.
I have googled around but nothing straight forward as yet.
Currently in my document root I have a folder called admin and a subfolder called uploads inside the admin. So only admin role can upload. Both editor and admin can download. What should I do in this case?
Please advise.
Put the files somewhere outside the public webroot directory, or configure your server to not serve the files. As long as your server will happily serve everything with a valid URL, there's nothing you can do with PHP to prevent that.
If your files are in the /public_html/ folder, take them out of that folder and place them in e.g. /secret_files/, so your directory structure looks something like this:
public_html/
index.html
admin/
admin_index.php
secret_files/
my_secret_file.txt
The webserver is only configured to serve files in the /public_html/ directory, so nobody will have access to directories outside (technical term above) it.
To still enable somebody to download those files, do as cletus suggests and use readfile to "manually serve" the files via a PHP script. PHP will still have access to these other parts of the file system, so you can use it as a gatekeeper.
Don't store the files in a directory under the document root.
Instead move them somewhere else and then a PHP script can programmatically determine if someone can download them and then use readfile() or something similar to stream them to the user.
You could also configure the Web server to not serve files from this directory but then you need PHP to serve them anyway. It's cleaner simply not to put them under the document root.
Answering question on how to password protect with PHP:
This should solve your problem.