We currently have a mutltisite setup of more than 20 sudomains for customers.
The same codebase is used for all sites, let's just say it is located in the folder /srv/code.
Apache's root directory points to it as well.
Now we have to create a folder under /var/www/subdomain for each subdomain and put a configuration file called config.ini there.
How do I tell Apache now when the PHP script searches for the file config.ini that it can find it here: /var/www/subdomain ?
And is it the best solution ever?
Is this a shared host? If yes, you should first check if a script inside /srv/code is allowed to access a file in /var/www/subdomain.
You can extract the domain the current script was called via from $_SERVER['SERVER_NAME']. You can then use this information to extract the subdomain from it and access the config.ini in the correct directory.
This is probably easier then doing it in Apache.
Related
I have an external PHP placed in a plugin which must work in all Wordpress configurations. Including those where the plugin folder has been moved outside Wordpress root.
The external PHP needs to bootstrap wp-load.php and thus need to know the location of that file. The location of the file is passed through the querystring (relative location from plugin folder to Wordpress root), which is obviously unsafe. If a hacker somehow has gained access to upload php files (wp-load.php) to for example wp-content, she will be able to run a malicious wp-load.php through the external script.
The external PHP is "called" by way of a RewriteRule in an .htaccess (which I have control of). On Apache I can block direct access, but not on Nginx.
As the purpose is to load Wordpress, note that using Wordpress functions is out of the question.
I am thinking that perhaps some secret or hash can be passed to the script from the .htaccess.
To validate that the root looks real, the .htaccess in the root could be examined.
With control over the .htaccess, you could put any comment into the .htaccess. To decide whether to accept a proposed root folder, the script can look for a .htaccess in that folder, read it, and see if it contains the magic comment.
To exploit this, the hacker would need to be able to store a file named ".htaccess" as well as "wp-load.php".
The solution could be improved by inserting a hash of something in the magic comment (but hash of what?)
To avoid running a php located in ie "/var/www/example.com/wp-content/uploads/wp-load.php", you could check if any of the parent folders contains "wp-load.php" and exit if they do.
This protection can unfortunately not stand by itself as it will not protect installations where the wp-content folder has been moved out of the root. But it will shield typical installations against running a malicious "wp-load" which has been uploaded in a subfolder.
The plugin could try to create a symlinked folder in the plugins folder, linking to root. Or it could create a php file there, which defines the path to root. It will only be necessary when system is detected as Nginx.
The uploads folder will be write protected on many systems. Instruction needs to be provided for creating such file manually. In this case, this is not a deal breaker as users on Nginx already need to insert the rewrite rules manually.
I just made a mistake in a PHP application I'm developing with WAMP Server.
My WAMP / WWW folder is inside my D:\ disk, where I also have my personal data. My app, due to a fail in generating a dynamic path, deleted all my music, my photos and other personal files I had.
I mean... WHAT? How was it possible? I will need a recovery tool to recover that data.
How can keep the PHP from touching anything outside it's folder in www so it does not happen again? It's a disaster.
Limit the files that can be accessed by PHP to the specified directory-tree, including the file itself.
http://php.net/manual/en/ini.core.php#ini.open-basedir
Use open_basedir to restrict file operations to within specific directories, like this (in the website's VirtualHost file)...
php_admin_value open_basedir "C:/WampDeveloper/Temp/;C:/WampDeveloper/Websites/www.example.com/webroot/"
Though if you are deleteing via the command line or bat file (e.g., you are not using PHP file functions directly), the only way to fix this is to set Apache to run under a custom account that only has permissions set on WAMP's folder.
I have concerns that are similar to what were addressed here. I'm using Composer to install Amazon AWS components to set up a SES (email) service.
According to the Amazon documentation, I need to include autoload.php in order to use the classes that I installed. This means that the autoload.php must be in my web directory (/var/www/html).
I didn't fully understand the answer provided to the SO question I previous mentioned, but it essentially says that the vendor directory should NOT be in the web directory. But if I do this, how will I require the autoload.php file, which is in the /vendor directory?
Overall I am very confused about how I should be properly setting this up. Any help would be appreciated.
Edit: This article also suggests putting the /vendor/ folder in the web directory. Is this the standard? What security risks should I be looking out for? Because there are no index.html files or anything in any of the folders, the directories of all the files that were installed can be seen and accessed freely. Surely this can't be a good thing?
The "web directory" is the directory directly served via HTTP to anyone asking with the right URL. So if anyone thinks there is a folder "/foo" hosted on your domain, and you didn't take precautions, and there is in fact that folder, and it does not contain a file that would be served as the directory index, anyone asking probably would get the directory listing of that folder, listing all files.
Now the difference between such a web hosted folder and the require statement in PHP is that PHP does not use a URL pointing at a publicly accessible HTTP hosted folder, but uses a filesystem path pointing to a file.
And most beginners mix this up: Because PHP at starter level is all about having a bunch of scripts spread around the web directory, which emit a lot of HTML containing links to other scripts, they get the idea that the links in HTML and the file paths in PHP are the same and have to be. This is wrong. They don't have to be the same, they are the same because no better approach has been selected.
So here's how a modern web application is constructed. If you deploy the whole project, the main directory on the server might be called /var/www/projectX. Inside this container are some files like /var/www/projectX/composer.json. Because of this there will also be a directory /var/www/projectX/vendor. Additionally, somewhere would be one PHP script that's being accessed (I delay the info HOW it's being accessed for now), and that location should either be A) /var/www/projectX/script.php or B) /var/www/projectX/public/script.php. Those two scripts want to use Composer provided classes and need to include the autoloading.
Because of the file location, the script in location A needs to run require 'vendor/autoload.php';, and the script in location B needs require '../vendor/autoload.php';. This is simply a matter of using the correct relative path from the script to the autoload file. You could even use an absolute path in both cases: require '/var/www/projectX/vendor/autoload.php'; will also work. The main point here is: It does not matter HOW you require that autoload.php file as long as it gets executed by the script. The path does not affect anything.
Now the HTTP hosting and accessing the scripts. The webserver has at least one directory configured that is being exposed to the outside world as the main directory of the domain. This is called DOCUMENT_ROOT, and it can be ANYWHERE. Now it depends on the configuration of your server which directory is preselected, and if you can change that setting (either by administrating your server on the command line, or by clicking some settings in a GUI).
If your server has the directory /var/www/projectX set as the document root, all the world can access the script in case A as http://example.com/script.php, as well as the script in case B as http://example.com/public/script.php, and also the vendor folder as http://example.com/vendor/.... This is not great, but could be avoided by placing .htaccess files inside or otherwise restrict access.
The better solution is to tell the server to only serve the directory /var/www/projectX/public as document root. This will prevent HTTP access to script A and the vendor folder, and access to script B is done via http://example.com/script.php.
In both cases, both scripts successfully include the autoloading of Composer because the restrictions of HTTP access do not apply to filesystem access.
Bad website hosting allows you only to use the first scenario, with the only accessible directory for you being directly the document root, without a method to change it.
More sophisticated website hosting ís using a fixed subdirectory like public or html or webroot as the document root, allowing you to hide sensitive files from ever being served via HTTP.
The best website hosting allows you to select which subdirectory should be hosted as document root.
In any case, the path pointing from a script to Composers autoload.php is not affected at all.
i have program, the log file address like that
C:\Users\Administrator\AppData\Roaming\program-folder\Logs
and now i want read this log file (text) with php
how can assign address to open this file?
PHP run by Apache cannot access files outside of the site root for obvious security reasons. If you can't move the log file, you could look to use a symlink to make it accessible.
Based on the path you've given I'm assuming you're using Windows. Here's a guide to symbolic links in Windows.
Sorry if this is a trivial question.
I am a kind of new to PHP and I'm creating a project from scratch. I need to store my application logs (generated using log4php) as files, and I don't want them to be public.
They are now stored in a subfolder under my PHP application folder, (/myAppFolder/logs) so they are served by Apache.
Where shall I store them, or what shall I do to keep them away from being served as content by Apache?
You can either have them in a directory above the root, or, if you're on shared host/ can't have the files above the root for whatever reason, you can have them in a directory that denies all HTTP access.
So you could have a folder called "secret_files" with a .htaccess file sitting inside:
.htaccess:
deny from all
Which will prevent HTTP access to files/subfolders in that folder.
Somewhere not under the public root!?
This is more a server config question as it depends on your server, but in apache you could use the custom log directives to set the location, so if you have
/www/myapp
Create
/www/log
and put them there instead. You need control over the config to do this so look up your web hosts docs to find out how.