Dynamically changing apache proxypass config (scripting httpd.conf?) - php

Background on the scenario:
Our CI system deploys multiple VM's, each with unique (though temporary) IPs. I have an apache proxy set up on the front to provide clean URL's to these VM's so that the various users don't have to remember IP's that can sometimes switch multiple times a day.
The problem is... I don't want to keep having to manually edit my httpd.conf file to change the IP's that the url's map to.
I know I could use PHP to edit the httpd.conf file, and probably restart apache. This seems like an anti-pattern to me. What is the better way to do this? Can Apache read the proxypass settings from a seperate file?

Related

How to prevent access to a directory (apache and nginx) in php

I have a script with a subdirectory containing files that should not be accessible from outside (so, users cannot directly access it even if they know the remote URL).
In order to do so, I thought about programmatically creating a .htaccess file in the directory with the line
Deny from all
This would work, but I need this to work both on apache and nginx, and I am not very familiar with the latter. I wasn't able to find any good pieces of information on this, though, although as far as I understood I should NOT use .htaccess with nginx.
So my question is whether there is an alternative that would work on nginx. I did find out how to configure nginx to deny access to a specific folder, but I need the script to be able to do this by creating the local file. I can't assume to have access elsewhere to the server.
By the way, any other solution that does not involve .htaccess would be ok as well, as long as it prevents the direct access to the file through URL and it makes it accessible to php.
In case it matters, the script is part of a WordPress plugin.

User Permission in PHP

I'm working on an online-platform that allows to write PHP code on a textarea
and see that code on an iframe.
Why? because i want to release API for another platform so that users
can try them without problems
With this platform users can create/edit/delete files.
Every user has a personal folder that contains his own files, the folder's name is equal to the username of the user.
My problem is that I don't want a user to edit files of other users, but only his own.
How can i do this?
If a user writes the code that refers to the folder of another user,
//for example
f_open('../path_to_differt_dir');
this user could delete all the files belonging to the other user.
How can I avoid this?
I wish that ONLY the functions written by me can change file, instead the functions created by the user, however, does not have permission to change anyone file.
That way I could control all this, But I don't know how to do something like this.
If I understand you - you are building an "online" PHP compiler.
Well you have bigger concerns than just relative path.
The best way to approach that will be to use a PHP sandbox class that will enable you to avoid unsafe code.
Here is a nice project (the best I know of): A full-scale PHP 5.3.2+ sandbox class that utilizes PHP-Parser to prevent sandboxed code from running unsafe code.
When doing it like that you can whitelist the functions you want and black list others - this way you can leave out exec, shell_exec ...
For relative path issues you can always create a virtual host for the userfolder when the folder is created and box him with basedir:
<VirtualHost 178.154.120.143:80>
<Directory /httpdocs/user/userfolder>
php_admin_value open_basedir "C:/httpdocs/user/userfolder"
</Directory>
</VirtualHost>
The problem here is that you will need to flush the changes which will normally require a server restart. But there are solutions for that too:
configure virtualhost without restarting apache web server
I Hope it helps.

How can I load files from httpdocs on the https side of the website

On my webserver. I have two directories which are publicly accessible. One of those is httpdocs/ and the other is httpsdocs/. You can probably guess where they're for.
The problem occurs when I wanted to create a CMS for myself. I want it to load the page with HTTPS for security reasons, but the problem is that my files are inside the httpdocs directory. I can move only the files for the system to the httpsdocs, but my next thought was what if I want a login on the normal website for people to login and view hidden content for not registered users?
Now my question is: Is there a way to combine those to directories so all the files will be inside the httpdocs and I can access them with the security HTTPS provides?
Things that you may need to know:
The webserver runs php 5.2.9
I only have FTP access to the webserver and to these directories.
The server runs apache, so .htaccess is possible.
There are a ton of avenues, but most are blocked if you can't alter the httpd.conf or certain restrictions are in place.
Your best bet would probably to simply abandon http: and just stick to https: and create a redirecting .htaccess in the http: folder. That way everything runs on a single directory.
Unless there is a drawback that would not allow your users to use HTTPS over HTTP.
Getting your ISP to make the https folder a symlink to the http folder would be the easiest solution here.
It sounds like you are likely using a hosting service that has things seperated. If this is the case then the answer is no, you can't combine these two directories into one. You can ask your provider if they can make that arrangement, but it would require changes to the apache configuration that you can't do.
That being said, barring the ability to have that configuration modified, the only other answer you are going to find is to move everything under the httpsdocs directory. Unless you can get another account setup or possibly if they offer the ability to setup subdomains with HTTPS connections. That would be an alternative solution.

PHP Manage/Add Virtualhost

Is there a free PHP script that allows you to easily add a virtualhost?
I need to do this, because I'll be opening a free forum host, and want to allow users to use their own domain, aside from a free subdomain.
Restarting the server whenever you add a new domain to it could be problematic. If there was ever a problem writing the file, you lose all the sites, and if there is data cached (I'm thinking mainly about APC), you also lose the usefulness of the cache.
Having the default virtual host handle the new domains, and then having the PHP code acting according to $_SERVER['HTTP_HOST'] would be a more robust solution.
Easy way is to wildcard all domains to a set directory then there use mod rewrite to map the specific domains to specific directories or in the case of say a MU wordpress install you can use the domain to map to a user account.
you can edit the .htaccess file on the fly without restarting apache.
couldnt php create a vhost file which is included in the main vhost handle this?
using cron to determine weather there has been updates to the attached vhost you could reload apache

Dynamic apache setting using PHP

starting new project where users can register and system will automatically install cms system for users registered domain and need to solve dynamic setting of the server (apache).
Registration information and info about the associations between domains and actual paths to the cms installation on the server will be stored in Mysql database.
Is there an easy way to configure apache to connect for all unknown domains to a specific php script, which will look into the database and provide the actual path to the relevant cms - apache will than use this info to correctly handle the request?
I think, that "easier" solution might be to use PHP to write the domains/paths/config to a file and force apache to use this file to handle requests - however as I expect, that the number of the domains might be higher and case that some domain will be deleted will not be rare - the file might become full of unwanted rules soon and hard to optimize, also apache restart would be needed in order to use changed file etc..therefore the question about dynamic solution - that might be much easier to manage (for me and for the admin system itself).
Yes - use a wildcard vhost in apache, and mod_rewrite to direct all URLs to your front controller (or use the 404 document handler).
C.

Categories