Alternative to storing PHP config values in php.ini - php

We have a web-app that we have modified a number of the default php.ini values for; short_open_tag = Off, expose_php = Off, memory_limit = 128M, etc, etc. Our current deployment strategy when we need to scale and bring another app server online involves cloning our app onto a 'new' server that has the latest version of php.ini, along with the distribution-specific (in our case, Debian) php.ini file.
We are currently storing our customized php.ini file in our repo and deploying that when we clone, but ran into a problem recently relating to deprecated config values when a new cloned app server fired up with PHP 5.4+ on it. This resulted in us having a broken config file, and got me thinking about how to best handle this. We'd like to use the default latest php.ini that would contain potentially new directives, and would have deprecated ones removed, and then be able to 'locally' override the settings we need.
Solutions we've considered include using .htaccess files and ini_set(), but three drawbacks here relate to the fact that some settings can only be adjusted in php.ini, that .htaccess will not be used by our cli scripts, and that for each user visiting the site via Apache, we have to process the .htaccess or make calls to ini_set() resulting in unneeded overhead. We've also looked at freezing the version of PHP we use so that there are no updates and changes to php.ini once deployed, but I am not sure if this strategy works best, given we would miss out on minor updates that could be related to security, etc.
Have we missed an option as it relates to portably deploying PHP engine settings?

Per the direction provided by #PeeHaa above, we've decided to lock our application to a specific PHP version (via Composer), taken that PHP version's default php.ini file for both Apache2 and the CLI, and added in our settings. This has then been pushed to our repo, and is copied as needed on deployment and on any git changes to the files.
FYI, in our Debian environment, we've followed the strategy outlined here in terms of installing a specific version of PHP other than latest.

Related

Overriding PHP settings in execution

Working with WordPress, I often come across the same issue where hosting companies set the initial PHP settings for things like upload_max_filesize and max_execution_time to very low values. This becomes an issue when trying to import large demo content or migrate data from another server. I'm trying to find a way around it using only a PHP script (no FTP access, can't recompile PHP etc...). How can this be accomplished?
Here are a few things to consider:
Using ini_set() only works for certain directive, which makes it useless in my case.
Creating a php.ini or php5.ini file in the root of WordPress only works if PHP is set to scan this directory for additional ini files, which is rarely the case.
Modifying the .htaccess file is an excellent way to achieve this, however, if PHP is run in "CGI mode" and not as an Apache module, this won't work.
Is there a solution that can work regardless of the host's settings? Is there a way to force PHP to scan an additional directory for php.ini files?

Capistrano Symlinks Being Cached?

I've been setting up PHP deployments with Capistrano on CentOS 6 and have run into an interesting issue. The way capistrano works, it sets up folders like this:
/var/www/myapp.com/
current (symlink to latest release in /releases)
shared
releases
20130826172737
20130826172114
When I look at the "current" symlink, it points to the most recent release. At first, when opening my web app, everything worked fine. After deploying a new release, the current folder correctly points to the new release, but the web application tries to load files from the old release (which has been deleted in a Capistrano cleanup process). Also, the virtual host is configured to point at /var/www/myapp.com/current/Public.
Are symlinks cached in any way?
The specific PHP code that fails (which initializes my framework) is this:
require_once dirname(dirname(__FILE__)) . '/App/App.php';
App\App::run();
That is in index.php currently located at /var/www/app.com/current/Public/index.php.
My Apache error logs show:
PHP Fatal error: require_once(): Failed opening required '/var/www/myapp.com/releases/20130826172237/App/App.php' (include_path='.:/usr/share/pear:/usr/share/php') in /var/www/myapp.com/releases/20130826172237/Public/index.php
And the current symlink shows:
current -> /var/www/zverse/releases/20130826172641
Obviously 20130826172641 != 20130826172237 which the latter was the previous version.
Any ideas or areas I can look at?
I can't verify this, but it seems that there is some unpredictable behaviour with Apache following / caching the old location of symlinks:
Is there a way to mimic symlink behavior with an apache configuration?
Case Against Using Symlinks For Code Promotion
The only thing that would absolutely clear up this issue was to cycle Apache, which we would prefer not to do on every deployment. -- Mike Brittain
He suggests moving the whole directory, instead of updating the symlink.
Have you checked the realpath_cache_size and realpath_cache_ttl directives? By default, php > 5.1 caches the real paths of symlinked files for 120 seconds. This will cause problems with capistrano deployments. The main problems are caching - that even if you clear your cache, your old php files will continue to be served for two minutes, repopulating it with old data - and interaction between php and static files. Static files are served directly by Apache, so will be updated immediately. The php code will still be from the previous release for two minutes after deploying though, so it will be expecting the old versions of any changed static files. That's especially a problem if you use a cache breaking procedure that changes the names of those files; in that case the php code won't be able to find the files it's expecting at all.
Anyway, there are two solutions. The first is to set realpath_cache_size to 0 in php.ini. (Note: setting realpath_cache_ttl to 0 does not disable the cache.) Or, if you want to keep it enabled, you should be able to use the clearstatcache function to clear the realpath cache immediately after deploying your symlink, using a capistrano hook. Be aware though, if you're using mod_php, the php cli and apache runtimes are separate, so you would need to call that function using a php script invoked by apache, similarly to what's done for clearing the APC cache here. I haven't tested that though, as I didn't notice a significant performance impact from simply disabling the cache.

How to set up a secure PHP environment on a wide base of PHP configurations

I'm just finishing a CMS that I want to release as open source. The program has some ini_set() directives to set a secure environment, like session.use_only_cookies, etc. The problem is that some hosts don't allow ini_set() and only allow configuration with the php.ini file. Is there a way to set up a secure PHP environment on a wide base of PHP configurations? How do other PHP programs face this problem (e.g. Wordpress, Drupal etc.).
Generally speaking :
In your software :
try to not depend on too many system-wide settings
have something that works with default settings (see the php.ini file(s) that are provided with a default PHP installation -- both from php.net, and from the major Linux distributions)
Write some short and clear list of requirements
Code some kind of installation script, that will :
check automatically for those requirements,
and explain how to set those that are not properly configured
You could output a warning if you encounter a setting that you deem unsecure.
Many web apps do it that way, e.g. if the install script isn't deleted, or the administrator has a default password.

Env variables for plugins in PHP on Windows

I am running PHP on Windows. PHP plugins on Windows are just DLL's in an extensions folder, and I can do little to configure these plugins.
For example, the ADAP plugin (which is OpenLDAP itself) has settings that I can't change on runtime. Luckily, OpenLDAP allows me to change some of these settings by messing with the environment variables. I tried setting them up on runtime by adding both:
// this apparently works on Linux
putenv('VARIABLE=value');
// tried this one as well
$_ENV['VARIABLE'] = 'value';
But that didn't work. I had to add that to Windows' environment variables (and that did work), but that's too much of a pain in the rear and will break the code when I move it. Is there a better way to do this or I'll have just to deal with it?
Thanks
Try to set the environment variables before the dll extension is loaded. That is, don't load the dll via php.ini, but use putenv() and afterwards dl(). Usually a DLL should share the environment variables with the main process, but you never know.
Alternatively set any required options from within .htaccess using SetEnv. This is at least portable for Apache webservers. Come to think of it, you should also try apache_setenv() if you are running mod_php and not the FastCGI version.
Btw, there have always been PHP bugs for putenv, http://bugs.php.net/50690, might be the case here.

Solving the shared-server security problem for Python

So my group is trying to set up a shared-server environment for various and sundry web services. I think we've settled on setting disable_functions and disable_classes site wide in php.ini and php_admin_value to force open_basedir in each app's httpd.conf
for php scripts, and passenger's user switching for ruby scripts.
We still need to find something for python though. Passenger does support python, but not for per-application security for specific sub-directories (it's all or nothing at the domain level).
Any suggestions?
(And if any of the previous doesn't make sense - well, I'm the guy who's supposed to set up the python support, not the guy who set up the php or ruby support, so there's still some "and then some magic happens" steps in there from my perspective).
Well, there is a system called virtualenv which allows you to run Python in a sort of safe environment, and configure/load/shutdown these environments on the fly. I don't know much about it, but you should take a serious look into it; here is the description from its web page (just Google it and you'll find it):
The basic problem being addressed is one of dependencies and versions, and indirectly permissions. Imagine you have an application that needs version 1 of LibFoo, but another application requires version 2. How can you use both these applications? If you install everything into /usr/lib/python2.4/site-packages (or whatever your platform's standard location is), it's easy to end up in a situation where you unintentionally upgrade an application that shouldn't be upgraded.
Or more generally, what if you want to install an application and leave it be? If an application works, any change in its libraries or the versions of those libraries can break the application.
Also, what if you can't install packages into the global site-packages directory? For instance, on a shared host.
In all these cases, virtualenv can help you. It creates an environment that has its own installation directories, that doesn't share libraries with other virtualenv environments (and optionally doesn't use the globally installed libraries either).

Categories