Can PHP be restricted to work in certain folder only? - php

I'm almost sure that PHP is always able to go anywhere on the server and do anything with any files but I'm wondering if there's a way to restrict it to work only in one folder and what would be requirements?
I mean I've got let's say 50 WordPress installations, 50 folders. If a virus from untrusted plugin affects only 1 installation - it instantly goes to 49 other, too (because PHP can scan all the directories on server).
Is there any way to prevent that? If virus breaks into 1 installation of WordPress - I want it to stay only there.
My hosting provider said it's not possible without buying another server. What is your opinion?

With php-fpm you can chroot php workers (for absolute separation) and give every php application its own user and php configuration (timeouts, memory limits, etc.). You don't have to use chroot to have unique users. With simple file permissions you can make the webroots unreadable to anyone not the dedicated user for that webroot. Also this is not specific to Apache, works any other webserver that supports fastcgi.
A little easier to set up way could be relaying on php's open_basedir (there's a dispute of how secure open_basedir is since php's developers frequently fixes bugs related to this feature)

You can install suEXEC and run PHP in FastCGI mode. With this configuration you are allowed to run the PHP instances under different users.
I didn't try this tutorial myself but it looks good to me: How to set up PHP FastCGI with suEXEC on Debian

Related

Perl/Python Scripts Fail to Access Internet/Network through Web GUI

Another way I could ask this question is:
How do I set pages served by Apache to have higher privileges? This would be similar to me setting an Application Pool in IIS to use different credentials.
I have multiple Perl and Python scripts I am publishing through a web front end. The front end is intended to run any script I have in a database. With most of the scripts I have no issues... but anything that seems to utilize the network returns nothing. No error messages or failures reported. Running from CLI as ROOT works, run from WEB GUI as www-data calling same command fails.
I am lumping Python and Perl together in this question because the issue is the same leading me to believe it isn't a code issue, it is a permissions issue. Also why I am not including code, initially.
These are running on linux using Apache and PHP5. Python 2.7 and Perl5 I believe. Here are examples of apps I have that are failing:
Python - Connecting out to VirusTotal API
Perl - Connecting to Domains and Creating a Graph with GraphViz
Perl - Performing a Wake On LAN function on a local network segment.
So after I posted this I looked into Handlers like I use for IIS. That led me down the path of SUEXEC and through everything I tried I couldn't get Apache to load it. Even made sure that I set the bits for SETUID and SETGID.
When I was researching that I ran across .htaccess files and how they can enable CGI scripts. I didn't want to put in .htaccess files so I just made sure the apache.conf was configured to allow CGI. That also did not help.
So finally while I was studying .htaccess they referred to ScriptAlias. I believe this is what solved my issue. I modified the ScriptAlias section in an apache configuration file to point to my directory containing the script. After some fussing with absolute directories and permissions for the script to read/write a file I got everything to work except it isn't going through the proxy set by environment http_proxy. That is a separate issue though so I think I am good to go on this issue. I will attempt the same solution on my perl LAMP.

Debugging php applications on folders different than 'xampp/htpdocs' with PhpStorm

I know this may be a long and general question but I am struggling with it for the past two days and have achieved nothing.
I am a C# .net developer and I use Visual Studio IDE for my development which does all the back-end work for me when creating projects, setting virtual hosts, publishing the project and etc.
Now for some reasons I have to do a project in PHP and I chose PhpStorm as my IDE. I installed XAMPP and the Apache server is working ok, and I set its' php executable as an php interpreter in PhpStorm.
I don't want all my projects to be in xampp/htpdocs so I choose another location (d:\projects\phpStorm\<name of the project>) as my working space when I first created a project.
I installed xdebug using the tutorial it gave me:
Download php_xdebug-2.4.0rc4-5.6-vc11.dll
Move the downloaded file to C:\xampp\php\ext
Edit C:\xampp\php\php.ini and add the line
zend_extension = C:\xampp\php\ext\php_xdebug-2.4.0rc4-5.6-vc11.dll
Restart the webserver
And I can confirm that it is installed using phpinfo() in a php file located in xampp/htpdocs.
My problem is with the debugging. When I click Run->Run in an opened php file in PhpStorm, it uses a free port and opens the php file with a url like this: localhost:port_number/<name of the project> and everything is ok.
Now I followed this tutorial to configure the xdebug. In step two, when I go to Run->Web Server debug validation, fill the Path to create validation script with d:\projects\phpStorm\<name of the project> and Url to validation script with localhost:port_number/<name of the project> (as mentioned above) and click validate I get this information (and an error in the last line):
Server Name: PhpStorm 10.0.3
Loaded php.ini: C:\xampp\php\php.ini
No debug extension is loaded
Follow this links to configure Xdebug or Zend Debugger. If you have
already configured debug extension in php.ini file check possible
reasons why it was not loaded:
You forgot to reload web server after changes in php.ini file.
You are configuring debug extension in the wrong php.ini (see the
loaded php.ini files below).
There are errors on attempt to load debug extension, e.g. version
incompatibility.
I want to know what should I do?
I really really want to set my projects in another location other than xampp/htpdocs to organize them properly, just like I do it in Visual Studio. So please don't suggest solutions involving me changing my working directory.
Thanks in advance.
P.S.: In JetBrain's tutorial, I see that the xdebug's IDKEY is PHPSTORM whereas mine is my pc's username, it it ok?
I really really want to set my projects in another location other than xampp/htpdocs to organize them properly, just like I do it in Visual Studio. So please don't suggest solutions involving me changing my working directory.
Well you're really going to hate this then, but unfortunately Apache (which is what php runs on) only recognizes htdocs (or public_html, or www, depending on what specific server you are running, but anyhow in your case only htdocs) as a valid directory for php execution. Sooooo...
That doesn't really mean this is hopeless, but you might have to jump through some silly hoops to get it to work this way.
Option 1
You may have some luck creating a symbolic link from .htdocs to your projects folder from the command line:
ln -s C:xampp/htpdocs d:/projects/phpStorm
If you're on windows, this probably won't work. It also likely won't work between drives. It also may not work depending on your Apache configuration in XAAMP.
PROS: If it works, will do exactly what you want.
CONS: Probably won't work, if it does, will require nightmarish levels of config fiddling
Option 2
Use a remote development server, and sync over FTP with your IDE. I'm not super familiar with PhpStorm, but I can pretty easily do this in Netbeans or Eclipse. This is a good option when you need a local archive of a project retained. You might be able to set up an FTP server on your machine and accomplish this, however you are going to wind up with two copies of your project; one in your projects folder and the other in htdocs.
PROS: Your projects stay organized where you want them without much issue.
CONS: File duplication, they will have to be in htdocs anyhow for Apache to run php
Option 3
Accept that the technology is not designed to work this way and just put everything in htdocs where it belongs. Resisting the way technology works because you are used to a different workflow is how design flaws and really bad bugs happen. Use it the way it was meant to work and don't be scared of learning new things.
PROS: No conflicts with the XAAMP stack whatsoever
CONS: You specifically stated you don't want to do it this way, but this is really the best way
Option 4 (Don't do this)
Install PHP as a globally accessible command line utility across your entire system, and consequently get all kinds of crazy viruses and errors that you may not be able to fix ever.
PROS: Minor alleviation of aggravation with foreign workflows
CONS: All of the things. The worst things.
Option 5 (probably not going to work)
Try using VirtualHosts. There's a bunch of caveats with this though. First, doing this between different drives is nearly impossible to configure correctly due to security policies in your operating system that are difficult to overrule. Second, if you're on Windows (I assume you are if you are using XAAMP), you need to do all of the following:
-In apache.conf, you need to enable your hosts file.
-In the vhosts file, you need to create a new vhost.
-In the windows hosts file, you also need to create a host, because for whatever reason windows likes to arbitrarily add redundant steps. On every other OS, this step is not neccessary. Also, you need to run your text editor as administrator to even do this at all.
-Restart apache when it's all set up
-Pray your machine will let you do this between drives (C: -> D:), or not take a million years to enable.

Does PHP configuration via the Windows registry work for CLI?

I am trying to set the include_path specifically for a given script in a given configuration.
Per Directory Values would be ideally suited for the task, so I am trying this:
[HKEY_LOCAL_MACHINE\SOFTWARE\PHP\Per Directory Values\c\phpDevScripts]
"include_path"="c:\\path\\to\\dev\\lib"
It doesn’t seem to work for CLI scripts but the docs say nothing about this.
Quoted from the "User Contributed Notes":
Being able to put php directives in httpd.conf and have them work on a per-directory or per-vitual host basis is just great. Now there's
another aspect which might be worth being aware of:
A php.ini directive put into your apache conf file applies to php when
it runs as an apache module (i.e. in a web page), but NOT when it runs
as CLI (command-line interface).
Such feature that might be unwanted by an unhappy few, but I guess
most will find it useful. As far as I'm concerned, I'm really happy
that I can use open_basedir in my httpd.conf file, and it restricts
the access of web users and sub-admins of my domain, but it does NOT
restrict my own command-line php scripts..

Developing a PHP/mySQL app on IIs

I am strictly a LAMP dev but an ad agency I work with is courting a government agency whose RFP requires that their site be delivered via a Windows server.
What advice do folks have on this? Are there specific pitfalls? It seems like I have heard that file uploads and folder permissions are very different on Windows servers.
Any advice would be greatly appreciated.
IME, IIS can behave very oddly at times.
The permissions model is primarily ACL based - so its certainly possible to design a system which mimics the way Unix works - but (just as with Unix) get the permissions model right - and don't tinker with permissions / ownership in your code.
And of course you'll get yourself tied in knots if you try to move up directory hierarchies and cross over 'drives'.
Add to that a complete absence of the services you might invoke via popen(), and the POSIX tools.
Yes, people keep telling me its a nice place to visit but I wouldn't want to live there.
OTOH, a self-contained set of PHP files will run quite happily there.
PHP on a windows server is definitely trying on your patience. Problems that I've run into are making sure that IIS is configured to use the correct php.ini file, and as you said, writing to files on the server as well as folder permissions.
That being said, if you can get it working correctly, it's not a bad production environment.
I would suggest getting your dev environment as similar as possible to what production will look like. That way you run into as few problems as possible when you deploy.
I can see some pitfalls for using PHP on IIS
Since IIS is multithreaded unlike
linux which is multiprocess. Some
PHP scripts might be unsafe.
Because of this PHP should be installed and
run as a CGI extension. CGI is
slower than IIS's ISAPI and worse
when compared to Apache's mod_php.
Another pitfall I can think of is URL rewriting. IIS, versions below
v7 do not support url rewriting.
Configuration of PHP with IIS is really a pain. But when you do configure it, make sure you use the same configuration, exact mirror images everywhere you are developing because a lot can go wrong with just one glitch.

Solving the shared-server security problem for Python

So my group is trying to set up a shared-server environment for various and sundry web services. I think we've settled on setting disable_functions and disable_classes site wide in php.ini and php_admin_value to force open_basedir in each app's httpd.conf
for php scripts, and passenger's user switching for ruby scripts.
We still need to find something for python though. Passenger does support python, but not for per-application security for specific sub-directories (it's all or nothing at the domain level).
Any suggestions?
(And if any of the previous doesn't make sense - well, I'm the guy who's supposed to set up the python support, not the guy who set up the php or ruby support, so there's still some "and then some magic happens" steps in there from my perspective).
Well, there is a system called virtualenv which allows you to run Python in a sort of safe environment, and configure/load/shutdown these environments on the fly. I don't know much about it, but you should take a serious look into it; here is the description from its web page (just Google it and you'll find it):
The basic problem being addressed is one of dependencies and versions, and indirectly permissions. Imagine you have an application that needs version 1 of LibFoo, but another application requires version 2. How can you use both these applications? If you install everything into /usr/lib/python2.4/site-packages (or whatever your platform's standard location is), it's easy to end up in a situation where you unintentionally upgrade an application that shouldn't be upgraded.
Or more generally, what if you want to install an application and leave it be? If an application works, any change in its libraries or the versions of those libraries can break the application.
Also, what if you can't install packages into the global site-packages directory? For instance, on a shared host.
In all these cases, virtualenv can help you. It creates an environment that has its own installation directories, that doesn't share libraries with other virtualenv environments (and optionally doesn't use the globally installed libraries either).

Categories