I am loading some content via ajax.. the content has a require_once in it that seems to be causing it to hang for at least one second before it loads, which is really distracting on the page. I haven't had these delays before, the only thing I've changed recently is setting a php_include_path (C:/wamp/www/project) in an htaccess file.. when I take it out it loads immediately (though it also doesn't find the file).
It's only that directory in the include path, though, so it seems strange it would slow things down that much (right?). Is that abnormal? Where could I look for what's causing the delay.
I have a symlink to mirror my files and folders to dropbox (so that my dropbox folder is where they are technically), but I copied the files directly into C:/wamp/www and the slowness persisted.
My suggestion would be to define the php include path in your PHP, rather than in your .htaccess to see if that speeds it up.
For example, define it as a constant
define('PHP_INCLUDE_PATH', 'C:/wamp/www');
Related
I had a little, slightly annoying problem that was an easy fix but I didn't understand why it took place.
I have a several folders in ../wamp/www/- and in one of them I had a file, that I had created, named index.php. When I clicked on the other directories I could browse through the normal list where I saw all the files, but when I opened the folder where the file Index.php was it executed immediately and I never saw what was in the folder. Instead I had to manually change the url to open other files in that directory.
I deleted index.php and half expected it to run some other .php-file in that directory, but didn't.
I'm aware of that there is a index.php-file in the www-directory (the comments in French though), which clearly is part of the WAMP system I downloaded, but having seen tutorials where people use index.php I'd have expected someone to mention this potential problem.
This is not a problem!
This is how web servers work.
To be specific there is a setting in the Apache config that tells Apache the names of files that it should run Automatically if no actual file is specified in the URL
Specifically this line in httpd.conf
DirectoryIndex index.php index.php3 index.html index.htm
It is normal that a web server will run any of these files if it sees them when only the directory is specified on a URL.
And before you consider changing it, dont. Almost everything you will come accross EXPECTS this to be the case.
The default file layout of MediaWiki seems quite insecure and good apps expose a single directory (and not the entire hierarchy) to http requests. I've tried to follow instructions found on: https://www.mediawiki.org/wiki/Manual:Security#Alternate_file_layout
And created a /web directory under the base installation directory /wiki and in it created two new files /wiki/web/index.php that looks like
<?php
chdir('../');
require_once 'index.php';
A similar /weiki/web/load.php was created. I've also added a soft link to the resources directory so /wiki/web/resource is actually /wiki/resources
The first request on the wiki (after restarting the server and setting the document root to /wiki/web instead of /wiki) does respond and shows the page well. However, subsequent responses produce a blank page with code 200 (success). Nothing comes up in the PHP error log either. As this happens after subsequent restarts I suspect some caching mechanism isn't working properly.
Any ideas on how to find the cause of the problem?
That chdir('../'); doesn't sound right. You should use an absolute path as demonstrated in the manual you linked, apparently in your case chdir('/wiki/web/');
However, before going through such adventures, I hope you protected the actually sensitive paths like configuration files, uploads and cache... (See the manuals for the setup of those.)
I am a PHP newbie and a have a php security question. Is it possible for somebody to get the source code of a php script file running on a server with default configuration? If so, what is the best way to be protected? I am asking this because I happened to download a php file when I requested a page from a site and what triggered my concerns. I think that maybe apache configuration was wrong and served that file to me like a simple file but I am not sure. Also what is the best place to keep "sensitive" data such as database or smtp configuration?
Thank you,
Alex
For the most sensitive information, I'd suggest putting it outside of your web root folder, and including it through "require" or "include". This way, even is some configuration gets botched on the server, the visitor will only get served the line "include('secret_stuff.php');" and not the actual script.
Exactly what David Dorward said but i would advise you take a look at the following patch(s) that would modify apache to not send source code's regards if there is a misconfiguration.
http://mirror.facebook.net/facebook/patches/ap_source_defense.patch
Patch like so:
cd apache-1.3.x
patch -p1 -i ap_source_defense.patch
More Patches from Facebook Development Team: http://mirror.facebook.net/facebook/patches/
The best way to protect your much needed source is to place them outside the public root directory, as if apache is running it will not be able to serve files directly from the folder up public_html
for example:
C:/server/apache/
C:/server/apache/htdocs/
C:/server/apache/htdocs/includes/
People can specifically view the files my going to
http://hostname.tld/includes/
but having the directory structure of:
C:/server/apache/
C:/server/apache/includes/
C:/server/apache/htdocs/
and then within
C:/server/apache/htdocs/index.php
you have
<?php
require_once('../includes/config.php');
?>
this should protect all major files bar the view file (index.php)
If the server is not configured to handle PHP files, then it will treat them like any other unknown file (and serve them as either text/plain or application/octet-stream.
PHP support is, as far as I know, always provided as an extension or external program (for CGI, FastCGI, etc) and never as a built in for an HTTP server.
If the server is properly configured to run PHP code, then people without direct access to the server cannot view the PHP source code. You don't have to do anything else.
It is only because that server was not configured to run PHP, and instead served it as text, that you could see the source.
If you have this line in your apache.httpd.conf file,
AddType application/x-httpd-php .php
Apache should deal with data, rather than showing them...
Also you need to start php services.
What you describe as "default configuration" is a webserver without php installed (or with php disabled). In these cases, it is certainly possible to download the php script.
Make sure php is installed (as it will be on ~100% of production php servers) and/or block access to your configuration file with an .htaccess file like this:
<FilesMatch "^config.php$">
Order allow,deny
Deny from all
</Files>
If you want to be extra-tricky (and work even on servers where .htaccess files are ignored), prefix the configuration file with .ht, like .ht.config.php. Most Apache(and some other webserver) configurations will refuse serving files beginning with .ht. However, in general, the only way you could be sure no webserver serves your file is to move it to a directory outside of the server's document directory. On most hosts you or your php script won't be able to access those though.
Your second problem are misconfigurations. There's not much you can do, albeit there might(?) be options to construct a rewriterule to prevent accidential accessibility.
The best prevention however is to keep all scripts outside of the DOCUMENT_ROOT. Just leave a single index.php there, and include all dependencies from there. This is also the best strategy to avoid leaking of configuration data (also don't use ini files for sensitive data, but always .php scripts).
Another worry are shared hosting servers however. All concurrent users on a server can read out your scripts (if not through PHP then via Perl/bash CGIs). Nothing you can do about that, unless you change to a professional hoster which supports running everthing through suexec and thus allowing individual permissions.
Well, "default configuration" is a vague term, but as long as the web server is configured to parse .php files through PHP, you should be fine from that angle. If your scripts themselves happen to parse other PHP files (for eg. a template system) then you need to make sure there aren't any loopholes in that too, since the PHP scripts have full access to your files on the server.
Assuming these are taken care of, you don't need to keep the "sensitive" data in any special place -- just put them in your .php files, but make sure all your scripts end in .php (for eg. config.inc.php) so that they are always parsed via PHP and not sent as plain text.
I have an index.html in my wampserver www directory. On this html, there is a link for a user to upload file. When I hit the link, I select files to upload but instead of the uploadmanager.php which i have tested in my eclipse debugg environment to work, it displays the some part of the code on the web page without doing anything thing. This is not what I expect. Can someone please tell me what is wrong? Thank you.
sound like you are using php-short-open-tags (<? instead of <?php) without enabling this in your php.ini. change your php.ini or use the standart open-tags to solve this.
Are you sure you enabled PHP in WAMP?
Try creating a new uploadmanager.php file directly in wamp/www (or whatever subdirectory) and paste the code from your tested uploadmanager script into the new file. Then try to run it in WAMP.
I think it is a permissions problem. I copied an index.php file into a c:/wamp/www/subdirectory and it only displayed the code. Once I created a new index.php file and pasted the contents of the old file into it, it worked perfectly.
Are you posting to the uploadmanager.php page? Are you getting an error or just seeing the code? Can you post the code from the index.html page that handled the form and the part of the php code you're seeing for us to look at?
Every now and then I have Apache serving the .php as downloadable files instead processing them on the server, but only with random requests.
Some reasons, why this might happen, are
PHP misconfiguration
PHP-files in a directory without execute rights
wrong content type sent
timeout from script execution
In my situation the last bullet is the most dangerous, but luckily it seems to show up only immediately after modifying some of the .php files. I haven't tracked the problem any deeper yet, but it seems to relate some filesystem level operations (as the disk I/O is a bottleneck) and presents itself only in testing env.
This may be a really stupid question...I started worrying last night that there might be someway to view PHP files on a server via a browser or someother means on a client machine.
My worry is, I have an include file that contains the database username and password. If there were a way to put the address of this file in to a browser or some other system and see the code itself then it would be an issue for obvious reasons.
Is this a legitimate concern?
If so how do people go about preventing this?
Not if your server is configured right. I think discussion on how that is done belongs on serverfault.
To add on to the other answers:
If you use a file extension like .inc there's indeed a higher risk. Can you open the file directly in your browser?
The most important advice is missing:
Only the files that should be accessed by a browser, should be in a publicly accessible location. All the other code (and configuration) should be in a completely separate directory.
For example
root
- webroot
- includes
- config
Only 'webroot' is exposed by your webserver (apache). Webroot can for example contain a single index.php, along with all your assets (javascript, css, images).
Any code index.php needs to load comes from 'includes' and all the configuration from 'config'. There's no way a user could ever directly access anything from those 2 directories, provided this is done correctly.
This depends on the file extension you have given the include file.
If the extension is one that is known and executed by the web server, it will be protected. If you browse to the file, the server will try to execute the code rather than just returning it as plain text.
If the extension is not known by the web server it will serve it as plain data, so anyone (who can guess the file name) can browse to the file and see the source code.
A Directory Traversal Vulnerability can used to obtain files off of the remote mahine. Alternatively you can use MySQL based sql injection to read files using load_file(). You can also test your system with w3af's urlfuzzer which will look for "backup files", such as index.php.zip. Also make sure that all files have .php extensions, a .inc can be viewed from the public. I would also disable Apache directory listing.
Normally there should be no way to view the PHP files remotely... it would be absolutely pointless. This completely depends on what web server you are using and how it's setup though.
Having looked around I can see that it is possible to protect a directory via the .htaccess by adding these lines:
Order allow,deny
Deny from all
This apparently protects the directory so that only local non web-access is possible.
This allows me to keep my includes in a subdirectory of the main site directory which is good for organisation and it can be used on the projects where I do not have access to folders outside the web root.
Does anyone else use this method?
Just for good measure I've put the directory permissions to execute only.
And the include extension is PHP as suggested by others.