can php require any php file in my pc?
I set the apache www root folder to be d:\phpnow\htdocs, I thought that php can only require php files under this folder before ,such as require('laji/hello/a.php');
today I found it php can load any php file in my PC ,only need the full path.
how to prevent ? it should not safe for web server.
can php require any php file in my pc?
Any file that the user whom the PHP program runs as has permission to access. (That is to say, filesystem permissions).
how to prevent?
Limit the permissions on the file system or chroot the server so it runs in a sandboxed environment. (I've no idea if chrooting is possible on Windows)
it should not safe for web server.
It is perfectly safe unless either:
You allow untrusted users to install their own PHP programs on your PC (but see also What do you recommend for setting up a shared server with php)
You allow file paths on your filesystem to be selected via unfiltered user input
PHP can include any file on the server within its jailed limits, if any. In this case your computer is the server. It's not a security issue, since a remote server has no way of accessing your file system.
You can deny access to a directory using .htaccess file since you run php with Apache.
If you want to block direct access to the whole includes folder, you can put a .htaccess file (the file has only extension, and no filename. You may use notepad to type code and save it as ".htaccess" with quotes, called absolute naming) in that folder that contains;
deny from all
If you want to disable directory listing, here is a tutorial:
Directory listing in htaccess. Allow, Deny, Disable, Enable Directory Listing in .htaccess
and you may refer this Stack Overflow question .htaccess deny access to folder
Just Google for folder access deny using htaccess and you can find lots of stuff.
Related
I'm having problems with my Apache2 WebServer. I run LAMP on a VPS (Debian 9, 64bit).
I have two VirtualHosts, Alpha and Beta.
Each VirtualHost has a different DocumentRoot: Alpha has /var/www/A, and Beta has /var/www/B.
The problem is that I don't want Beta can include /var/www/A/index.php on his files, and the same is for Alpha: I don't want he can include /var/www/B/index.php (and all other documents) in his files.
How can I do this? I already tryed lots of method using .htaccess but nothing worked, for example:
Order Allow, Deny
Deny from All
Allow from mydomain.com
Thank you! Hope in an answer...is so important :)
if you speak about PHP's include it is not possible to achieve this with htaccess, since you could include any file in the whole file system that can be read by the Apache user.
A solution would be to have a program that can run Apache with different user access depending on the document root, so you can only include (read) the files inside the document root defined for the virtual host, I think it is possible using an Apache module or some other Unix program (I don't remember), it is the same solution that is used by web hosting providers when they give you a folder inside the file system and you can only read the files inside this folder, they usually give you a user name (a Unix user) which have only read access to a specific folder and also Apache run with the rights with this user and so on for PHP.
I just made a mistake in a PHP application I'm developing with WAMP Server.
My WAMP / WWW folder is inside my D:\ disk, where I also have my personal data. My app, due to a fail in generating a dynamic path, deleted all my music, my photos and other personal files I had.
I mean... WHAT? How was it possible? I will need a recovery tool to recover that data.
How can keep the PHP from touching anything outside it's folder in www so it does not happen again? It's a disaster.
Limit the files that can be accessed by PHP to the specified directory-tree, including the file itself.
http://php.net/manual/en/ini.core.php#ini.open-basedir
Use open_basedir to restrict file operations to within specific directories, like this (in the website's VirtualHost file)...
php_admin_value open_basedir "C:/WampDeveloper/Temp/;C:/WampDeveloper/Websites/www.example.com/webroot/"
Though if you are deleteing via the command line or bat file (e.g., you are not using PHP file functions directly), the only way to fix this is to set Apache to run under a custom account that only has permissions set on WAMP's folder.
I have a situation where I want to protect a file from public access, but enable read and write from php. The file contains sensitive information like passwords.
The problem is that
I cannot put the file outside the web root (server security restriction on access from php)
I would like to avoid mysql database.
Also I would try to avoid .htacess files.
So if I make a folder, say private, in the web root, and do
chmod 700 private
Then, if the file to protect is private/data, I do
chmod 700 private/file
will this be a safe setup? So now I can read and write to the file from php but it is not accessible for the public?
Is this a safe setup?
PHP runs as the same user as the webserver so if PHP can read it, so can your webserver (and vice versa).
If you don't want to use .htaccess there is another trick: save the file as a .php file. Even if someone accesses the file from the web they can't see the source, they might just get a white page or maybe an error depending on what exactly is in the file.
If you're running suPHP or fastCGI php, you can use a setup similar to what you've described to limit access to files. Otherwise, PHP will use the same user as the web server, and any file PHP can access is also accessible via url.
If want to keep the restrictions stipulated (which are rather strange), and as (i guess) you do not wish/have access to apache config directives, consider adding PHP to some group and give the group only rights to the file, ie. apache cannot read (if its not in root/wheel).
Or make it a valid .php file (so only php would be invoker when the file is requested) which returns nothing or redirects when invoked with php. or just cipher it.
Sorry if this is a trivial question.
I am a kind of new to PHP and I'm creating a project from scratch. I need to store my application logs (generated using log4php) as files, and I don't want them to be public.
They are now stored in a subfolder under my PHP application folder, (/myAppFolder/logs) so they are served by Apache.
Where shall I store them, or what shall I do to keep them away from being served as content by Apache?
You can either have them in a directory above the root, or, if you're on shared host/ can't have the files above the root for whatever reason, you can have them in a directory that denies all HTTP access.
So you could have a folder called "secret_files" with a .htaccess file sitting inside:
.htaccess:
deny from all
Which will prevent HTTP access to files/subfolders in that folder.
Somewhere not under the public root!?
This is more a server config question as it depends on your server, but in apache you could use the custom log directives to set the location, so if you have
/www/myapp
Create
/www/log
and put them there instead. You need control over the config to do this so look up your web hosts docs to find out how.
I think this question should be something easy but after searching all over the web I couldnt find an answer, so I decided to ask here.
I have a file uploader in my website that works with php. The folder where files are being uploaded has 777 chmod. I also have a php script to list the files in that folder. What I need is to allow php to upload and browse files on that folder, but dont allow people to do it. The only solution I imagined is to chown that folder to another user different than default, so I could later chmod in filezilla and allow only owner to do it, so people will see the files trough the output of the php script, but not if they navigate to that folder.
Im using Debian, apache2. Id like to know what could I do.
To make it shor, my aim: allow php to upload, read, write and execute files in that folder, but not clients unless they use my php script.
Thanks in advance
Put all the files you're talking about in their own directory. Add a .htaccess file to that directory. The contents of the .htaccess should be deny from all.
This will prevent any user from manually accessing the files as access will be blocked off. Your PHP script can still browse the contents of the file and serve it up as an attachment with the correct content type.
For more info on how to serve a file for download in PHP, read this: https://serverfault.com/questions/316814/php-serve-a-file-for-download-without-providing-the-direct-link
All services including web servers run in a security context which is an account in the OS, for example apache starts using apache user in apache group. It is enough to change mode and change owner to this user and group. Never chmod a directory to 777 until there is a good explanation for that. Using this trick, web service process only can read, write and execute in that directory.
As well, if you want the browser clients not to see(read) the contents of that directory, you should deny listing on that directory. I think it is disabled for default.