Invalid Url Redirection - php

I have different folders in the site I am making.
What if the user tries to enter that folder, How would I not let them see what's inside?
Or can I redirect them into another page saying that they don't have the permission to access the folder/ invalid url?
I read something about htaccess but I dunno how that one works.
I am currently doing some trick (like adding index.php in the folders with a message saying they don't have permission to access) to every folder.
But it's kind of a pain. And I believe there's an easier method.

It depends on the contents of the folders:
If it contains php or configuration files that are never to be opened directly (or anything else that never needs to be requested directly by the browser), you should not put them in the web-root;
If it contains assets that are included in html but you do not want the visitor to browse the directory, you should configure your web-server so that directory browsing is disabled;
If only certain logged-in users should be able to open certain files, you should handle that in the file itself, not on the directory level.
If you cannot move your directory out of the web-root but nothing in it needs to be accessible by the browser, you can put an .htaccess file in that directory with just the following contents:
deny from all

This is something you should accomplish on a server level.
Basically restraining access to these folders on UNIX (chmod -r) for example will take care of this for you.

Related

Setting file permissions on a web server

Need a bit of clarification on this.
I have a folder in my web server that will contain sensitive information that no one should be able to read. My script currently does this:
makes the folder with 0777 permission and places an image in that folder
I have a second script that does this:
pulls that image from that specific folder, and shows it to the user
However, right now if the user knew the exact name of the parent folder, they can just type it in their browser and see all the images contained in that folder, like: www.testsite/test/images
What file permission can I use instead of 0777, that will allow these two scripst to write in and read in to the folder, WITHOUT allowing anyone to view the contents of the folder when typing it in their browser?
If I understand your problem correctly, you're worried about a user typing in /test/images/ into the URL bar, and seeing the directory listing containing your secret file.
Setting a chmod of 000 would mean that neither of your scripts (nor you) would be able to access the folder.
In my opinion, you'd be far better off using .htaccess with deny from all. This will make it so that you cannot 'open' any file in that folder, though you can still include them in PHP.
Alternatively, you may opt for creating an index.php in your /images/ folder, and setting an automatic redirect with header('Location: /'). This way a user wouldn't be able to see the directory listing.
Hope this helps! :)

How can I prevent access to my PHP include files like header.php, footer.php and the likes?

I am developing a website for myself and I just wonder how can I prevent direct access to include files like header.php and footer.php. Those files should only be incorporated in pages like index.php or other pages wherein they will be called using <?php include(''); ?>. Should I do it through PHP? How about editing the .htaccess file or are there any other methods?
place the files(s) in a directory out side the web root.
the web server will never serve theses files to users.
php et.al. can still access the files via include\require etc
This has been the gold standard approach for several decades.
I offered 3 suggestions and since you didn't provide much to go one, I will give you one elaboration.
As #Dragon eludes to, when you use include() your reading via the file system and not via a HTTP Request. You can check for an HTTP verb ($_REQUEST, $_GET, $_POST) and refuse to show content or fake a 401.
<?php
if(isset($_REQUEST) || isset($_GET) || isset($_POST)){
header("HTTP/1.0 404 Not Found");
die();
}
// Do the needed
?>
I will let you figure out the gotcha on your own here.
It would be perfect if your server is linux, because then what you can do is follow Dagon's suggestion of placing the files to include outside of the web root.
The web root of course is the base folder that contains files the outside world is meant to access. On many systems, this is the public_html folder.
On a system with WHM/cpanel installed, you might have a special user account where the root of that account (where anything can be stored) is located at /home/user on the entire system. This can be viewed by using the file manager utility included with cpanel when logged in. In that /home/user folder, you may find configuration files and folders starting with a period as well as public_ftp and public_html folders.
In the /home/user folder, you can place the PHP files you don't want the world to directly access. Then In public_html, (accessible within /home/user) you can place the index.php file that has the include statement to the "protected" files. That way in index.php you can use this statement:
include "../file-to-include.php";
Just make sure that the administrator has set the owner of the /home/user folder to the same username you login with or you may get access denied messages when trying to access the file. On a decent cpanel setup, the last step would have already been done for you.

folder to save files that are retrieved with require in document tree

I'm building a website based on php and i want to ask where to put files that are retrieved with a require statement, so that they can not be accessed from users with their browser.
(for example a php file that connects to my database)
EDIT actually i think the better way is to put them outside the public root because apache tutorial says htaccess will have a slowdown impact. it can be done with adding a ../
for example require("../myFile.php"); (At least this works in my server)
Best regards to all
That depends on the web server configuration. Usually (or at least in all cases I witnessed), you have a document root which cannot be accessed by users with their browser, with in there a folder containing all public material (often called htdocs, httpdocs, public_html or anything of the kind. Often, you can place your PHP include files in that root, and then require them using require("../include_file.php");
However, it depends on the configuration whether PHP can include files outside your public folder. If not, a .htaccess file is your best option.
If you place those files outside the document root of your webserver users cannot access these files with a browser.
If you use apache you can also place these files in a directory to which you do not allow access with a .htaccess file.
And as a last remark, if your files do not generate output, there is no way users can check the contents of the files.
If you mean source code then it is not visible for users, if you want hide folder contents use .htaccess directive Options -Indexes to hide files, if you can access php source your server configuration is wrong and it is not parsing php files.
You normally place them into a directory that is not accessible over the webserver (outside the document or web root). Sometimes called a "private" directory.
You then include/require the file from that path as PHP has still access to the files.
See also:
placing php script outside website root
disable access to included files - For a method if you're not able to place the files in a private directory.
Just make them secure with .htacces!
Here's a very clear tutorial for protecting files with a password. If you don't need direct access to the files per browser, or only your scripts need access, just block them completly by changing the code between
<Files xy>
change this bit here
</Files>
to
Order allow,deny
Deny from all
Then you won't need your htpassword file anymore either!
You need to put these files outside of public-facing folders on your web server. Most (all?) web hosts should have the capability to change the document root of the website.
For example, let's say that all of your files are served from the following directory on your host: /home/username/www/example.com/
This means that anything that resides inside that directory is visible to the internet. If you went to http://example.com/myfile.png it would serve the file at /home/username/www/example.com/myfile.png.
What you want to do is create a new directory called, for example, public which will serve your files, and point the document root there. After you've done that, the request for http://example.com/myfile.png will be served from /home/username/www/example.com/public/myfile.png (note the public directory here). Now, anything else that resides within the example.com directory won't be visible on your website. You can create a new directory called, for example, private where your sensitive include files will be stored.
So say you have two files: index.php, which serves your website, and sensitive.php which contains passwords and things of that nature. You would set those up like this:
/home/username/www/example.com/public/index.php
/home/username/www/example.com/private/sensitive.php
The index.php file is visible to the internet, but sensitive.php is not. To include sensitive.php, you just include the full file path:
require_once("/home/username/www/example/com/private/sensitive.php");
You can also set your application root (the root of your websites files, though not the root of the publicly accessible files) as a define, possibly in a config file somewhere, and use that, e.g.:
require_once(APP_ROOT . "sensitive.php");
If you can't change the document root, then what some frameworks do is use a define to note that the file shouldn't be executed directly. You create a define in any file you want as an entry point to your application, usually just index.php, like so:
if (!defined('SENSITIVE')) {
define('SENSITIVE', 'SENSITIVE');
}
Then, in any sensitive file, you check that it's been set, and exit if it hasn't, since that means the file is being executed directly, and not by your application:
if (!defined('SENSITIVE')) {
die("This file cannot be accessed directly.");
}
Also, make sure that your include files, when publicly accessible (and really, even if not), have a proper extension, such as .php, so that the web server knows to execute them as PHP files, rather than serving them as plain text. Some people use .inc to denote include files, but if the server doesn't recognize them as being handled by PHP, your code will be publicly visible to anyone who cares to look. That's not good! To prevent this, always name your files with a .php extension. If you want to use the .inc style to show your include files, consider using .inc.php instead.

How do I prevent public downloads of files using php?

I have a script that allows only authorised users to upload files to a certain folder.
However I do not know how to prevent people from downloading freely without login.
I need the solution in php.
I have googled around but nothing straight forward as yet.
Currently in my document root I have a folder called admin and a subfolder called uploads inside the admin. So only admin role can upload. Both editor and admin can download. What should I do in this case?
Please advise.
Put the files somewhere outside the public webroot directory, or configure your server to not serve the files. As long as your server will happily serve everything with a valid URL, there's nothing you can do with PHP to prevent that.
If your files are in the /public_html/ folder, take them out of that folder and place them in e.g. /secret_files/, so your directory structure looks something like this:
public_html/
index.html
admin/
admin_index.php
secret_files/
my_secret_file.txt
The webserver is only configured to serve files in the /public_html/ directory, so nobody will have access to directories outside (technical term above) it.
To still enable somebody to download those files, do as cletus suggests and use readfile to "manually serve" the files via a PHP script. PHP will still have access to these other parts of the file system, so you can use it as a gatekeeper.
Don't store the files in a directory under the document root.
Instead move them somewhere else and then a PHP script can programmatically determine if someone can download them and then use readfile() or something similar to stream them to the user.
You could also configure the Web server to not serve files from this directory but then you need PHP to serve them anyway. It's cleaner simply not to put them under the document root.
Answering question on how to password protect with PHP:
This should solve your problem.

Folder security?

I have a folder named upload which is filled with folders of users uploaded files.
Is there any way I can stop people from directly downloading my users files by simply typing the folder names and file name into the address bar?
Example: user Jim's folder is stored at HOST/uploads/jim
user Jim's important file "myimportantfile.txt" is stored at HOST/uploads/jim/myimportantfile.txt
Now, if just anyone types into the address bar: www.HOST.com/uploads/jim/myimportantfile.txt , they will be able to view Jim's important file.
How can I stop this from happening?
Can I write certain attributes when making the directories?
You don't want to have those files in a web-accessible folder. Move them out of the webroot.
Once you do this, you can have a file like download.php to which you pass a file ID and it can then validate it is in fact Jim asking for his files and only then fetch the file and output it to the browser as an attachment. This is the safest/best way for security.
I belive file permissions of a directory +w-r+x will alow directory writes but not reads. In geeky unix terms this is %chmod 733 dirname. The directory ownership would have to be set properly using chown and chgroup. This applies to a unix environment.
You could use an .htaccess file to require a username and password to be entered making each folder a protected folder.
But I think the best way to do it would be to move the uploads folder outside of the webroot so that it's not directly accessible, and then create a script (PHP, ASP, etc) that serves up the requested file after authenticating the user.
The simplest solution is to just add an index.htm file to the folder.
Any visitors will then see this page rather than the index of files.
The page can be blank, or even better, redirect to the domain home page with a redirect.
Sure, you can use basic file/directory permissions in Linux. You can also set the entire tree to be denied by apache.
What platform / webserver software are you running?
Okay, linux:
If the owner of the directory is 'joe', and the group is 'apache', then:
chmod 750 joe
This would give the directory 'joe' permissions which allow the owner (joe) full access, the group (apache) write access (and the ability to enter the directory), and nothing else.
Is this an FTP drop-box?
What are the ownerships/groups like now?

Categories