.htaccess permissions to stop outside world executing php scripts - php

I am setting up a new website and currently if I go to mydomian/php/someScript.php it will execute the php script. How can I let the files that include this still include this but not let anyone else execute these scripts from the browser. Currently I have this in my .htaccess file:
deny from all
but when I visit the site a AJAX post request is made to a script in this folder and is getting back a 403 error.
Any ideas on how to achieve this are welcome.
====EDIT====
for clarity, some files in the php directory are requested by AJAX and I've now been made aware that these files cant have the desired permissions. However I would still like to put these permissions on the other files in this directory
Thanks

The best solution is to put them outside of the web root directory if at all possible, that way you can include them but the web server can't serve them, no configuration is required at all in this case.
EDIT: I noticed you want to allow access to the scripts by AJAX. There is no way of doing this as there's no way of telling the difference between an AJAX request or other types of HTTP request with any reliability.

You can still include those files from php, e.g. using include or require.
Calling it via AJAX is not different from calling it by entering the URL in the browser - i.e. you cannot block direct access but allow AJAX access.

Related

How to verify logins on direct navigation to resource pages on an AJAX site

I have an SPA that uses AJAX calls to assemble content from multiple PHP files. I can add the following into the main application's config file to be able to redirect users that are not logged in back to the login page as long as they tried going through the portal to look at stuff.
// Verify Login to access this resource
if($_SESSION["loggedIn"] != true) {
echo ('resource denied <script>window.location.href = "https://'.$_SERVER['SERVER_NAME'].'/login.php";</script> ');
exit();
}
The problem comes in that there are tons of views, models, controllers, and third party widgets that can still be accessed directly if you simply tried scanning the site for common file architures.
Is there a way to use something like an htaccess or php.ini file to automatically append this login check to all of the php files in a directory so that I don't have to paste this into each and every page?
Baring that, is there a way to set my chmod settings to only allow indirect access to those files such that php scripts running on the server can use them, but they can not be directly visited? Thanks.
[EDIT]
Moving files outside of my public folder did not work because it broke the AJAX.
I also tried auto_prepend_file in an htaccess file, but this resulted in a 500 error. I am using a VPS that apparently won't let me do an AllowOverride All in my Apache pre_virtualhost_global.conf; otherwise, I think that would have been the right way to do this.
Setting the CHMOD settings of my resource folders to 0750 appear to be allowing the AJAX commands to execute without allowing direct access to the files. If anyone knows of any other security caveats to be aware of when doing this let me know. Thanks.

Any other way than .htaccess to redirect requests

I'm trying to get the back end of an app that was built by someone else, and then taked down, up and running again. I have uploaded the unmodified back end sourcecode to a dev server (not the original server it was on).
In the app, the URL to access a certain API is as such:
[hostname]/controllers/api/user/profile
and when looking at the back end php code, under the api folder there is a user.php file, and in that php file there is a function called "profile", and there is one for every api end point.
now the only way I know of to do this is to have an .htaccess file that redirects a request to /controllers/user/profile to /controllers/user.php?action=profile, and have a big switch statement in user.php that calls the function corresponding to "action" parameter.
But the weird thing is that there is no .htaccess file in the the api folder. The only .htaccess file is in the absolute root of the folder containing all the server code, and that just says deny from all
is there any other way to set up a server to cause requests to .../folder/functionName to actually call a function within a php file, other than using .htaccess?
It can be done through a PHP redirect. It is described here in details: How to make a redirect in PHP?

AJAX post with absolute url to backend file

I am working on an AJAX post from which needs to send data to a php file. This file is ONE located level above the domain root.
If my domain root is /root_general/root_domain/
The file php backend file is in /root_general/
I am trying to achieve this by using the dirname($_SERVER['DOCUMENT_ROOT']) url. But AJAX won't load the file, it tells me that the file wasn't found on this server. I am using Apache2 on Ubuntu and working with all permissions enabled.
How can I do it in other way? I need to put the file outside because it is supposed to be used by many different domains, and I think it wouldn't be clean to paste the same file inside every single domain root.
Edit: some code
When calling the file it's this way:
http[act].open('post',url,true);
You can't use AJAX to access files on the server. You can use it only to access URLs. So what you need to do is point an URL to that file you want to access. You can give it own domain, you can copy it a few times or you can have symlinks point to it.

How would I redirect a non-php file's URL, without editing the .htaccess file?

So basically I need a way to take an exe file on my server and make a php function to toggle it's availability. I do not want to just rename, move, or delete the file, but redirect to a page explaining the file is temporarily unavailable.
I know how to set a permanent redirect in .htaccess but this doesn't solve my problem since I'd like to make this automated via a php script....
Any suggestions? Thanks in advance.
You won't be able to do this with PHP alone unless your web server is set up to process requests for .exe files as .php, or you add a rule in .htaccess. If the file exists, and .htaccess/server settings allow direct access to files, there's nothing PHP can do to stop access to it.
Honestly the simplest way would probably be to write a PHP script that edits your .htaccess file and reloads it in apache. You could just run that script whenever you needed to enable/disable access.

prevent direct access to a php include

I have a php script PayPal eStores/dl_paycart but it has PayPal eStores "settings.php" Security Bypass Vulnerability
I would like to know if I can prevent direct access to a php include file.
Would this help?
defined( '_paycart' ) or die( 'Access to this directory is not permitted' );
Thank you
I would STRONGLY recommend finding some new script. Any sort of blocking is just sticking a finger in the dam; it isn't a permanent solution and eventually it's going to break.
If you really want to use it, check out htaccess files, particularly "Order Allow,Deny" and "Deny from All"
The problem is that if someone is able to use "include" and read the code contents, variables, and the like, that means that they are already operating on the same server and, to be a bit crude, you're boned if they try to screw with you.
On the other hand, if you're looking to prevent outside access to the file from a remote server, then the include call can only retrieve the values which would be displayed to any external site (and if the question is, "Can I prevent external sites from even loading this file remotely", the answer is "through server configurations in http.conf and .htaccess files" ).
The long and the short, however, is that this is not something which can really be fixed with PHP, this is a server security issue.
The fact that the script has a .php extension offers some protection - any http or https call for that file will go through the web server which is going to execute the php before serving the request.
I would recommend moving the script to a directory under your public web directory and putting .htaccess file in that directory that either blocks all requests, or requires a password to access it. Then include the script when needed by scripts in your public directory. See Apache's .htaccess Tutorial
Probably the most secure way is something like this
$allowed_files = array("/paths/", "/that/", "/are/", "/allowed/");
if(!in_array($_SERVER['PHP_SELF'], $allowed_files))
{
die("Not Allowed");
}
Fill the array with Files that you would like to have access. (You might have to access PHP self in each page you want and copy and paste it in). This will check to make sure that the file being executed is one of the allowed pages. If it isn't the script will die.
I believe $_SERVER might be able to be changed, but probably won't be. This file will still be able to be gotten using fopen or file_get_contents, and if someone reads it, they will know what to change.
But I would forewarn, it is not 'completely secure', because there isn't really a way to make something 'completely' secure.

Categories