I'm trying to use one PHP script on my server that calls other scripts.
The main script, called call.php, is in the public_html folder, so I can send an HTTP request to it using my_website.com/call.php?action=some_script_name&arg1=value&arg2=some_other_value.
I already have a method to form the new request (and execute it), if the action script is in public_html. For example, if some_script.php was located at /public_html/scripts/some_script.php, my HTTP request would be my_website.com/scripts/some_script.php?arg1=value&arg2=some_other_value.
I have that done already, and it works correctly. However, I want to send requests to scripts that are NOT in public_html (or any subdirectory of that). For example, if I have a script under /lib/otherscript.php, I want to call that as well.
I tried a request such as ../lib/otherscript.php?args_here, but that did not work.
Is this possible, and if so, how can I accomplish this?
Edit:
The actual file structure of the (shared) server looks like this (for this example):
/
public_html/
call.php
scripts/
some_script.php
lib/
otherscript.php
You can't access something outside public_html via HTTP; that's the whole point of the public_html directory. You have a few options:
Create a wrapper that is publicly accessible to call the functionality in public_html. The best way to do this is either a class or a function that takes as parameters the arguments from your URL.
Use the command line interpreter.
Either way, you may need to do some user authentication if the functionality is sensitive. If it's harmless, you can put it in public_html. If it's not, you need authentication/authorization checks.
Edited because I misread your question originally.
If you are using cPanel you can access php files outside public_html folder using absolute url, e.g.: <?php require_once('/home/username/lib/otherscript.php'); ?> now you can post parameters to any script within public_html folder at witch you have included otherscript.php.
Note: You have to use your cPanel user name in absolute address.
Related
I need some help.
I was reading the security recommendations of my hosting service and they say that ideally just put the
index file and files like css, js and img inside my root folder, and that all other files should be placed
off, that is, a level above.
I tried doing this in my tests, and I had some problems. The structure of the hosting folders is:
/
/htdocs
Inside /htdocs I put the index.php file and when accessing it through the url exemple.com/index.php works normally.
But putting other test files out of htdocs is what starts the problem. For example, if I have a file called contact.php
and I try to access it through the url exemple.com/contact.php I get the 404 error message.
So the question I have to ask is:
Is it possible to access url files that are outside of htdocs, or better to put all the files that will be accessed by the url inside
of htdocs and leave only configuration files outside this folder, like class, functions, database connection, etc?
And if it is possible to access the files by url, how would I rewrite these urls in htaccess?
and that all other files should be placed off
Yes, this is good practice. However, you're misunderstanding the implementation.
You can not directly access files outside the document root. But you can indirectly access them. I.e., the web server can't see them, but your programming code can.
Ideally, your site would use the front controller pattern. Here, your index.php file would serve every page of your app by intercepting every request and then routing it to the correct end point. I.e., you would never directly request /contact.php, you'd instead request /contact, which would get funneled to /index.php, which would load the required resources from outside the doc root.
All of the php files in the application are directly accessible through URL.
Adding this code at the start of my php files works for few of them which are being requested with POST method:
if ( $_SERVER['REQUEST_METHOD']=='GET' && realpath(__FILE__) == realpath( $_SERVER['SCRIPT_FILENAME'] ) ) {
die(header( 'location:/webapp/postings' ));
}
But, I do have some php files which are being requested through GET method and the above code doesn't work for them, because of which I came with the following code:
if(!isset($_SERVER['HTTP_REFERER'])){
die(header('location:/webapp/postings'));
}
I know that the HTTP_REFERER coudn't be trusted. Any other options?
can someone please tell me a generic way of preventing direct URL access without altering the code across all the php files.
Note: My Application is running on IIS 7.5 Web server.
Don't do this:
public_html/
includes/
dont_access_me_bro.php
...
index.php
...
Do this instead:
includes/
dont_access_me_bro.php
...
public_html/
index.php
...
Explanation
Keeping your source files outside of the document root guarantees that users will be unable to access them directly by changing the URI on their HTTP request. This will not protect against LFI exploits.
To find out where your document root is, this handy PHP script can help:
var_dump($_SERVER['DOCUMENT_ROOT']);
If this prints out string (25) "C:\htdocs\www\example.com", you don't want to store your files in C:\htdocs\www\example.com or any subdirectory of C:\htdocs\www\example.com.
If you place user-provided files inside your document root, you're creating the risk that someone will access them directly from their browser, and if Apache/nginx/etc. screws something up, their uploaded file may be executed as code.
So you would not want your files to be inside C:\htdocs\www\example.com\uploaded, you would want something like C:\uploads\example.com\.
This is covered in-depth in this article on secure file uploads in PHP.
Just finished doing a simple mail transfer at my site using PhpMailer
I got 3 question about it -
I have read that's needed to store your credentials on a different file, read that there's 2 options - ini/php, which one would be better and how exactly this file should look like.
Regarding the directory of the credentials file, read it should be located outside the web root (just one level above its fine?), in that case how do I call it from inside the web root?
On the same matter, should the Mail.php itself be located on the site directory? or should I take it out as well?
It's generally safest to put values like these in .php files because they will render to nothing, unlike a .ini file which will usually render as plain text.
Yes, one level above is fine - it means that the file does not have a public URL of its own. From a script running inside the web root, you'd just load it with require '../settings.php';
You don't say what Mail.php is, but generally any other PHP scripts can stay put. Things like class definitions are safe because they have no effect when run directly (or at least should have no effect, if you've written them safely!). That said, it's common to put your composer vendor folder outside the web root since you don't necessarily have control over what ends up in there.
This is what the map of my website looks like:
root:
-index.php
-\actions\ (various php files inside and a .htaccess)
- \includes\ (various php files inside and a .htaccess)
- .htaccess
I know that if I use "deny from all" in actions and includes directories, the files in them will be secured from direct access.
Now, my actions folder has many php files that are called by index.php (forms).
I have "deny from all" in .htaccess inside \actions, but then I have the forbiden access to that files.
So, how can I protect the files from direct url access but have the exception that can be called from index.php?
The easiest way is to place a constant in the index.php and check in the other php files if this constant exists. If not, let the script die.
index.php
define('APP', true);
various.php
if(!defined('APP')) die();
If you want to block using .htaccess then most likely the best way of adding the exception is to add it to the same .htaccess file. If you want to prevent PHP script from working (but not from being "visible"), then simply look for certain condition (like session variable, constant) at the begining of your scripts. Unless these scripts are invoked "the right way", these requirement will not be ment, so it'd be safe to just die() then
In a way to secure my files from outside access, I am considering placing all the included files outside the public_html folder or the httpdocs folder.
However, this comment is saying that nothing should be kept outside of the public folder that handles user input data.
What is the best and most ideal practice for this? My thinking would be to have a .htaccess point route EVERYTHING to an index.php, and the index.php includes all the neccessary files such as database connections and whatever else, and also includes the .php file which would have the HTML and PHP inside it for the main body content of the page.
Can anyone tell me if there is anything wrong with that, and why?
However, this comment is saying that nothing should be kept outside of the public folder that handles user input data.
The comment uses the word direct. Includes are handling the data indirectly.
My thinking would be to have a .htaccess
Configuration is better handled in the main configuration file if possible. .htaccess marginally is less efficient (and scatters configuration across your webroot).
point route EVERYTHING to an index.php, and the index.php includes all the neccessary files such as database connections and whatever else, and also
The front controller pattern is a perfectly reasonable approach.
includes the .php file which would have the HTML and PHP inside it for the main body content of the page.
Simply including that can start to create a bit of a mess. I suggest investigating the MVC pattern.
The comment you are referring to says that nothing that handles input or output directly should be outside the document root.
On the other hand, it's perfectly fine to place library code outside the root. If you use index.php as a single entry point to your application, pretty much the only things that should be web-accessible in addition to that script would be your assets (css, js, images, etc).