PHP: Is php_sapi_name() safe (can the user manipulate it)? - php

can the user manipulate the value which is returned by php_sapi_name()?
I have a script which looks like this:
if( php_sapi_name() !== "cli" ){
die( "NoAccess" );
}
// Do some admin stuff
This script should only (!) be called through command line. Is the code above safe? Or can somebody call the script through HTTP and execute it beyond the if condition?

php_sapi_name()'s return value is safe to rely on. It's not generated from user data.
You shouldn't have this script accessible to your web server though if you don't want it to be called from your web server. If you cared about safety, this script wouldn't be accessible at all.
You also mentioned .htaccess... don't use that, use a proper config file elsewhere. .htaccess has to be loaded and parsed for every request, which is not efficient.

Related

PHP detect if connection is browser or script

I am trying to prevent users from connecting to certain pages with a script. Thus, is there any method I can use to detect if a connection to a specific web page is a client browser versus an automated script?
I know certain headers can be spoofed, but is there another mechanism I can use; say like if unable to set a sesseion_start or setCookie. Do those return true/false values if able or unable be to be set?
Something like:
$sessionID = session_id() ;
$isSet = setCookie('cookieName',$sessionID, [ .... ]) ;
if ($isSet == false) {
... do something to kill the session
... or do something to redirect
}
Is this even possible? And even if it is, I know this probably isn't reliable, but what would be a better or more reliable method?
And to clarify, detect if its a script and if so, kill it before even serving the rest of the html page.
If you are trying to prevent pages from being called entirely, you can reliably do this with a combination of using an .htaccess file and a php "check" file .. This will check to see if the requested file came from your scripts, or an outside source. Make a directory, and put your "hidden" script files in it along with the following 2 files:
.htaccess
php_value auto_prepend_file check.php
check.php
<?php
if( !#$_SERVER["HTTP_X_REQUESTED_WITH"] ){
header('/page_404.php'); // Or whatever you want it to do.
exit;
}
All the .htaccess directive does is make check.php happen before every script call -- So technically you COULD just include check.php at the top of every file .. But I find this a more complete, elegent solution.
You can check with php_sapi_name() if you are running on CLI.
This example will only allow scripts from CLI.
if (PHP_SAPI !== php_sapi_name()) {
die('CLI only');
}
You can reverse the condition to make it only running for web server.
if (PHP_SAPI === php_sapi_name()) {
die('Web Server only');
}
You can do it with $_SERVER['HTTP_REFERER'] but it can be fake/dummy made.
<?php
if (isset($_SERVER['HTTP_REFERER']) && strtolower(parse_url($_SERVER['HTTP_REFERER'], PHP_URL_HOST)) === 'example.com') {
//your code
} else {
die('Bots are not allowed!');
}
You can use UserAgent
(You can see how to get it here : How to get user agent in PHP)
This will let you know user web browser which -I assume- will be different for 'scripts'

Is using php_sapi_name() to detect CLI reliable? [duplicate]

can the user manipulate the value which is returned by php_sapi_name()?
I have a script which looks like this:
if( php_sapi_name() !== "cli" ){
die( "NoAccess" );
}
// Do some admin stuff
This script should only (!) be called through command line. Is the code above safe? Or can somebody call the script through HTTP and execute it beyond the if condition?
php_sapi_name()'s return value is safe to rely on. It's not generated from user data.
You shouldn't have this script accessible to your web server though if you don't want it to be called from your web server. If you cared about safety, this script wouldn't be accessible at all.
You also mentioned .htaccess... don't use that, use a proper config file elsewhere. .htaccess has to be loaded and parsed for every request, which is not efficient.

Preventing bots from running PHP scripts

We have a directory that is open to the web where we place utility scripts, some of them used for submitting Email, others for calling generic functions on our web service.
In my PHP error log, I am constantly getting notices and warning that the data that is used in the script has an issue, like "undefined index" or "trying to get property of non-object".
Several of these script I know are not being used anymore, yet there are still entries in the log file from someone attempting to run those scripts.
What can I do to prevent this from happening in my legitimate scripts? They need to be avail to the web due to them being called via ajax from multiple pages.
Update ---
I figured out that the reason they were even able to be run by bots was that the directory didn't have protection from directory listings; meaning that the bots had read the listing and ran them from there without really knowing what they did.
I added the option to prevent directory listings to my .htaccess and I am going to monitor things to see if it helps.
On another note to all those suggesting blocking via IP or password protect them...
After checking some log files, checking for IP will not work because the scripts are being called both from the server, in PHP scripts, AND via ajax from the client. Also, to protect with password means I'd have to modify every place that calls the scripts to pass that password.
Hopefully my mods will help tremendously but it may not prevent bots that already know the scripts are there.
You could/should protecte those scripts with IP restrictions or logins. Both can be done with .htaccess files. This is probably enough for simple utility scripts. You should not use something like this for a complex and secure application though.
Sample .htaccess file:
# BAN USER BY IP
<Limit GET POST>
order allow,deny
allow from all
deny from 1.2.3.4
</Limit>
# login
AuthName "Test"
AuthType Basic
AuthUserFile test/.htpasswd
require valid-user
Sample .htpasswd file
test:Qh8a4zM4Z/i1c
There are even generators for these files.
Some sample that Google found: http://www.toshop.com/htaccess-generator.cfm
Don't call PHP script directly, or make scripts that are directly callable. This is the end goal. Probably not something you can implement right now.
If you take an Object Oriented approach all your PHP files will contain just classes. This means that when you run a file nothing happens.
Only 1 file will be an actual script and that is your entry point.
You're getting these undefined index messages probably because you're not validating your input (or there is a bug).
It's common to see a script like:
if ($_GET["action"] === "edit") {
// edit
} else if ($_GET["action"] === "delete") {
// delete
}
You expect to call the script like: action.php?action=edit but what if you call it like: action.php? You will get undefined index "action"
Add input validation like:
if (isset($_GET["action"]) === false) {
throw new Exception("Invalid input");
}
If a file is no longer used, delete it. If you don't want a file accessible from the web move it out of the webroot.
I run scripts via a cronjob and have them protected by a password I pass through the GET, like this:
$password = $_GET['password'];
if($password == "somethingcool") {
//the rest of your code here.
}
Then I call my script like this: script.php?password=somethingcool. If the password is incorrect, the script isn't executed.
There's a downside to this though.. if it's called from a public page, make sure you use javascript variables to set the password, or the bot will simple follow the link in the source code.
PS: Make sure you filter $_GET['password'], this current example is not safe to use.
I added the option to prevent directory listings to my .htaccess.
This brought down the execution fo the scripts by bots down to almost zero. I can live with the number I'm receiving now.

securing PHP page with GET var

I have a page that I want to execute via cron. It just does some pretty simple archiving stuff, nothing super high-security, no DB access etc.
Is it a secure practice to simply require a GET var to be present in order to execute the page? So myarchiver.php would be something like:
<?php
$mysecret_word = "abc123";
if ($_GET['secret'] == $mysecret_word){
// execute my stuff here
}
Then you'd just call myarchiver.php?secret=abc123 in the crontab and the process would run, while any wrong answer or attempt to execute the page with no secret would simply present a blank page (with no extra server load).
I realize this is not "secure" against man in the middle attacks and if the site was compromised-- but I believe in general it's plenty secure to keep this script from being fired by random script kiddies and other idiots who may somehow know about its existence? The thing I'm guarding against is random malicious users who may know about this script bombarding it with requests in order to dos/tie up resources.
EDIT TO ADD: the server is not accessible via SSH and the cron is being executed on a remote machine-- so it must be done via an http request.
Thanks for input.
If this script is never going to be run from the browser, you should place the file outside of your web root directory where browsers cannot reach it and just have your cron run the script at the alternate location. It would definitely be the most secure way to do it.
If you're on a shared hosting environment, you may need browser access for manual running. I would just use SSH to manually run the file from its location since it only takes me a couple seconds to login to SSH and get to the directory. Personally, I just prefer not to have excess pages laying around on my website.
First off, why not just check the IP address of the server making the request?
If it has to be done via an HTTP request and simply checking the IP address isn't an option, you can have your cron run a script similar to "runcron.php". That script would in turn make a CURL or WGET request to the actual cron script you want to run.
That would allow you to pass a dynamic hash instead of a static key. That would prevent someone from just repeating the HTTP request if they happen to sniff the traffic. For the hash you could use anything dynamic like the date combined with a salt.
Example:
if (md5('secretword') . date('H')) == $_GET['hash']) { // do cron }
That would at least rotate your key once an hour.
Also, crontab won't let you pass GET variables. You'll have to do this -
/usr/bin/php /home/blah.php hello
Then in the script -
$argv = $_SERVER['argv'];
echo $argv[1];
Someone correct me if I'm mistaken.
This is a technique that facebook uses on their logout.php file, so that if someone sends a link to logout.php it won't log them out. I would recommend doing this.
$mysecret_word = "abc123";
if ($_GET['asd2edxc32cwqcxad'] === $mysecret_word){
// execute my stuff here
} else {
error_log('oopis');
header('HTTP/1.0 404 Not Found');
die();
}

PHP: invoking remote server from my php server?

how can I invoke a php script on a remote server from my server code ?
I'm currently using:
header('Location: http://www.url.com/script.php?arg1=blabla');
in my code, but it doesn't work.
thanks
If you mean by invoking just "calling" it, so you only need it to run, then you can use curl.
If you mean by invoking that you want it to act the same as include, then you can't trough http (the server does ofcourse not return code, but runs it). You might be able to obtain the file trough other means (ftp?), and then include it, but that seems like a bit of a hack.
If you mean by invoking that you want to redirect the user to the page, then this should work:
header('Location: http://www.site.nl/');
exit;
(your script continues to run after a header call, so you might need to call that exit). How doens't your code work for you? (I'm guessing because you want one of the other options)
If you only want to invoke the script you can simply use $result = file_get_contents('http://www.example.com/');.
Your version using header() will as said above redirect the user.
Use cURL, it gives you much wider manipulation options.

Categories