I have a basic file. The user can only view the file after a successfull form submission.
Now I know I have a variety of options I could use including placing it in a dir and modifying my .htaccess and then use a parameter which I pass through a routing script to let the user view the file as pointed out by numerous answers to somewhat similar questions. (which would probably be best option)
However Im just curious about something. "How Secure" would the following be.
Both files reside in a directory called xyz
Directory public_html/xyz
Form.php
<form method ="post" action="displayInfo.php" />
displayInfo.php
Now what I would like to know is IF i set the following code at the start of displayInfo.php would it stop with unauthorized access (i.e. prevent user from viewing the file IF he / she did not successfully submit the form)
if($_SERVER['REQUEST_METHOD'] !== 'POST'){
die("first submit the form")
}
No it wouldn't. I could defeat your security precaution with a simple cURL command:
curl -X POST https://example.com/displayInfo.php
The check for a POST request will pass, because it indeed a post request. However, it has absolutely none of the data you wanted.
Related
I have Linux operating system with Doc root(/var/www/html)in which I have an index.html file which has a form for asking user's name and then it puts that details into MYSQL Database(using php script called inside the index.html).
When I open the index.html in browser it presents me with the form to enter the user details and after clicking on submit, the php script is called(browser URL changes to /localhost/insert.php)and it inserts the data into database which is fine.
The issue is that the backend php script is directly available using /localhost/insert.php, so if I(or someone) bypasses the index.html and directly opens the /localhost/insert.php, it runs directly putting some vague data into MYSQL Database.
Any fixes to avoid running the backend(server side php script) directly from the browser.It should ONLY be allowed to run when called from the index.html.
It is better to check the request method than check if the $_POST variable is set, as there will be cases that a form will be correctly sent but the $_POST won't be sent.
You can do this by the following:
if ($_SERVER['REQUEST_METHOD'] != 'POST') {
header('Location: index.html');
die();
}
Then you can do sanitisation to check the fields have been entered, then finally sanitise data. If you are inserting the data into a database be sure to use prepared statements (or at the very least sanitise your data inputs using a real escape string function). Also make sure you prevent XSS injections by using htmlspecialchars.
You need to bypass processing in insert.php by placing a check and executing only if the request is coming from a POST
if(!isset($_POST['formValue'])){
exit;
}
?>
formValue is the key being "Post" from your index.html
If you are handling things properly, using POST method, then things should work out good. It doesn't matter if the user is trying to access your php script directly. It all depends on your request method. Say for example your form tag goes like this.
<form action = "index.html" method = "post">
And your submit button goes like this,
<input type = "submit" id ="submit">
Then in your index.html php script their should be something like this.
if(isset($_POST['submit'])){
// redirect data to another php script. And in this script, data should be cleaned to prevent Sql injections !!
}
else
{echo "invalid request";}
I have a page with this code:
<?php
if ($_SERVER['REQUEST_METHOD'] !== 'GET') {
header('HTTP/1.0 405 Method Not Allowed');
exit(); }
If I access it with a form in GET method, it works, and doesn't exit, which is ok.
But when I try to access this file directly, I would expect the file to perform the exit -
unless, a GET request is sent automatically whenever one access a file, even when not through a form?
unless, a GET request is sent automatically whenever one access a file, even when not through a form?
It is.
When you follow a link or type a URL into the address bar, you are getting a resource (you aren't asking for just metadata (HEAD), or sending any kind of data (PUT, POST), or deleting something (DELETE)) so you use GET.
Unless you specifically issue a POST, PUT, HEAD, etc. request then it is GET. If you click a link in a page or use the URL bar in the browser it is GET.
We have a directory that is open to the web where we place utility scripts, some of them used for submitting Email, others for calling generic functions on our web service.
In my PHP error log, I am constantly getting notices and warning that the data that is used in the script has an issue, like "undefined index" or "trying to get property of non-object".
Several of these script I know are not being used anymore, yet there are still entries in the log file from someone attempting to run those scripts.
What can I do to prevent this from happening in my legitimate scripts? They need to be avail to the web due to them being called via ajax from multiple pages.
Update ---
I figured out that the reason they were even able to be run by bots was that the directory didn't have protection from directory listings; meaning that the bots had read the listing and ran them from there without really knowing what they did.
I added the option to prevent directory listings to my .htaccess and I am going to monitor things to see if it helps.
On another note to all those suggesting blocking via IP or password protect them...
After checking some log files, checking for IP will not work because the scripts are being called both from the server, in PHP scripts, AND via ajax from the client. Also, to protect with password means I'd have to modify every place that calls the scripts to pass that password.
Hopefully my mods will help tremendously but it may not prevent bots that already know the scripts are there.
You could/should protecte those scripts with IP restrictions or logins. Both can be done with .htaccess files. This is probably enough for simple utility scripts. You should not use something like this for a complex and secure application though.
Sample .htaccess file:
# BAN USER BY IP
<Limit GET POST>
order allow,deny
allow from all
deny from 1.2.3.4
</Limit>
# login
AuthName "Test"
AuthType Basic
AuthUserFile test/.htpasswd
require valid-user
Sample .htpasswd file
test:Qh8a4zM4Z/i1c
There are even generators for these files.
Some sample that Google found: http://www.toshop.com/htaccess-generator.cfm
Don't call PHP script directly, or make scripts that are directly callable. This is the end goal. Probably not something you can implement right now.
If you take an Object Oriented approach all your PHP files will contain just classes. This means that when you run a file nothing happens.
Only 1 file will be an actual script and that is your entry point.
You're getting these undefined index messages probably because you're not validating your input (or there is a bug).
It's common to see a script like:
if ($_GET["action"] === "edit") {
// edit
} else if ($_GET["action"] === "delete") {
// delete
}
You expect to call the script like: action.php?action=edit but what if you call it like: action.php? You will get undefined index "action"
Add input validation like:
if (isset($_GET["action"]) === false) {
throw new Exception("Invalid input");
}
If a file is no longer used, delete it. If you don't want a file accessible from the web move it out of the webroot.
I run scripts via a cronjob and have them protected by a password I pass through the GET, like this:
$password = $_GET['password'];
if($password == "somethingcool") {
//the rest of your code here.
}
Then I call my script like this: script.php?password=somethingcool. If the password is incorrect, the script isn't executed.
There's a downside to this though.. if it's called from a public page, make sure you use javascript variables to set the password, or the bot will simple follow the link in the source code.
PS: Make sure you filter $_GET['password'], this current example is not safe to use.
I added the option to prevent directory listings to my .htaccess.
This brought down the execution fo the scripts by bots down to almost zero. I can live with the number I'm receiving now.
I am making AJAX like function but I have problem that bad user can change the value into any other current user. So, how can I prevent this thing?
$live = 'user1';
$fol = 'user2';
function ajax(like){
var data = 'like='+like+'&CURRENTUSER=<?php echo $live; ?>&TOFOLLOW=<?php echo $fol; ?>';
$.ajax( {
type: 'POST',
url: 'ajax.php',
data: data,
success: function(e) {
$('#success').html(e);
}
});
}
Also I want to move this ajax function into ajax.js file, but i am have problem in getting the value $live and $fol of users because echo $live doesn't work on .js.
So, is there any way to do this like Facebook, Twitter AJAX function does ?
This solution works for apache web-server. For interpreting JS file using php, add this line to your .htaccess file:
AddType application/x-httpd-php .js
And put your script inside ajax.js. One other way is using rewrite URL:
RewriteEngine On
RewriteRule ^ajax.js$ ajax.js.php [L]
And put your scripts inside ajax.js.php file. Of course, all these are if you want to show your URL as JS file.
at the top of your ajax.js or ajax.js.php file, before any kind of output, put this:
header('Content-Type: application/javascript');
I have problem that bad user can change the value into any other current user. So, how can I prevent this thing?
Of course you can not do that at all.
HTTP is a stateless protocol – so each and every request that reaches your server is to be mistrusted, period.
You have to check server-side whether the requesting client is authorized to request/perform whatever action it is he wants to trigger – f.e. by checking that the user id that is passed as the “current” user against the session where you stored your login information. (So when you have the id of the current user stored in there, then there’s no need to actually send it from the client any more in the first place.)
This is one of the most basic security principles of any web application – don’t trust any incoming request, until you have verified that the client has the appropriate authorization. So asking for how to “hide” any data that is send from the client is completely the wrong question – that would be what’s called “security by obscurity”, and that does not work.
There can be many solutions for such problem.
Add one of follow user in session before page load so even you dont need to send data in ajax. Just need to confirm action and all data will be taken from session. Hence hackers cant modify users.(This is how i solved the problem in my project)
You can build an function like encode() & decode(). when you are using data in file encode() it first.Then at code end use decode() to extract the info. Since if invalid data came out mean some one has tempered and you will not execute that action. But you have to create such encode() & decode() yourself.
$live = encode(user1);
At php end
$real_live = decode($live);
3. Ajax request to when start php execution you can have a function like
check_auth(user1,user2);
So even if some one used bad data your security rules can filter them.
Hope you can use any of them.
My problem is simple. I need to upload a file directly to the correct server (which has currently low workload).
Therefore I do:
<?php
$server = file_get_contents('http://my-api.com/upload-server.php'); // returns url
?>
then i print my form like
<form method="post" action="<?php echo $server; ?>"...
Now I would like to shift this step to when the upload starts, like so:
<form method="post" action="http://my-api.com/upload-gateway.php"...
this url should do a redirect to the "real" server.
So that the upload page doesn't slow down loading and I have static html code that I can cache, embed etc...
Problem is that this works perfetly fine with get requests but not with post requests.
The request seems like to get transformed into a get request when redirecting using the location header. All post data is lost.
Is this impossible or am I doing it wrong? And yes, I considered a remote dynamic javascript that prints the html code with the correct server in the first place. I would rather like not to do that...
any ideas? Maby alternative uplod techniques?
edit:
this is the exact html code i use:
<form method='post' enctype='multipart/form-data' action='http://storage.ivana.2x.to/rest.php?action=target'>
<input type=hidden name=testfield value="test">
File to upload: <input type=file name=upfile><br>
Notes about the file: <input type=text name=note><br>
<br>
<input type=submit value=Press> to upload the file!
</form>
this is the redirect code i use:
if($_GET["action"] == "target") {
header("Location: http://storage.ivana.2x.to/rest.php?action=test");
}
this is the output code i use to see the results:
if($_GET["action"] == "test") {
echo "<pre>";
var_dump($_POST);
var_dump($_GET);
var_dump($_FILES);
}
the result when uploading a small file looks like:
array(0) {
}
array(1) {
["action"]=>
string(4) "test"
}
array(0) {
}
If you really want to load balance through the code while potentially caching the page with the upload form, first select the default download server (url); then, onSubmit call the server and find the best upload target and adjust the action attribute accordingly.
With this method, users who do not activate JS still get what they want, users with JS enabled get the better upload target, and you can still cache. Additionally, the timing of the cache request could potentially be more opportunistic, since the URL request will occur very shortly before the actual upload.
The only hitch will be the call to get the URL, which you can more easily performance tune (I imagine) than the process you are describing above. Uploading a file twice through a header directive and/or cURL call don't seem like a good tradeoff for caching a single html file, IMO. But I don't know what you're up against, either.
If you don't want to heavily administer the server environment and introduce load balancing, this is the option I would suggest.
Note, also, I am not a server administrator by trade.
You could try returning a code 302 (temporary moved), not 100% that would let your browser post the data to the changed url though, but it's worth something to check out.
[Edit]
According to this wikipedia article, the POST data would be converted to GET, which probably won't work for a file upload.
This is not a good solution. The file will be uploaded in it's entirety before the PHP script is even started. This means that if you succeed with what you're trying to do the client will have to upload the file twice!
I recommend that you try to figure out which server to send the request to when you're creating the form, so the action attribute in the form tag will point directly to the lesser loaded machine.
Or even better: use a serverside loadbalancer and make your HTTP infrastructure transparent to the browser. This is essentially a sysadmin problem.
i might be wrong but.. your form's action points to ?action=target and in your rest.php you do a header to "?action=test" well of course you wont find your $POST nor your $FILES!... a header() does not send those variables..
if you want to send your post and your file to a differente location you'll need to use the cUrl library.. but it wont be easy :)
Good Luck