Preventing mass form submits by $_SESSION - php

If I'm getting bombed by a .NET program created in C# - bombed as in the user is submitting the $_POST fields on my form in mass quantities... It's specifically my contact form.
I'm not sure how exactly the mass $_POST occurs in the .NET program or it even could be a c++ program, I have no idea. However I had an idea to counter this.
My first idea requires $_SESSION but... Would those $_POST bombing program(s) the user created handle/accept a $_SESSION? I really don't want to find out but maybe someone with experience with the WebClient class in C# would know if it handled $_SESSION'S or whatever it is the user is using. I was considering using $_SESSION['submitted'] = $count; and another part of $count++;
if($_SESSION['submitted'] > 5) {
//display captcha or block from site
} else {
$count++;
}
If the user's program didn't handle a $_SESSION is there anyway possible I can disable the site to them? So they can't attack my contact form?

Bypassing the session lockout is trivial for a malicious user. Just delete the session cookie after each POST and they get a brand new clean session with the limit reset.
The only secure way to block a user such as this is to start throttling their IP address. Limit it to a certain number of connection attempts per minute and they won't be able to submit than that many requests per minute. Now, if they can hop between hosts then you've got a bigger problem, and should probably look at moving your form elsewhere so all they get is a 404 (until they notice it's moved).
The downside is if they're using a common proxy or somethign like AOL which proxies EVERYTHING, you'd be blocking other legitimate users as well.

Well you can put some security barriers like a login form or some other form of authentication (like browser check or the like)
Aside from that the only think i can think of is to block the incoming IP, which then again might not be a good idea

Someone can correct me if I'm wrong, but SESSION stores a cookie (or some other way) with the session id (a very long hash string), The SESSION variables are stored on the server. So even if they handled SESSIONs they would have no control over the server stored SESSION variables.
If I were you, I'd set CAPTCHA to always be on...

What about just implementing the CAPTCHA?

Related

What is the best way to secure an application with session vars?

I have an application with an index page, where the user can enter a login and a password. Then they are submitted as POST variables, and if they are both correct, my script creates a session var as follows :
$_SESSION['auth'] = $login;
and the index page echoes a menu. If they are not, the login window is displayed again, as if the user went on the index page for the first time (with an error message under the required fields).
For now, I verify if the session variable is set :
<?php
include 'functions.php'; //this file contains session_start()
if (!isset($_SESSION['auth'])) {
header("Location: index.php");
exit();
}
?>
<html>
Html stuff with jquery...
</html>
Do I have to put all my html stuff in brackets within an else statement, or my page is secured enough ? Even if the user is redirected, the html part is still got by his browser, isn't it ?
I have several PHP pages on my server, and at the beginning of each one I make the same test as above. I also make this verification when the user comes on the index page (if he has already signed in, I don't want him to log-in again).
Assuming my script is safe (all the information provided by the user is checked on server side), what is the best way to work with secure sessions ?
It's great you are thinking about security, but allow me to explain a little on how sessions and session cookies work.
First of all cookies only allow a limited amount of data stored in them, I forget how much but I know it's not limitless. So how do severs allow you to store all that session data. What happens is the actual session cookie only stores the session id. On the server there is an actual physical file that has the session data in it. So when a client connects they send the session id, which in turn tells what file to use. Therefor, the session is as secure as the severs filesystem and the algorithm used to create the cookie's id. No actual session data ( besides that id ) leaves the server.
That said there are 2 settings that might help ensure sessions are safe.
These are httponly and secure, you can read more about them here
http://php.net/manual/en/session.configuration.php#ini.session.cookie-httponly
Now the short of it is, secure means to only transmit the session data ( the session id ) over Https, this only works if you have https setup on your server and website. Httponly, tells the browser that the cookie should only be sent over http ( or https ) not by client-side scripting. However as you have no control over what browser the client uses, or if their computer has been compromised in some way that will tell the browser otherwise ( although I don't know of any exploits that might do that ) you really are just making a suggestion to the client machine.
Now as far as the security of the actual data, any input from anywhere even your own database should always be treated as potentially insecure. This doesn't mean you have to check it everywhere, mainly that you escape html when outputting to the page, and that you use proper means to prevent sql injection into the database. As a general rule these should be done at the point of entry.
For example, when outputting content to a page that should not have html, simply use html_entities etc.. when outputting it. When using data in sql, use prepared statements there. You see you don't need to check the $_POST data every time you touch it, just check the data before using it for something that could be exploited, such as saving it in the database.
Lets take a small example function ( assume its in a user class )
public function getUser( $id ){
$sql = "Select * from user where id = $id"
//execute sql
}
We would never do this using PDO but lets assume this is old school stuff, you filter the data from a login form elsewhere, so when you set this up you assume its always a clean a id, because you filter it at the login form. Then latter you need a user from the session. Well you have the id there too, so you use it. Now is that id from the session clean, who knows right. Maybe it's an id from a file, or some other obscure place. Who knows where an Id can come from ( when we make the user class ). So what we do now-a-days is check that or use a prepared statement "at the point we use" the data, the point of entry. Then we don't care where the data came from. Because, it is always being cleared at the prepared statement, just before we use it in the database. That way it is always 100%, no question asked, clean. We can see it right there where we run the query from. We don't need to worry if we make a new login form latter. Maybe you add a pop-up login form somewhere? Dosn't matter, your sql will always be clean. Now, I am not saying not to check it at the login form. Never hurts to be safe, not to mention you might want to validate other things there, email format, uniqueness of a username etc. But, the critical point, is covered.
Make sense?
To address your comment about not using cookies ( I should have explained this at the beginning ). By it's very nature the internet is stateless. What this means is that every page reload is essentially a new connection. The only way to maintain a login ( or any data across reloads ) is to pass some data between requests. There is generally only a few ways to do this, they are
$_POST, $_GET, $_FILE, $_COOKIES. Notice how they are formatted, that's a hint, they are called super globals.
http://php.net/manual/en/language.variables.superglobals.php
By the way we have Netscape to thank for both Cookies and Javascript, both of which IE "borrowed", to me it's sad I don't see Netscape Navigator anymore. I remember that from the AOL 3.0 days, that was before you could embed images in email. You old timers know what I am talking about... Churr.Beep.bong.Cherrrk (various analog modem noises )... streaming video what's that, it's like 1.5 days to download a song. & Pr#y 2 teh GodZ of de inTerWeBz MomZ do4't g3t a [ call .... L33t BaBy >:-)~ ^v^

How to protect from malicious use of jQuery post handler?

I use jquery POST calls to fetch data to display in various sections of my websites.
Usually, they POST to a single 'ajax_handler.php' page which reads the requested parameters and returns the relevant data. Typical parameters might be 'get_order_details',123
How could I stop users posting to the script to try and retrieve data which they should not be able to? I know I can verify that data belongs to the currently logged in user, for instance, but how could I stop users 'guessing' that there might be a handler for 'get_some_info'?
Since users could even run javascript straight from the URL this seems to be a major security problem to me as the client would have access to SESSION and COOKIE data (which I would otherwise use for security).
I guess I could start by naming each of my handler identifiers with a random string, but I'd prefer not to compromise the legibility of my code.
Naming your handlers with a random string is security through obscurity and while it might slow someone down, it won't stop them.
The best thing to do is to store a session or database checksum each time a page is accessed. Then also send that same checksum along with the POST content. When the form is submitted, through AJAX or otherwise, you compare the checksums. If they don't match then you know the user wasn't on the appropriate page and is trying to access the handler through some other method.
For each user, you can store within your database which data he should be able to view and which he shouldn't. Each time you get a request, e.g get_order_details, you should call a function which does your security checking to make sure both that the user is logged in, and that he has access to the 'get_order_details' method or any other method he is trying to access.
What you're trying to do is fundamentally antithetical to how the Internet works. You can't and shouldn't attempt to limit the way users make requests to your services. This is an extremely outdated and backwards way of thinking. Instead of trying to limit the ways in which users can use your service, be thankful that they're using your service in the first place.
All you can do is make sure that the user is authenticated and has access to the record they're requesting. If you're using a system which has no authentication, and you want to prevent users from "guessing" the ID of the next record, don't use sequential IDs. Use randomly generated strings as your identifier. Make them sufficiently long that it will be difficult for users to stumble upon other records.

How do I protect against ajax-spam in PHP?

Good day,
I would like to know how to protect my website from ajax-spam. I'm looking to limit any ajax action per
users. Let's say 8 ajax-actions per minute.
An example of an action would be: a button to add/remove a blog posts "as my favorites".
Unless I'm wrong, I believe the best way would be using $_SESSION's variable and to avoid someone/a bot to clear
cookies to avoid my protection. I'm allowing ajax-functions only to logged-on users.
Using database would make my protection useless because it's the unwanted database's writes I'm trying to avoid.
I have to mention that I actually use PHP as server-language and jQuery to proceeds my ajax calls.
Thank you
Edit:
The sentense
... to protect my website ...
is confusing but it's not about cross-domain ajax.
Edit 2011-04-20:
I added a bounty of 50 to it.
Since you're only allowing AJAX actions to logged in users, this is really simple to solve.
Create a timestamp field for each account. You can do this in the database, or leverage Memcached, or alternatively use a flat file.
Each time the user makes a request through your AJAX interface, add the current timestamp to your records, and:
Check to make sure the last eight timestamps aren't all before one minute ago.
From there you can add additional magic, like tempbanning accounts that flagrantly violate the speed limit, or comparing the IPs of violators against blacklists of known spammers, et cetera.
Are you talking about specific ajax-spam to your site, or ajax-spam in general?
If the latter, you can use hashes to prevent auto-sending forms, i.e. write your hash() one-way function which takes string and makes sha1-checksum of it.
So that's how you use it:
// the page is a blog post #357
$id = 357;
$type = 'post';
$hash = hash($_SERVER['REMOTE_ADDR'].$type.$id);
Put that hash in hidden field which is not within the comment form or even hidden div, somewhere at the bottom of the page, and name it "control_hash" or something. Attach it's value to the ajax-request on form submit. When the form is received by the script, make a new hash from $_REQUEST data (excluding existing $control_hash) and check if they match.
If the form was submitted by bot, it won't have $control_hash, so it won't pass.
Yes, your idea in principle is good. Some things to consider though:
If you track the limits globally then you may run into the issue of a bot DoSing your server and preventing legitimate users from being able to use your "Favourite" button.
If you track the requests based on their IP then someone could use a bot network (multiple IPs) to get around your blocking. Depending on your site and what your preference is, perhaps limit based on a subnet of the IP.
Install and use Memcache to store and track the requests, particularly if you are going to be tracking based on the IP. This should be faster than using session variables (someone might correct me on this).
If you have access to the source code of the web-site, you can rewrite some of the javascript code that actually performs AJAX-request. I.e. your pages can have a hidden counter field, that is incremented every time a user clicks the button. And also you can have a timefield hidden on the page, in order to rate the frequency of clicks.
The idea is that you don't even have to send anything to the server at all - just check it on the client side inside the script. Of course, that will not help against the bots adressing directly to the server.
It really depends on the result of such a spam. If you just want to avoid writing to your database, all these check could end up taking more ressources than actually writing to the database.
Does the end justify the means?
You also have to judge what's the probability of such a spam. Most bots are not very smart and will miserably fail when there's some logging involved.
Just my 2 cents, the other answers are perfectly valid to avoid spam.
Buy more powerful hosting to be able serve requests, don't limit them.
8 requests per minute it's ridiculous.
Anyway, if requests are 'legal', you should find ways how to serve requests, not how to limit them. And if not 'legal' - then deny them without any 'time' limitations.
You can use a session field with a global variable holding the time of last ajax request. Since you want to allow 8 requests, make it an array of size 8 and check for the time differences. If it increases, (important) it might not always be a bot. give the user a chance with captcha or something similar. (a math problem maybe?)
once the captcha is validated, allow the next few posts etc..
But do make sure that you are checking for that particular session and user.
Kerin's answer is good, I just wanted to emphasize on captcha.
yes you need to use a function in every function views can interact, also, it should be in global library so you can use it anywhere.
if(is_logged_in())
{
// do you code here
}
while is_logged in is defined as follows
function is_logged_in($activated = TRUE)
{
return $this->ci->session->userdata('status') === ($activated ? STATUS_ACTIVATED : STATUS_NOT_ACTIVATED);
}
you should set the status session when user login successfully.

Is it safer to transfer previous page location via GET method rather than HTTP_REFERER?

I want to store the page location the user came from (on my site). I want to do that for this example: say someone sent a comment without being logged in. "process_comment.php" will process it and send a header(location:$_GET['prev_page']); Of course I'm gonna filter $_GET before sending it.
Should I use a session instead?
Thanks!
It is actually exactly the same. Both methods imply that the information is passed in the HTTP query, which can easily be forged. So you can't really trust one method more than the other.
That being said, as long as you don't rely on that information for something really important, you can admit that the referer can be trusted, because it's a little bit more complex to forge than a querystring parameter. At least for the average user.
The best solution, if you need to trust that information for something important, would be to store it on the server, as a session variable for instance. Each page would store its URL, after checking what the previous value was.
If you use $_SESSION, there will be trouble if the user has multiple windows/tabs open and does different things at once. There is nothing more annoying than being able to only have window of a site.
You could store the value in a SESSION variable and identify it by a short key. That key goes into the GET string. That way, you can keep your URLs clean, and you don't risk hitting the 1024 byte limit many servers have for GET parameters.
Well, the HTTP_REFERER can be stripped out by some clients.. I seem to remember some Norton Internet security products did that, probably others do too. So it is going to be more reliable for you to set the previous page in a session and use that for redirecting.
If you can use it, session is a safer option. Sending user back from GET or even headers will allow crafty people to possibly abuse any flaws in your code to possibly do nasty things.
The header itself may also be removed by some firewall software.
I don't think there is a problem with using GET in this case. You can't always depend on being able to retrieve the referrer from the browser.

How to re-initialize a session in PHP?

I am attempting to integrate an existing payment platform into my webshop. After making a succesful transaction, the payment platform sends a request to an URL in my application with the transaction ID included in the query parameters.
However, I need to do some post-processing like sending an order confirmation, etc. In order to do this, I'd need access to the user's session, since a lot of order-related information is stored there. To do this, I include the session_id in the intial request XML and do the following after the transaction is complete:
$sessionId = 'foo'; // the sessionId is succesfully retrieved from the XML response
session_id($sessionId);
session_start();
The above code works fine, but $_SESSION is still empty. Am I overlooking something or this simply not possible?
EDIT:
Thanks for all the answers. The problem has not been solved yet. As said, the strange thing is that I can succesfully start a new session using the session_id that belongs to the user that placed the order. Any other ideas?
Not really what you ask for, but don't you need to persist the order into database before you send the customer to the payment-service? It's better to rely on persisted data in your post-processing of the order when you receive the confirmation of the payment.
Relying on sessions is not reliable since you will have no idea on how long this confirmation will take (usually it's instant, but in rare cases this will have a delay).
Also, in the event of your webserver restarting during this time span, will make you lose relevant data.
A third issue is if you have a load-balancing solution, with individual session-managment (very common) then you will have no guarantee that the payment-server and your client will reach the same webserver (since stickiness is usually source-ip based).
I will venture to guess that since domains are different from where the session is set to where you are trying to read it, php is playing it safe and not retrieving session data set by a different domain. It does so in an effort to preserve security in case somebody were to guess session ID and hijack the data.
Workaround for this, assuming the exchange happens on the same physical disk, is to temporary write order data to a serialized (and possibly encrypted depending on wether or not full credit card number is being tracked, which is a whole another story) file that once read by the receiving end is promptly removed.
In essence all that does is duplicates the functionality that you are trying to get out of sessions without annoying security side-effects.
Many thanks for all the replies.
Smazurov's answer got me thinking and made me overlook my PHP configuration once more.
PHP's default behaviour is not to encrypt the session-related data, which should make it possible to read out the session data after restarting an old session from another client. However, I use Suhosin to patch and prevent some security issues. Suhosin's default behaviour is to encrypt session data based on the User Agent, making it a lot harder to read out other people's sessions.
This was also the cause of my problems; disabling this behaviour has solved the issue.
Make sure you're closing the current session before you attempt to start the new one. So you should be doing:
$id = 'abc123';
session_write_close();
session_id($id);
session_start();
Dirty, but has worked for me:
Tell the payment gateway to use
http://yourdomain.com/callbackurl.php?PHPSESSID=SESSIONIDHERE
PHP uses that method of passing a session around itself if you set certain config vars (session.use_trans_sid), and it seems to work even if PHP has been told not to do that. Its certainly always worked for me.
Edit:
Your problem may be that you have session.auto_start set to true - so the session is starting automatically using whatever ID it generates, before your code runs.
How about do it in another PHP page, and you do a iframe include / redirect user to the second page?
I'm not sure the exact length of time between your transaction and your check; but it certainly seems that your session cookie has expired. Sessions expire usually after 45 minutes or so by default. This is to free up more uniqid's for php to use and prevent potential session hijacking.
I'm not sure if you have a custom session handler and whether it's stored in the database but guessing from your posts and comments on this page I would assume it is stored in server side cookies.
Now the solution to your problem, would be to bite the bullet and store the necessary data in the database and access it via the session id, even if it means creating another table to sit along side your orders table.
If however you are doing the action immediately then the other explanation is that either the user logged out or committed an action which destroyed their session (removing the server side cookie).
You will see these cookies in your servers /tmp folder, try have a look for your cookie, it should be named 'sess' + $session_id.

Categories