DDoS vulnerable forms - php

I recently heard that a website can be DDoS'ed through the those doing the attack submitting forms with i.e 2MB data (in characters), lot's of times very rapidly. Is this statement true? If so, is there anything I can do in PHP to prevent this, i.e set the form data limit down?

DDoS'ing by submitting large amounts of data would require some vulnerability in data processing, otherwise it is not really plausible since the attackers upload speed becomes a bigger factor. It would be much easier to attack with using slowloris or ping floods.
Using data an outdated server can be DDoS'ed by using variable parsing problem aka the Hash Collision attack (not only PHP). It is discussed on the 28th Chaos Communication Congress which can be watched here.

There may be more than 1 thing that was going on in the background that would have led to this DoS. I am assuming the OP is using DDoS instead of just DoS. Anyway, the reasons why this could have happened are because, the code spent too much time on a single request, too much logging(ie the server writes to the log every-time it gets a bad request) or a plain old bottleneck on the server side. And I believe 2 MB is a moderate size for a POST request. If going below that doesn't affect your functionality, go ahead and reduce it.
See my question here, which is along the same lines : https://serverfault.com/questions/413718/tomcat-denial-of-service-due-to-large-packets

Related

Detect if there was an sql injection attack by parsing access logs or HTTP params. PHP Websites

As most programmers I try to program my applications in the safest way possible but this we know that does not guarantee security at 100%. Therefore I think it is also appropriate to have methods to monitor if we may be being attacked. So this is my question.
(My websites are made with PHP and MySQL)
In the case of SQL injection I think this can be done in two ways, but if there are other ways I would also like to know them.
Parsing access/error logs. Does anyone have or know a script that adequately analyzes the access logs (apache) to detect possible attacks? And notify to the administrator automatically with all details.
Analyze HTTP params at real time. It would be a script that analyzes in real time the content passed by GET / POSt and notify (e.g. via email) to the administrator of the website
For example, I do not know much about SQLi attacks but I think it's common for the 'SELECT', 'UINON',...(Others?) strings to appear in query strings and params.
In this way we can analyze the attack and see if it succeeds or not, and then take the consequent actions.
Thanks for your attention!
Edited: Simple bash script
I have made a simple system for analyzing the Apache access_log files and communicate results by email. Which is detailed in this question:
Linux bash to iterate over apache access_log files and send mail
In addition, another one using AWK. The only one resource I've found related about that:
https://www.unix.com/shell-programming-and-scripting/248420-sql-injection-detection.html
(But I have not been able to make it runs in my case)
Oh boy.
Alright, where to start?
For starters, remember that bad hackers are usually financially motivated. You know your website has been injected if you wake up one morning to a red error message from Chrome or Firefox, and you open it anyway to find that your website is now among the more popular places to find free cruises and viagra online.
Sites that score well with SEO are more likely to be hacked than sites that do not. More users means more exposure. Password protected sites don't get hacked as often, but the password protection itself does not necessarily mean any added security. If you're vulnerable, you're vulnerable, and you need to be on top of it.
First and foremost, remember to filter your variables. Never trust anything that comes in from a browser. IT'S ALL SUSPECT. That's means filtering anything that counts as a super global, GET POST, REQUEST, etc. I wouldn't even trust sessions, honestly. Filter it all. More on this can be found here: http://php.net/manual/en/function.filter-var.php
Something else to think about is file uploading. Bad guys love uploading files, and taking over your server. Most common method is exploit files disguised as images. You're going to want to resample every image that comes in. GD Works, but I like Imagick better, personally, more options. More on that here: http://php.net/manual/en/book.imagick.php You're also going to want to make sure that your site can't upload images or any other type of file from pages that you don't explicitly designate as form or upload pages. You would be shocked how often I see sites that can upload from the index, it's insane.
Another method you can deploy for this, is use your php ini to set a global include, and open up any file in a $_FILES array that comes in. Open up the first million spaces in the file, and scan it for php reserved words, and unix shell scripting. If you find one, kill the upload, exit or die, whatever you like to do there.
Apache has a setting for forensic logs. Forensic logs will capture all GET and POST stuff, but the issue with it, and the reason it's not exposed by default is that your log get big, and quickly. You can read up on it here: https://httpd.apache.org/docs/2.4/mod/mod_log_forensic.html
Lastly, you're going to want to evaluate your site for injection vulnerabilities and cross site scripting. Cross site scripting isn't the issue it once was, given the way browsers are constructed these days. All those little details that make life harder for us as a developers actually make us more secure.
But you do want to check for SQL vulnerabilities, especially if you're writing code from scratch. There are a couple reasonably solid plugins for Chrome that make pen testing a little easier.
Hackbar: https://chrome.google.com/webstore/detail/hackbar/ejljggkpbkchhfcplgpaegmbfhenekdc?utm_source=chrome-ntp-icon
HackTab:
https://chrome.google.com/webstore/detail/hack-tab-web-security-tes/nipgnhajbnocidffkedmkbclbihbalag?utm_source=chrome-ntp-icon
For Firefox, there's scrippy
https://addons.mozilla.org/en-US/firefox/addon/scrippy/?src=search
Hope that helps.
Good luck.
Therefore I think it is also appropriate to have methods to monitor if we may be being attacked.
The biggest waste of time ever.
ANY site gets "attacked" 100% of time. There are freely avalable scripts that allow any stupid schoolboy to scan whole internet, probing sites just by chance. You'll grow bored the very next day after scouring these logs of your detection system.
In your place I would invest in the protection. Other vectors than you could think of. For examle, all recent breakings I was a vitness of were performed by means of stealing ftp passwords stored on the webmaster's PC. And I can assure you that there are much more attack vectors than a blunt SQL injection. Which is a simplest thing to protect from, with only two simple rules to follow:
Any variable data literal (i.e. a string or a number) should be substituted with a parameter, whereas actual value should be sent to the query separately, through bind/execute process.
All other query parts that happen to be added through a variable, should be explicitly filtered through a hardcoded list of allowed values.

PHP: Abusing POST and GET

Concerning security; Should I check every page for how many variables are sent, the size of variables that has been sent and block GET if it is not needed in the page for example.
I mean maybe someone send very very large text for many many times as GET variables to overload my server.
Is it possible? what can I do about it?
Using GET request you can't send huge amount of data (Apache has a default of 8000 characters, check browser limitations here). And if you don't use anywhere $_GET parameters, than it will be mostly no impact for server. What matters here is requests per second. Normal user will not generate lots of request.
If you are looking for security holes, start from Uploaded files execution restrictions (like PHP code in image.jpg) and other insecure access to files, XSS attacks, weak passwords generation and so on.
There was a big problem with how POST/GET values were handled in most languages, including PHP that could result in DOS attacks via specifically crafted requests. It was first discussed in this talk (slides are available here).
You can also read about it here and here. The main idea was that POST/GET are arrays, and that arrays are stored using hashtables. An attacker could create a DOS by purposefully creating collisions (data has same hash value), which results in a lot of computations.
But this isn't something that should be handled at application level, so you as a PHP coder do not have to worry about it. The problem described above is an issue of how PHP handles hashtables, but you can also prevent it by limiting the size of POST/GET requests in your PHP configuration.
If you are worried about DDoS, this also would not have to be handled by your application code, but externally, eg by a firewall.
My answer somewhat links your question to your comment:
No, I'm worried about hackers
Security wise I think the first thing you should check and optimize is the site structure. The problem you mentioned is very specific and to a certain degree may help, however probably won't be their primary attack.
You could always limit the GET requests (by default is somewhere around 8KB for most servers) somewhere in the server configs. You may also create a custom 414 explaining the reason for the shorter request length.
All in all, if it's security that you're aiming for, I'd start off elsewhere (the broader picture) and then slowly tackle my way until I hit the core.

How to see the bandwidth usage of my Flash application?

I'm developing an online sudoku game, with ActionScript 3.
I made the game and asked people to test it, it works, but the website goes down constantly. I'm using 000webhost, and I'm suspecting it is a bandwidth usage precaution.
My application updates the current puzzle, by parsing a JSON string every 2 seconds. And of course when players enter a number, it sends a $_GET request to update the mysql database. Do you think this causes a lot of data traffic?
How can I see the bandwidth usage value?
And how should I decrease the data traffic between Flash and mysql (or php, really).
Thanks !
There isn't enough information for a straight answer, and if there were, it'd probably take more time to figure out. But here's some directions you could look into.
Bandwidth may or may not be an issue. There are many things that could happen, you may very well put too much strain on the HTTP server, run out of worker threads, have your MySQL tables lock up most of the time, etc.
What you're doing indeed sounds like it's putting a lot of strain on the server. Monitoring this client side could be inefficient, you need to look at some server-side values, but you generally don't have access to those unless you have at least a VPS.
Transmitting data as JSONs is easier to implement and debug, but a more efficient way to send data (binary, instead of strings) is AMF: http://en.wikipedia.org/wiki/Action_Message_Format
One PHP implementation for the server side part is AMFPHP: http://www.silexlabs.org/amfphp/
Alternatively, your example is a good use case for the remote shared objects in Flash Media Server (or similar products). A remote shared object is exactly what you're doing with MySQL: it creates a common memory space where you can store whatever data and it keeps that data synchronised with all the clients. Automagically. :)
You can start from here: http://livedocs.adobe.com/flashmediaserver/3.0/hpdocs/help.html?content=00000100.html

A form that is saved real-time. Practical?

I am trying to build a very user-friendly user interface for my site. The standard right now is to use client side as well as server side validation for forms. Right? I was wondering if I could just forgo client side validation, and rely simply on server side. The validation would be triggered on blur, and will use ajax.
To go one step ahead, I was also planning to save a particular field in the database if it has been validated as correct. Something like a real-time form update.
You see, I am totally new to programming. So I dont know if this approach can work practically. I mean, will there be speed or connection problems? Will it take toll on the server in case of high traffic? Will the site slow down on HTTPS?
Are there any site out there which have implemented this?
Also, the way I see it, I would need a separate PHP script for every field! Is there a shorter way?
What you want to do is very doable. In fact, this is the out-of-the-box functionality you would get if you were using JSF with a rich component framework like ICEfaces or PrimeFaces.
Like all web technology, being able to do it with one language means you can do it with others. I have written forms like you describe in PHP manually. It's a substantial amount of work, and when you're first getting started it will definitely be easiest with one script per field backing the form. As you get better, you will discover how you can include the field name in the request and back it down to one script for Ajax interactions per form. You can of course reduce the burden even further.
PHP frameworks may be able to make this process less onerous, but I haven't used them and would recommend you avoid them initially until you get your bearings. The magic that a system like Cake or Rails provides is very helpful but you have to understand the tradeoffs and the underlying technology or it will be very hard to build robust systems atop their abstractions.
Calculating the server toll is not intuitive. On the one hand, handling large submissions is more work than handling smaller ones. It may be that you are replacing one big request with several tiny ones for a net gain. It's going to depend on the kind of work you have to do with each form field. For example, auto completion is much more expensive than checking for a username already being taken, which is more expensive than (say) verifying that some string is actually a number or some other obvious validation.
Since you don't want to repeat yourself it's very tempting to put all your validation on one side or the other, but there are tradeoffs either way, and it is true that server-side validation is going to be slower than client-side. But the speed of client-side validation is no substitute for the fact that it will introduce security problems if you count on it. So my general approach is to do validation on the server-side, and if I have time, I will add it to the client side as well so as to improve responsiveness. (In point of fact, I actually start with validation in the database as much as possible, then in the server-side code, then client-side, because this way even if my app blows up I don't have invalid data sticking around to worry about).
It used to be that you could expect your site to run about 1/3 as fast under SSL. I don't have up-to-date numbers but it will always be more expensive than unencrypted. It's just plain more work. SSL setup is also not a great deal of fun. Most sites I've worked on either put the whole thing under SSL, or broke the site into some kind of shopping cart which was encrypted and left the rest alone. I would not spend undue energy trying to optimize this. If you need encryption, use it and get on with your day.
At your stage of the game I would not lose too much sleep over performance. Since you're totally new, focus on the learning process, try to implement the features that you think will be gratifying and aim for improvement. It's easy to obsess about performance, but you're not going to have the kind of traffic that will squash you for a long time, unless half the planet is going to want to buy your product and your site is extremely heavy and your host extremely weak. When it comes, you should profile your code and find where you are doing too much work and fix that, and you will get much further than if you try and design up front a performant system. You just don't have enough data yet to do that. And most servers these days are well beyond equipped to handle fairly heavy load—you're probably not going to have hundreds of visitors per second sustained in the near future, and it will take a lot more than that to bring down a $20 VPS running a fairly simple PHP site. Consider that one visitor a second works out to about 80,000 hits a day, you'd need 8 million hits a day to reach 100/second. You're not going to need a whole second to render a page unless you've done something stupid. Which we all do, a few times, when we're learning. :)
Good luck on your journey!

Should i use Sleep() or just deny them

Im implementing a delay system so that any IP i deem abusive will automatically get an incremental delay via Sleep().
My question is, will this result in added CPU usage and thus kill my site anyways if the attacker just keeps opening new instances while being delayed? Or is the sleep() command use minimal CPU/memory and wont be much of a burden on a small script. I dont wish to flat out deny them as i'd rather they not know about the limit in an obvious way, but willing to hear why i should.
[ Please no discussion on why im deeming an IP abusive on a small site, cause heres why: I recently built a script that cURL's a page & returns information to the user and i noticed a few IP's spamming my stupid little script. cURLing too often sometimes renders my results unobtainable from the server im polling and legitimate users get screwed out of their results. ]
The sleep does not use any CPU or Memory which is not already used by the process accepting the call.
The problem you will face with implementing sleep() is that you will eventually run out of file descriptors while the attacker site around waiting for your sleep to time out, and then your site will appear to be down to any other people who tries to connect.
This is a classical DDoS scenario -- the attacker do not actually try to break into your machine (they may also try to do that, but that is a different storry) instead they are trying to harm your site by using up every resource you have, being either bandwidth, file descriptors, thread for processing etc. -- and when one of your resources are used up, then you site appears to be down although your server is not actually down.
The only real defense here is to either not accept the calls, or to have a dynamic firewall configuration which filters out calls -- or a router/firewall box which does the same but off your server.
I think the issue with this would be that you could potentially have a LARGE number of sleeping threads laying around the system. If you detect your abuse, immediately send back an error and be done with it.
My worry with your method is repeat abusers that get their timeout up to several hours. You'll have their threads sticking around for a long time even though they aren't using the CPU. There are other resources to keep in mind besides just CPU.
Sleep() is a function that "blocks" execution for a specific amount of time. It isn't the equivalent of:
while (x<1000000);
As that would cause 100% CPU usage. It simply puts the process into a "Blocked" state in the Operating System and then puts the process back into the "Ready" state after the timer is up.
Keep in mind, though, that PHP has a default of 30-second timeout. I'm not sure if "Sleep()" conforms to that or not (I would doubt it since its a system call instead of script)
Your host may not like you having so many "Blocked" processes, so be careful of that.
EDIT: According to Does sleep time count for execution time limit?, it would appear that "Sleep()" is not affected by "max execution time" (under Linux), as I expected. Apparently it does under Windows.
If you are doing what I also tried, I think you're going to be in the clear.
My authentication script built out something similar to Atwood's hellbanning idea. SessionIDs were captured in RAM and rotated on every page call. If conditions weren't met, I would flag that particular Session with a demerit. After three, I began adding sleep() calls to their executions. The limit was variable, but I settled on 3 seconds as a happy number.
With authentication, the attacker relies on performing a certain number of attempts per second to make it worth their while to attack. If this is their focal point, introducing sleep makes the system look slower than it really is, which in my opinion will make it less desirable to attack.
If you slow them down instead of flat out telling them no, you stand a slightly more reasonable chance of looking less attractive.
That being said, it is security through a "type" of obfuscation, so you can't really rely on it too terribly much. Its just another factor in my overall recipe :)

Categories