Laravel rate limiting API authenticated with API Token - php

I am building a restful API for users of my Laravel application to retrieve their data.
The current plan is that they can generate an API Token within the application to then authenticate their API requests. I do not know from where they will be making the requests.
The main reason I want to implement rate limiting is to reduce the impact of accidental/intentional DDOS, as well as part of the users current subscription package (necessary). Because of the latter, different users may have different rates.
Laravel already provides a rate limiter built in, including access to dynamic user limits specified in the User table.
I'm wondering though how the session is handled. From what I can see the Laravel TokenGuard class does not store the the user between requests. Therefore the user is being retrieved between every request, even to retrieve the rate limit. This seems to defeat the point of the rate limiter if we are still making database queries each time.
What is the appropriate way to handle this?
If I write my own authentication middleware, and store the user in the session, would that work? Do requests sent from another server (not a browser) even handle sessions?
Thanks.

Every time anyone accesses your site, you are spinning up an entire Laravel instance which is already putting stress on your server. DDOS doesn't depend on distressing your DB only. If someone is determined to DDOS you, you are going to notice! All you can do is mitigate the problem, so don't worry too much that each request has an associated DB call.
You could have a local session, but in the long run this is a bad design decision, since it introduces state to your server, which will make scaling in the future much harder. (https://12factor.net/ for more info on that.) This is why Laravel uses the user stored in the DB instead.
Unless you are doing something pretty special, it's generally safe to assume that Laravel is using an adequate solution. They do frameworks so that you can worry about business logic!
Finally, there are many websites out there. The chances are that by the time you're big enough to attract attention of people trying to DDOS you (remember it takes resources, and, therefore, money) you'll probably be using a much more sophisticated system.

If a request with some kind of token reaches your application, you should not need any kind of session. As you've assumed: a session is usually handled through cookies, but raw HTTP calls (like cURL does it) usually don't use them.
Don't overestimate the cost of getting the current user from the database - if your application performs some more actions, these additional actions will make the difference! Getting one entity from the database is fairly cheap, compared to everything else, and to check for the proper permissions and rate limits, this is obviously neccessary.
Everything else looks like you're looking for something like Laravel Passport (see https://laravel.com/docs/5.7/passport). Additional tools like the Throttle package (see https://github.com/GrahamCampbell/Laravel-Throttle) will help you to enable the rate limiting for your routes

Related

Symfony2 storing data in session versus database calls every page load

I have a site built on Symfony 2 that is basically made up of various applications. Once an application is selected, I store that application's ID in a session variable. Then for every page load for that application, the database is queried for the details of that application.
Wouldn't it be more efficient to just store the application details in the session variable instead of just the application ID?
What are the down sides of storing the application details in that way, are there any security risks I need to worry about?
Thanks a lot.
I do not recommend to store the app number at session at all. You deprive yourself from usage of shared HTTP caches with that approach http://symfony.com/doc/current/book/http_cache.html#public-vs-private-responses as all your requests become private cause a response depends on SESSION value.
If you will move the app number to the url or header or etc you will get a lot of space for optimizations.
Usage of the DB to a get app info is a quite good practice as you are able to enable doctrine's result cache for this queries to make them not impact of the app performance at all. http://doctrine-orm.readthedocs.org/en/latest/reference/caching.html#result-cache
Usage of the session to store app_id is bad practice but usage of the session to store the all app info is even worse as the number of session_ids is significant and you will store a lot of redundant information.

PHP Web Service optimisations and testing methods

I'm working on a web service in PHP which accesses an MSSQL database and have a few questions about handling large amounts of requests.
I don't actually know what constitutes 'high traffic' and I don't know if my service will ever experience 'high traffic' but would optimisations in this area be largely attributed to the server processing speed and database access speed?
Currently when a request is sent to the server I do the following:
Open database connection
Process Request
Return data
Is there anyway I can 'cache' this database connection across multiple requests? As long as each request was processed simultaneously the database will remain valid.
Can I store user session id and limit the amount of requests per hour from a particular session?
How can I create 'dummy' clients to send requests to the web server? I guess I could just spam send requests in a for loop or something? Better methods?
Thanks for any advice
You never know when high traffic occurs. High traffic might result from your search engine ranking, a blog writing a post of your web service or from any other unforseen random event. You better prepare yourself to scale up. By scaling up, i don't primarily mean adding more processing power, but firstly optimizing your code. Common performance problems are:
unoptimized SQL queries (do you really need all the data you actually fetch?)
too many SQL queries (try to never execute queries in a loop)
unoptimized databases (check your indexing)
transaction safety (are your transactions fast? keep in mind that all incoming requests need to be synchronized when calling database transactions. If you have many requests, this can easily lead to a slow service.)
unnecessary database calls (if your access is read only, try to cache the information)
unnecessary data in your frontend (does the user really need all the data you provide? does your service provide more data than your frontend uses?)
Of course you can cache. You should indeed cache for read-only data that does not change upon every request. There is a useful blogpost on PHP caching techniques. You might also want to consider the caching package of the framework of your choice or use a standalone php caching library.
You can limit the service usage, but i would not recommend to do this by session id, ip address, etc. It is very easy to renew these and then your protection fails. If you have authenticated users, then you can limit the requests on a per-account-basis like Google does (using an API key for all their publicly available services per user)
To do HTTP load and performance testing you might want to consider a tool like Siege, which exactly does what you expect.
I hope to have answered all your questions.

php session management in socket services

I'm considering building a security service in PHP that would hold user credential information , the most important of them would be tokens of logged in users. This service would be accessed by some kind of an API (REST, SOAP, whatever) by another API (an external user connects through a website API which checks credentials in another API - the one we're considering now).
There is a possibility to store tokens (and other information) in RDBMS. But this solution doesn't seem clean to me (tokens will remain in the database even if they're already expired, I would have to implement a mechanism for clearing expired sessions, etc). I was thinking about using native PHP session management ($_SESSION). Is that possible? Does anyone have experience with doing such things?
I thought of following problems:
when a PHP-based website is deployed on www server, users access the URL via browser and their native sessions are created using browser cookies. If there was one webpage API that would connect to security API, would there be only one session object all the time? Is it configurable?
How precisely sessions are created and how can I affect the mechanism (e.g. not to base it on cookies)?
My advice would be to use a database.
Let me start out with explaining the general concept of sessions. Sessions can be seen as server-side cookies. The location of the $_SESSION variable storage is determined by PHP's session.save_path configuration. Usually this is /tmp on a Linux/Unix system. Sessions have a session-parameter of the client associated with them. When a session_start or something like that is issued, the server will retrieve the file/session based on the session-parameter provided by client. As these are just stored files, it is possible for the server to read the sessions of other clients.
That brings me to the second problem you describe. If I am correct you want to have some api request information about a session of some user. Based on the first paragraph, you hopefully understand that the purpose of sessions isn't to use it as some sort of global storage. Of course it is possible. You could have the foreign APIs include the session-parameter or you could read the session-files manually, but to me these seem dirty fixes. It just isn't what sessions are build for.
The only other thing which attracts you to using sessions is the automatic timeout of sessions. However this simple logic you could easily implement when using a database. What you should do is register the time of the last activity of the user in your database. When an API requests the data of a user you can simply check whether the current time - the last active time is lower than a certain threshold. If that is not the case, the session expired and, at the same time, you can drop the session from the table. This is the more or less the same general method as sessions internally use, which requires no regular cronjobs (although they still could be useful to cleanup the database) to remove sessions.
So don't be afraid to use a database to store data, after all they are build (and optimized) to do that exact thing.

Smart PHP Session Handling/ Security

I've decided the best way to handle authentication for my apps is to write my own session handler from the ground up. Just like in Aliens, its the only way to be sure a thing is done the way you want it to be.
That being said, I've hit a bit of a roadblock when it comes to my fleshing out of the initial design. I was originally going to go with PHP's session handler in a hybrid fashion, but I'm worried about concurrency issues with my database. Here's what I was planning:
The first thing I'm doing is checking IPs (or possibly even sessions) to honeypot unauthorized attempts. I've written up some conditionals that sleep naughtiness. Big problem here is obviously WHERE to store my blacklist for optimal read speed.
session_id generates, hashed, and gets stored in $_SESSION[myid]. A separate piece of the same token gets stored in a second $_SESSION[mytoken]. The corresponding data is then stored in TABLE X which is a location I'm not settled on (which is the root of this question).
Each subsequent request then verifies the [myid] & [mytoken] are what we expect them to be, then reissues new credentials for the next request.
Depending on the status of the session, more obvious ACL functions could then be performed.
So that is a high level overview of my paranoid session handler. Here are the questions I'm really stuck on:
I. What's the optimal way of storing an IP ACL? Should I be writing/reading to hosts.deny? Are there any performance concerns with my methodology?
II. Does my MitM prevention method seem ok, or am I being overly paranoid with comparing multiple indexes? What's the best way to store this information so I don't run into brick walls at 80-100 users?
III. Am I hammering on my servers unnecessarily with constant session regeneration + writebacks? Is there a better way?
I'm writing this for a small application initially, but I'd prefer to keep it a reusable component I could share with the world, so I want to make sure I make it as accessible and safe as possible.
Thanks in advance!
Writing to hosts.deny
While this is a alright idea if you want to completely IP ban a user from your server, it will only work with a single server. Unless you have some kind of safe propagation across multiple servers (oh man, it sounds horrible already) you're going to be stuck on a single server forever.
You'll have to consider these points about using hosts.deny too:
Security: Opening up access to as important a file as hosts.deny to the web server user
Pain in the A: Managing multiple writes from different processes (denyhosts for example)
Pain in the A: Safely making amends to the file if you'd like to grant access to an IP that was previously banned at a later date
I'd suggest you simply ban the IP address on the application level in your application. You could even store the banned IP addresses in a central database so it can be shared by multiple subsystems with it still being enforced at the application level.
I. Optimal way of storing IP ACL would be pushing banned IP's to an SQL database, which does not suffer from concurrency problems like writing to files. Then an external script, on a regular basis or a trigger, may generate IPTABLES rules. You do not need to re-read your database on every access, you write only when you detect mis-behavior.
II. Fixation to IP is not a good thing on public Internet if you offer service to clients behind transparent proxies, or mobile devices - their IP changes. Let users chose in preferences, if they want this feature (depends on your audience, if they know what does the IP mean...). My solution is to generate unique token per (page) request, re-used in that page AJAX requests (not to step into a resource problem - random numbers, session data store, ...). The tokens I generate are stored within session and remembered for several minutes. This let's user open several tabs, go back and submit in an earlier opened tab. I do not bind to IP.
III. It depends... there is not enough data from you to answer. Above may perfectly suit your needs for ~500 user base coming to your site for 5 minutes a day, once. Or it may fit even for 1000 unique concurent users in a hour at a chat site/game - it depends on what your application is doing, and how well you cache data which can be cached.
Design well, test, benchmark. Test if session handling is your resource problem, and not something else. Good algorithms should not throw you into resource problems. DoS defense included, and it should not be an in-application code. Applications may hint to DoS prevention mechanisms what to do, and let the defense on specialized tools (see answer I.).
Anyway, if you get into a resource problems in future, the best way to get out is new hardware. It may sound rude or even incompetent to someone, but calculate price for new server in 6 months, practically 30% better, versus price for your work: pay $600 for new server and have additional 130% of horsepower, or pay yourself $100 monthly for improving by 5% (okay, improve by 40%, but if the week is worth $25 may seriously vary).
If you design from scratch, read https://www.owasp.org/index.php/Session_Management first, then search for session hijacking, session fixation and similar strings on Google.

REST API for a PHP Web application

I am working on a API for my web application written in CodeIgniter. This is my first time writing a API.
What is the best way of imposing a API limit on the API?
Thanks for your time
Log the user's credentials (if he has to provide them) or his IP address, the request (optional) and a timestamp in a database.
Now, for every request, you delete records where the timestamp is more than an hour ago, check how many requests for that user are still in the table, and if that is more than your limit, deny the request.
Simple solution, keep in mind, though, there might be more performant solutions out there.
Pretty straight forward. If that doesn't answer your question, please provide more details.
I don't see how this is codeigniter related, for example.
You can use my REST_Controller to do basically all of this for you:
http://net.tutsplus.com/tutorials/php/working-with-restful-services-in-codeigniter-2/
I recently added in some key logging, request limiting features in so this can all be done through config.
One thing you can do is consider using an external service to impose API limits and provide API management functionality in general.
For example, my company, WebServius ( http://www.webservius.com ) provides a layer that sits in front of your API and can provide per-user throttling (e.g. requests per API key per hour), API-wide throttling (e.g. total requests per hour), adaptive throttling (where throttling limits decrease as API response time increases), etc, with other features coming soon (e.g. IP-address-based throttling). It also provides a page for user registration / issuing API keys, and many other useful features.
Of course, you may also want to look at our competitors, such as Mashery or Apigee.

Categories