Limiting client usage with credits - php

My web app is built with Angular, and php as an api server. I want to have a service which is limited by predefined time. For example a user gets 4 hours to run function consume() which consumes data from my server. This of course can't be handled client-side alone, because it will be bypassed.
What I'm currently thinking is some kind of credit system. for example, the user gets 240 credits, and consume() pulls data from the api every 1 minute which costs 1 credit. Is there a better way of doing it?
I couldn't find any hints about it on google because I don't even know what keywords to search for..

Related

Is it possible to have such process represented in rabbitMq?

There is a long running process(Excel report creation) in my web app that needs to be executed in a background.
Some details about the app and environment.
The app consists of many instances, where each client has separate one (with customized business logic) while everything is hosted on our server. The functionality that produces Excel is the same.
I'm planning to have one rabbitMq server installed. One part of app(Publisher) will take all report options from user and will put it into message. And some background job(Consumer) will consume it, produce report and send it via email.
However, there is a flaw in such design, where,say, users from one instance will queue lots of complicated reports(worth ~10 min of work) and a user from another instance will queue an easy one(1-2 mins) and he will have to wait until others will finish.
There could be separate queues for each app instance, but in that case I would need to create one consumer per instance. Given that there are 100+ instances atm, it doesn't look like a viable approach.
I was thinking if it's possible to have a script that checks all available queues(and consumers) and create a new consumer for a queue that doesn't have one. There are no limitations on language for consumer and such script.
Does that sound like a feasible approach? If not, please give a suggestion.
Thanks
As I understood topic correctly everything lies on one server - RabbitMQ, web application, different instances per client and messeges' consumers. In that case I rather put different topics per message (https://www.rabbitmq.com/tutorials/tutorial-five-python.html) and introduce consumer priorities (https://www.rabbitmq.com/consumer-priority.html). Based on that options during publishing of the message I will create combination of topic and priority of the message - publisher will know number of already sent reports per client, selected options and will decide is it high, low or normal priority.
Logic to pull messages based on that data will be in the consumer so consumer will not get heavy topics when there are in process already 3 (example).
Based on the total number of messages in the queue (its not accurate 100%) and previous topics and priorities you can implement kind of leaking bucket strategy in order to get control of resources- max 100 number of reports generated simultaneously.
You can consider using ZeroMQ (http://zeromq.org) for your case its maybe more suitable that RabbitMQ because is more simple and its broker less solution.

PHP batch processing

I am building an social media application where a user is required to post something and this posted content is then propogated to all the members within his/her 4 circles. Meaning the query goes into a loop. Its like a family tree. The logic works perfectly fine. But now when number of members in each circle keep growing, the the exceution is exceeding the max execution time which is currently set to 90 and which is fairly good enough.
We dont want to increase the time limit as this is not a permanently solution. So, we have though about implmenting this using bath processing concept. Like a user posts something on the web, and then the id and text is handed over to the batch script. The response is generated imediatly to the web user and the batch script continues to work behind the scene.
Any idea or though on how this can be implemented. Thanks in advance.
I suppose you're using a Relational DB for this (like mysql). In my experience, while it can work, it is surely not the best tool to model social interactions because it is not scalable/efficient.
You should probably explore a NoSQL database with graph support for this kind of problem, like OrientDB http://orientdb.com/orientdb/ .
Not exactly what you suggested but I think it will save you from many headache in the end.
What you are looking for is named a Message queuing service.
Basicly, in your app you send a message saying for exemple to dispatch the content to members of circles of the current user. Then this message will be consumed asynchronously by a consumer (an other running PHP instance of your app that is able to handle those messages)
Have a look at RabbitMQ or Beanstalkd, with PHP.

Can PHP Handle Multiple synchronous request with code written in pure Procedural way

I had built a site in the last week For a simple Registration Process,The process is as mentioned below
Entering Basic details - STEP1
OTP Generated and sent to user and user enters it - STEP2
A Membership Number is displayed to him
I have Used Plain Core PHP with procedural coding
The client says he is expecting 1.2 million people on the launching date and on the basis i assume there may be 2 to 300k requests simultaneously , one doubt i have in mind is Can PHP Code written in Pure Procedural Code Handle this
So I'm stuck on how to proceed with PHP should i change the procedural code to object oriented or should i use framework like codeigniter?
See this post:
Differences in procedural and object-oriented implementations of mysql in php?
For your purposes, there is no difference in procedural vs OO. Under-the-hood, they do the same thing. The thread of execution is the same. There is no "asynchronous vs synchronous" or "serial vs parallel" thing going on.
Codeigniter would only have a performance benefit if you use it instead of another framework. You could choose to use it for other reasons, but they aren't related to your stated problem. Your existing PHP should be fine.
Steps 1 and 2 are separate PHP requests. There is no process on the server which persists throughout the user's website visit. You might have lot of visitors viewing the website at the same time, but it is only their server interactions which matter. Suppose you have 2 lakh visitors on the website at peak times, and each visit lasts 15 minutes. That translates into about 200 Step1 requests and 200 Step2 requests per second.
For PHP scripts doing a small number of DB requests and generating a bit of HTML, you might have from 5 to 50 process running simultaneously on the server. That is not a huge load.
Your bottleneck is likely to be in Step1, where you will be emailing 200 passwords per second. You need to use a method that gets the email out of PHP quickly and into the queue for your outgoing mail server. If each mail request were to block your PHP request for 5 seconds, you would need 1000 simultaneous PHP processes. Do some testing, sending out 1000 emails from one PHP script, and see if it takes more than 5 seconds.
If you really need large-scale processing for a 1-day launch, consider cloud hosting where you can use as many servers as you need for a short period of time, at low cost. For example, see:
http://www.rackspace.com/cloud/cloud_bursting

Developing an application that creates financial chart from web services

I'm trying to create an application that will create a chart that will get the data by calling a financial web service. Users will interact with the chart by clicking on it. There could be millions of views on the application, so millions of requests on the web service. What's the best way to do this? Will I need to call the web service each time, a million times?
The way I see it you would be better off reading the data from the web service at fixed interval (every X seconds/minutes/hours depending if your data is real-time or not) and displaying the graph with that information. That way your number of requests to the web service will not vary and you can change one part of the equation without affecting the other.
Edit after first comment: my answer stands. You would be better off getting the data at a fixed interval and inform users that the data is supplied with a delay (the usual delay with financial activity data is 15 minutes). That way you know in advance the number of requests you will run and you greatly speed up the service to your users.
A word of warning: Yahoo Finance changed its API licensing in 2012 or 2011 and it is now forbidden to use data from the API for public applications without a commercial license.

REST API for a PHP Web application

I am working on a API for my web application written in CodeIgniter. This is my first time writing a API.
What is the best way of imposing a API limit on the API?
Thanks for your time
Log the user's credentials (if he has to provide them) or his IP address, the request (optional) and a timestamp in a database.
Now, for every request, you delete records where the timestamp is more than an hour ago, check how many requests for that user are still in the table, and if that is more than your limit, deny the request.
Simple solution, keep in mind, though, there might be more performant solutions out there.
Pretty straight forward. If that doesn't answer your question, please provide more details.
I don't see how this is codeigniter related, for example.
You can use my REST_Controller to do basically all of this for you:
http://net.tutsplus.com/tutorials/php/working-with-restful-services-in-codeigniter-2/
I recently added in some key logging, request limiting features in so this can all be done through config.
One thing you can do is consider using an external service to impose API limits and provide API management functionality in general.
For example, my company, WebServius ( http://www.webservius.com ) provides a layer that sits in front of your API and can provide per-user throttling (e.g. requests per API key per hour), API-wide throttling (e.g. total requests per hour), adaptive throttling (where throttling limits decrease as API response time increases), etc, with other features coming soon (e.g. IP-address-based throttling). It also provides a page for user registration / issuing API keys, and many other useful features.
Of course, you may also want to look at our competitors, such as Mashery or Apigee.

Categories