My website has a script that will call an external API when a user visits a particular page.
An API request will be made when the page is accessed and a response is returned in xml format.
I am using the usual curl requests.
Right now, due to new implementations on the API side, if the API is getting too much requests, it will throw an exception and deny the request.
I want to limit the total calls to the API from my website to only 8 times per second.
How can I achieve this? Someone suggested me about queuing the requests but I've never done something like this before and I'm having a hard time finding a solution.
Sorry if my English has errors. Any help is appreciated.
For example: if 100 users accessed the web-page all at the same time, I need to queue those API requests 8 after 8 per second and so on until all are done.
I have give suggest you to use one api generate to create token and match token on every request and do expiry or delete token after some time. So may be resolve your multiple request issues.
$currentCount=0;
$currentSeconds;
function callAPI()
{
if($currentCount<8 || date("s") != $currentSeconds)
{
if(date("s") != $currentSeconds)
{
$currentCount=0;
}
$currentSeconds=date("s");
//call your API here
$currentCount++;
}
}
For each API call:
-Record the current time (just the seconds) in a variable.
-Make your API call.
-Increment a call counter.
-Check again if current seconds equal the previously stored value and if your call counter is under 8. If your call counter is under 8, you may make another call.
You can delay the API request for microseconds, here is sample code
usleep(1250000);//1 sec = 10,000,00 ms
function _callAPI(){
// Your code here
}
When a user visits your site, the request will fire after a few microseconds in this way you can delay the request.
You can also maintain a log when a request is fired for the API and based on the previous request, dealy the next request.
Value of 8 call per second is low, so can save in database each call attempt and calculate number of calls per last 5 second every time.
For large values usually used counters in nosql database like Cassandra or Aerospike.
I.e for each request you get current time and increase counter name "counter"+second until you got your desired limit.
Aerospike is best for this if load is really high(1000+ cps), it give very low latency.
Cassandra is simpler to use and require less memory.
Even less memory is memcashed.
Related
I am creating a project management system and in need to do push notifications when an activity took place.
Question : If I do a jquery to refresh and fetch notification from mysql database, say every 30seconds, will there be a huge impact in the server? What are the minimum requirements?
So basically, I'm looking at 10 notifications/day for 20 employees.
Assuming you're talking about an AJAX request to the server in order to update DOM elements, most basic web servers would very much be able to handle a few requests every 30 seconds or so. More important is how well-optimized the server-side code that finds & returns the notifications is. Assuming you'll have a few clients requesting every 30 seconds, I would suggest making sure the code only takes a few seconds to process the request and respond with the updated data.
Im using an api there are limited number of request time. In my system there will be lots of user for using. Can i do set time to each request after 20 request in a min in Php or symfony?
I made a PHP application and it requires API Response from a server and it checks with mysql database.
Every time it takes long time to execute, minimum time to get response is 3 seconds and goes 90 seconds and so on.
I have attached the screenshot of two files for one is continuous request delayed response and another is waiting time.
I want to know the reason for the waiting time and blocked time in response. But when I check the php code execution it done in less than one seconds.
Whether I want to improve php code or that API server?
I've doubt regarding speed and latency for show real time data.
Let's assume that I want to show read time data to users by fire ajax requests at every second that get data from MySql table by simple collection query.
For that currently these two options are bubbling in my mind
MySql / Amazon Aurora
File system
Among these options which would be better? Or any other solution?
As I checked practically, if we open one page in browser then ajax requests gives response in less than 500ms using PHP, MySql, Nginx stack.
But if we open more pages then same ajax requests gives response in more than 1 second that should be less than 500ms for every visitors.
So in this case if visitors increase then ajax requests gives very poor response.
I also checked with Node.js+MySql but same result.
Is it good to create json files for records and fetch data from file? Or any other solution?
Indeed, you have to use database to store actual data but you can easily add memory cache (it could be internal dictionary or separate component) to track actual updates.
Than your typical ajax request will look something like:
Memcache, do we have anything new for user 123?
Last update was 10 minutes ago
aha, so nothing new, let's return null;
When you write data:
Put data into database
Update lastupdated time for clients in memcache
Actual key might be different - e.g. chat room id. Idea is to read database only when updates actually happened.
Level 2:
You will burn you webserver and also client internet with high number of calls. You can do something:
DateTime start = DateTime.Now;
while(Now.Subtract(30 seconds) < start)
{
if (hasUpdates) return updates;
sleep(100);
}
Than client will call server 1 time per 30 seconds.
Client will get response immediately when server notices new data.
hello i have some problems with my php ajax script
i'm using PHP/mysql
i have a field in my accounts table that will save the time for the last request from a user, i will use that to kick the idle user out of the chat. and i will make a php function that will delete all the rows that its time field more than the time limit, but where should i use this method is it okay to fire it every time a new request sent to my index.php ? i think that will make a huge load on the server,is n't it ? do you have a better solution?
thanks
There are two viable solutions:
either create a small PHP script that makes this deletion in an infinite loop (and of course sleeps for a specified amount of time before doing it again), and then start it via PHP CLI,
or create one that makes the deletion only once, then exits, and call it from cron (if you're using a UNIXish server) or Task Scheduler (on Windows).
The second one is simpler, but its drawback is that you can't make the interval between the deletions shorter than 60 seconds.
A solution could be to fire the deletion function just once every few requests.
Using rand() you could give it a 1 in 100 (for example) change of running the function, so that about one page request in a 100 will clean up the expired data.