OVER_QUERY_LIMIT w/Billing Account Enabled - php

I am using the Geocoding API, and am receiving OVER_QUERY_LIMIT and I have enabled my billing account, which should give me over 100k queries. I am doing about 2500 or less. It seems to happen when I am processing many items in a php loop, but not for each item - for example.
OK
OK
OVER_QUERY_LIMIT
OK
OK
So it doesn't appear I am actually over the limit, but that's the XML returned for the transaction. If I process the same transaction in a URL it works with no issue.
Ideas?

Pace your application or submit the requests in smaller groups. Solutions include using a cron job to distribute the requests throughout the day, or adding a small delay between requests.
From: https://developers.google.com/maps/documentation/business/articles/usage_limits
If you exceed the usage limits you will get an OVER_QUERY_LIMIT status
code as a response.
This means that the web service will stop providing normal responses
and switch to returning only status code OVER_QUERY_LIMIT until more
usage is allowed again. This can happen:
Within a few seconds, if the error was received because your
application sent too many requests per second.

Related

My telegram bots stopped working

I have two bots on my server but since 4 days ago both of them stopped working.
I checked the script on the other server and I'm pretty sure there is nothing wrong with it. also I talked to my hosting provider and seems that there is nothing wrong with it too.
What's going wrong?
Update : "I'm using webhooks"
When sending messages inside a particular chat, avoid sending more than one message per second. We may allow short bursts that go over this limit, but eventually you'll begin receiving 429 errors.
If you're sending bulk notifications to multiple users, the API will not allow more than 30 messages per second or so. Consider spreading out notifications over large intervals of 8—12 hours for best results.
Also note that your bot will not be able to send more than 20 messages per minute to the same group.

Query free API, IP blocking

I am using some API which is free.
I am using PHP script which is using fopen to download JSON from API.
When I make to many requests(eg. 2 requests every minute) API is blocking my PHP server IP.
Is there a way to solve it and possibility to make more requests (I don't want to DDoS attack)?
Is there better solution than use of many PHP servers with different IP's?
This is a quite abstract question as we don't know the actual api you are talking about.
But, usually, if an api implement a rate limit, it shows this kind of header in it's answer:
X-Rate-Limit-Limit: the rate limit ceiling for that given request
X-Rate-Limit-Remaining: the number of requests left for the 15 minute window
X-Rate-Limit-Reset: the remaining window before the rate limit resets in UTC epoch seconds
Please check the docs (this one is from twitter, https://dev.twitter.com/rest/public/rate-limiting).

Can not reach Amazon SES sending limit because of throttling errors

we send messages with Amazon SES using SES API. Our send rate is now 90 messages per second. But we are getting throttling exception even when we don't reach this limit but just trying to approach it.
Right now we can steadily send at the rate of 30 messages per second. The question is how to send faster.
Let me dive into some more details and clarify the question.
It might take from 0.3 to 3 seconds for a single API send request to complete. This is why if we would send messages sequentially we would hardly get 1 message per second speed.
Luckily we can send messages in parallel and this is what we are doing.
For each thread we check that it does not send more then allowed number of messages per second. For example if we have 40 threads then we don't allow each thread to send more then 2 messages per second. Yes, this is not optimal.
We register every message send and the time when the API request has completed (when we got the response from API). This allows to get some statistics.
When we restrict the sending limit to be less then allowed limit (like 60 instead of 90) everything works fine.
When we try to send at maximum limit we start getting throttling errors. Like when have speed of 80 requests per second reached we start getting exceptions.
This allows me to put the questions:
Q: How to send messages with highest allowed speed?
Let's start with another question - 'How does SES calculate the number of messages to check send rate?'
Let me guess. When we submit a new request they look at the number of requests submitted during last second from current moment and if this number is less then our limit the request is accepted.
But wait. If we have 40 threads and each thread can not send more then 2 messages per second then we can never reach the limit. But we do get the exceptions.
Research
There is a great blog post on Amazon SES blog about handling the limits. We are trying to adopt this approach but have not succeed yet.
We are using PHP in our application and PHP SES SDK.
I guess this is quite common task but for some reason I'm not lucky to find the complete solution.
Any help or ideas would be greatly appreciated. Thank you.
The key take away is:
A variety of factors can affect your send rate, e.g. message size,
network performance or Amazon SES availability.
Based of what you've said it seems like you're using a bit of fuzzy logic to try and calculate how many messages you're sending. This won't be perfect so if your AWS limit is 90p/s then getting setting your code lower, e/g to 60p/s makes sense (once again this all depends on how accurate your estimates are).
You should consider the other approaches as you've mentioned, such as "Exponential backoff" as described in the link you provided.
Another thing you can consider is taking advantage of a queue, like SQS. This way you can pick tasks off the list as quick as possible, and if you're a little too quick you can always back off and then jump back on the queue as soon as possible.

Ajax calls vs. server side calls

I am building a twitter feed widget for Wordpress, and one of the issues I have to deal with is Twitter's rate limits (150 tweets per hour per account). I have noticed that when i'm fetching the tweets using server-side calls (e.g. file_get_contents()) the limit is reached very quickly, especially on a shared host. I've tried to fetch the tweets using client-side calls with jQuery's getJSON function, and the rate limit took a lot longer to reach.
What is the reason for this difference between client-side and
server-side calls when it comes to Twitter rate limits?
Which method would be preferable for this case?
Update
I should note that the tweets are being cached to avoid hitting the rate limits, but that does not help when the calls are made from a shared host.
When you use server-side calls, all the calls are coming from the same IP; all the users are sharing the same 150 tweat/hour quota.
When you use client-side calls, they calls come from different IPs for each client. Each client gets 150 tweats per hour, so all the clients combined can get a much larger volume.

Google safebrowsing api limits

Who know how many url can i check and what time limit between request i need to use with safebrowsing api. I use it with PHP, but after checking 2k urls, i got
Sorry but your computer or network may be sending automated queries. To protect our users, we can't process your request right now.
are supposed to be 10.000
with both the Safe Browsing Lookup API
https://developers.google.com/safe-browsing/lookup_guide#UsageRestrictions
and Safe Browsing API v2
https://developers.google.com/safe-browsing/developers_guide_v2#Overview
but you could ask for more is free they said.
I understand that they allow you to do 10k request per day. On each request you can query for up to 500 URLs, so, in total they let you lookup 5M URLs daily, not bad.
I currently use Google Safe Browsing API and following are the limitations in the API.
A single API key can make requests for up to 10,000 clients per 24-hour period.
You can query up to 500 URLs in a single POST
request.
I previously used one request per time and ended by exceeding the quota defined by the API. But now per request I set maximum of 500 URLs. It helped me not to exceed the limit of the API and it is super fast too.

Categories