Who know how many url can i check and what time limit between request i need to use with safebrowsing api. I use it with PHP, but after checking 2k urls, i got
Sorry but your computer or network may be sending automated queries. To protect our users, we can't process your request right now.
are supposed to be 10.000
with both the Safe Browsing Lookup API
https://developers.google.com/safe-browsing/lookup_guide#UsageRestrictions
and Safe Browsing API v2
https://developers.google.com/safe-browsing/developers_guide_v2#Overview
but you could ask for more is free they said.
I understand that they allow you to do 10k request per day. On each request you can query for up to 500 URLs, so, in total they let you lookup 5M URLs daily, not bad.
I currently use Google Safe Browsing API and following are the limitations in the API.
A single API key can make requests for up to 10,000 clients per 24-hour period.
You can query up to 500 URLs in a single POST
request.
I previously used one request per time and ended by exceeding the quota defined by the API. But now per request I set maximum of 500 URLs. It helped me not to exceed the limit of the API and it is super fast too.
Related
I am using some API which is free.
I am using PHP script which is using fopen to download JSON from API.
When I make to many requests(eg. 2 requests every minute) API is blocking my PHP server IP.
Is there a way to solve it and possibility to make more requests (I don't want to DDoS attack)?
Is there better solution than use of many PHP servers with different IP's?
This is a quite abstract question as we don't know the actual api you are talking about.
But, usually, if an api implement a rate limit, it shows this kind of header in it's answer:
X-Rate-Limit-Limit: the rate limit ceiling for that given request
X-Rate-Limit-Remaining: the number of requests left for the 15 minute window
X-Rate-Limit-Reset: the remaining window before the rate limit resets in UTC epoch seconds
Please check the docs (this one is from twitter, https://dev.twitter.com/rest/public/rate-limiting).
I am working with Sound-Cloud API in my application for these i create some APIs. It was working fine yesterday but now its showing
error: string(47) "The requested URL responded with HTTP code 429.
I checked Sound-Cloud documentation and find HTTP Code 429 related to Too many request.
Here my concern is How i can know count of my all requests and remaning request.
Effective July 1, all requests that result in access to a playable stream are subject to a limit of 15,000 requests per any 24-hour time window. Ref
NOTE
There is no way to count, how many requests are remaining or used.
Solution
You have to check, how many API request you have at one page? Reduce them as much as you can.
You can create multiple API KEYS and use them randomly.
You can make cache of your queries.
I am currently trying to make a syncing operation between my Database and Gmail's contacts.
My first initial sync, downloading/uploading over 1,000 contacts per user might throw some errors up in gmails face.
Is there any work-arounds? What is the limitations to having many contacts?
My understanding is that it is limited per IP, and not per User... is this correct?
I hope that someone can share some info on this, I have searched the web, but haven't found the best of resources... Thoughts?!
I actually received a response from Google.
The query is currently per user and is quite high though there is a
limit in the number of queries per second, per hour and per half a day
you can send. Unfortunately, we don't publicly communicate on these
values but I can assure you that normal applications (not spamming
Google's servers) shouldn't have any issues.
Also, when syncing, please make sure to use the updated-min query
parameter to only request contacts that have been updated since the
provided time and use batch-request when sending requests to the API
as it will perform multiple operations while consuming only one
request on the user's quota.
Hopefully this helps someone else if in need.
Yes, there is a limitation on accessing the Google API (at least on Google Maps API) on an IP basis. The only workaround I was abble to find is to use proxy servers (or tor).
I'm just starting to mess around with a very, very basic call to the
Twitter API (http://api.twitter.com/1/statuses/user_timeline.json) to
pull my tweets to my website through cURL. However, using a page that
nobody knows exists yet (thus eliminating the possibility of
inadvertent traffic), I'm getting a Rate Limit Exceeded thing before
I've had the chance to even test it. It says it resets at 5 past the
hour, so I check again, and for a minute it works but then it's back
to telling me my rate limit is exceeded. A few questions for anyone who knows about the Twitter API and/or cURL:
First, is the rate limit applied to my server (instead of the user)? I would assume so, but
that could make it tough of course. Even one API call per visitor
could, on a site with marginal traffic, easily surpass the rate limit
in an hour. Is there a way to associate the call with the visitor, not
the server? Seems like probably not, but I'm not entirely sure how the
whole API works, and cURL does seem to be advocated in a number of
places. I'm aware that if I use JSON and AJAX the data in I can make
that request from the user, but just for the sake of argument, what
about cURL?
Second, any idea how I could be surpassing my rate limit without even
refreshing the page? I pay for hosting at another location, so I might
be sharing server space with another site, but my site definitely has
a unique IP, so that should … that should be OK, right? So how is it
that I'm surpassing the rate limit without even running the code (or
by running it once?)?
Here's what I've got for code, if it helps:
$ch=curl_init("http://api.twitter.com/1/statuses/user_timeline.json?screen_name=screenname");
curl_setopt_array($ch,array(
CURLOPT_RETURNTRANSFER=>true,
CURLOPT_TIMEOUT=>5,
)
);
$temp=curl_exec($ch);
curl_close($ch);
$results=json_decode($temp,true);
Also, I've now got it so that if Twitter returns a Rate Limit error, it records the error in a text file, as well as the time that the limit will reset. Looking at that file, the only time it updates (I don't have it rewrite, it just adds on) is when I've loaded the page (which is maybe once or twice in an hour), so it's not like something else is using this page and calling on this URL.
Any help?
Authenticated requests should count against the user's 350/hour limit. Non-authenticated requests get counted against your IP address's 150/hour limit.
If you're running into the limits during development, Twitter has generally been quite willing to whitelist dev server IPs.
http://dev.twitter.com/pages/rate-limiting
Some applications find that the default limit proves insufficient. Under such circumstances, we offer whitelisting. It is possible to whitelist both user accounts and IP addresses. Each whitelisted entity, whether an account or IP address, is allowed 20,000 requests per hour. If you are developing an application and would like to be considered for whitelisting you fill out the whitelisting request form. Due to the volume of requests, we cannot respond to them all. If your request has been approved, you'll receive an email.
I am working on a API for my web application written in CodeIgniter. This is my first time writing a API.
What is the best way of imposing a API limit on the API?
Thanks for your time
Log the user's credentials (if he has to provide them) or his IP address, the request (optional) and a timestamp in a database.
Now, for every request, you delete records where the timestamp is more than an hour ago, check how many requests for that user are still in the table, and if that is more than your limit, deny the request.
Simple solution, keep in mind, though, there might be more performant solutions out there.
Pretty straight forward. If that doesn't answer your question, please provide more details.
I don't see how this is codeigniter related, for example.
You can use my REST_Controller to do basically all of this for you:
http://net.tutsplus.com/tutorials/php/working-with-restful-services-in-codeigniter-2/
I recently added in some key logging, request limiting features in so this can all be done through config.
One thing you can do is consider using an external service to impose API limits and provide API management functionality in general.
For example, my company, WebServius ( http://www.webservius.com ) provides a layer that sits in front of your API and can provide per-user throttling (e.g. requests per API key per hour), API-wide throttling (e.g. total requests per hour), adaptive throttling (where throttling limits decrease as API response time increases), etc, with other features coming soon (e.g. IP-address-based throttling). It also provides a page for user registration / issuing API keys, and many other useful features.
Of course, you may also want to look at our competitors, such as Mashery or Apigee.