How can I prevent my IP address from getting banned by Ebay? - php

I'm using the ebay API to import the products. After some days/time period I can't get the records from ebay. To ensure the problem I have requested it from the different IP. And it works from it. It concludes that ebay is not allowing the request from earlier IP. So what can be the solution?

You might be blacklisted. If you consume too many resources or perform too many queries in a time frame (on purpose or by accident), you're almost guaranteed to be blocked. In that case, you can contact ebay's support and try to straighten out the situation.

Related

How can I keep track of user specifically? Example, via using their permanent private IP

Today I decided I want to work more on my framework and want to add rate limiting. I have done such a thing before and had it working. It worked by getting your public router mask IP and storing it in a database, then on every request it counts it, and once reached to a limit, it will start blocking you for some time then reset.
The Problem
Well the problem is I was getting the user IP with this code snippet:
$client['IP'] = $_SERVER['REMOTE_ADDR'];
This code gets the public IP of the client or the MASKED IP that your router assigns. Now this is a problem because if one person on the network is spamming and is the bad guy, the good guy on the network will also be rate limited cause after a certain amount of rate limiting you, it blocks you for a while.
So, in PHP, is there some possible way to keep track of a pin pointed user, so I can have something that will guaranty to pin point the user for at least a week since you ISP keeps changing your IP every 14 days.
Thanks!
Not using IP alone you can't, no.
Even if you could get the private IP (which you can't)...
it probably won't be unique (except within that private network),
it is very unlikely to be permanent (largely because of DHCP),
the same device can switch networks (e.g. different Wifi networks, or 4G) giving it a different public and private IP each time, and
many people use more than one device anyway.
One obvious and more reliable way to track users is via your authentication scheme. If users must authenticate before sending requests to your endpoints, you can track the number of requests made by a specific user and put limits on that. (If you want to be subtle you can impose different limits for different endpoints depending on their sensitivity or popularity, or the load they cause on your infrastructure. You could also potentially limit a user per IP, if they connect from more than one IP, and you think it make sense.)
If you allow anonymous access to some or all of the data, then you can track the user a bit using API keys, which you would issue as a GUID to each application or user which registers with the API, but they are not quite as secret. You can then apply rate limits per key and/or per user. That's how big API providers such as Google do it - or, at least it's one technique that they use to help with rate limiting (and other things), anyway.

Google Map " but your computer or network may be sending automated queries" error

I've a PHP project that use to work like a cake but now google map api is blocking it
I user a code as this one to get geolocation data from google map (address to coordinates transformation) with a low number of request/day.
$base_url = "http://maps.google.com/maps/geo?output=xml&key=".KEY."&q=".urlencode($address);
$xml = #simplexml_load_file($request_url);
with KEY ad my google API key but I started to get this error
We're sorry...
... but your computer or network may be sending automated queries. To protect our users, we can't process your request right now.
how can I fix that ? Google API guide is useless...
Most of Google Api's have rate limiting protections. You should read the terms of service and see the allowed policy.
Besides that, what you could technically do (and i'm almost certain you are running than code in a loop for multiple addresses) is set a random delay (more than 20s) between the calls to google api, limiting the load and your own evilness... You could also think about using curl to make requests with multiple user agents... its technically possible, although frowned upon :)

How many queries does google allow from a specified IP or website?

I have made a PHP script that counts the no. of back-links from Google of a given website. But my dilemma is how many queries can I perform(as Google has a limit to that)? kindly answer my question and give some solution as soon as possible.
This depends on the speed of requests... you can send thousands of queries from a single ip but you need to do this like a human (slow), not like a bot which sends 20 or so queries every second.
I have done a similar script for myself and the only solution for this are:
use the google search api, which will costs you something.
or
scrape google like you did but use proxies. (which will also
cost you something, i don't recommend free proxies because they are slow and can change the content)
or
use a captcha solving service to automatically unblock the google captcha.

Query Limit for Google Contacts API

I am currently trying to make a syncing operation between my Database and Gmail's contacts.
My first initial sync, downloading/uploading over 1,000 contacts per user might throw some errors up in gmails face.
Is there any work-arounds? What is the limitations to having many contacts?
My understanding is that it is limited per IP, and not per User... is this correct?
I hope that someone can share some info on this, I have searched the web, but haven't found the best of resources... Thoughts?!
I actually received a response from Google.
The query is currently per user and is quite high though there is a
limit in the number of queries per second, per hour and per half a day
you can send. Unfortunately, we don't publicly communicate on these
values but I can assure you that normal applications (not spamming
Google's servers) shouldn't have any issues.
Also, when syncing, please make sure to use the updated-min query
parameter to only request contacts that have been updated since the
provided time and use batch-request when sending requests to the API
as it will perform multiple operations while consuming only one
request on the user's quota.
Hopefully this helps someone else if in need.
Yes, there is a limitation on accessing the Google API (at least on Google Maps API) on an IP basis. The only workaround I was abble to find is to use proxy servers (or tor).

cURL call to Twitter API meeting "Rate Limit" without making more than 5 requests

I'm just starting to mess around with a very, very basic call to the
Twitter API (http://api.twitter.com/1/statuses/user_timeline.json) to
pull my tweets to my website through cURL. However, using a page that
nobody knows exists yet (thus eliminating the possibility of
inadvertent traffic), I'm getting a Rate Limit Exceeded thing before
I've had the chance to even test it. It says it resets at 5 past the
hour, so I check again, and for a minute it works but then it's back
to telling me my rate limit is exceeded. A few questions for anyone who knows about the Twitter API and/or cURL:
First, is the rate limit applied to my server (instead of the user)? I would assume so, but
that could make it tough of course. Even one API call per visitor
could, on a site with marginal traffic, easily surpass the rate limit
in an hour. Is there a way to associate the call with the visitor, not
the server? Seems like probably not, but I'm not entirely sure how the
whole API works, and cURL does seem to be advocated in a number of
places. I'm aware that if I use JSON and AJAX the data in I can make
that request from the user, but just for the sake of argument, what
about cURL?
Second, any idea how I could be surpassing my rate limit without even
refreshing the page? I pay for hosting at another location, so I might
be sharing server space with another site, but my site definitely has
a unique IP, so that should … that should be OK, right? So how is it
that I'm surpassing the rate limit without even running the code (or
by running it once?)?
Here's what I've got for code, if it helps:
$ch=curl_init("http://api.twitter.com/1/statuses/user_timeline.json?screen_name=screenname");
curl_setopt_array($ch,array(
CURLOPT_RETURNTRANSFER=>true,
CURLOPT_TIMEOUT=>5,
)
);
$temp=curl_exec($ch);
curl_close($ch);
$results=json_decode($temp,true);
Also, I've now got it so that if Twitter returns a Rate Limit error, it records the error in a text file, as well as the time that the limit will reset. Looking at that file, the only time it updates (I don't have it rewrite, it just adds on) is when I've loaded the page (which is maybe once or twice in an hour), so it's not like something else is using this page and calling on this URL.
Any help?
Authenticated requests should count against the user's 350/hour limit. Non-authenticated requests get counted against your IP address's 150/hour limit.
If you're running into the limits during development, Twitter has generally been quite willing to whitelist dev server IPs.
http://dev.twitter.com/pages/rate-limiting
Some applications find that the default limit proves insufficient. Under such circumstances, we offer whitelisting. It is possible to whitelist both user accounts and IP addresses. Each whitelisted entity, whether an account or IP address, is allowed 20,000 requests per hour. If you are developing an application and would like to be considered for whitelisting you fill out the whitelisting request form. Due to the volume of requests, we cannot respond to them all. If your request has been approved, you'll receive an email.

Categories