Twitter API Rate Limit for Multiple Users - php

I am writing a PHP-based application using the Twitter API. Up until now I've been using the REST API via a GET request on a PHP page. However, as my app scales, I can easily see it going over the 150 requests-per-hour limit. Here's why:
I have categories of topics, each which periodically poll the Twitter API for tweets around a topic. For example, I have: mysite.com/cars, mysite.com/trucks, etc. A user can go to either page. When he is on the page, live, refreshing updates are pulled from Twitter by making an AJAX call to a PHP page I've set up. The PHP page determines which category the user is coming from (cars, trucks), polls Twitter for search results, then returns the JSON to the category page. This sounds confusing, but there are a number of unrelated reasons I need to have the intermediate PHP page.
The problem is that since the PHP page is making the requests, it will eat up the rate limit very quickly (imagine if there were 20 categories instead of just cars and trucks). I can't make a single call with multiple parameters because it would combine multiple categories of tweets and I'd have no way to separate them. I can cache results, but if I did, the more categories I add, the longer each would have to go between API calls.
So how can I approach this problem? I looked at the streaming API, but it's only for oAuth'd users and I don't want my users to have to log in to anything. Can I use the stream on the PHP page and then just make continuous requests each time the category page polls the PHP page? Thanks for any help!

a) You don't have to use your websites user's oAuth credentials in streaming API - just your's:
get them somewhere in dev.twitter.com and hardcode them. Your users won't know there is any oAuth going on backstage.
b) Don't use anonymous requests (150 per IP per hour) use oAuth requests (350 per oAuth per hour). You don't have to ask your users to sing in - just sign in few (1 is sufficient for start) your private twitter accounts. If you don't like creating twitter login functionality, you can get credentials for your twitter account to your twitter application in dev.twitter.com .
c) As #Cheeso mentioned - cache! Don't let every pageload make twitter request.

Related

How to display Facebook public page feed in PHP website, without reaching Graph API quota limit?

I am creating a custom Facebook Feed plugin for a custom CMS, to display the 10 latest posts of our client's public Facebook page in a fancy way, with attachments. But we have issues with very limited quota and often it goes beyond 100% and the plugin crashes.
So at my company we created a facebook app with all the necessary authorisations to use the Graph API requests and it works well. But each page load on the frontend (where there is Facebook feed present) was using like 10% of the quota. So I implemented some cache and storing the attachments locally, and I was able to get down to 2% of the quota for each page load.
But it still means that 50 visits at the same time = 100% = over limit !
So I'm kind of stuck with it, and don't know what are the best practices in this field.
The facebook official doc says that the quota on the Graph API depends on the number of app users, but we don't want (or need) to have user connection since we just want to use Graph API to display posts from public pages on our clients' websites.
Solved.
It appears that you can put a ?limit=[X] to the /[page-id]/feed API call, even though the /feed documentation page is not stating it clearly.
So my calls were fetching like... all the posts since the beginning, with all the attachments.
Adding ?limit=10 or lower to the /[page-id]/feed query solved my problem.

API to fetch twiiter feeds via cron

We have developed a product that is used for employee engagement. It provides a feature that shows tweets posted by members of your office if they have authorised the site.
The fetching of tweets is done by a periodic cron that is run at a regular interval at about 15 minutes. This cron searches for all the users who have authorised the site's app
and makes requests twitter for their tweets. For every user one request is send to twitter
Currently the system is using REST API (http://api.twitter.com/1/statuses/user_timeline.xml?user_id='xxxxxx') that is limiting number of request to 150 per hour.
We cannot make authenticated requests as it requires the user to authorise the call every time, which is not possible while making the requests by cron. So, with just 150 requests
and cron running four times an hour it is possible to fetch only 35-40 users data which cannot meet our requirements.
Also we have explored the option of Site Streaming API. But it requires a persistent connection to be established with twitter which would be difficult while using the cron. Another concerns with Site Streaming API is that it is in beta version and the website should be whitelisted.
Kindly assist us in selecting the best possible alternative that would help us meet the above mentioned objective
There are two solutions.
Create an account to follow the users who have authorised your app. Then, simply retrieve that timeline.
or
Place the users in a "List". Then make a call to lists/statuses
I am not 100% certain why you cannot stay logged in via Cron hwoever, as you are using PHP, can I suggest that you look at https://github.com/jmathai/twitter-async/blob/master/EpiTwitter.php which authenticates your oAuth and then does what you want and closes the request.
If you cant do this with Cron, use serviceUptime.com to call a php script although the max time the script can run is 35seconds so don't try and pull all the lance Armstrong tweets back all at once. HTH.

FQL Queries/API Calls make my page impossible slow (PHP SDK)

Total page load times have been ranging from 4-8 seconds, and for some reason the page doesn't even begin to load until it's made the API call. So even though the major API calls are at the bottom of the page (and script), nothing else on the page will load beforehand.
What's the best way to go about this? The most problematic API call is
$result = $facebook->api('/fql?q=select+uid,+name,+is_app_user+from+user+where+uid+in+(select+uid2+from+friend+where+uid1=me())+and+is_app_user=1');
Which finds the list of the user's friends using my app. In addition, I am making multiple api calls to get the names of users who are not that person's friend.
If it is impossible to speed up the API call, how can I at least get the rest of my page to load while FQL runs?
The facebook api is very slow, you may want to make fb api calls through the javascript api and only through php if it is absolutely neccessary. Ajax is your best friend here. Making client side calls only also has the benefit that maybe critical user data never touches the server and that's a good argument for your site. People are very cautious these days.

Facebook OAuth Error: Application request limit reached

I'm trying to get facebook's example page working (again) which you can find here. I'm getting the following error:
Fatal error: Uncaught OAuthException: (#4) Application request limit reached thrown in C:\wamp\www\base_facebook.php on line 988
I've googled this and the problem seems to be easily fixed by using the steps outlined here. However, when I go to facebook.com/insights, my application isn't listed (I am logged in).
The weirder part is that when I go to my app via Developers > My apps, I can go to the page of my app and click "Insights". This brings me to the Insights page for my app... but the diagnostic section is nowhere to be found. Can anyone help?
The outlined way of finding out why this happens is:
Log into https://developers.facebook.com/apps/
The last app you've edited should already be loaded on the right side; if not, find your app on the left side and click the name.
Scroll down until you see the Insights section and click See All.
From the menu on the left side, select API > Activity & Errors.
The Facebook "Graph API Rate Limiting" docs says that an error with code #4 is an app level rate limit, which is different than user level rate limits. Although it doesn't give any exact numbers, it describes their app level rate-limit as:
This rate limiting is applied globally at the app level. Ads api calls are excluded.
Rate limiting happens real time on sliding window for past one hour.
Stats is collected for number of calls and queries made, cpu time spent, memory used for each app.
There is a limit for each resource multiplied by monthly active users of a given app.
When the app uses more than its allowed resources the error is thrown.
Error, Code: 4, Message: Application request limit reached
The docs also give recommendations for avoiding the rate limits. For app level limits, they are:
Recommendations:
Verify the error code (4) to confirm the throttling type.
Do not make burst of calls, spread out the calls throughout the day.
Do smart fetching of data (important data, non duplicated data, etc).
Real-time insights, make sure API calls are structured in a way that you can read insights for as many as Page posts as possible, with minimum number of requests.
Don't fetch users feed twice (in the case that two App users have a specific friend in common)
Don't fetch all user's friends feed in a row if the number of friends is more than 250. Separate the fetches over different days. As an option, fetch first the app user's news feed (me/home) in order to detect which friends are more important to the App user. Then, fetch those friends feeds first.
Consider to limit/filter the requests by using the following parameters: "since", "until", "limit"
For page related calls use realtime updates to subscribe to changes in data.
Field expansion allows ton "join" multiple graph queries into a single call.
Etags to check if the data querying has changed since the last check.
For page management developers who does not have massive user base, have the admins of the page to accept the app to increase the number of users.
Finally, the docs give the following informational tips:
Batching calls will not reduce the number of api calls.
Making parallel calls will not reduce the number of api calls.
If you make a GET request to one of FB graph API endpoints that does not require access_token that does not mean you should not include it in request parameter. If you do as FB documentation says as do not include access_token than in FB server side it registers into your server machine. So limit (whatever amount is it exactly) can be reached very easily. If you however, put the user access token into the request (&access_token=XXXXXX) then requests register into the specific user, so the limit hardly ever be reached. You can test it with a simple script that makes 1000 requests with and without user access_token.
NOTE, FB app access token will not be sufficient as you will face the same problem: requests will be registered into app access_token that situation is alike making requests without access_token.

Multiple twitter users' feed on one site without reaching the rate limit

I have a large number of twitter users I wish to sydicate onto a website using PHP and caching the tweets in MySQL. However I seem to be stumped by the rate-limit problem when ever I access the API. Every request I make to every user seems to count as a request, which stands to reason.
I notice other sites* doing this exact thing successfully. How are they getting around this, are they simply whitelisted, or is there a technique I'm missing?
*http://www.twackle.com/NFL/Aaron-Rodgers_1/tweets
The streaming API is what you are looking for, and more specifically, the filter method. Filter, at its least-privileged level, will allow you to follow 5,000 users in realtime, without them having to authorize your app, and you can track up to 400 keywords using this method as well.
Now, if you want historical tweets as well, you will have to pull those from the REST API (the streaming API's count parameter doesn't really help here), but since you can only retrieve the last 3200 tweets for a user via the REST API, you can pretty much backfill all available tweet history with 16 calls to statuses/user_timeline by passing in a count parameter value of 200 and paging accordingly.
http://api.twitter.com/1/statuses/user_timeline.json?screen_name=barackobama&count=200&page=2
http://api.twitter.com/1/statuses/user_timeline.json?screen_name=barackobama&count=200&page=3
http://api.twitter.com/1/statuses/user_timeline.json?screen_name=barackobama&count=200&page=4
With your 350 calls per hour per single Twitter account, you could backfill approximately 22 full user timelines per hour.
On the implementation side, you'd probably be interested in Phirehose, a streaming API client interface for PHP.
try to auth first, before get the tweets. that should increase the rate limit
A simple method of combining multiple user_timelines is to create a Twitter list and use GET /:user/lists/:id/status. That single API request will return the most recent tweets from all users on the list.

Categories