API to fetch twiiter feeds via cron - php

We have developed a product that is used for employee engagement. It provides a feature that shows tweets posted by members of your office if they have authorised the site.
The fetching of tweets is done by a periodic cron that is run at a regular interval at about 15 minutes. This cron searches for all the users who have authorised the site's app
and makes requests twitter for their tweets. For every user one request is send to twitter
Currently the system is using REST API (http://api.twitter.com/1/statuses/user_timeline.xml?user_id='xxxxxx') that is limiting number of request to 150 per hour.
We cannot make authenticated requests as it requires the user to authorise the call every time, which is not possible while making the requests by cron. So, with just 150 requests
and cron running four times an hour it is possible to fetch only 35-40 users data which cannot meet our requirements.
Also we have explored the option of Site Streaming API. But it requires a persistent connection to be established with twitter which would be difficult while using the cron. Another concerns with Site Streaming API is that it is in beta version and the website should be whitelisted.
Kindly assist us in selecting the best possible alternative that would help us meet the above mentioned objective

There are two solutions.
Create an account to follow the users who have authorised your app. Then, simply retrieve that timeline.
or
Place the users in a "List". Then make a call to lists/statuses

I am not 100% certain why you cannot stay logged in via Cron hwoever, as you are using PHP, can I suggest that you look at https://github.com/jmathai/twitter-async/blob/master/EpiTwitter.php which authenticates your oAuth and then does what you want and closes the request.
If you cant do this with Cron, use serviceUptime.com to call a php script although the max time the script can run is 35seconds so don't try and pull all the lance Armstrong tweets back all at once. HTH.

Related

Display online users as soon as they log in

I could make an AJAX request that displays the online users for each user. The request is done every 5-10 sec. or whatever. The problem is, isn't that overloading my server too much? Is there a way to do it that it will update the online users for everyone else instantly?
If you are looking for a system that can be used to indicate "Presence" events (Join/Leave/Is Typing/Geo/Lat/Long/etc) in real-time, you should check out PubNub:
http://www.pubnub.com/
Of specific interest to you would be the publish, subscribe, presence, and state APIs.
Both PHP and JS client SDKs are available:
https://github.com/pubnub/php
http://www.pubnub.com/docs/javascript/javascript-sdk.html
geremy

Twitter API Rate Limit for Multiple Users

I am writing a PHP-based application using the Twitter API. Up until now I've been using the REST API via a GET request on a PHP page. However, as my app scales, I can easily see it going over the 150 requests-per-hour limit. Here's why:
I have categories of topics, each which periodically poll the Twitter API for tweets around a topic. For example, I have: mysite.com/cars, mysite.com/trucks, etc. A user can go to either page. When he is on the page, live, refreshing updates are pulled from Twitter by making an AJAX call to a PHP page I've set up. The PHP page determines which category the user is coming from (cars, trucks), polls Twitter for search results, then returns the JSON to the category page. This sounds confusing, but there are a number of unrelated reasons I need to have the intermediate PHP page.
The problem is that since the PHP page is making the requests, it will eat up the rate limit very quickly (imagine if there were 20 categories instead of just cars and trucks). I can't make a single call with multiple parameters because it would combine multiple categories of tweets and I'd have no way to separate them. I can cache results, but if I did, the more categories I add, the longer each would have to go between API calls.
So how can I approach this problem? I looked at the streaming API, but it's only for oAuth'd users and I don't want my users to have to log in to anything. Can I use the stream on the PHP page and then just make continuous requests each time the category page polls the PHP page? Thanks for any help!
a) You don't have to use your websites user's oAuth credentials in streaming API - just your's:
get them somewhere in dev.twitter.com and hardcode them. Your users won't know there is any oAuth going on backstage.
b) Don't use anonymous requests (150 per IP per hour) use oAuth requests (350 per oAuth per hour). You don't have to ask your users to sing in - just sign in few (1 is sufficient for start) your private twitter accounts. If you don't like creating twitter login functionality, you can get credentials for your twitter account to your twitter application in dev.twitter.com .
c) As #Cheeso mentioned - cache! Don't let every pageload make twitter request.

PHP twitter Hometimeline

I making a php server/page which is supposed to capture a users twitter feed and then provide it in a JSON format to another application and/or mobile device in JSON format.
Twitter provides it's data already in JSON format by using .json after the timeline url. But this is limited to 150 requests/hour which can be a problem being on a shared hosted server.
If been trying to use the twitteroauth php library with API keys. Before I can start communicating with the API I always need to sign in with a twitter account. Using the API is limited to 350 request/hour.
Is there a way to use the library not needing to log in to capture the timeline?
Or what is a better way to achieve my goal, creating a php-page providing me the timeline on request?
If I understand the question correct, the problem is that you make to many request to the Twitter API that doesn't require log on. In that case, if you don't want to use the API that require login, I guess you could implement some caching. Let your server run a cron every minute that check the Twitter API for new tweets, and store the tweets in a database or a textfile.
Then, when a user request your page for JSON, you read from your cache instead of going straight to the Twitter API every time. That way you will save a lot of traffic between your server and Twitter, and you would still be very close to real time when it comes to up-to-date tweets, as you with 150 requests/hour could update your cache every 30 seconds or so.

Multiple twitter users' feed on one site without reaching the rate limit

I have a large number of twitter users I wish to sydicate onto a website using PHP and caching the tweets in MySQL. However I seem to be stumped by the rate-limit problem when ever I access the API. Every request I make to every user seems to count as a request, which stands to reason.
I notice other sites* doing this exact thing successfully. How are they getting around this, are they simply whitelisted, or is there a technique I'm missing?
*http://www.twackle.com/NFL/Aaron-Rodgers_1/tweets
The streaming API is what you are looking for, and more specifically, the filter method. Filter, at its least-privileged level, will allow you to follow 5,000 users in realtime, without them having to authorize your app, and you can track up to 400 keywords using this method as well.
Now, if you want historical tweets as well, you will have to pull those from the REST API (the streaming API's count parameter doesn't really help here), but since you can only retrieve the last 3200 tweets for a user via the REST API, you can pretty much backfill all available tweet history with 16 calls to statuses/user_timeline by passing in a count parameter value of 200 and paging accordingly.
http://api.twitter.com/1/statuses/user_timeline.json?screen_name=barackobama&count=200&page=2
http://api.twitter.com/1/statuses/user_timeline.json?screen_name=barackobama&count=200&page=3
http://api.twitter.com/1/statuses/user_timeline.json?screen_name=barackobama&count=200&page=4
With your 350 calls per hour per single Twitter account, you could backfill approximately 22 full user timelines per hour.
On the implementation side, you'd probably be interested in Phirehose, a streaming API client interface for PHP.
try to auth first, before get the tweets. that should increase the rate limit
A simple method of combining multiple user_timelines is to create a Twitter list and use GET /:user/lists/:id/status. That single API request will return the most recent tweets from all users on the list.

Facebook API Responses Very Slow (oAuth)

I am doing some benchmark testing on my web app and notice that the responses from Facebooks API are a lot slower than Twitters.
** For the record, I am using the twitter-async library for Twitter API integration and Facebooks own library here
With the Twitter library I can save an oAuth token & secret, I then use these to create an instance and make calls, simple. For Facebook, unless I ask for offline_permission, I must store an oAuth code and recreate an oAuth access token each time the user logs into my app.
Given the above I can:
Retrieve a Twitter users timeline in 0.02 seconds.
Get a FB oAuth Access Code in 1.16 seconds, then I can get the users details in 2.31 seconds, totalling 3.47 seconds to get the users details.
These statistics are from using functions Facebook has provided in their PHP API library. I also tried implementing my own CURL functions to get this information via a request and the results are not much better.
Is this the same kind of response times others are getting using the Facebook API?
Besides requesting offline permission and storing the permanent access token, how else can I speed up these requests, is the problem on my end or Facebooks?
Thanks,
Chris
I also have the experience the Facebook API is quite slow. I believe the facebook PHP API does not much more than wrap around CURL in the case of API calls so it makes sense that this didn't improve the speeds.
I work on a canvas page, which means for existing users, I get an access token and fb_UID as he/she comes in. At first, I did a /me graph call and sometimes a /me/friends. The first takes like 0.6 secs, the second usually a bit more. So in that case I can (to some extend) confirm your findings.
That's why I've now switched to storing important stuff locally and updating it only when needed (real time update API). Basically, I don't need any API calls during 'normal' operation.
I realize you are probably integrating FB on your own page, and perhaps use a bit more info than just name, fb-UID & friends, and that this solution is not totally answering your question. But perhaps it can still function as a small piece of the puzzle ;)
I am looking forward to other perspectives on this as well!
My application calls multiple URL's from Facebook. It does take some time :/
This is why I decided to write a function which stores the results in $_SESSION so I can use it again later, along with a timestamp to see if the data is too old.
This doesn't solve the actual problem, it just saves you having to keep fetching it.
What I like to do for end user experience, is forward them to page with a loading .gif - then have javascript request the page that actually fetches data. That way, the user remains on a loading page with a nice gif to stare at, until the next page is ready.

Categories