Facebook Graph API Comment Count Suddenly Stopped Working - php

Our site uses our own comment system (simple php/mysql) and also the fb comment plugin. I would like to be able to add the comment counts of each to display a single total count of comments from both together. Seems simple enough.
Months ago, I got this working. Then it suddenly stopped working. This morning, I found a new way to do it. Got it working on one page, and by the time I had added the code to all the pages on which we have comments, it was no longer working.
I am pulling my hair out trying to get this working, having virtually zero understanding of json. The FB API explorer gives me an error about auth tokens, but doing what I see recommended has no effect (i.e. creating a new fb app and including the block of auth code they provide).
This is what was working fine at first this am:
$fbcounturl = 'http://www.catalystathletics.com/articles/article.php?articleID=1902';
$fbjsonurl = "https://graph.facebook.com/v2.1/?fields=share{comment_count}&id=" .$fbcounturl;
$fbdata = file_get_contents($fbjsonurl);
$fbarray = json_decode($fbdata, true);
$fbcomcount = $fbarray['share']['comment_count'];
print($fbcomcount);
Then I could simply add $fbcomcount to the $comCount from our db.
If I just browse to the url, I get the json info fine:
{
"share": {
"comment_count": 3
},
"id": "http://www.catalystathletics.com/articles/article.php?articleID=1902"
}
But the $fbcomcount is empty.
Here is an example of a page that would use this -
http://www.catalystathletics.com/article/1902/Jumping-Forward-in-the-Snatch-or-Clean-Error-Correction/#comments
Any help would be GREATLY appreciated.

Ran into the same issue recently, Facebook comment count simply stopped working. Eventually tracked down the error in the returned JSOn response, telling me Error #4 Application request limit reached
{"error":{"message":"(#4) Application request limit
reached","type":"OAuthException","is_transient":true,"code":4,"fbtrace_id":"EUNAVRNgnFu"}}`
Here is a good, detailed response on Facebook Open Graph API limits I found elsewhere:
The Facebook API limit isn't really documented, but apparently it's something like: 600 calls per 600 seconds, per token & per IP. As the site is restricted, quoting the relevant part:
After some testing and discussion with the Facebook platform team, there is no official limit I'm aware of or can find in the documentation. However, I've found 600 calls per 600 seconds, per token & per IP to be about where they stop you. I've also seen some application based rate limiting but don't have any numbers.
As a general rule, one call per second should not get rate limited. On the surface this seems very restrictive but remember you can batch certain calls and use the subscription API to get changes.
As you can access the Graph API on the client side via the Javascript SDK; I think if you travel your request for photos from the client, you won't hit any application limit as it's the user (each one with unique id) who's fetching data, not your application server (unique ID).
This may mean a huge refactor if everything you do go through a server. But it seems like the best solution if you have so many request (as it'll give a breath to your server).
Else, you can try batch request, but I guess you're already going this way if you have big traffic.
If nothing of this works, according to the Facebook Platform Policy you should contact them.
If you exceed, or plan to exceed, any of the following thresholds please contact us as you may be subject to additional terms: (>5M MAU) or (>100M API calls per day) or (>50M impressions per day).

Related

How to display Facebook public page feed in PHP website, without reaching Graph API quota limit?

I am creating a custom Facebook Feed plugin for a custom CMS, to display the 10 latest posts of our client's public Facebook page in a fancy way, with attachments. But we have issues with very limited quota and often it goes beyond 100% and the plugin crashes.
So at my company we created a facebook app with all the necessary authorisations to use the Graph API requests and it works well. But each page load on the frontend (where there is Facebook feed present) was using like 10% of the quota. So I implemented some cache and storing the attachments locally, and I was able to get down to 2% of the quota for each page load.
But it still means that 50 visits at the same time = 100% = over limit !
So I'm kind of stuck with it, and don't know what are the best practices in this field.
The facebook official doc says that the quota on the Graph API depends on the number of app users, but we don't want (or need) to have user connection since we just want to use Graph API to display posts from public pages on our clients' websites.
Solved.
It appears that you can put a ?limit=[X] to the /[page-id]/feed API call, even though the /feed documentation page is not stating it clearly.
So my calls were fetching like... all the posts since the beginning, with all the attachments.
Adding ?limit=10 or lower to the /[page-id]/feed query solved my problem.

Facebook OAuth Error: Application request limit reached

I'm trying to get facebook's example page working (again) which you can find here. I'm getting the following error:
Fatal error: Uncaught OAuthException: (#4) Application request limit reached thrown in C:\wamp\www\base_facebook.php on line 988
I've googled this and the problem seems to be easily fixed by using the steps outlined here. However, when I go to facebook.com/insights, my application isn't listed (I am logged in).
The weirder part is that when I go to my app via Developers > My apps, I can go to the page of my app and click "Insights". This brings me to the Insights page for my app... but the diagnostic section is nowhere to be found. Can anyone help?
The outlined way of finding out why this happens is:
Log into https://developers.facebook.com/apps/
The last app you've edited should already be loaded on the right side; if not, find your app on the left side and click the name.
Scroll down until you see the Insights section and click See All.
From the menu on the left side, select API > Activity & Errors.
The Facebook "Graph API Rate Limiting" docs says that an error with code #4 is an app level rate limit, which is different than user level rate limits. Although it doesn't give any exact numbers, it describes their app level rate-limit as:
This rate limiting is applied globally at the app level. Ads api calls are excluded.
Rate limiting happens real time on sliding window for past one hour.
Stats is collected for number of calls and queries made, cpu time spent, memory used for each app.
There is a limit for each resource multiplied by monthly active users of a given app.
When the app uses more than its allowed resources the error is thrown.
Error, Code: 4, Message: Application request limit reached
The docs also give recommendations for avoiding the rate limits. For app level limits, they are:
Recommendations:
Verify the error code (4) to confirm the throttling type.
Do not make burst of calls, spread out the calls throughout the day.
Do smart fetching of data (important data, non duplicated data, etc).
Real-time insights, make sure API calls are structured in a way that you can read insights for as many as Page posts as possible, with minimum number of requests.
Don't fetch users feed twice (in the case that two App users have a specific friend in common)
Don't fetch all user's friends feed in a row if the number of friends is more than 250. Separate the fetches over different days. As an option, fetch first the app user's news feed (me/home) in order to detect which friends are more important to the App user. Then, fetch those friends feeds first.
Consider to limit/filter the requests by using the following parameters: "since", "until", "limit"
For page related calls use realtime updates to subscribe to changes in data.
Field expansion allows ton "join" multiple graph queries into a single call.
Etags to check if the data querying has changed since the last check.
For page management developers who does not have massive user base, have the admins of the page to accept the app to increase the number of users.
Finally, the docs give the following informational tips:
Batching calls will not reduce the number of api calls.
Making parallel calls will not reduce the number of api calls.
If you make a GET request to one of FB graph API endpoints that does not require access_token that does not mean you should not include it in request parameter. If you do as FB documentation says as do not include access_token than in FB server side it registers into your server machine. So limit (whatever amount is it exactly) can be reached very easily. If you however, put the user access token into the request (&access_token=XXXXXX) then requests register into the specific user, so the limit hardly ever be reached. You can test it with a simple script that makes 1000 requests with and without user access_token.
NOTE, FB app access token will not be sufficient as you will face the same problem: requests will be registered into app access_token that situation is alike making requests without access_token.

Multiple twitter users' feed on one site without reaching the rate limit

I have a large number of twitter users I wish to sydicate onto a website using PHP and caching the tweets in MySQL. However I seem to be stumped by the rate-limit problem when ever I access the API. Every request I make to every user seems to count as a request, which stands to reason.
I notice other sites* doing this exact thing successfully. How are they getting around this, are they simply whitelisted, or is there a technique I'm missing?
*http://www.twackle.com/NFL/Aaron-Rodgers_1/tweets
The streaming API is what you are looking for, and more specifically, the filter method. Filter, at its least-privileged level, will allow you to follow 5,000 users in realtime, without them having to authorize your app, and you can track up to 400 keywords using this method as well.
Now, if you want historical tweets as well, you will have to pull those from the REST API (the streaming API's count parameter doesn't really help here), but since you can only retrieve the last 3200 tweets for a user via the REST API, you can pretty much backfill all available tweet history with 16 calls to statuses/user_timeline by passing in a count parameter value of 200 and paging accordingly.
http://api.twitter.com/1/statuses/user_timeline.json?screen_name=barackobama&count=200&page=2
http://api.twitter.com/1/statuses/user_timeline.json?screen_name=barackobama&count=200&page=3
http://api.twitter.com/1/statuses/user_timeline.json?screen_name=barackobama&count=200&page=4
With your 350 calls per hour per single Twitter account, you could backfill approximately 22 full user timelines per hour.
On the implementation side, you'd probably be interested in Phirehose, a streaming API client interface for PHP.
try to auth first, before get the tweets. that should increase the rate limit
A simple method of combining multiple user_timelines is to create a Twitter list and use GET /:user/lists/:id/status. That single API request will return the most recent tweets from all users on the list.

Facebook API Responses Very Slow (oAuth)

I am doing some benchmark testing on my web app and notice that the responses from Facebooks API are a lot slower than Twitters.
** For the record, I am using the twitter-async library for Twitter API integration and Facebooks own library here
With the Twitter library I can save an oAuth token & secret, I then use these to create an instance and make calls, simple. For Facebook, unless I ask for offline_permission, I must store an oAuth code and recreate an oAuth access token each time the user logs into my app.
Given the above I can:
Retrieve a Twitter users timeline in 0.02 seconds.
Get a FB oAuth Access Code in 1.16 seconds, then I can get the users details in 2.31 seconds, totalling 3.47 seconds to get the users details.
These statistics are from using functions Facebook has provided in their PHP API library. I also tried implementing my own CURL functions to get this information via a request and the results are not much better.
Is this the same kind of response times others are getting using the Facebook API?
Besides requesting offline permission and storing the permanent access token, how else can I speed up these requests, is the problem on my end or Facebooks?
Thanks,
Chris
I also have the experience the Facebook API is quite slow. I believe the facebook PHP API does not much more than wrap around CURL in the case of API calls so it makes sense that this didn't improve the speeds.
I work on a canvas page, which means for existing users, I get an access token and fb_UID as he/she comes in. At first, I did a /me graph call and sometimes a /me/friends. The first takes like 0.6 secs, the second usually a bit more. So in that case I can (to some extend) confirm your findings.
That's why I've now switched to storing important stuff locally and updating it only when needed (real time update API). Basically, I don't need any API calls during 'normal' operation.
I realize you are probably integrating FB on your own page, and perhaps use a bit more info than just name, fb-UID & friends, and that this solution is not totally answering your question. But perhaps it can still function as a small piece of the puzzle ;)
I am looking forward to other perspectives on this as well!
My application calls multiple URL's from Facebook. It does take some time :/
This is why I decided to write a function which stores the results in $_SESSION so I can use it again later, along with a timestamp to see if the data is too old.
This doesn't solve the actual problem, it just saves you having to keep fetching it.
What I like to do for end user experience, is forward them to page with a loading .gif - then have javascript request the page that actually fetches data. That way, the user remains on a loading page with a nice gif to stare at, until the next page is ready.

Sporadic 400 responses from twitter API call

I am getting seemingly random 400 errors from a twitter api call on my site. It doesn't seem to be rate limiting, as it doesn't appear to be time based, while testing I did not experience it at all, whilst my designer (who had not been hitting refresh over and over) could not see the tweet at all, then suddenly could.
this is the call
$file = #file_get_contents('http://api.twitter.com/1/statuses/user_timeline.xml?screen_name=mildfuzz&count=1');
Can Twitter be down in specific areas? It seems odd to get different results based on location.
It turned out the best way was to make sure I was always checking my rate limit, no idea why, but this seems to alert Twitter to the fact that I have some limit left.
I am also now keeping a copy of each tweet in the database.

Categories