Facebook Friends Count? - php

What is the best way to count a Facebook user's friends...
I'm currently using (PHP):
$data = $facebook->api('/me/friends');
$friends_count = count($data['data']);
and its very slow... (about 2 secs)

Querying the facebook api sends a request to facebook. Because its a common http-request this Probably takes most of the time. There is usually no way around it. If you need the values more often, you should cache them somewhere
if (file_exists($cacheFile)) {
$data = file_get_contents($cachefile);
} else {
$data = $facebook->api('/me/friends');
file_put_contents($cacheFile, $data);
}
$friends_count = count($data['data']);
Remember to update the cache file from time to time.

If you are not processing the data given by Facebook on server side, instead of doing it using PHP, you can use JavaScript graph API to fetch, it can fetch it using ajax which wont effect your page load time.

Related

How to cache facebook graph api call

I've created a function to get the likes for my facebook page using the graph api. However, the level rate limit keeps on getting reached as it's being called on every request.
How would i cache this so it doesn't make the call every time?
The code i'm currently using is:
function fb_like_count() {
$id = '389320241533001';
$access_token = 'access token goes here';
$json_url ='https://graph.facebook.com/v3.2/'.$id.'?fields=fan_count&access_token='.$access_token;
$json = file_get_contents($json_url);
$json_output = json_decode($json);
if($json_output->fan_count) {
return like_count_format($json_output->fan_count);
} else{
return 0;
}
}
There are many cache mechanism in PHP that you can use depending on your project size.
I would suggest you to check memcached or Redis. These are in-memory cache mechanisms that are pretty fast and would help you to gain better performance.
You can read more about how to implement memcached here or for redis here.
The second and easier way is to use file caching. It works like this:
You send a request to Facebook API and when response is returned you save it to a file. When you want to send the second response you can check if there is any content in your file first and if there is you can return that directly to your application otherwise you will send the request to Facebook API
Simple integration is like this
if (file_exists($facebook_cache_file) && (filemtime($facebook_cache_file) > (time() - 60 * 15 ))) {
// Cache file is less than 15 minutes old but you can change this.
$file = file_get_contents($facebook_cache_file); // this holds the api data
} else {
// Our cache is out-of-date, so load the data from our remote server,
// and also save it over our cache for next time.
$response = getFacebookData() // get data from facebook and save into file
file_put_contents($facebook_cache_file, $response, LOCK_EX);
}
Anyway I would suggest you to use any PHP library for doing file cache.
Below you can find some that might be interesting to look at:
https://github.com/PHPSocialNetwork/phpfastcache
https://symfony.com/doc/current/components/cache.html

How to managment over 100k access for a php file in my vps?

I have a app over 500k member and all of their user get data from server every 1h.
All of my data store in a php file and device get it with JSON.
This is my Simple php file:
<?php
$response = array();
header('Content-type: application/json');
$response["AppInf"] = array();
$product = array();
$product["apptitle"] = "string1";
$product["apps"] = "string2";
$product["apps2"] = "string3";
$product["apps4"] = "string4";
$product["idapp"] = "stringid";
array_push($response["AppInf"], $product);
$response["success"] = 1;
echo json_encode($response);
?>
but when access over 15k user in server my cpu load grow to 100%.
I have a good vps server with 64g ram and xenon cpu.
Anyone can help me for manage and fix this problem???
If your content is really static as in your example: store content in static file and use caching. If your content is the same for at least a group of users then you only have to calculate the desired result once and store the data for later retrieval
Consider using a reverse proxy like varnish to move load from your web server to another server
If it is possible: Don't let all users fetch data at the same time. Add some random offset to the time when data is being pulled.

How to make my get Facebook like script loop every 3 seconds?

I am building a Facebook page like counter. I want to be able to get page like updates roughly every 2-3 seconds. The result will be sent to a sql database then sent to an Arduino. First of all though, I need to get the likes from the desired page. I have the PHP script below that works great but you have to load the page every time to get an updated like count.
Would you be able to use a while loop with some sort of 2-3 second delay? or what you recommend in situation? p.s I used Dan Bilzerian's FB page as an example because his likes just keep going up up up, its ridiculous.
Thanks
function facebook_count($url){
// Query in FQL
$fql = "SELECT like_count";
$fql .= " FROM link_stat WHERE url = '$url'";
$fqlURL = "https://api.facebook.com/method/fql.query?format=json&query=" . urlencode($fql);
// Facebook Response is in JSON
$response = file_get_contents($fqlURL);
return json_decode($response);
}
$fb = facebook_count('https://www.facebook.com/danbilzerianofficial');
// facebook like count
echo $fb[0]->like_count;

Google API request every 30 seconds

I'm using Live Reporting Google APIs to retrieve active users and display the data inside a mobile application. On my application I'd like to make a HTTP request to a PHP script on my server which is supposed to return the result.
However I read on Google docs that it's better not to request data using APIs more often than 30 seconds.
I prefer not to use a heavy way such as a cron job that stores the value inside my database. So I'd like to know if there's a way to cache the content of my PHP scrpit na dmake it perform an API request only when the cache expires.
Is there any similar method to do that?
Another way could be implementing a very simple cache by yourself.
$googleApiRequestUrlWithParameter; //This is the full url of you request
$googleApiResponse = NULL; //This is the response by the API
//checking if the response is present in our cache
$cacheResponse = $datacache[$googleApiRequestUrlWithParameter];
if(isset($cacheResponse)) {
//check $cacheResponse[0] for find out the age of the cached data (30s or whatever you like
if(mktime() - $cacheResponse[0] < 30) {
//if the timing is good
$googleApiResponse = $cacheResponse[1];
} else {
//otherwise remove it from your "cache"
unset($datacache[$googleApiRequestUrlWithParameter]);
}
}
//if you do no have the response
if(!isset($googleApiResponse)) {
//make the call to google api and put the response in $googleApiResponse then
$datacache[] = array($googleApiRequestUrlWithParameter => array(mktime(), $googleApiResponse)
}
If you data are related to the user session, you could store $datacahe into $_SESSION
http://www.php.net/manual/it/reserved.variables.session.php
ortherwise define $datacache = array(); as a global variable.
There is a lot of way of caching things in PHP, the simple/historic way to manage cache in PHP is with APC http://www.php.net/manual/book.apc.php
Maybe I do not understard correctly your question.

Instagram max requests per hour (30) exceeded?

{
"code":420,
"error_type":"OAuthRateLimitException",
"error_message":"You have exceeded the maximum number of requests per hour. You have performed a total of 253 requests in the last hour. Our general maximum request limit is set at 30 requests per hour."
}
I just noticed a clients website I am looking after has stopped showing the Instagram feed, so I loaded up the feed URL straight into the browser and I got the above error. I don't think there should have been 253 requests in an hour, but whilst Googling this problem, I came across someone saying it was because the API was being logged in on every request. Sadly, I have "inherited" this code, and haven't really worked with the Instagram API before, apart from to fix an error with this same website before.
The clients site is in WordPress so I have wrapped the code to get the images in a function:
function get_instagram($user_id=USERID,$count=6,$width=190,$height=190){
$url = 'https://api.instagram.com/v1/users/'.$user_id.'/media/recent/?access_token=ACCESSTOKEN&count='.$count;
// Also Perhaps you should cache the results as the instagram API is slow
$cache = './'.sha1($url).'.json';
if(file_exists($cache) && filemtime($cache) > time() - 60*60){
// If a cache file exists, and it is newer than 1 hour, use it
$jsonData = json_decode(file_get_contents($cache));
} else {
$jsonData = json_decode((file_get_contents($url)));
file_put_contents($cache,json_encode($jsonData));
}
$result = '<a style="background-image:url(/wp-content/themes/iwear/inc/img/instagram-background.jpg);" target="_BLANK" href="http://www.instagr.am" class="lc-box lcbox-4 instagram">'.PHP_EOL.'<ul>'.PHP_EOL;
foreach ($jsonData->data as $key=>$value) {
$result .= "\t".'<li><img src="'.$value->images->low_resolution->url.'" alt="'.$value->caption->text.'" data-width="'.$width.'" data-height="'.$height.'" /><div class="lc-box-inner"><div class="title"><h2>images</h2></div><div class="description">'.$value->caption->text.'</div></div></li>'.PHP_EOL;
}
$result .= '</ul></a>'.PHP_EOL;
return $result;
}
But as I said, this code has stopped working. Is there any way I could optimize this to actually work? I also notice there is mention of a cache in the (probably stolen) instagram stuff, but it isn't actually caching, so that could also be a solution
Thanks
Try registering a new client in instagram and then change
$url = 'https://api.instagram.com/v1/users/'.$user_id.'/media/recent/?access_token=ACCESSTOKEN&count='.$count;
for
$url = https://api.instagram.com/v1/users/'.$user_id.'/media/recent/?client_id=CLIENT_ID&count='.$count;
where CLIENT_ID is the client id of your recently created client.

Categories