Facebook Pagination Limit with Graph API - php

this is my first post here.
I'm trying to query all places within my city limits (Winnipeg) using geo-location data and a radius of 15,000 meters.
This is working somewhat well with the query:
$request = new FacebookRequest(
$session,
'GET',
'/search?type=place&limit=100&center=49.89542,-97.13853&distance=15000'
);
The pagination afterwards works once or twice but ends up cutting me off after a variable 180-220 results. I noticed the pages returned are all from downtown (center query string) and start to spread outward, but it stops after only 2 city blocks.
Here is the loop I have in use for iterating through additional results:
$paging = $graphObject->asArray()['paging'];
$i = 0;
while(property_exists($paging, "next") && $i <= $times)
{
$next_url = $paging->next;
$json = file_get_contents($next_url);
$obj = json_decode($json);
foreach($obj->data as $page)
{
$page_ids[] = $page->id;
}
$paging = $obj->paging;
$i++;
}
I have researched issues with API limits but nothing that comes even close to only 200-ish results.
Is there possibly something I'm missing here?

Related

Need help decoding JSON from Riot API with PHP

As a part of an assignment I am trying to pull some statistics from the Riot API (JSON data for League of Legends). So far I have managed to find summoner id (user id) based on summoner name, and I have filtered out the id's of said summoner's previous (20) games. However now I can't figure out how to get the right values from the JSON data. So this is when I'll show you my code I guess:
$matchIDs is an array of 20 integers (game IDs)
for ($i = 1; $i <= 1; $i++)
{
$this_match_data = get_match($matchIDs[$i], $server, $api);
$processed_data = json_decode($this_match_data, true);
var_dump($processed_data);
}
As shown above my for loop is set to one, as I'm just focusing on figuring out one before continuing with all 20. The above example is how I got the match IDs and the summoner IDs. I'll add those codes here for comparison:
for ($i = 0; $i <= 19; $i++)
{
$temp = $data['matches'][$i]['matchId'];
$matchIDs[$i] = json_decode($temp, true);
}
$data is the variable I get when I pull all the info from the JSON page, it's the same method I use to get $this_match_data in the first code block.
function match_list($summoner_id, $server, $api)
{
$summoner_enc = rawurlencode($summoner);
$summoner_lower = strtolower($summoner_enc);
$curl =curl_init('https://'.$server.'.api.pvp.net/api/lol/'.$server.'/v2.2/matchlist/by-summoner/'.$summoner_id.'?api_key='.$api.'');
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
$result = curl_exec($curl);
curl_close($curl);
return $result;
}
Now to the root of the problem, This is where I put the data I get from the site, so you can see what I am working with. Now by using the following code I can get the first value in that file, the match ID.
echo $processed_data['matchId'];
But I can't seem to lock down any other information from this .json file. I've tried typing stuff like ['region'] instead of ['matchId'] with no luck as well as inserting index numbers like $processed_data[0], but nothing happens. This is just how I get the right info from the first examples and I am really lost here.
Ok, so I think I've figured it out myself. By adding this to the code I can print out the json file in a way more human-friendly way, and that should make it much easier to handle the data.
echo ("<pre>");
var_dump($processed_data);
echo ("</pre>");

PHP/MySQL/JSON - Looping through all pages of JSON response

I am calling the Crunchbase API and it gives me long response, so long that the response has multiple pages that can be accessed with ?page=# at the end of the api url.
My question is how do I write some code to run the script once and it will go through all of the pages available without me having the change the page number every time I call the script?
Simplified version of my code:
$url = "https://api.url.com/tags/?page=2";
$jsondata = file_get_contents($url);
$array = json_decode($jsondata,true);
var_dump($array);
foreach($array as $key => $value) {
mysql_query(" INSERT into cbcompanies (
`column1`)
VALUES (
'{$value['foo']}') ",$con);
}
If you want to make multiple requests you have to use loops or explicitly get all the pages.
$numberOfPages = 100;
for($i = 1; $i < $numberOfPages; $i++) {
$url = sprintf("https://api.url.com/tags/?page=%d", $i);
// Rest of the code.
}

Itunes Search API sort results by "kind"

I'm coding an iTunes search app using the iTunes Search API. So far I have everything working properly but here is the problem I am running into. If you want to search all media types by a particular keyword it returns the results in no particular order. You may have movies, podcasts, songs, etc mixed in randomly in the results. The API, as far as I know only allows you to sort by most recent release date.
I would like to sort the results by the "kind":key to group them by media type. I don't care what particular order they are in as long as they are grouped together by media type.
This is part of a typical return in JSON format with simple key value pairs. I have it set to return the maximum results so I may have up to 200 results with 10 different media types with the key of "kind":
{"wrapperType":"track",
"kind":"song",
"artistId":909253,
"collectionId":120954021,
"trackId":120954025,
"artistName":"Jack Johnson",
Here is part of my results array:
$iTunes = array();
for($i=0; $i<count($data['results']); $i++) {
$kind = $data['results'][$i]['kind'];
$artistId = $data['results'][$i]['artistId'];
$collectionId = $data['results'][$i]['collectionId'];
$trackId = $data['results'][$i]['trackId'];
$artistName = $data['results'][$i]['artistName'];
$trackName = $data['results'][$i]['trackName'];
$collectionName = $data['results'][$i]['collectionName'];
$artistViewUrl = $data['results'][$i]['artistViewUrl'];
$trackViewUrl = $data['results'][$i]['trackViewUrl'];
$collectionViewUrl = $data['results'][$i]['collectionViewUrl'];
$image100 = $data['results'][$i]['artworkUrl100'];
$collectionPrice = $data['results'][$i]['collectionPrice'];
$previewUrl = $data['results'][$i]['previewUrl'];
I'm trying to wrap my head around grouping by values of keys. Do I create another array and iterate over the $kind = $data['results']?
It would seem that you would need to build an array with 'kind' as a dimension. So something like this:
$data_array = array();
for ($i = 0; $i <count($data['results']); $i++) {
$kind = $data['results'][$i]['kind'];
$data_array[$kind][] = $data['results'][$i];
}
You could then work with the array like this:
foreach ($data_array as $kind => $kind_array) {
// echo out any category headers based on $kind
foreach($kind_array as $item) {
// echo out each individual item within this kind group
}
}

Facebook comments loop very very slow

I have this function in order to retrieve the count of Facebook comments to blog posts:
function comment_count($url) {
$json = json_decode(file_get_contents('https://graph.facebook.com/?ids=' . $url));
return ($json->$url->comments) ? $json->$url->comments : 0;
}
However if I insert it in a loop fetching the results of a query in order to retrieve five posts on a page, this function is seriously affecting the speed of the website (the page takes up to 6-7 seconds to load).
Is there a way to avoid this? Why is it so slow?
Thanks
Pass in a comma separated list of URLs to the ids parameter to get all the counts at once, or alternatively, cache them on the server side and use those values.
Example: https://graph.facebook.com/?ids=http://www.google.com,http://www.bing.com,http://www.yahoo.com
This is specified in Facebook's Graph API Reference under the section "selection"
An example implementation follows:
<?php
function comment_count($urls) {
$json = json_decode(file_get_contents('https://graph.facebook.com/?ids=' . implode(',', array_map("rawurlencode", $urls))));
$output = Array();
foreach($json as $url=>$data)
{
$output[$url] = isset($data->comments) ? $data->comments : 0;
}
return $output;
}
var_dump(comment_count(Array('http://www.facebook.com/', 'http://www.google.com')));
I hope this helps!

Retrieving all twitter followers with a single request

I tried this and even added a cursor but it would still retrieve only the first 100 followers.
<?php
$cursor = -1;
$account_from = 'username';
do
{
$json = file_get_contents('http://api.twitter.com/1/statuses/followers/' . $account_from .'.json?cursor=' . $cursor);
$accounts = json_decode($json);
foreach ($accounts->users as $account)
{
$a[] = $account->screen_name ;
}
$cursor = $accounts->next_cursor;
}
while ($cursor > 0);
foreach($a as $f) {
echo $f ;
}
?>
Is there a better and simpler way of doing it? Where am i going wrong? help please?
API docs state this request is deprecated:
This method is deprecated as it will only return information about users who have Tweeted recently. It is not a functional way to retrieve all of a users followers. Instead of using this method use a combination of GET followers/ids and GET users/lookup.
Use this API instead: https://dev.twitter.com/docs/api/1/get/followers/ids
There should be a property in your object called next_cursor, as long as that's non-zero, keep doing the request again (specifying that cursor) until you get all the results.

Categories