I am writing a program in PHP that retrieves the list of who a user is following on Instagram. The problem I have is that their API only returns 50 results per call, and the rest is paginated.
I know that there is a 'next page' as the returned JSON has a pagination->next_url.
Currently, the code I have gets the JSON and decodes it. Immediately afterwards, a call is made to get the next page using the URL from the first API call.
Have a look:
function getFollows($url){
$client_id = "my client id";
//echo "A url: ".$url."</br>";
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_TIMEOUT, 20);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
$result = curl_exec($ch);
curl_close($ch);
return json_decode($result);
}
$url = 'https://api.instagram.com/v1/users/'.$user_id.'/follows/?client_id='.$client_id.'&access_token='.$token;
$first_page = getFollows($url);
$second_page = getFollows($first_page->pagination->next_url);
What I would like to do instead is to check the JSON for a next url and make a call to the next_url. Then it would check the JSON from that url for a next url and repeat. All collected JSON would then be merged into one list which I can then iterate through to echo each individual person.
My question is how can I for every time there is pagination, get the next url, merge the JSON and repeat until there are no more pages to go through.
I could keep making $third_page, $fourth_page, but then that is pointless if the user has more than four pages of followers and if they only have 10 followers for example.
I have tried using an if function to check if there is pagination and array_merge(), but to no avail. Maybe I was doing it wrong.
Please can someone point me in the right direction.
Thanks,
-DH
you can take a ready-made code, there is a point with pagination- https://github.com/cosenary/Instagram-PHP-API
Related
I'm looking for some help with this cURL problem I have.
I'm experiencing slow response when I'm using the cURL function below. I have a list of about 40 ids that I pass to the function in a for loop - the URL does not change. Once I get the reponse from each id, I store the data I need in a JSON array (to iterate through the JSON array I have another for loop within the id for loop) and echo the result. I am 99% certain I've isolated the issue to cURL as I can see the response requests taking a long time in Chrome Development Tools.
I'm thinking there has to be a better way to do this because it's taking a really long time to complete, getting up to 10 minutes. When I test the requests via Postman or my browser the response is almost instant. I can manually complete getting the resposnes quicker than I can with PHP/cURL.
So, I'm thinking there has to be a better way to do this but I just can't seem to work it out.
public function cURLfunction($url, $id){
//example $url = "https://mywebsite.com/getIdData/";
//example $id = "100569841";
$curl = curl_init();
curl_setopt($curl, CURLOPT_URL, $url . "/" . $id);
curl_setopt($curl, CURLOPT_HTTPHEADER, array('Content-Type: application/json' , $authorization ));
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
$result = curl_exec($curl);
curl_close ($curl);
return $result;
}//end cURLfunction
Any help is greatly appreciated
Kahlil
Sorry if this has been posted elsewhere but I could not find the correct answer to my question.
I am trying to get the share count for a page but I'm getting the error:
{"error":{"message":"(#4) Application request limit reached","type":"OAuthException","is_transient":true,"code":4,"fbtrace_id":"GfwY7+r9UJb"}}
I don't think I'm hitting the limit. You can see the page here - https://moving2canada.com/express-entry-draw/. I know it's being called 3 times at a time but I've changed it so it's calling just once and still same result.
If I am hitting the limit somehow, can I add an access token to increase the limit?
Here is the code:
$query_url = "http://graph.facebook.com/?id=https://moving2canada.com/express-entry-draw/";
$curl = curl_init();
curl_setopt($curl, CURLOPT_URL, $query_url);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
curl_setopt($curl, CURLOPT_HTTPHEADER, array('Content-type: application/json'));
$response = curl_exec($curl);
curl_close($curl);
$social_stats = json_decode($response, true)
return $social_stats
Hope I am being clear enough and thanks in advance
The shares field of the URL node has just been removed a couple of API versions back, in current ones you want to be looking for engagement now.
Graph API Explorer example for your given URL: https://developers.facebook.com/tools/explorer/?method=GET&path=%3Fid%3Dhttps%3A%2F%2Fmoving2canada.com%2Fexpress-entry-draw%2F%26fields%3Dengagement&version=v2.12
I am attempting to access the results of a search on Facebook via PHP curl so I can write information about the search results into a database and check for incremental changes. The code is below, and I can get it to work similarly by pulling my pages name and count of likes,
<?php
function get_fb_data($get) {
define("TARGET", "https://www.facebook.com/search/".$get."/likers");
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, TARGET);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$exec = curl_exec($ch);
return $exec;
curl_close($ch);
}
$fb = get_fb_data('506764276135128');
echo '<pre>';
print_r($fb);
echo '</pre>';
?>
The Facebook search does rely on being logged in to function, so I might have to pass user credentials along with an app key to access the results. However, I am hoping not as it has become extremely difficult and confusing to navigate through the App Developer section of the site and get approval for an app.
What I would like to pull off the page is the number of results, and the name/profile id for each result.
That would actually be scraping and is not allowed on Facebook. Use the Search API instead (see the section about "Searching").
It does not offer the same functionality as the Graph Search that is integrated in Facebook, but it is the only permitted way and you can search for Pages, Users, Events, etc).
I am trying to get the details of a venue in Foursquare using the venue ID, but there is something minor that isn't correct:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'https://api.foursquare.com/v2/venues/4c599c84f346c9287ff84cca?client_id=[MY_ID]&client_secret=[MY_SECRET]&v=20120609');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$contents = curl_exec ($ch);
$result = json_decode($contents);
var_dump($result);
curl_close ($ch);
If I change the URL to something like Google News RSS, or one of my XML feed pages, I get results, to the cURL portion is right
If I paste the above URL into my browser (with my actual ID&Secret) then I get a json formatted result with the data that I expect (correct name of venue, etc). So I know the URL is correct.
If I copy the json from the above process and put it into a variable in my code, then set $result to the decode version of that variable then I see the results properly. So I know that the decode/output bit is working.
Somewhere between retrieving the result and storing it in a variable for decoding something is going wrong. I have to assume it is something silly and simple, since all of the parts are there, but I can't figure it out.
Any help is appreciated.
The answer turned out to be that I had to add the following 2 lines:
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
In order to decrease pageloads I would like to change my current solution. Below is a part of the code that I use to give the user the booktitle from a isbn. The script takes the isbn, posted from a form and checks its isbn validity and then uses a open api solution where a http request is mde with the isbn and the title is returned as json and then displayed for the user.
The user usually checks three books and, with the current solution, have to post the form for each book to get the result. How is the best way to do this without pageloads?
if(isset($_POST['isbn']) && strip_tags($_POST['isbn'])!=''){
$currISBN = new ISBNtest;
$currISBN->set_isbn(strip_tags($_POST['isbn']));
if ($currISBN->valid_isbn13() === TRUE) {
//Hämta info
$url = "http://apisite.com/search?query=isbn:".$currISBN->get_isbn13()."&format=json";
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_REFERER, "http://www.site.com");
$json_body = curl_exec($ch);
curl_close($ch);
$json = json_decode($json_body,true);
$this_title=$json["xsearch"]["list"][0]["title"];
Sounds like a job for jQuery.ajax()!
http://api.jquery.com/jQuery.ajax/
If you have a little bit of programming under your belt, it shouldn't take too long to figure out how to implement it.
This looks like a prime example for using XMLHTTPRequest in the background - and some javascript that puts some data on the page. You would not need to pageload at all.