Requesting more than the 100 limit api - php

I'm trying to request more than 100 live channels from the Twitch API. The current limit of results are 100. I need to do a new request every 100 while adding +1 to the offset url. I've tried to search to find help but I haven't been able to find anything.
<?php
$get2 = json_decode
(#file_get_contents_curl('https://api.twitch.tv/kraken/streams/?
offset=0&limit=5'), true);
foreach ($get2['streams'] as $test) {
echo "<pre>";
print_r ($test['channel']['name'] . ' | ' . $test['viewers']);
echo "</pre>";
}
?>
This will echo 100 results perfectly. I just need to loop that code x however many times I need to get to the total. So say there are 30,000 live streams. That will be 300 queries.

Figured out what I had to do after staying up all night working on it.
This will take the request and loop it for as many times as I need to get the full results.
for ($offset = 0; $offset < $requests; $offset++) {
$testz =
json_decode(#file_get_contents_curl('https://api.twitch.tv/kraken/streams/?
offset=' . $offset . '&limit=100'), true);
foreach ($testz['streams'] as $streamz) {
echo "<pre>";
print_r ($streamz['channel']['name']);
echo "</pre>";
}
}

Related

I want to show all result from an api

I'm working with google places API , and i showed content from the api but one by one like this
<?php
$str = file_get_contents('https://maps.googleapis.com/maps/api/place/nearbysearch/json?location=48.075971700000004,-0.7651981999999999&radius=5000&type=restaurant&keyword=cruise&key=AIzaSyDxXV4Ka8yiDq1-UKzlzX-MUC8csfdN8y4');
$maps_url = 'https://maps.googleapis.com/maps/api/place/nearbysearch/json?location=48.075971700000004,-0.7651981999999999&radius=5000&type=restaurant&keyword=cruise&key=AIzaSyDxXV4Ka8yiDq1-UKzlzX-MUC8csfdN8y4';
$maps_json = file_get_contents($maps_url);
$maps_array = json_decode($maps_json, true);
$lat = $maps_array['results'][1]['name'];
$lat2 = $maps_array['results'][2]['name'];
echo $lat;
echo "<br>";
echo $lat2;
?>
but i want to show all the result one time with a loop
Try
foreach ($maps_array['results'] as $map) {
echo $map['name'];
echo "<br>";
}
Put the results array in a foreach and make a loop from that. in that way u will get all the data instead of taking it 1 by 1. And for the numbers array u can add a ++ statement(not needed).
hope this helps. Good luck

Facebook Pagination Limit with Graph API

this is my first post here.
I'm trying to query all places within my city limits (Winnipeg) using geo-location data and a radius of 15,000 meters.
This is working somewhat well with the query:
$request = new FacebookRequest(
$session,
'GET',
'/search?type=place&limit=100&center=49.89542,-97.13853&distance=15000'
);
The pagination afterwards works once or twice but ends up cutting me off after a variable 180-220 results. I noticed the pages returned are all from downtown (center query string) and start to spread outward, but it stops after only 2 city blocks.
Here is the loop I have in use for iterating through additional results:
$paging = $graphObject->asArray()['paging'];
$i = 0;
while(property_exists($paging, "next") && $i <= $times)
{
$next_url = $paging->next;
$json = file_get_contents($next_url);
$obj = json_decode($json);
foreach($obj->data as $page)
{
$page_ids[] = $page->id;
}
$paging = $obj->paging;
$i++;
}
I have researched issues with API limits but nothing that comes even close to only 200-ish results.
Is there possibly something I'm missing here?

Rate limit. Twitter API

I'm working on a small and simple code which basically does some tweets filtering. The problem is that I'm hitting the request limit of Twitter API and I would like to know if there is a workaround or if what I want to do just cannot be done.
First, I type a twitter username to retrieve the ID's of people this user follows.
$user_id = $_GET["username"];
$url_post = "http://api.twitter.com/1/friends/ids.json?cursor=-1&screen_name=" . urlencode($user_id);
$following = file_get_contents($url_post, true);
$json = json_decode($following);
$ids = $json->ids;
Twitter API responds with a list of ID's.
Here comes the problem. The next step is to make a request to find out username, profile picture and description for each one of those ID's.
$following = array();
foreach ($ids as $value)
{
$build_url = 'http://api.twitter.com/1/users/lookup.json?user_id=' . $value . '';
$following[] = $build_url;
}
foreach ($following as $url)
{
$data_names = file_get_contents($url, true); //getting the file content
$json_names = json_decode($data_names);
foreach ($json_names as $tweet) {
$name = $tweet->name;
$description = $tweet->description;
echo '<p>';
echo $name . '<br>';
echo $description;
echo '</p>';
}
}
If the user follows 50 people it works. But if he follows, let's say, 600 hundred, that would be 600 hundred request (for username, description and profile pic) to Twitter API which exceeds the limit.
Is there any way to workaround this o it just cannot be done?
Thank you!
You can and should request users/lookup API endPoint with 100 userIds at a time, instead of doing one request per twitter ID. cf. https://dev.twitter.com/docs/api/1.1/get/users/lookup
You have to replace your forEach loop (foreach ($following as $url)) by a recursive function.
At the end of the function, check the number of hits remaining before calling it again (cf. this link to see how to know the time remining until you get rate limited).
If there is no hit left, sleep 15 minutes before calling the function again, otherwise do the call again.
There is plenty of information on how to do this, use Google and search existing stackOverflow questions.

tweets on last moments with twitter search api

Well, what I'm trying to do is quite obvious. I am receiving tweets as shown below :
$options .= 'q='.urlencode($hash_tag);
$options .= '&page=15';
$options .= '&rpp=100';
$options .= '&result_type=recent';
$url = 'https://search.twitter.com/search.atom?'.$options ;
$ch = curl_init($url);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, TRUE);
$xml = curl_exec ($ch);
curl_close ($ch);
$affected = 0;
$twelement = new SimpleXMLElement($xml);
foreach ($twelement->entry as $entry) {
$text = trim($entry->title);
$author = trim($entry->author->name);
$time = strtotime($entry->published);
$id = $entry->id;
echo '<hr>';
echo "Yazan : ".$author;
echo "</br>";
echo "Tarih : ".date('Ymd H:i:s',$time);
echo "</br>";
echo "Tweet : ".$text;
echo "</br>";
}
and as you can check on this link : linkToTrial I can receive tweets. But they are so old for me! I want to receive tweets in last moments, at least in last 5 mins. Here it says
This sounds like something you can do on your end, as created_at is one of the fields returned in the result set. Just do your query, and only use the ones that are within the last 5 seconds.
but when you check my example, you will see that I'm not even receiving the last tweets. Where am I doing wrong? Where?
Any answer will be appreciated. Thanks for your responds.
You're using a deprecated API (search.twitter.com) that will cease functioning on May 7, 2013 -- you'll want to move to the v1.1 Search API -- see https://dev.twitter.com/docs/api/1.1/get/search/tweet for docs.
It looks like the specific reason you're getting older results with this query is that you're starting on page 15 -- the end of the result set. The most recent tweets will be at the beginning of the result set -- page 1.
In API v1.1, the concept of paging no longer exists for the Search API. Instead, you navigate through the result set using since_id and max_id, details here: https://dev.twitter.com/docs/working-with-timelines

PHP Foreach Loop Speed

I'm trying to take a list of 40,000 items and run a json request for each item. The process is super slow which I suspect is because each json request is finishing before the next request starts.
$result = mysql_query("SELECT * FROM $tableName");
while($row = mysql_fetch_array($result))
{
checkRank($row['Domain']);
}
checkRank is running something to the effect of:
$json = file_get_contents($jsonurl,0,null,null);
I'm thinking I could run 10 checkRanks at a time to speed the process up? Any other ideas / suggestions?
UPDATE:
For example this loop runs through my array in 27 secs.
for ($i=0; $i<=100; $i++) {
checkRank($domains[$i]);
$i++;
checkRank($domains[$i]);
$i++;
checkRank($domains[$i]);
$i++;
checkRank($domains[$i]);
$i++;
checkRank($domains[$i]);
$i++;
echo "checking " . $i . "<br/>";
}
The loop below takes over 40 seconds with the same array.
for ($i=0; $i<=100; $i++) {
checkRank($domains[$i]);
echo "checking " . $i . "<br/>";
}
Not sure if this will help much because I don't work with PHP but I did find this.
How can one use multi threading in PHP applications
Unless there is something else that you haven't mentioned, the best way to do this would be to do one JSON request and pass your items to it, and get the equal number of results back. That way, you minimize the server response time. I am not sure if you want to send 40,000 items in though, you might want to divide it into 2 parts, but that you can test this later.
so you checkRank() would look something like
function checkRank($domainsArray) {
$json = file_get_contents($jsonurl,$domainsArray);
}
http://www.phpied.com/simultaneuos-http-requests-in-php-with-curl/
This seems to be a nice way to speed up the processing. Thanks for all the input guys.
... it's a shame, I read on Stack Overflow today that PHP could not support threading as it would require fundamental changes to the language ...
https://github.com/krakjoe/pthreads

Categories