Understanding Twitter API's "Cursor" parameter - php

I don't really get how to use the Curesor parameter in Twitter's API, for an exmaple - here.
Am I supposed to make a new API call for each 100 followers?
I'd love it if someone could provide a PHP example for getting a full list of followers assuming I have more than 100...
Thanks in advance!

You need to pass the cursor value back to the API to get the next "chunk" of followers. Then you take the cursor parameter from that chunk and pass it back to get the next chunk. It is like a "get the next page" mechanism.

Since this question was asked, Twitter API has changed in many ways.
Cursor is used to paginate API responses with many results. For example, a single API call to obtain followers will retrieve a maximum of 5000 ids.
If you want to obtain all the followers of a user you will have to make a new API call, but this time you have to indicate the "next_cursor" number that was in your first response.
If it's useful, the following python code will retrieve followers from a given user.
It will retrieve a maximum of pages indicated by a constant.
Be careful not to be banned (i.e.: don't make more than 150 api calls/hour in anonymous calls)
import requests
import json
import sys
screen_name = sys.argv[1]
max_pages = 5
next_cursor = -1
followers_ids = []
for i in range(0,max_pages):
url = 'https://api.twitter.com/1/followers/ids.json?screen_name=%s&cursor=%s' % (screen_name, next_cursor)
content = requests.get(url).content
data = json.loads(content)
next_cursor = data['next_cursor']
followers_ids.extend(data['ids'])
print "%s have %s followers!" % (screen_name, str(len(followers_ids)))

Despite you asked it some time ago, I hope this will be the more accurated answer.
Twitter does not offer info about what next_cursor means. Only how cursor works.
It is only an easy way for Twitter to manage pagination.
«The function is intended to be used linearly, one cursor set at a time per user.» Source
«It's potentially less efficient for you but much more efficient
for us.» Twitter Staff
BUT...
Someone ask before in a broken link «are cursors persistents? seem that the answer is "yes"»
This means you can save your last cursor before 0 and continue with it next time.

Check out http://code.google.com/p/twitter-boot/source/browse/trunk/twitter-bot.php
foreach ($this->twitter->getFollowers(,0 ) as $follower)//the 0 is the page
{
if ($this->twitter->existsFriendship($this->user, $follower['screen_name'])) //If You Follow this user
continue; //no need to follow now;
try
{
$this->twitter->createFriendship($follower['screen_name'], true); // If you dont Follow Followit now
$this->logger->debug('Following new follower: '.$follower['screen_name']);
}
catch (Exception $e)
{
$this->logger->debug("Skipping:".$follower['screen_name']." ".$e->getMessage());
}
}
}

Related

Course catalog of course era using rest api

I am using open api given by course era.
Problem is that its giving only list of 100 courses,i checked on website there are 1294 courses listed.then why it is giving 100 courses on request.
my code is
<?php
$url = "https://api.coursera.org/api/courses.v1";
$result = file_get_contents($url);
print_r($result);
?>
what should i do to fetch whole course catalog and store it in mysql db
From their docs:
https://building.coursera.org/app-platform/catalog/
To paginate through a result set, use integer start and limit query
parameters.
curl "https://api.coursera.org/api/courses.v1?start=300&limit=10"
So just use start and limit for pagination
The API will only return 100 records on each call. In order to get all records, you will need to call the method multiple times and concatenate/store all of the responses.
The response returns the following value, which can be used to modify the subsequent call.
paging":{"next":"101","total":1948},"linked":{}}
Which can then be used accordingly;
https://api.coursera.org/api/courses.v1?start=101
From the documentation:
curl "https://api.coursera.org/api/courses.v1?start=300&limit=10"
You will need to loop through and increase the start until you do not get any more results.
Documentation

PHP - Twilio Recording Duration Value Issue

So I'm creating an application that allows users to record a message through Twilio and I'm attempting to store the RecordingSID, the date it was created, and the duration of the recording in a MySQLi database right after the recording has been made. I've managed to get the RecordingSID by taking the last 34 digits off the RecordingURL using the substr() function and am simply getting whatever today's date is for the date created field in my database table. However, seemingly regardless of how long the actual recording is, I'm continually getting a value of 8 when attempting to get the recording duration. Here's what I've got right now (with database inserts omitted since they work):
<?php
$recordingURL = $_REQUEST['RecordingUrl'];
$recordingSID = substr($recordingURL, -34);
date_default_timezone_set('EST');
$dateCreated = date("Y-m-d");
$duration = $_REQUEST['RecordingDuration'];
?>
Any help with this matter would be fantastic! Thanks!
Edit: I've also tried the following solution in place of the last line in my previous code snippet:
<?php
$recordings = $client->account->recordings->getIterator(0, 50, array('Sid' => $recordingSID,));
foreach ($recordings as $recording)
{
$duration = $recording->duration;
}
?>
Based on that code sample you've pasted in, you could be doing a couple of things incorrectly. Correct me if I'm wrong, but I believe you are trying to request a Recording resource back from the Twilio API after you've submitted one with the Twilio js and their TwiML?
If so, twilio actually has a nice little demo of exactly your use case.
You shouldn't see anything in the $_REQUEST['RecordingDuration'], I'm not sure why you are even getting a value of 8 returned. Basically what you want to do is find the users recordings by using the Twilio REST API.
here is an example snippet:
<?php
// Get the PHP helper library from twilio.com/docs/php/install
require_once('/path/to/twilio-php/Services/Twilio.php'); // Loads the library
// Your Account Sid and Auth Token from twilio.com/user/account
$sid = "ACda6f132a3c49700934481addd5ce1659";
$token = "{{ auth_token }}";
$client = new Services_Twilio($sid, $token);
// Loop over the list of recordings and echo a property for each one
foreach ($client->account->recordings as $recording) {
echo $recording->duration;
}
The response from the API call will return a Recording resource.
Here is some more examples from their docs

Google Big Query + PHP -> How to fetch a large data set without running out of memory

I am trying to run a query in BigQuery/PHP (using google php SDK) that returns a large dataset (can be 100,000 - 10,000,000 rows).
$bigqueryService = new Google_BigqueryService($client);
$query = new Google_QueryRequest();
$query->setQuery(...);
$jobs = $bigqueryService->jobs;
$response = $jobs->query($project_id, $query);
//query is a syncronous function that returns a full dataset
The next step is to allow the user to download the result as a CSV file.
The code above will fail when the dataset becomes too large (memory limit).
What are my options to perform this operation with lower memory usage ?
(I figured an option is to save the results to another table with BigQuery and then start doing partial fetch with LIMIT and OFFSET but I figured a better solution might be available..)
Thanks for the help
You can export your data directly from Bigquery
https://developers.google.com/bigquery/exporting-data-from-bigquery
You can use PHP to run a API call that does the export (you dont need the BQ tool)
You need to set the jobs configuration.extract.destinationFormat see the reference
Just to elaborate on Pentium10's answer
You can export up to a 1GB file in json format.
Then you can read the file line by line which will minimize the memory used by your application and then you can use json_decode the information.
The suggestion to export is a good one, I just wanted to mention there is another way.
The query API you are calling (jobs.query()) does not return the full dataset; it just returns a page of data, which is the first 2 MB of the results. You can set the maxResults flag (described here) to limit this to a certain number of rows.
If you get back fewer rows than are in the table, you will get a pageToken field in the response. You can then fetch the remainder with the jobs.getQueryResults() API by providing the job ID (also in the query response) and the page token. This will continue to return new rows and a new page token until you get to the end of your table.
The example here shows code (in java in python) to run a query and fetch the results page by page.
There is also an option in the API to convert directly to CSV by specifying alt='csv' in the URL query string, but I'm not sure how to do this in PHP.
I am not sure do you still using the PHP but the answer is:
$options = [
'maxResults' => 1000,
'startIndex' => 0
];
$jobConfig = $bigQuery->query($query);
$queryResults = $bigQuery->runQuery($jobConfig, $options);
foreach ($queryResults as $row) {
// Handle rows
}

Routeboxer server side

I'm trying to find out a way to get latitude and longitude bounds from google's routeboxer to php and then query mysql by those limits. I'll then output the results to json or xml to use those with android maps api v2. I found this http://luktek.com/Blog/2011-02-03-google-maps-routeboxer-in-php , but I think that this only does boxes between two points on a map, not boxes around the route itself which makes it not accurate enough. Using javascript is not an option, since I can't use it with google maps api or get the results from my database. Is there any way to accomplish this by using some server side code (preferably PHP, but any other language that works with mysql can be used as well), that can fetch the bounds, query mysql by those and output the data to json or xml so that it can be parsed by android?
I finally found a solution that I'm satisfied with.
I'm not going to paste every step because it's going to take like thousand lines but here's it in a nutshell:
1. Parse this field from Google directions api json(https://developers.google.com/maps/documentation/directions/#JSON): "overview_polyline": {
2. Decode the polyline to latitude and longitude points with this:
http://unitstep.net/blog/2008/08/02/decoding-google-maps-encoded-polylines-using-php/
3. Download this https://github.com/bazo/route-boxer. I piled all the code from GeoTools php-files to a one file but that's not propably necessary if you'll know how to use that :)
4. And here is an example of getting those boxes using those scripts:
...
$from = "(Startup point for example: "Turku,Finland")";
$to = "(Destination point fro example: "Porvoo,Finland")";
$json_string = file_get_contents("http://maps.googleapis.com/maps/api/directions/json?origin=$from&destination=$to&sensor=false");
$parsed_json = json_decode($json_string, true);
$polyline = $parsed_json['routes'][0]['overview_polyline']['points'];
$routepoints = decodePolylineToArray($polyline);
$collection = new LatLngCollection($routepoints);
$boxer = new RouteBoxer();
//calculate boxes with 10km distance from the line between points
$boxes = $boxer->box($collection, $distance = 10);
foreach($boxes as $row){
$southWestLtd = $row->southWest->latitude;
$southWestLng = $row->southWest->longitude;
$northEastLtd = $row->northEast->latitude;
$northEastLng = $row->northEast->longitude;
$query = "SELECT * FROM markers WHERE Latitude > $southWestLtd AND Latitude < $northEastLtd AND Longitude > $southEastLng AND Longitude < $norhtEastLng";
}
Run that query and it'll give you only the markers (or what ever you are querying) that are inside those boxes. If you'll need some more detailed instructions just leave a comment. I'm more than happy to help since I spent many nights trying to find a reasonable solution to this.
Lots of questions here, let me try to tackle some:
Getting lat and long bounds from routeboxer:
Once you have your 'boxes', you can loop through and get these
var northeast = boxes[i].getNorthEast();
var southwest = boxes[i].getSouthWest();
var lat_north = northeast.lat();
var long_east = northeast.lng();
var lat_south = southwest.lat();
var long_west = southwest.lng();
> at this only does boxes between two points on a map, not boxes around the route itself which makes it not accurate enough
I do not know about this persons implementation of Routeboxer but the original Google Routeboxer for the API v3 creates boxes around the entire route. That is what is documented and I can say with confidence, how it works (I've used it)
http://google-maps-utility-library-v3.googlecode.com/svn/trunk/routeboxer/docs/examples.html
> Using javascript is not an option, since I can't use it with google maps api or get the results from my database.
This point makes little sense to me. You DEFINITELY CAN use Google Maps API with Javascript, there is a Javascript Library (version 3 already) specifically FOR THIS PURPOSE. If this is a browser application I would strongly recommend sticking with that, at least initially.
As for getting results from DB, you can use AJAX calls (basically just call your php script via AJAX, have it return JSON data, then use JAVASCRIPT to parse/populate this data onto the screen of the user... Alternately, it doesn't even have to be ajax, you can use routeboxer, then upon some action, submit this to your DB, then have the next page return with results rendered... But AJAX really is the more elegant approach (this case is almost an 'ideal use-case' for it)
BTW: I do not see the logic of putting Routerbox 'server-side' as this programmer has done. He points out rightly to the fact that very long, complex routes, will take time but my thoughts would be: "whether the user is waiting for his PC to crunch the data, or he is waiting for a remote PC (the server) to crunch the data....... all he knows and cares about is that he is waiting, NOT THE BACKEND METHODOLOGY! (with the exception being when one has access to a series of powerful servers and can rewrite the server-side code to take advantage of parallel-runs etc...)

Is there a way to get all tweets from twitter for a specified user?

I wanted to write a function for grabbing all tweets for specified user, but it returns only 20 most recent.
I came up with something like that:
function getTweets($user) {
$page = file_get_contents("http://twitter.com/{$user}");
$from = strpos($page, "<ol id='timeline' class='statuses'>");
$to = strpos($page, "</ol>");
$length = $to - $from;
$page =substr($page, $from, $length);
echo $page;
}
getTweets('user_name');
Is there a way to get round that?
Twitter has an API that you should be querying to retrieve data such as tweets. It is far more efficient than crawling the HTML.
The statuses/user_timeline API service returns a list of tweets from any non-protected user. Here's an example of this service, configured to retrieve tweets for the user FRKT_ (that's me). You can customize the data it returns in many ways, such as by appending the count variable to the URL like so to specify how many tweets you'd like to retrieve.
You should use an XML parser such as SimpleXML rather than miscellaneous string functions such as strpos like you demonstrated to parse the data returned from the API.
Twitter Libraries has libs for php listed. If you can grab a single users all tweets or not I'm afraid I don't know but the libs should be a good starting point.
They only return a maximum of 3200 tweets per user by calling GET statuses/user_timeline, for more info look here:
https://dev.twitter.com/discussions/1157

Categories