facebook api: count attendees for event (limit problem) - php

Okay normally I'm all fine about the facebook API but I'm having a problem which just keeps me wondering. (I think it's a bug (Check ticket http://bugs.developers.facebook.net/show_bug.cgi?id=13694) but I wanted to throw it here if somebody has an idea).
I'm usng the facebook PHP library to count all attendees for a specific event
$attending = $facebook->api('/'.$fbparams['eventId'].'/attending');
this works without a problem it correctly returns an array with all attendees...
now heres the problem:
This event has about 18.000 attendees right now.
The api call returns a max number of 992 attendees (and not 18000 as it should).
I tried
$attending = $facebook->api('/'.$fbparams['eventId'].'/attending?limit=20000');
for testing but it doesn't change anything.
So my actual question is:
If I can't get it to work by using the graph api what would be a good alternative? (Parsing the html of the event page maybe?) Right now I'm changing the value by hand every few hours which is tedious and unnecessary.

Actually there are two parameters, limit and offset. I think that you will have to play with both and continue making calls until one returns less than the max. limit.
Something like this, but in a recursive approach (I'm writting pseudo-code):
offset = 0;
maxLimit = 992;
totalAttendees = count(result)
if (totalAttendees >= maxLimit)
{
// do your stuff with each attendee
offset += totalAttendees;
// make a new call with the updated offset
// and check again
}

I've searched a lot and this is how I fixed it:
The requested URL should look something like this.
Here is where you can test it and here is the code I used:
function events_get_facebook_data($event_id) {
if (!$event_id) {
return false;
}
$token = klicango_friends_facebook_token();
if ($token) {
$parameters['access_token'] = $token;
$parameters['fields']= 'attending_count,invited_count';
$graph_url = url('https://graph.facebook.com/v2.2/' . $event_id , array('absolute' => TRUE, 'query' => $parameters));
$graph_result = drupal_http_request($graph_url, array(), 'GET');
if(is_object($graph_result) && !empty($graph_result->data)) {
$data = json_decode($graph_result->data);
$going = $data->attending_count;
$invited = $data->invited_count;
return array('going' => $going, 'invited' => $invited);
}
return false;
}
return false;
}

Try
SELECT eid , attending_count, unsure_count,all_members_count FROM event WHERE eid ="event"

Related

Google Search Console API group results

I'm working on this project which use GSC API. Everything's working fine but I'm stuck with one kind of query : I wanna group the result by Pages so I can get all the queries (or keywords) for each page.
here's what a request looks like :
public function getTopPages($sitename){
$this->client->setAccessToken($this->getStoredToken());
$searchRequest = new \Google_Service_Webmasters_SearchAnalyticsQueryRequest;
//last month from today
$searchRequest->setStartDate($this->getDateMinusMonths("-12 months")); //'YYYY-MM-DD'
$searchRequest->setEndDate(date("Y-m-d", time()));
$dimensions = ['page', 'query'];
$searchRequest->setDimensions($dimensions);
$searchRequest->setRowLimit(10);
$searchRequest->setAggregationType('byPage');
$siteQueries = $this->webmaster->searchanalytics->query($sitename, $searchRequest);
if (count($siteQueries) > 0) {
$siteQueries = $siteQueries->getRows();
if ($siteQueries) {
return $siteQueries;
}
}
}
I tried to change the aggregationType property but it doesnt changed the result.
Here's the result
As you guys can see, there's a url coming back (url-example-1) for different keywords.
Is it possible to merge them directly from the request ? Suming all the datas (clicks, impressions, ctr...)
I know I could write a script to do it but I believe it's possible to do it natively
Thanks a lot

google analytics api suddenly stopped returning data after working for many months

My analytics api stopped returning rows the other day. I see the data in the analytics api. What could cause this? Is it a quota issue? Is there some messages the api returns that could tell me what is going on?
It worked flawlessly prior to this. I'm not seeing any visible errors when I manually run the script. I am using composer to include the SDK, and it's up to date. I do appear to be able to initialize my Google_Service_AnalyticsReporting($client)
When I make the report request, I just don't get any rows or any indication as to why I am no longer getting any rows returned.
My code is based on the accepted answer here How to get the next 10,000 data from google analytics api using php?
I apparently am still getting data but the while loop never fires now. I have views where the next page token value is blank, and at least 1 where it is > 10000.
Any help on steps to take to trouble shoot this would be greatly appreciated. Issue started on 9.19
$data = $analytics->reports->batchGet( $body );
$cnt = 0;
while ($data->reports[0]->nextPageToken > 0 && $cnt < 10) {
// There are more rows for this report.
$body->reportRequests[0]->setPageToken($data->reports[0]->nextPageToken);
$pagedata = $analytics->reports->batchGet( $body );
$this->processResults($mysqli, $pagedata, $auth);
}
Checking if nextPageToken is a number fixed my issue.
if(is_numeric($data->reports[0]->nextPageToken)){
//nextPageToken is a number
$cnt = 0;
while ($data->reports[0]->nextPageToken > 0 && $cnt < 5) {
$body->reportRequests[0]->setPageToken($data->reports[0]->nextPageToken);
$data = $analytics->reports->batchGet( $body );
$this->processResults($mysqli, $data, $auth);
$cnt++;
}
}else{
//nextPageToken IS NOT A NUMBER
$data = $analytics->reports->batchGet( $body );
$this->processResults($mysqli, $data, $auth);
}

ZohoCRM Module getRecords PHP

I'm using the ZohoCRM PHP SDK to attempt to pull all Account records from the CRM and manipulate them locally (do some reports). The basic code looks like this, which works fine:
$account_module = ZCRMModule::getInstance('Accounts');
$response = $account_module->getRecords();
$records = $response->getData();
foreach ($records as $record) {
// do stuff
}
The problem is that the $records object only has 200 records (out of about 3000 total). I can't find any docs in the (minimally / poorly documented) SDK documentation showing how to paginate or get bigger result sets, and the Zoho code samples in the dev site don't seem to be using the same SDK for some reason.
Does anyone know how I can paginate through these records?
The getRecords() method seems to accept 2 parameters. This is used within some of their examples. You should be able to use those params to set/control pagination.
$param_map = ["page" => "20", "per_page" => "200"];
$response = $account_module->getRecords($param_map);
#dakdad was right that you can pass in the page and per page values into the param_map. You also should use the $response->getInfo()->getMoreRecords() to determine if you need to paginate. Something like this seems to work:
$account_module = ZCRMModule::getInstance('Accounts');
$page = 1;
$has_more = true;
while ($has_more) {
$param_map = ["page" => $page, "per_page" => "200"];
$response = $account_module->getRecords($param_map);
$has_more = $response->getInfo()->getMoreRecords();
$records = $response->getData();
foreach ($records as $record) {
// do stuff
}
$page++;
}

Push Request from a Do while using Guzzle

I have a problem with my results array, what I initially intended to have is something like this
$promises = [
'0' => $client->getAsync("www.api.com/opportunities?api=key&page=1fields=['fields']"),
'1' => $client->getAsync("www.api.com/opportunities?api=key&page=2fields=['fields']"),
'2' => $client->getAsync("www.api.com/opportunities?api=key&page=3fields=['fields']")
];
An array of request promises, I will use it because I want to retrieve a collection of data from the API that I am using. This is what the API first page looks like
In my request I want to get page 2,3,4.
This is how page 2 looks like
I made a do while loop on my PHP script but it seems to run an infinite loop
This is how it should work. First I run the initial request then get totalRecordCount = 154 and subtract it to recordCount = 100 if difference is != 0 it run it again and change the $pageNumber and push it to the promises.
This is my function code.Here's my code
function runRequest(){
$promises = [];
$client = new \GuzzleHttp\Client();
$pageCounter = 1;
$globalCount = 0;
do {
//request first and then check if has difference
$url = 'https://api.com/opportunities_dates?key='.$GLOBALS['API_KEY'].'&page='.$pageCounter.'&fields=["fields"]';
$initialRequest = $client->getAsync($url);
$initialRequest->then(function ($response) {
return $response;
});
$initialResponse = $initialRequest->wait();
$initialBody = json_decode($initialResponse->getBody());
$totalRecordCount = $initialBody->totalRecordCount;//154
$recordCount = $initialBody->recordCount;//100
$difference = $totalRecordCount - $recordCount;//54
$pageCounter++;
$globalCount += $recordCount;
array_push($promises,$url);
} while ($totalRecordCount >= $globalCount);
return $promises;
}
$a = $runRequest();
print_r($a); //contains array of endpoint like in the sample above
There is an endless loop because you keep looping when the total record count equals the global count. Page 3 and above have 0 records, so the total will be 154. Replacing the >= with a > will solve the loop.
However, the code will still not work as you expect it to do. For each page, you prepare a request with getAsync() and immediately do a wait(). The then statement does nothing. It returns the response, which it already does by default. So in practice, these are all sync requests.
Given that the page size is constant, you can calculate the pages you need based on the information given on the first request.
function runRequest(){
$promises = [];
$client = new \GuzzleHttp\Client();
$url = 'https://api.com/opportunities_dates?key='.$GLOBALS['API_KEY'].'&fields=["fields"]';
// Initial request to get total record count and page count
$initialRequest = $client->getAsync($url.'&page=1');
$initialResponse = $initialRequest->wait();
$initialBody = json_decode($initialResponse->getBody());
$promises[] = $initialRequest;
$totalRecordCount = $initialBody->totalRecordCount;//154
$pageSize = $initialBody->pageSize;//100
$nrOfPages = ceil($totalRecordCount / $pageSize);//2
for ($page = 2; $page <= $nrOfPages; $page++) {
$promises[] = $client->getAsync($url.'&page='.$page);
}
return $promises;
}
$promises = runRequest();
$responses = \GuzzleHttp\Promise\unwrap($promises);
Note that the function now returns promises and not URLs as strings.
It doesn't matter that the first promise is already settled. The unwrap function will not cause another GET request for page 1, but return the existing response. For all other pages, the requests are done concurrently.

php RRD graph separation

I am trying to create RRD graphs with the help of PHP in order to keep track of the inoctets,outoctets and counter of a server.
So far the script is operating as expected but my problems comes when I am trying to produce 2 or more separate graphs. I am trying to produce (hourly, weekly , etc) graphs. I thought by creating a loop would solve my problem, since I have split the RRA in hours and days. Unfortunately I end up having 2 graphs that updating simultaneously as expected but plotting the same thing. Has any one encounter similar problem? I have applied the same program in perl with RRD::Simple,where is extremely easy and everything is adjusted almost automatically.
I have supplied under a working example of my code with the minimum possible data because the code is a bit long:
<?php
$file = "snmp-2";
$rrdFile = dirname(__FILE__) . "/snmp-2.rrd";
$in = "ifInOctets";
$out = "ifOutOctets";
$count = "sysUpTime";
$step = 5;
$rounds = 1;
$output = array("Hourly","Daily");
while (1) {
sleep (6);
$options = array(
"--start","now -15s", // Now -10 seconds (default)
"--step", "".$step."",
"DS:".$in.":GAUGE:10:U:U",
"DS:".$out.":GAUGE:10:U:U",
"DS:".$count.":ABSOLUTE:10:0:4294967295",
"RRA:MIN:0.5:12:60",
"RRA:MAX:0.5:12:60",
"RRA:LAST:0.5:12:60",
"RRA:AVERAGE:0.5:12:60",
"RRA:MIN:0.5:300:60",
"RRA:MAX:0.5:300:60",
"RRA:LAST:0.5:300:60",
"RRA:AVERAGE:0.5:300:60",
);
if ( !isset( $create ) ) {
$create = rrd_create(
"".$rrdFile."",
$options
);
if ( $create === FALSE ) {
echo "Creation error: ".rrd_error()."\n";
}
}
$t = time();
$ifInOctets = rand(0, 4294967295);
$ifOutOctets = rand(0, 4294967295);
$sysUpTime = rand(0, 4294967295);
$update = rrd_update(
"".$rrdFile."",
array(
"".$t.":".$ifInOctets.":".$ifOutOctets.":".$sysUpTime.""
)
);
if ($update === FALSE) {
echo "Update error: ".rrd_error()."\n";
}
$start = $t - ($step * $rounds);
foreach ($output as $test) {
$final = array(
"--start","".$start." -15s",
"--end", "".$t."",
"--step","".$step."",
"--title=".$file." RRD::Graph",
"--vertical-label=Byte(s)/sec",
"--right-axis-label=latency(min.)",
"--alt-y-grid", "--rigid",
"--width", "800", "--height", "500",
"--lower-limit=0",
"--alt-autoscale-max",
"--no-gridfit",
"--slope-mode",
"DEF:".$in."_def=".$file.".rrd:".$in.":AVERAGE",
"DEF:".$out."_def=".$file.".rrd:".$out.":AVERAGE",
"DEF:".$count."_def=".$file.".rrd:".$count.":AVERAGE",
"CDEF:inbytes=".$in."_def,8,/",
"CDEF:outbytes=".$out."_def,8,/",
"CDEF:counter=".$count."_def,8,/",
"COMMENT:\\n",
"LINE2:".$in."_def#FF0000:".$in."",
"COMMENT:\\n",
"LINE2:".$out."_def#0000FF:".$out."",
"COMMENT:\\n",
"LINE2:".$count."_def#FFFF00:".$count."",
);
$outputPngFile = rrd_graph(
"".$test.".png",
$final
);
if ($outputPngFile === FALSE) {
echo "<b>Graph error: </b>".rrd_error()."\n";
}
} /* End of foreach function */
$debug = rrd_lastupdate (
"".$rrdFile.""
);
if ($debug === FALSE) {
echo "<b>Graph result error: </b>".rrd_error()."\n";
}
var_dump ($debug);
$rounds++;
} /* End of while loop */
?>
A couple of issues.
Firstly, your definition of the RRD has a step of 5seconds and RRAs with steps of 12x5s=1min and 300x5s=25min. They also have a length of only 60 rows, so 1hr and 25hr respectively. You'll never get a weekly graph this way! You need to add more rows; also the step seems rather short, and you might need a smaller-step RRA for hourly graphs and a larger-step one for weekly graphs.
Secondly, it is not clear how you're calling the graph function. You seem to be specifying:
"--start","".$start." -15s",
"--end", "".$t."",
"--step","".$step."",
... which would force it to use the 5s interval (unavailable, so the 1min one would always get used) and for the graph to be only for the time window from the start to the last update, not a 'hourly' or 'daily' as you were asking.
Note that the RRA you have defined do not define the time window of the graph you are asking for. Also, just because you have more than one RRA defined, it doesnt mean you'll get more than one graph unless oyu call the graph function twice with different arguments.
If you want a daily graph, use
"--start","end - 1 hour",
"--end",$t,
Do not specify a step as the most appropriate available will be used anyway. For a daily graph, use
"--start","end - 1 day"
"--end",$t,
Similarly, no need to specify a step.
Hopefully this will make it a little clearer. Most of the RRD graph options have sensible defaults, and RRDTool is pretty good at picking the correct RRA to use based on the graph size, time window, and DEF statements.

Categories