All the question about examples are like 2 years old, and I can't find ANY working example for current API client version (https://github.com/google/google-api-php-client). Every single one is missing something or throwing exception....
Could anybody provide working example? There is no documentation AT ALL, anywhere.
This is the most working one:
<?php
require_once 'inc/google-api/autoload.php'; // or wherever autoload.php is located
$client = new Google_Client();
$client->setApplicationName("whatever");
$client->setDeveloperKey("some key");
$service = new Google_Service_Bigquery($client);
$postBody = "[
'datasetReference' => [
'datasetId' => $datasetId,
'projectId' => $projectId,
],
'friendlyName' => $name,
'description' => $description,
'access' => [
['role' => 'READER', 'specialGroup' => 'projectReaders'],
['role' => 'WRITER', 'specialGroup' => 'projectWriters'],
['role' => 'OWNER', 'specialGroup' => 'projectOwners'],
],
]";
$dataset = $service->datasets->insert($projectId, new Google_Dataset($postBody));
$postBody = "[
'tableReference' => [
'projectId' => 'test_project_id',
'datasetId' => 'test_data_set',
'tableId' => 'test_data_table'
]
]";
$table = $service->tables->insert($projectId, $datasetId, new Google_Table($postBody));
?>
But I am getting Fatal errors about Google_Dataset and Google_Table not defined...
Here is a code that
properly creates a Google_Client
runs a job async
displays the running job ID and status
You need to have:
service account created (something like ...#developer.gserviceaccount.com)
your key file (.p12)
service_token_file_location (writable path to store the JSON from the handshake, it will be valid for 1h)
code sample:
function getGoogleClient($data = null) {
global $service_token_file_location, $key_file_location, $service_account_name;
$client = new Google_Client();
$client->setApplicationName("Client_Library_Examples");
$old_service_token = null;
$service_token = #file_get_contents($service_token_file_location);
$client->setAccessToken($service_token);
$key = file_get_contents($key_file_location);
$cred = new Google_Auth_AssertionCredentials(
$service_account_name, array(
'https://www.googleapis.com/auth/bigquery',
'https://www.googleapis.com/auth/devstorage.full_control'
), $key
);
$client->setAssertionCredentials($cred);
if ($client->getAuth()->isAccessTokenExpired()) {
$client->getAuth()->refreshTokenWithAssertion($cred);
$service_token = $client->getAccessToken();
}
return $client;
}
$client = getGoogleClient();
$bq = new Google_Service_Bigquery($client);
/**
* #see https://developers.google.com/bigquery/docs/reference/v2/jobs#resource
*/
$job = new Google_Service_Bigquery_Job();
$config = new Google_Service_Bigquery_JobConfiguration();
$config->setDryRun(false);
$queryConfig = new Google_Service_Bigquery_JobConfigurationQuery();
$config->setQuery($queryConfig);
$job->setConfiguration($config);
$destinationTable = new Google_Service_Bigquery_TableReference();
$destinationTable->setDatasetId(DATASET_ID);
$destinationTable->setProjectId(PROJECT_ID);
$destinationTable->setTableId('table1');
$queryConfig->setDestinationTable($destinationTable);
$sql = "select * from publicdata:samples.github_timeline limit 10";
$queryConfig->setQuery($sql);
try {
// print_r($job);
// exit;
$job = $bq->jobs->insert(PROJECT_ID, $job);
$status = new Google_Service_Bigquery_JobStatus();
$status = $job->getStatus();
// print_r($status);
if ($status->count() != 0) {
$err_res = $status->getErrorResult();
die($err_res->getMessage());
}
} catch (Google_Service_Exception $e) {
echo $e->getMessage();
exit;
}
//print_r($job);
$jr = $job->getJobReference();
//var_dump($jr);
$jobId = $jr['jobId'];
if ($status)
$state = $status['state'];
echo 'JOBID:' . $jobId . " ";
echo 'STATUS:' . $state;
You can grab the results with:
$res = $bq->jobs->getQueryResults(PROJECT_ID, $_GET['jobId'], array('timeoutMs' => 1000));
if (!$res->jobComplete) {
echo "Job not yet complete";
exit;
}
echo "<p>Total rows: " . $res->totalRows . "</p>\r\n";
//see the results made it as an object ok
//print_r($res);
$rows = $res->getRows();
$r = new Google_Service_Bigquery_TableRow();
$a = array();
foreach ($rows as $r) {
$r = $r->getF();
$temp = array();
foreach ($r as $v) {
$temp[] = $v->v;
}
$a[] = $temp;
}
print_r($a);
You can see here the classes that you can use for your other BigQuery calls. When you read the file, please know that file is being generated from other sources, hence it looks strange for PHP, and you need to learn reading it in order to be able to use the methods from it.
https://github.com/google/google-api-php-client/blob/master/src/Google/Service/Bigquery.php
like:
Google_Service_Bigquery_TableRow
Related
I have this code and trying to var dump my res but am not getting anything, i have installed minishlink web push in my root directory and this is my code plz check
send_push.php
require __DIR__ . '/vendor/autoload.php';
use Minishlink\WebPush\WebPush;
// CONNECT TO THE DATABASE
include_once("php_includes/conn.php");
session_start();
if (isset($_SESSION['user_id'])) {
$userid = $_SESSION['user_id'] ;
}
$sam= "SELECT endpoint,p256dh,auth FROM endpointurl WHERE userid = ? ";
$stmt=$conn->prepare($sam);
$stmt->bind_param('i',$userid);
$stmt->execute();
$stmt->bind_result($endpoint,$p256dh,$key);
$stmt->fetch();
$stmt->close();
$auth = array(
'VAPID' => array(
'subject' => 'https://*****.com',
'publicKey' => '*********',
'privateKey' => '*************', // in the real world, this would be in a secret file
),
);
//exit($subscriber['endpoint'].' : '.$subscriber['auth'].' : '.$subscriber['p256dh']);
$webPush = new WebPush($auth);//some problem here
echo $endpoint ;//not even echoing this
$res = $webPush->sendNotification(
$endpoint,
'{"title":"hello","msg":"yes it works","icon":"/icons/cam.png","badge":"/icons/cam.png","url":"https://*****.com"}',
str_replace(['_', '-'], ['/', '+'],$p256dh),
str_replace(['_', '-'], ['/', '+'],$key),
true
);
var_dump($res) ;
it is not even echoing $endpoint after creating a new webpush();
Hi maybe u can try to put this foreach after sendNotification function.
foreach ($webPush->flush() as $report) {}
You can leave the foreach empty or you can debug the foreach if you want
foreach ($webPush->flush() as $report) {
$endpoint = $report->getRequest()->getUri()->__toString();
if ($report->isSuccess()) {
$result = "[v] Message sent successfully for subscription {$endpoint}.";
} else {
$result = "[x] Message failed to sent for subscription {$endpoint}: {$report->getReason()}";
}
}
I'm currently trying to get a list of all users inside the domain and the logic I'm using is the following:
$service = new Google_Service_Directory($client);
$optParams = array(
'customer' => 'my_customer',
'maxResults' => 500,
'orderBy' => 'email',
);
$results = $service->users->listUsers($optParams);
$users = $results->getUsers();
foreach($users as $user) {
$usersemails = $user->getPrimaryEmail();
echo $usersemails.'<br>';
}
The problem is I only get a max of 500 users. I figure out that I have to use the next page token so I tried this:
$service = new Google_Service_Directory($client);
$optParams = array(
'customer' => 'my_customer',
'maxResults' => 500,
'orderBy' => 'email',
'pageToken' => NULL,
);
$results = $service->users->listUsers($optParams);
$pageToken = $results->getNextPageToken();
$users = $results->getUsers();
while($pageToken);
foreach($users as $user) {
$usersemails = $user->getPrimaryEmail();
echo $usersemails.'<br>';
}
But I'm getting the following message:
504 Gateway Time-out. The server didn't respond in time.
Is there a problem with the code I'm using or is this a problem with the server?
After checking many times I was able to find out that the problem was that I was not properly writing the code. I have modified my code and now it works fine. In case anybody else goes through the same problem perhaps this might help. This is the final code:
$service = new Google_Service_Directory($client);
$pageToken = NULL;
$optParams = array(
'customer' => 'my_customer'
);
try {
do {
if ($pageToken){
$optParams['pageToken'] = $pageToken;
}
$results = $service->users->listUsers($optParams);
$pageToken = $results->getNextPageToken();
$users = $results->getUsers();
foreach($users as $user) {
$usersemails = $user->getPrimaryEmail();
echo $usersemails.'<br>';
}
} while($pageToken);
} catch (Exception $e) {
print 'An error occurred: ' . $e->getMessage();
}
I'm using TwitterAPIExchange and Instagram-PHP-API to request data from both APIs on my server, concatenate them into an array and build this into JSON which I can consume on the client with a simple GET request in jQuery.
This is all working fine, however it is quite slow. When I make a request to static JSON data as a test, it is far quicker. It seems the php script on my server to get this data is running each time (so starting with an empty array, using the wrappers the libraries provide to make requests etc.). What I would like to do is to cache/store this data, and only make another request once or twice a day.
My PHP knowledge is not the best so I'm sure there is a relatively simple way of doing this.
Here is the php script on my server:
<?php
header('Content-type:application/json;charset=utf-8');
require_once('TwitterAPIExchange.php');
require_once('Instagram.php');
unset($allFeeds);
$allFeeds = array();
$settings = array(
'oauth_access_token' => "XXXX' => "XXXX",
'consumer_key' => "XXXX",
'consumer_secret' => "XXXX"
);
$url = 'https://api.twitter.com/1.1/statuses/user_timeline.json';
$getfield = '?screen_name=XXX';
$requestMethod = 'GET';
$twitter = new TwitterAPIExchange($settings);
$twitter_response = $twitter->setGetfield($getfield)->buildOauth($url,$requestMethod)->performRequest();
$jsonTweets = json_decode($twitter_response);
foreach ($jsonTweets as $tweet) {
$tweetObj = new stdClass();
$tweetObj->type = 'Twitter';
$tweetObj->created_at = strtotime($tweet->created_at);
$tweetObj->text = $tweet->text;
$allFeeds[] = $tweetObj;
}
use MetzWeb\Instagram\Instagram;
$instagram = new Instagram(array(
'apiKey' => 'XXXX',
'apiSecret' => 'XXXX',
'apiCallback' => 'XXXX'
));
$token = 'XXX';
$code = $token;
$instagram->setAccessToken($code);
$id_one = 'XXX';
$id_two = 'XXX';
$insta_response_one = $instagram->getUserMedia($id_one, 20);
$insta_response_two = $instagram->getUserMedia($id_two, 10);
foreach ($insta_response_one->data as $insta_one) {
$instaObjOne = new stdClass();
$instaObjOne->type = 'Instagram One';
$instaObjOne->created_at = (int)$insta_one->created_time;
if (isset($insta_one->caption->text)) {
$instaObjOne->text = $insta_one->caption->text;
}
$instaObjOne->img = $insta_one->images->standard_resolution->url;
$allFeeds[] = $instaObjOne;
}
foreach ($insta_response_two->data as $insta_two) {
$instaObjTwo = new stdClass();
$instaObjTwo->type = 'Instagram Two';
$instaObjTwo->created_at = (int)$insta_two->created_time;
if (isset($insta_two->caption->text)) {
$instaObjTwo->text = $insta_two->caption->text;
}
$instaObjTwo->img = $insta_two->images->standard_resolution->url;
$allFeeds[] = $instaObjTwo;
}
function dateSort($a, $b) {
return $b->created_at - $a->created_at;
}
usort($allFeeds, "dateSort");
$data = json_encode($allFeeds);
// cache $data here?
echo $data;
?>
I bringing it into my frontend like so:
$.ajax({
type: 'GET',
url: 'path/to/script.php'
dataType: 'json',
}).done(function(data) {
// do stuff with data here
});
}
Hope that makes sense, thanks
This is the simplest way to add caching to your code.
It's using file_put_contents('cache.txt',$data) to cache the request data and then at the top of the file it checks the current time and compares with the last time the file was modified. If it was modified less than 24 hours ago, it just outputs the content of the cache file using echo file_get_contents('cache.txt); and stops the script.
<?php
header('Content-type:application/json;charset=utf-8');
// Check if file was modified less than 24 hours ago
if ((time() - filemtime('cache.txt')) < 24 * 60 * 60) {
// Output contents of cache file and stop script
echo file_get_contents('cache.txt');
exit();
}
require_once('TwitterAPIExchange.php');
require_once('Instagram.php');
unset($allFeeds);
$allFeeds = array();
$settings = array(
'oauth_access_token' => "XXXX" => "XXXX",
'consumer_key' => "XXXX",
'consumer_secret' => "XXXX"
);
$url = 'https://api.twitter.com/1.1/statuses/user_timeline.json';
$getfield = '?screen_name=XXX';
$requestMethod = 'GET';
$twitter = new TwitterAPIExchange($settings);
$twitter_response = $twitter->setGetfield($getfield)->buildOauth($url,$requestMethod)->performRequest();
$jsonTweets = json_decode($twitter_response);
foreach ($jsonTweets as $tweet) {
$tweetObj = new stdClass();
$tweetObj->type = 'Twitter';
$tweetObj->created_at = strtotime($tweet->created_at);
$tweetObj->text = $tweet->text;
$allFeeds[] = $tweetObj;
}
use MetzWeb\Instagram\Instagram;
$instagram = new Instagram(array(
'apiKey' => 'XXXX',
'apiSecret' => 'XXXX',
'apiCallback' => 'XXXX'
));
$token = 'XXX';
$code = $token;
$instagram->setAccessToken($code);
$id_one = 'XXX';
$id_two = 'XXX';
$insta_response_one = $instagram->getUserMedia($id_one, 20);
$insta_response_two = $instagram->getUserMedia($id_two, 10);
foreach ($insta_response_one->data as $insta_one) {
$instaObjOne = new stdClass();
$instaObjOne->type = 'Instagram One';
$instaObjOne->created_at = (int)$insta_one->created_time;
if (isset($insta_one->caption->text)) {
$instaObjOne->text = $insta_one->caption->text;
}
$instaObjOne->img = $insta_one->images->standard_resolution->url;
$allFeeds[] = $instaObjOne;
}
foreach ($insta_response_two->data as $insta_two) {
$instaObjTwo = new stdClass();
$instaObjTwo->type = 'Instagram Two';
$instaObjTwo->created_at = (int)$insta_two->created_time;
if (isset($insta_two->caption->text)) {
$instaObjTwo->text = $insta_two->caption->text;
}
$instaObjTwo->img = $insta_two->images->standard_resolution->url;
$allFeeds[] = $instaObjTwo;
}
function dateSort($a, $b) {
return $b->created_at - $a->created_at;
}
usort($allFeeds, "dateSort");
$data = json_encode($allFeeds);
// Cache $data here
file_put_contents('cache.txt', $data);
echo $data;
?>
Using PHP, I am able to run BigQuery manually but I need BigQuery to run as an automatic cron job (without Gmail login). How would I accomplish this? Thanks.
You need to create a service account in the developer console, than you will be able to use from code. Here is a code that we have in our cron files.
properly creates a Google_Client using https://github.com/google/google-api-php-client
runs a job async
displays the running job ID and status
You need to have:
service account created (something like ...#developer.gserviceaccount.com)
your key file (.p12)
service_token_file_location (writable path to store the JSON from the handshake, it will be valid for 1h)
code sample:
function getGoogleClient($data = null) {
global $service_token_file_location, $key_file_location, $service_account_name;
$client = new Google_Client();
$client->setApplicationName("Client_Library_Examples");
$old_service_token = null;
$service_token = #file_get_contents($service_token_file_location);
$client->setAccessToken($service_token);
$key = file_get_contents($key_file_location);
$cred = new Google_Auth_AssertionCredentials(
$service_account_name, array(
'https://www.googleapis.com/auth/bigquery',
'https://www.googleapis.com/auth/devstorage.full_control'
), $key
);
$client->setAssertionCredentials($cred);
if ($client->getAuth()->isAccessTokenExpired()) {
$client->getAuth()->refreshTokenWithAssertion($cred);
$service_token = $client->getAccessToken();
}
return $client;
}
$client = getGoogleClient();
$bq = new Google_Service_Bigquery($client);
/**
* #see https://developers.google.com/bigquery/docs/reference/v2/jobs#resource
*/
$job = new Google_Service_Bigquery_Job();
$config = new Google_Service_Bigquery_JobConfiguration();
$config->setDryRun(false);
$queryConfig = new Google_Service_Bigquery_JobConfigurationQuery();
$config->setQuery($queryConfig);
$job->setConfiguration($config);
$destinationTable = new Google_Service_Bigquery_TableReference();
$destinationTable->setDatasetId(DATASET_ID);
$destinationTable->setProjectId(PROJECT_ID);
$destinationTable->setTableId('table1');
$queryConfig->setDestinationTable($destinationTable);
$sql = "select * from publicdata:samples.github_timeline limit 10";
$queryConfig->setQuery($sql);
try {
// print_r($job);
// exit;
$job = $bq->jobs->insert(PROJECT_ID, $job);
$status = new Google_Service_Bigquery_JobStatus();
$status = $job->getStatus();
// print_r($status);
if ($status->count() != 0) {
$err_res = $status->getErrorResult();
die($err_res->getMessage());
}
} catch (Google_Service_Exception $e) {
echo $e->getMessage();
exit;
}
//print_r($job);
$jr = $job->getJobReference();
//var_dump($jr);
$jobId = $jr['jobId'];
if ($status)
$state = $status['state'];
echo 'JOBID:' . $jobId . " ";
echo 'STATUS:' . $state;
You can grab the results with:
$res = $bq->jobs->getQueryResults(PROJECT_ID, $_GET['jobId'], array('timeoutMs' => 1000));
if (!$res->jobComplete) {
echo "Job not yet complete";
exit;
}
echo "<p>Total rows: " . $res->totalRows . "</p>\r\n";
//see the results made it as an object ok
//print_r($res);
$rows = $res->getRows();
$r = new Google_Service_Bigquery_TableRow();
$a = array();
foreach ($rows as $r) {
$r = $r->getF();
$temp = array();
foreach ($r as $v) {
$temp[] = $v->v;
}
$a[] = $temp;
}
print_r($a);
You can see here the classes that you can use for your other BigQuery calls. When you read the file, please know that file is being generated from other sources, hence it looks strange for PHP, and you need to learn reading it in order to be able to use the methods from it.
https://github.com/google/google-api-php-client/blob/master/src/Google/Service/Bigquery.php
like:
Google_Service_Bigquery_TableRow
Also check out the questions tagged with [php] and [google-bigquery]
https://stackoverflow.com/questions/tagged/google-bigquery+php
Using the SoundCloud PHP wrapper, I can successfully update a song’s title, privacy, genre, tags. But I can't figure out what I'm doing wrong with regard to the streamable property. When I send a true value to track[streamable], it remains false.
Here’s what I’m working with:
<?php
require_once 'Soundcloud.php';
require './globaldatabase.php';
$access_token = $_POST['access_token'];
$trackid = $_POST['trackid'];
$title = $_POST['title'];
$genre = $_POST['genre'];
$tag_list = $_POST['tag_list'];
$privacy = $_POST['privacy'];
$release = $_POST['release'];
$streamable = true;
if($privacy=='disabled'){
$streamable = false;
$privacy = 'private';
}
$client = new Services_Soundcloud($sc_clientid, $sc_clientsecret);
$client->setAccessToken($access_token);
try {
$track = json_decode($client->get('tracks/'.$trackid));
$client->put('tracks/' . $track->id, array(
'track[title]' => $title,
'track[genre]' => $genre,
'track[tag_list]' => $tag_list,
'track[sharing]' => $privacy,
'track[release]' => $release,
'track[streamable]' => $streamable
));
$return = $client->get('tracks/' . $track->id);
$return_array[] = json_decode($return);
echo json_encode($return_array);
} catch (Services_Soundcloud_Invalid_Http_Response_Code_Exception $e) {
exit($e->getMessage());
}
?>
Try setting the track attribute api_streamable to true.