I am having issues with YouTube Data API. I'm working on a PHP function to retrieve all videos information in JSON format from a YouTube channel, and then put that into a file.
So far I managed to call the API and retrieve the first 50 videos info; am also knowing the method to create the file.
I know that I have to use the nextPageToken parameter to loop the function until there is no nextPageToken, but I'm stuck. I've seen a lot of similar posts with this problem, but none of them really helped me.
To sum up, I need help to:
use nextPageToken to collect information after 50 videos;
collect this information each turn of the loop.
So far the code I found and used:
function youtube_search($API_key, $channelID, $max_results, $next_page_token=''){
$myQuery = "https://www.googleapis.com/youtube/v3/search?key=".$API_key."&channelId=".$channelID."&part=snippet,id&order=date&maxResults=".$max_results;
$ch = curl_init();
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_URL, $myQuery);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_VERBOSE, 0);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
$response = curl_exec($ch);
curl_close($ch);
$data = json_decode($response);
if(!empty($data->nextPageToken)){
return youtube_search($API_key, $channelID, $max_results, $searchResponse['nextPageToken']);
}
}
youtube_search($API_key, $channelID, $max_results, $next_page_token='');
Thank you for helping (am a humble beginner -_-).
I asked a friend and he helped me with my issue. This code didn't work because I didn't have the right arguments... and many other problems. This code works to retrieve videos info from a channel and write it in a JSON file
<?php
$API_key = 'API_key';
$channelID = 'channelID';
$max_results = 50;
$table = array();
$file_name = 'all-videos.json';
function youtube_search($API_key, $channelID, $max_results, $next_page_token,$videos,$file){
$dataquery = "https://www.googleapis.com/youtube/v3/search?part=snippet&channelId=".$channelID."&maxResults=".$max_results."&order=date&pageToken=".$next_page_token."&key=".$API_key;
$ch = curl_init();
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_URL, $dataquery);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_VERBOSE, 0);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
$response = curl_exec($ch);
curl_close($ch);
$data = json_decode($response);
foreach ($data->items as $item){
array_push($videos,$item);
}
$content = json_encode($videos);
file_put_contents($file, $content);
if(!empty($data->nextPageToken)){
youtube_search($API_key, $channelID, $max_results, $data->nextPageToken, $videos, $file);
}
}
youtube_search($API_key, $channelID, $max_results, $next_page_token='', $table, $file_name);
?>
(Disclaimer: this is not an actual answer to OP's question, but only a code refactoring of his own answer below.)
Here is your code refactored such that to have a do...while loop instead of tail recursion:
function youtube_search_paginated(
$API_key, $channelID, $max_results, $next_page_token, $videos, $file)
{
$dataquery = "https://www.googleapis.com/youtube/v3/search?part=snippet&channelId=".$channelID."&maxResults=".$max_results."&order=date&key=".$API_key;
if (!empty($next_page_token))
$dataquery .= "&pageToken=".$next_page_token;
$ch = curl_init();
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_URL, $dataquery);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_VERBOSE, 0);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
$response = curl_exec($ch);
curl_close($ch);
$data = json_decode($response);
foreach ($data->items as $item){
array_push($videos,$item);
}
$content = json_encode($videos);
file_put_contents($file, $content);
return $data->nextPageToken;
}
function youtube_search(
$API_key, $channelID, $max_results, $videos, $file)
{
$next_page_token = NULL;
do {
$next_page_token = youtube_search_paginated(
$API_key, $channelID, $max_results, $next_page_token, $videos, $file);
} while (!empty($next_page_token));
}
youtube_search($API_key, $channelID, $max_results, $table, $file_name);
Related
I wanted to make an inline bot! and when i do this:
function sendResponse($url, $data){
$ch = curl_init();
//curl_setopt($ch, CURLOPT_HTTPHEADER, array('Content-Type: multipart/form-data'));
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, array('inline_query_id' => $data['inline_query_id'], 'results' => json_encode($data['results'])));
$output = curl_exec($ch);
return $output;
}
It wont work, the error (with or without the header): {"ok":false,"error_code":400,"description":"[Error]: Bad request: Field \"message_text\" must be of type String"}
but when I do it like this:
function sendResponse($url, $data){
$ch = curl_init();
//curl_setopt($ch, CURLOPT_HTTPHEADER, array('Content-Type: multipart/form-data'));
curl_setopt($ch, CURLOPT_URL, $url.'?inline_query_id='.rawurlencode($data['inline_query_id']).'&results='.rawurlencode(json_encode($data['results'])));
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
//curl_setopt($ch, CURLOPT_POST, 1);
//curl_setopt($ch, CURLOPT_POSTFIELDS, $q);
$output = curl_exec($ch);
return $output;
}
It works ! the problem is the second method request URI will be too large so I cannot use it!
Any way I can send these data is okay with me! thanks!
and the code for making $data is here:
$result = connectWebsite(SITE_SEARCH_URL, urlencode($update['inline_query']['query']));
$result = json_decode($result);
$output = array();
$output['inline_query_id'] = $update['inline_query']['id'];
$i = 0;
foreach($result as $post){
$data = array();
$data['type'] = 'article';
$data['id'] = strval($post->ID);
$data['title'] = '('.$post->atypes.') '.$post->title;
if(strlen($post->content) > 2100)
$tmp = substr($post->content, 0, 2096).'...';
$data['message_text'] = '<b>'.$post->title.'</b>'.ucwords($post->genre, ',').$tmp;
$data['parse_mode'] = 'HTML';
if(strlen($post->content) > 200)
$tmp = substr($post->content, 0, 196).'...';
//$data['description'] = ucwords($post->genre, ',').' | '.$tmp;
$output['results'][$i] = $data;
$i++;
if($i == MAX_RESULTS)
break;
}
sendResponse(API_URL.'answerInlineQuery', $output);
It might help someone so I'll answer it myself.
the problem was the UTF-8 encoding
I replaced substr with mb_substr
besides at the first line I'v added this: mb_internal_encoding("UTF-8")
and ... the problem was solved. now I can send my inline query results (or any other command) without the URL length problem
Thanks everyone for your help
When i am Decoding using commented "$jsonString" String it is working very well.
But after using curl it is not working, showing Null.
Please Help Me in this.
if (isset($_POST['dkno'])) {
$dcktNo = $_POST['dkno'];
$url = 'http://ExampleStatus.php?dkno=' . $dcktNo;
$myvars = '';
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, $myvars);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$jsonString = curl_exec($ch);
// $jsonString = '[{"branchname":"BHUBNESHWAR","consignee":"ICICI BANK LTD","currentstatus":"Delivered by : BHUBNESHWAR On - 25/07/2015 01:00","dlyflag":"Y","PODuploaded":"Not Uploaded"}]';
if ($jsonString != '') {
$json = str_replace(array('[', ']'), '', $jsonString);
echo $json;
$obj = json_decode($json);
if (is_null($obj)) {
die("<br/>Invalid JSON, don't need to keep on working on it");
} else {
$podStatus = $obj->PODuploaded;
}
}
}
}
After curl I used following concept to get only JSON data from HTML Page.
1) fetchData.php
$url = 'http://DocketStatusApp.aspx?dkno=' . $dcktNo;
$myvars = '';
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, $myvars);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$jsonString = curl_exec($ch);
// now get only value
$dom = new DOMDocument();
$dom->loadHTML($jsonString);
$thediv = $dom->getElementById('Label1');
echo $thediv->textContent;
2) JSONprocess.php
if (isset($_POST['dkno'])) {
$dcktNo = $_POST['dkno'];
ob_start(); // begin collecting output
include_once 'fetchData.php';
$result = ob_get_clean(); // Completed collecting output
// Now it will show & take only JSON Data from Div Tag
$json = str_replace(array('[', ']'), '', $result);
$obj = json_decode($json);
if (is_null($obj)) {
die("<br/>Invalid JSON, don't need to keep on working on it");
} else {
$podStatus = $obj->PODuploaded;
}
}
I wanted to send different url at same time.this is the code which i used.Can you please tell me the error in here?
<?php
$data = R::find('savedata','list_name = ? order by id desc',array($i));
foreach ( $data as $list):
$mh = curl_multi_init(); //set up a cURL multiple execution handle
$ch = curl_init("https://example.com/save_data.php?NUM=$list->id&MSG=$message");
curl_setopt($ch, CURLOPT_TIMEOUT, 2);
curl_multi_add_handle($mh, $ch);
endforeach;
curl_multi_exec($mh,);
curl_multi_close($mh)
?>
Thank you all.I have corrected this by my self.I am sharing the correct code with you,
$mobile = R::find('add_numbers', 'list_name = ? order by id desc', array($i));
foreach ($mobile as $list):
try{
$url = "https://example.com/save_data.php?NUM=$list->id&MSG=****";
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1); // TRUE
curl_setopt($ch, CURLOPT_HEADER, 0); // DO NOT RETURN HTTP HEADERS
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); // RETURN THE CONTENTS OF THE CALL
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
$results = curl_exec($ch);
$i[]=$results;
}
This is my cURL POST function:
public function curlPost($url, $data)
{
$fields = '';
foreach($data as $key => $value) {
$fields .= $key . '=' . $value . '&';
}
rtrim($fields, '&');
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_POST, count($data));
curl_setopt($ch, CURLOPT_POSTFIELDS, $fields);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
$result = curl_exec($ch);
$info = curl_getinfo($ch);
curl_close($ch);
}
$this->curlPost('remoteServer', array(data));
How do I read the POST on the remote server?
The remote server is using PHP... but what var in $_POST[] should I read
for e.g:- $_POST['fields'] or $_POST['result']
You code works but i'll advice you to add 2 other things
A. CURLOPT_FOLLOWLOCATION because of HTTP 302
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, TRUE);
B. return in case you need to output the result
return $result ;
Example
function curlPost($url, $data) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, $data);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, TRUE);
$result = curl_exec($ch);
$info = curl_getinfo($ch);
curl_close($ch);
return $result;
}
print(curlPost("http://yahoo.com", array()));
Another Example
print(curlPost("http://your_SITE", array("greeting"=>"Hello World")));
To read your post you can use
print($_REQUEST['greeting']);
or
print($_POST['greeting']);
as a normal POST request ... all data posted can be found in $_POST ... except files of course :) add an &action=request1 for example to URL
if ($_GET['action'] == 'request1') {
print_r ($_POST);
}
EDIT: To see the POST vars use the folowing in your POST handler file
if ($_GET['action'] == 'request1') {
ob_start();
print_r($_POST);
$contents = ob_get_contents();
ob_end_clean();
error_log($contents, 3, 'log.txt' );
}
I've been using various pieces of different twitter feeds to grab tweets, but now I've hit a wall with the rate limiting and caching tweets. Here's my code:
function tweets($twitter_handle, $tweet_limit, $tweet_links, $tweet_tags, $tweet_avatar, $tweet_profile) {
/* Store Tweets in a JSON object */
$tweet_feed = json_decode(file_get_contents('http://api.twitter.com/1/statuses/user_timeline.json?screen_name='.
$twitter_handle.'&include_entities=true&include_rts=true&count='.$hard_max.''));
This works great until I hit the rate limit. Here's what I added to cache tweets:
function tweets($twitter_handle, $tweet_limit, $tweet_links, $tweet_tags, $tweet_avatar, $tweet_profile) {
$url = 'http://api.twitter.com/1/statuses/user_timeline.json?screen_name='.$twitter_handle.'&include_entities=true&include_rts=true&count='.$hard_max.'';
$cache = dirname(__FILE__) . '/cache/twitter';
if(filemtime($cache) < (time() - 60))
{
mkdir(dirname(__FILE__) . '/cache', 0777);
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 30);
curl_setopt($ch, CURLOPT_TIMEOUT, 5);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_REFERER, $_SERVER['REQUEST_URI']);
$data = curl_exec($ch);
curl_close($ch);
$cachefile = fopen($cache, 'wb');
fwrite($cachefile, $data);
fclose($cachefile);
}
else
{
$data = file_get_contents($cache);
}
$tweet_feed = json_decode($data);
This however only returns the username and timestamp (which is wrong), when it should be returning the twitter avatar, tweet content, correct timestamp, etc. Additionally, it's also returning an error every few refreshes:
Warning: mkdir() [function.mkdir]: File exists in /home/content/36/8614836/html/wp-content/themes/NCmainSite/functions.php on line 110
Any help would be appreciated.
If you need more info, here's the rest of the function: http://snippi.com/s/9f066q0
Here try this ive fixed your issues, plus you had a rogue post opt in curl.
<?php
function tweets($twitter_handle, $tweet_limit, $tweet_links, $tweet_tags, $tweet_avatar, $tweet_profile) {
$http_query = array('screen_name'=>$twitter_handle,
'include_entities'=>'true',
'include_rts'=>'true',
'count'=>(isset($hard_max))?$hard_max:'5');
$url = 'http://api.twitter.com/1/statuses/user_timeline.json?'.http_build_query($http_query);
$cache_folder = dirname(__FILE__) . '/cache';
$cache_file = $cache_folder . '/twitter.json';
//Check folder exists
if(!file_exists($cache_folder)){mkdir($cache_folder, 0777);}
//Do if cache files not found or older then 60 seconds (tho 60 is not enough)
if(!file_exists($cache_file) || filemtime($cache_file) < (time() - 60)){
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 30);
curl_setopt($ch, CURLOPT_TIMEOUT, 5);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_REFERER, $_SERVER['REQUEST_URI']);
$data = curl_exec($ch);
curl_close($ch);
file_put_contents($cache_file,$data);
}else{
$data = file_get_contents($cache_file);
}
return json_decode($data);
}
$twitter = tweets('RemotiaSoftware', 'tweet_limit','tweet_links', 'tweet_tags', 'tweet_avatar', 'tweet_profile');
print_r($twitter);
?>