I am working on project need to get last video from many channels in Youtube and store it in mysql now I work in file it is work by cron job every on hours to check last video , the problem now I have more than 30 channels and I use curl to get last video in every channel ,you can see the code
foreach ($channels as $channel){
$channel_id = $channel['channel_id'];
$curl = curl_init();
curl_setopt_array($curl, array(
CURLOPT_URL => "https://www.googleapis.com/youtube/v3/search?part=snippet&channelId=$channel_id&maxResults=50&order=date&key=$key" ,
CURLOPT_RETURNTRANSFER => true
));
$json = curl_exec($curl);
$data_json = json_decode($json , TRUE);
$item = $data_json['items'][0];
// complete code here to check last video
}
now my qustion is:
1- is it good idea or there are better?
2 - are there any problem to use many request Curl in youtube api?
Thanks
YouTube API v2 supported batch operations; API v3 no longer does. You could do try to run multiple requests in parallel using curl_multi_init().
If you only care about one item per request, don't request maxResults=50.
Do all of the videos have something else in common? For example, are the all uploaded by the authenticated user? If so, try searching for videos uploaded by that user (forMine=true), publishedAfter the time of the previous cron job. Then triage the results according to each video's channelId.
Related
So I'm using the YouTube data API to bring in videos related to a specific channel using there Search: list function.
The function works fine and I have used it for months with no issues on many different channels.
Today thought, I found a channel that wouldn't send in its videos even though they were clearly public on the channel.
The search query would send back a object that said "0 results returned"
Here is the function in question
//The url you wish to send the POST request to
$url = 'https://www.googleapis.com/youtube/v3/search?'.'order='.$videoType.'&part=snippet'.'&channelId=' . $channelID.'&maxResults=' . $maxResults.'&key='.$API_key .$publishHolder.$nextPageTokenHolder;
//open connection
$ch = curl_init();
//set the url
curl_setopt($ch,CURLOPT_URL, $url);
//So that curl_exec returns the contents of the cURL; rather than echoing it
curl_setopt($ch,CURLOPT_RETURNTRANSFER, true);
//execute post
$videoList = json_decode(curl_exec($ch));
curl_close($ch);
Like I said the problem is not the function or my code, as it works every were except for this one channel.
Does anyone know why there would be no results returned?
A side not would be the channel is extremely small and only has 7 - 8 videos total, but I have also been successful with smaller channels in the past.
EDIT
To answer some questions in the comments
The channel id is UC8zmDqrUX0-b8blMHHV1XCA
I will have to ask the client (Its his YouTube channel) to check and see if is something preventing it from working in his settings.
Is there a no-embed setting on youtube?
And no the curl script is inside of a WordPress plugin I created. However its returning a object sent from YouTube that says "0 results returned", so even that is working.
YouTube is just simply "not finding anything".
Ok, so based on this video; https://www.youtube.com/watch?v=7tjfeEAdY0I
I'm curious if it's possible to create a PHP script that will automatically grab all video's on a certain youtube channel (in this example; https://www.youtube.com/user/whitehouse/videos), grab all videos, check out the transcripts for a certain line or word, then save the vids with the correct timestamp, to make it possible to merge all these video's into one.
I know there's an API available to get all youtube video url's of a specific user, and you can get the transcripts per video, but scanning through all of them will be quite resource-heavy. I'm curious if you guys have any ideas on how to create such a script.
It is possible to capture all information related to videos of a particular YouTube channel using the YouTube V3 API.
The documentation of YouTube API can be found here
The related API functionality can be found here
The procedure to retrieve the transcript of a video from the json response from the API call will not be resource intensive and can be achieved in line of code as json parsing is a standard procedure and is supported on most of the languages including PHP.
Using the PHP wrapper, this can be achieved in a few lines of code.
Procedure:
Firstly, a request to listChannels under channels is send to receive all content details regarding a particular YouTube channel denoted by a channel ID. The related code snippet is:
$channelsResponse = $service->channels->listChannels('contentDetails', array('forUsername' => $channelId));
The second step is to parse through each video item which is done conveniently using a for loop like:
foreach ($channelsResponse['items'] as $channel) {}
The third step is to get the upload list id from the current item and send another request to the API with the parameter 'snippet'.
At this point, you will have all the content related data in json format and you need to parse and get the desired information.
The entire code snippet is :
$channelsResponse = $service->channels->listChannels('contentDetails', array('forUsername' => $channelId));
$data = [];
foreach ($channelsResponse['items'] as $channel)
{
$uploadsListId = $channel['contentDetails']['relatedPlaylists']['uploads'];
$playlistItemsResponse = $service->playlistItems->listPlaylistItems(
'snippet', array(
'playlistId' => $uploadsListId,
'maxResults' => 50
)
);
foreach ($playlistItemsResponse['items'] as $playlistItem)
{
$data[] = $playlistItem['snippet'];
}
}
I was trying to get the feeds of friends' fan pages with a cURL and to post a like via app.
The cURL GET works fine, but when it comes to liking the object (that has for sure likes enabled because it's facebook), all I get is a boolean (and no action).
I had seen on SO that you post a like with
($access_token, "https://graph.facebook.com/$post_id/likes", 'POST')
However, it is not working (it says either that the app has no permissions to do that, or that I need a valid url).
I have tried all possible solutions but nothing seems to work. The profile is mine, the access token too, and I gave publish_stream permissions to my own app.
After having given a try to the SDK PHP Likbrary, I tried a direct cURL post, with the following code:
function likeit($url){
//open connection
$ch = curl_init();
curl_setopt($ch,CURLOPT_URL, $url);
curl_setopt($ch,CURLOPT_POST, $url);
curl_exec($ch);
curl_close($ch);
}
The execution i a puzzle. When it does not return an OAuth Exception it returns boolean and stops there (there is no way for me to "like" the post I want to like). The last attempt (but I've tried them all, even the whole url in some tests) is
foreach ($feeds->data as $feed_data)
$message = $feed_data->message;
$title = $feed_data->name;
$link = $feed_data->link;
$id = $feed_data->id;
$myaccesstoken = "longlastingaccesstokenstring";
$post=likeit($myaccesstoken, "https://graph.facebook.com/$post_id/likes", 'POST');
Does anybody have a suggestion on how to do this? It seems trivial and yet there is no way for me to accomplish it. Thank you so much in advance!
I've found a solution, and thus am posting it here for future reference (hoping it is helpful for anybody with the same problem):
the only problem was that after the new permissions with facebook I needed to re-authorize my app.
After that, it is possible to post a like with the SDK facebook library, normally:
$facebook->api("/$id/likes", 'post', array(
'access_token' => $myaccesstoken));
No need to use cURL directly. Hope this helps someone with the same question.
For a project I have to grab the insights of a page over a long period of time (e.g. 1-2 years) of facebook.
I first tried to do a single request but it turned out that only requesting
/PAGE_ID/insights?since=xxx&until=xxx
doesn't return all the data I want (it somehow supresses data as if there's some limit to the size of the answer).
I then tried to split up the date range (e.g. 01.04.2011-01.04.2011 -> 01.04.2011-01.08.2011-01.12.2011-01.04.2011) which as well, didn't quite work like I wanted it to.
My next approach was to request only the insight values I need, like 'page_stories, page_impressions, ...'. The requests looked like this
/PAGE_ID/insights/page_impressions/day?since=xxx&until=xxx
This actually worked but not with ajax. It sometimes seemed to drop some requests (especially if I changed the browser tab in google chrome) and I need to be sure that all requests return an answer. A synchronus solution would just take way too much time considering that one requests needs at least 2 seconds and with a date range of 2 years I may have about 300 single requests this just takes way too long to complete.
Lastly I stumbled over facebook ability to do batch requests which is exactly what I need. It can pack up to 50 requests in one call which significantly lowers the bandwith. And here's where I'm stuck. The facebook api gives some examples on how to use it but none of them worked when I tested them in the Graph Explorer and via the php facebook api sdk. I tried to pack this request
PAGE_ID/insights/page_fan_adds/day?since=1332486000&until=1333695600
into a batch request but failed.
It seems that the api is bugged. It's always giving me this error when I'm using a question mark '?' in the 'relative_url' field.
{
"error": {
"message": "batch parameter must be a JSON array",
"type": "GraphBatchException"
}
}
Here is what I tried:
These give the 'must be a JSON array' error:
?batch=[{"method":"GET","relative_url":"/PAGE_ID/insights/page_fan_adds/day?since=1332486000&until=1333695600"}]
These two actually return data but they are ignoring the parameters:
?batch=[{"method":"GET","relative_url":"/PAGE_ID/insights/page_fan_adds/day","body":"since=1332486000 until=1333695600"}]
?batch=[{"method":"GET","relative_url":"/PAGE_ID/insights/page_fan_adds/day","body":"since=1332486000,until=1333695600"}]
?batch=[{"method":"GET","relative_url":"/PAGE_ID/insights/page_fan_adds/day","body":{"since":"1332486000","until":"1333695600"}}]
And this one tells me that this is an 'Unsupported post request':
?batch=[{"method":"POST","relative_url":"/PAGE_ID/insights/page_fan_adds/day","body":"since=1332486000 until=1333695600"}]
Can someone help?
I finally found the solution to my problem. It's not mentioned in the facebook documentation but for this request
?batch=[{"method":"GET","relative_url":"/PAGE_ID/insights/page_fan_adds/day?since=1332486000&until=1333695600"}]
to properly work we have to use a function like
urlencode()
to encode the json part. This way the querys work like a charm. A php example:
$insights = $facebook->api('?batch=['.urlencode('{"method":"GET","relative_url":"/PAGE_ID/insights/page_fan_adds/day?since=1332572400&until=1333782000"}').']'
,'post',array('access_token' => $this->facebook->getAccessToken()));
which results in this:
?batch=[%7B%22method%22%3A%22GET%22%2C%22relative_url%22%3A%22%2FPAGE_ID%2Finsights%2Fpage_fan_adds%2Fday%3Fsince%3D1300086000%26until%3D1307862000%22%7D]
This example is for using an array of IDs to make a batch request with urlencoding.
$postIds = [
'XXXXXXXXXXXXXXX_XXXXXXXXXXXXXXX',
'XXXXXXXXXXXXXXX_XXXXXXXXXXXXXXX',
'XXXXXXXXXXXXXXX_XXXXXXXXXXXXXXX',
'XXXXXXXXXXXXXXX_XXXXXXXXXXXXXXX',
'XXXXXXXXXXXXXXX_XXXXXXXXXXXXXXX',
];
$queries = [];
foreach( $postIds as $postId ) {
$queries[] = [
'method' => 'GET',
'relative_url' => '/' . $postId . '/comments?summary=1&filter=stream&order=reverse_chronological',
];
}
$requests = $facebook->post( '?batch=' . urlencode( json_encode( $queries ) ) )->getGraphNode();
Using the twitter API (and OAuth) if i was to call for the user followers, (statuses/followers) i would be returned only 99 results.
Is there a way i can return 99, then call again starting at follower 100 then looping through this style of calling until the total number of followers has been returned?
Or just return ALL followers?
You need to specify cursor parameter as described in the API documrnation. E.g. specify cursor=-1 to request the first page and then use a next_cursor value returned in the first response:
http://twitter.com/statuses/followers/barackobama.xml?cursor=-1
http://twitter.com/statuses/followers/barackobama.xml?cursor=1300794057949944903
<?php
$trends_url = "http://api.twitter.com/1/statuses/followers/fawadghafoor.json";
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $trends_url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$curlout = curl_exec($ch);
curl_close($ch);
$response = json_decode($curlout, true);
foreach($response as $friends){
$thumb = $friends['profile_image_url'];
$url = $friends['screen_name'];
$name = $friends['name'];
?>
<a title="<?php echo $name;?>" href="http://www.twitter.com/<?php echo $url;?>"><img class="photo-img" src="<?php echo $thumb?>" border="0" alt="" width="40" /></a>
<?php } ?>
Be sure you're using the right call. followers/ids gives you 5000 at a time (but it's just a list of ids). This call, too, uses the cursor to let you step through pages of users. You get a zero back when you have them all.
Twitter API restricts us to make api Call for method followers/ids is 15 requests per 15 minutes. If you make more than this api will give you an error message that Rate Limit Reached.
For more information of twitter API rate Limit
Visit-https://dev.twitter.com/docs/rate-limiting/1.1 and https://dev.twitter.com/docs/rate-limiting/1.1/limits
Twitter only allows a certain number of API requests per hour, and I think minute. You might not be able to retrieve any more than 99 requests at once.
Though I asked this quite a while ago, I came back to building something quite similar recently (+ new programming skills).
I noticed Twitter API have a method to get all of a users followers (or followings) user ID's in one request. I found the best way was to array_chunk up the ID's into batches of 100 (and only take the first 30 arrays as I dont want to use all the users api requests that hour - they might want to actually tweet!). Then there is a method that allows you to get up to 100 users userinfo (from the view of the currently authenticated user) so I just do a loop (sleep a bit inbetween) and then you've got 30,000 twitter followers!
I would recommend doing this asynchronously in a queue system, as if you do this on the fly when the users requests a page on the site it could be very slow and you might be prone to a HTTP timeout. Also cache them like hell!
Sorry I didn't post any code, but hopefully this thought process will help someone :)
$cursor = -1;
$account_from = 'twitter_account';
do
{
$json = file_get_contents('http://api.twitter.com/1/statuses/followers/' . $account_from .'json?cursor=' . $cursor);
$accounts = json_decode($json);
foreach ($accounts->users as $account)
{
array(
':twitter_id' => $account->id_str,
':account' => $account->screen_name,
':description' => $account->description,
);
}
$cursor = $accounts->next_cursor;
}
while ($cursor > 0);