I am using the following code to retrieve an amount of Tweets from the Twitter API:
$cache_file = "cache/$username-twitter.cache";
$last = filemtime($cache_file);
$now = time();
$interval = $interval * 60; // ten minutes
// Check the cache file age
if ( !$last || (( $now - $last ) > $interval) ) {
// cache file doesn't exist, or is old, so refresh it
// Get the data from Twitter JSON API
//$json = #file_get_contents("http://api.twitter.com/1/statuses/user_timeline.json?screen_name=" . $username . "&count=" . $count, "rb");
$twitterHandle = fopen("http://api.twitter.com/1/statuses/user_timeline.json?screen_name=$username&count=$count", "rb");
$json = stream_get_contents($twitterHandle);
fclose($twitterHandle);
if($json) {
// Decode JSON into array
$data = json_decode($json, true);
$data = serialize($data);
// Store the data in a cache
$cacheHandle = fopen($cache_file, 'w');
fwrite($cacheHandle, $data);
fclose($cacheHandle);
}
}
// read from the cache file with either new data or the old cache
$tweets = #unserialize(file_get_contents($cache_file));
return $tweets;
Of course $username and the other variables inside the fopen request are correct and it produces the correct URL because I get the error:
Warning: fopen(http://api.twitter.com/1/statuses/user_timeline.json?screen_name=Schodemeiss&count=5) [function.fopen]: failed to open stream: HTTP request failed! HTTP/1.1 400 Bad Request in /home/ellexus1/public_html/settings.php on line 187
that ^^ error returns whenever I try and open my page.
Any ideas why this might be? Do I need to use OAuth to even just get my tweets!? Do I have to register my website as somewhere that might get posts?
I'm really not sure why this is happening. My host is JustHost.com, but I'm not sure if that makes any diffrence. All ideas are welcome!
Thanks.
Andrew
PS. This code lies inside a function where username, interval and count are passed in correctly, hence in the error code its created a well formed address.
Chances are you are getting rate-limited
400 Bad Request: The request was invalid. An accompanying error
message will explain why. This is the status code will be returned
during rate limiting.
150 requests per hour for non authenticated calls (Based on IP-addressing)
350 requests per hour for authenticated calls (Based on the authenticated users calls)
You have to authenticate to avoid these errors popping up.
And also please use cURL when dealing with twitter. I've used file_get_contents and fopen to call the twitter API, and found that it is very unreliable. You would get hit with that every now and then.
Replace the fopen with
$ch = curl_init("http://api.twitter.com/1/statuses/user_timeline.json?screen_name=$username&count=$count");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$it = curl_exec($ch); //content stored in $it
curl_close($ch);
This may help
Error codes
https://developer.twitter.com/en/docs/basics/response-codes.html
Error codes defination is given in above link
Related
This is code of my bot in PHP, but it doesnt answer. What should I do? Here's the PHP error log :
[24-Sep-2018 09:06:29 UTC] PHP Warning: file_get_contents(https://api.telegram.org/bot694EMvJayx1zD-J3FPyKPfRlGka0 /sendMessage?chat_id=110***01&text=hellhAkbarixyzhMohammad Hosein): failed to open stream: HTTP request failed! HTTP/1.1 400 Bad Request
in /home/xcqcctmm/public_html/BOT/getnewsir.php on line 22
The code is :
<?php
$token = '692******1zD-J3FPyKPfRlGka0';
// read incoming info and grab the chatID
$json = file_get_contents('php://input');
$telegram = urldecode ($json);
$results = json_decode($telegram);
$message = $results->message;
$text = $message->text;
$chat = $message->chat;
$user_id = $chat->id;
$username = $chat->username;
$first_name = $chat->first_name;
//send reply
$answer = "hello".$user_id . "h" . $username . "h" . $first_name;
$url = 'https://api.telegram.org/bot'.$token.'/sendMessage?chat_id='. $user_id .'&text='.$answer;
file_get_contents($url);
?>
it doesn't work.
Telegram's API returns 400 Bad Request, which could be caused by any of the following:
FIRSTNAME_INVALID: The first name is invalid
LASTNAME_INVALID: The last name is invalid
PHONE_NUMBER_INVALID: The phone number is invalid
PHONE_CODE_HASH_EMPTY: phone_code_hash is missing
PHONE_CODE_EMPTY: phone_code is missing
PHONE_CODE_EXPIRED: The confirmation code has expired
API_ID_INVALID: The api_id/api_hash combination is invalid
PHONE_NUMBER_OCCUPIED: The phone number is already in use
PHONE_NUMBER_UNOCCUPIED: The phone number is not yet being used
USERS_TOO_FEW: Not enough users (to create a chat, for example)
USERS_TOO_MUCH: The maximum number of users has been exceeded (to create a chat, for example)
TYPE_CONSTRUCTOR_INVALID: The type constructor is invalid
FILE_PART_INVALID: The file part number is invalid
FILE_PARTS_INVALID: The number of file parts is invalid
FILE_PART_Х_MISSING: Part X (where X is a number) of the file is missing from storage
MD5_CHECKSUM_INVALID: The MD5 checksums do not match
PHOTO_INVALID_DIMENSIONS: The photo dimensions are invalid
FIELD_NAME_INVALID: The field with the name FIELD_NAME is invalid
FIELD_NAME_EMPTY: The field with the name FIELD_NAME is missing
MSG_WAIT_FAILED: A waiting call returned an error
Unfortunately, you have to debug which one causing the actual error.
Source
I know I could measure the total site loading time for an external url just with something like:
$start_request = time();
file_get_contents($url);
$end_request = time ();
$time_taken = $end_request - $start_request;
But I don't need the total site loading, I want to measure only the server-response-time like it's displayed here in the "wait"-part of the result:
http://www.bytecheck.com/results?resource=https://www.example.com
How can I do this with php?
You can't do this with PHP like so. With time() or microtime() you can only get the complete time that one or more commands took.
You need a tool where you have access to the Network Layer Data. cURL can do this for you, but you have to enable php curl, it if its not already done.
PHP can than take the result and process it.
<?php
// Create a cURL handle
$ch = curl_init('http://www.example.com/');
// Execute
curl_exec($ch);
// Check if any error occurred
if (!curl_errno($ch)) {
$info = curl_getinfo($ch);
echo 'Took ', $info['total_time'], ' seconds to send a request to ', $info['url'], "\n";
}
// Close handle
curl_close($ch);
You have a bunch of informations in $info like
"filetime"
"total_time"
"namelookup_time"
"connect_time"
"pretransfer_time"
"starttransfer_time"
"redirect_time"
The complete list could be found here
The "Wait" time should be the starttransfer_time - pretransfer_time,
so in your case you need:
$wait = $info['starttransfer_time'] - $info['pretransfer_time'];
Below is the code that I am currently using in which I pass an address to the function and the Nominatim API should return a JSON from which I could retrieve the latitude and longitude of the address from.
function geocode($address){
// url encode the address
$address = urlencode($address);
$url = 'http://nominatim.openstreetmap.org/?format=json&addressdetails=1&q={$address}&format=json&limit=1';
// get the json response
$resp_json = file_get_contents($url);
// decode the json
$resp = json_decode($resp_json, true);
// get the important data
$lati = $resp['lat'];
$longi = $resp['lon'];
// put the data in the array
$data_arr = array();
array_push(
$data_arr,
$lati,
$longi
);
return $data_arr;
}
The problem with it is that I always end up with an Internal Server Error. I have checked the Logs and this constantly gets repeated:
[[DATE] America/New_York] PHP Notice: Undefined index: title in [...php] on line [...]
[[DATE] America/New_York] PHP Notice: Undefined variable: area in [...php] on line [...]
What could be the issue here? Is it because of the _ in New_York? I have tried using str_replace to swap that with a + but that doesn't seem to work and the same error is still returned.
Also, the URL works fine since I have tested it out through JavaScript and manually (though {$address} was replaced with an actual address).
Would really appreciate any help with this, thank you!
Edit
This has now been fixed. The problem seems to be with Nominatim not being able to pickup certain values and so returns an error as a result
The errors you have mentioned don't appear to relate to the code you posted given the variables title and area are not present. I can provide some help for the geocode function you posted.
The main issue is that there are single quotes around the $url string - this means that $address is not injected into the string and the requests is for the lat/long of "$address". Using double quotes resolves this issue:
$url = "http://nominatim.openstreetmap.org/?format=json&addressdetails=1&q={$address}&format=json&limit=1";
Secondly, the response contains an array of arrays (if were not for the limit parameter more than one result might be expected). So when fetch the details out of the response, look in $resp[0] rather than just $resp.
// get the important data
$lati = $resp[0]['lat'];
$longi = $resp[0]['lon'];
In full, with some abbreviation of the array building at the end for simplicity:
function geocode($address){
// url encode the address
$address = urlencode($address);
$url = "http://nominatim.openstreetmap.org/?format=json&addressdetails=1&q={$address}&format=json&limit=1";
// get the json response
$resp_json = file_get_contents($url);
// decode the json
$resp = json_decode($resp_json, true);
return array($resp[0]['lat'], $resp[0]['lon']);
}
Once you are happy it works, I'd recommend adding in some error handling for both the http request and decoding/returning of the response.
I am trying to get user's fan page post using the following code, but it's give me warning
Warning: file_get_contents(https://graph.facebook.com/782176371798916/posts): failed to open stream: HTTP request failed! HTTP/1.0 400 Bad Request
$page_posts = file_get_contents('https://graph.facebook.com/'.$page_id.'/posts');
$pageposts = json_decode($page_posts);
foreach ($pageposts["data"] as $fppost) {
echo $fppost['message'];
}
SO, how is the correct way to get user's fan page post?
The solution I found is by using the following code:
$pageposts = $facebook->api('/'.$page_id.'/posts', 'GET');
foreach ($pageposts["data"] as $fppost) {
echo $fppost['message'];
}
You didn't send the access_token parameter, just add it and it should work like charm:
$page_id = 'smashmag'; // Page ID or username
$token = '553435274702353|OaJc7d2WCoDv83AaR4JchNA_Jgw'; // Valid access token, I used app token here but you might want to use a user token .. up to you
$page_posts = file_get_contents('https://graph.facebook.com/'.$page_id.'/posts?fields=message&access_token='.$token); // > fields=message < since you want to get only 'message' property (make your call faster in milliseconds) you can remove it
$pageposts = json_decode($page_posts);
foreach ($pageposts->data as $fppost) {
if (property_exists($fppost, 'message')) { // Some posts doesn't have message property (like photos set posts), errors-free ;)
print $fppost->message.'</br>';
}
}
I have a problem, I am developing a website where people can share YouTube videos and sometimes, I get an error for some videos (not all of them) where YouTube return this : failed to open stream: HTTP request failed! HTTP/1.0 403 Forbidden
I would like to know if there is a maximum-number-requests-per-hour for example for fetching videos ?
I am using this address to fetch infos : http://gdata.youtube.com/feeds/api/videos?q={VIDEO_ID}
And there is my code :
<?php
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, YT_API_URL . $post->get_youtubeVideo());
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
//$feed holds a rss feed xml returned by youtube API
$feed = curl_exec($ch);
curl_close($ch);
//Using SimpleXML to parse youtube's feed
$xml = simplexml_load_string($feed);
$entry = $xml->entry[0];
//If no entry whas found, then youtube didn't find any video with specified id
if($entry) {
$media = $entry->children('media', true);
$group = $media->group;
$title = $group->title;
$desc = $group->description;
$thumb = $group->thumbnail[0];
list($thumb_url, $thumb_width, $thumb_height, $thumb_time) = $thumb->attributes();
$content_attributes = $group->content->attributes();
echo '<div class="youtube-video-inpost group" data-video-id="'.$post->get_youtubeVideo().'">
<div class="thumbnail"><div class="play-button"></div><img src="https://i1.ytimg.com/vi/'.$post->get_youtubeVideo().'/mqdefault.jpg" width="195" height="110" /></div>
<div class="infos">
<div class="title">'.$title.'</div>
<div class="description">'.$desc.'</div>
</div>
</div>';
}
?>
Can someone help me please ?
Thanks a lot !
Welcome to StackOverflow!
Yes, there is a "maximum-number of requests". It's called quota. Once this limit is exceeded, you'll not be able to do any further requests on this day. However - this is not what we're looking for here! You'd receive error stating: quotaLimitExceeded in it's response body.
403 Forbidden tells us that the requested resource (the video) is not public visible!
Probably it's private. You should handle this and tell your users that videos need to be public!
Code improvements
I strongly suggest using Googles API PHP client if you're unable to do the requests properly without it! Your code contains problems:
You aren't authenticated -> quota limit is exceeding faster
You won't get detailed usage statistics
And prolly much more bad smells.