I'm getting an error from this Twitter script that is causing the rest of the page to not load. Not sure why suddenly this is happening, where it was functioning properly for quite some time.
The script looks like this, and it pulls the users current status:
<?php
$response = new SimpleXMLElement('http://twitter.com/users/show/tuscaroratackle.xml',NULL,TRUE);
echo $response->status->text.'';
?>
Here's another post that I was trying to figure out the answer to another bug which pointed me to this Twitter error.
You can see it here in the footer, or a screengrab of the output: http://cl.ly/33IZ.
The relevant error (which is displayed in the footer of the page you linked to) is:
Warning: SimpleXMLElement::__construct(http://twitter.com/users/show/tuscaroratackle.xml) [simplexmlelement.--construct]: failed to open stream: HTTP request failed! HTTP/1.1 400 Bad Request in /home5/tuscaror/public_html/footer.php on line 47
Warning: SimpleXMLElement::__construct() [simplexmlelement.--construct]: I/O warning : failed to load external entity "http://twitter.com/users/show/tuscaroratackle.xml" in /home5/tuscaror/public_html/footer.php on line 47
Fatal error: Uncaught exception 'Exception' with message 'String could not be parsed as XML' in /home5/tuscaror/public_html/footer.php:47 Stack trace: #0 /home5/tuscaror/public_html/footer.php(47): SimpleXMLElement->__construct('http://twitter....', 0, true) #1 /home5/tuscaror/public_html/index.php(119): include('/home5/tuscaror...') #2 {main} thrown in /home5/tuscaror/public_html/footer.php on line 47
The first warning tells you what happened: "HTTP request failed! HTTP/1.1 400 Bad Request".
So, for some reason, your server is failing when making the HTTP request to twitter to retrieve the document "http://twitter.com/users/show/tuscaroratackle.xml". The return code is 400 Bad Request.
I just tried that same request from my web browser, and it worked fine, so either twitter was temporarily "out to lunch" (which does happen from time to time), or there is something unique about your server's network configuration. My first guess would be that somewhere up-stream from your server, someone has installed an HTTP proxy which is (for some unknown reason) blocking your request.
Here's what twitter has to say about it:
400 Bad Request: The request was invalid. An accompanying error message
will explain why. This is the status code will be returned during rate limiting.
Here is twitter's page on Rate Limiting. I suspect that this is your culprit. If you think otherwise, then you might try retrieving the document as a string and examining it before you try to parse it, so you can see what the message is.
This is quick and dirty, but it'll get the message so you can see what's going on:
$str = file_get_contents('http://twitter.com/users/show/tuscaroratackle.xml');
echo $str;
that may fail due to the 400 response code. if so, you'll need to use php curl to get the un-parsed response body.
good luck!
Related
I am trying to gather a year's worth of data for a select ad account but I get the following exception:
FacebookAds\Exception\Exception
Failed sending HTTP request: Header overflow
The exception happens at the following line of code:
$adData = [];
foreach ($fbadaccount->getAds($adFields, $adParams) as $object) {
$adData[] = $object->getData();
}
This code works perfectly fine for smaller time frames.
I understand it is attempting to get a lot of data but I'm trying to find a solution.
Could this potentially be environment-related? i.e. nginx
The "Header overflow" error occurs when the HTTP request header is too large.
Perhaps because of the cookies being sent.
I'm trying to develop and Instagram application but I'm struggling to get a json inserting the access token via variable.
That's my code:
$personal = json_decode(file_get_contents('https://api.instagram.com/v1/users/self/?access_token={$accesstoken}'));
And the error that I receive is this:
PHP Warning: file_get_contents(https://api.instagram.com/v1/users/self/?access_token={$accesstoken}): failed to open stream: HTTP request failed! HTTP/1.1 400 Bad Request
I already tried to see if I get the variable $accesstoken from the previous form and it's all right, because if I echo-it on the page it pops up.
I tried the function preg_replace to understand if the problem were the white spaces, but nothing.
I don't want to use cURL if it is not mandatory.
What's wrong with the code (and me)?
Thanks in advance!
Edit:
As I answered to #FirstOne: yes, I already tried to put the variable (access token) manually and in this way it works.
I am trying to access multiple json files provided by steam for the market price of an item for CSGO. I am using a first file_get_contents which works:
$inventory = file_get_contents('http://steamcommunity.com/profiles/' . $steamprofile['steamid'] . '/inventory/json/730/2');
but the 2nd onwards doesn't work:
$marketString = file_get_contents('http://steamcommunity.com/market/priceoverview/?currency=1&appid=730&market_hash_name=' . urlencode($json_a->{'rgDescriptions'}->$rgDescrId->{'market_hash_name'}));
However I get the error on all items for example:
Warning: file_get_contents(http://steamcommunity.com/market/priceoverview/?currency=1&appid=730&market_hash_name=Negev%20|%20Nuclear%20Waste%20(Minimal%20Wear)): failed to open stream: HTTP request failed! HTTP/1.0 429 Unknown in /home4/matt500b/public_html/themooliecommunity.com/CSGO/index.php on line 24
I can confirm that allow_url_fopen is on
Pasting the following url into a browser shows that the url works
http://steamcommunity.com/market/priceoverview/?currency=1&appid=730&market_hash_name=Negev%20|%20Nuclear%20Waste%20(Minimal%20Wear)
Please note that about 1 hour ago this worked but now throwing an error, any suggestions?
You've got response with status 429 Too many requests
The user has sent too many requests in a given amount of time ("rate
limiting").
So this site can just block too frequent reference to his API
A HTTP 429 is a too many request warning, it's not an error, just a note to tell you you've over done it a little. You'll have to either wait a while or if it's your own server then adjust it's settings to allow for more requests.
I am trying to use PHP and cURL to log in to a website (namely Craigslist). When accessing the script, I get this warning message:
Received problem 2 in the chunky parser
Searches showed that it is not a problem associated with cURL. I am unable to find the source of the problem. What may be the reason?
Thank you.
Update: Googling for the error message, I also find this:
The chunky-parser error message occurs when curl expects a chunked HTTP response body and then doesn't get one. Your reply sends the Transfer-Encoding: chunked header, so curl expects to see a body chunked according to RFC2616 and it doesn't get one.
Obviously, a redirect shouldn't have a response body or even the Transfer-Encoding header to begin with. You could try overriding the header, but maybe CouchDB inserts it unconditional in which case we should fix that, if you find out you can't override the Transfer-Encoding header, can you file a bug report?
I have no idea what to make out of this in the context of fetching an arbitrary page, though.
Original post:
There's a CouchDB Bug report dealing with the same issue in conjunction with multi-byte data. Craigslist seems to run in ISO-8859-1, maybe the ad (or whatever you are fetching) has UTF-8 characters in it?
"Received problem 2 in the chunky parser" is an error message from libcurl. The specific "problem 2" refers to CHUNKE_ILLEGAL_HEX which is an internal error code identifying an illegal chunked-encoded stream.
Pretty much what Pekka's answer already said...
I have a php Twitter app which lets you mark tweets as favorite.
I'm doing something like this:
$fav = $twitter->createFavorite("xml", $get_id);//handles api call (using curl)
$fav_result = new SimpleXMLElement($fav);
On my localhost and on one online server all goes well: the tweet is marked as favorite, and the api call returns xml. On another online server, the tweet is also marked as favorite, but php gives an error: Fatal error: Uncaught exception 'Exception' with message 'String could not be parsed as XML'
On the second server, I seem to get an empty string as return value. When I look at the http status codes, when all is well I get a 200, but when things go wrong I get a status code of 0.
When I check the curl_error it says "Failed to open/read local data from file/application"
I think it has to do something with my server configuration. Does anyone have an idea what might be causing this?
I found the solution here: http://www.milk-hub.net/blog/2008/08/26/curl_error_26
Since no separate postvars are sent, you have to explicitly set the to an empty string:
curl_setopt($curl_handle, CURLOPT_POSTFIELDS, '');