File_get_contents, curl not working - php

Something strange is going on, and I would like to know why.
On this url: http://api.promasters.net.br/cotacao/v1/valores?moedas=USD&alt=json, which works well in the browser, but when I tried to retrieve the content with php:
echo file_get_contents('http://api.promasters.net.br/cotacao/v1/valores?moedas=USD&alt=json');
printed nothing, with var_dump(...) = string(0) "", so i went a little further and used:
function get_page($url) {
$curl = curl_init();
curl_setopt($curl, CURLOPT_RETURNTRANSFER, True);
curl_setopt($curl, CURLOPT_URL, $url);
$return = curl_exec($curl);
curl_close($curl);
return $return;
}
echo get_page('http://api.promasters.net.br/cotacao/v1/valores?moedas=USD&alt=json');
Also printed nothing, so i tried python (3.X):
import requests
print(requests.get('http://api.promasters.net.br/cotacao/v1/valores?moedas=USD&alt=json').text)
And WORKED. Why is this happening? What's going on?

It looks like they're blocking the user agent, or lack thereof, considering that php curl and file_get_contents doesn't seem to set the value in the request header.
You can fake this by setting it to something like Mozilla/5.0 (Macintosh; Intel Mac OS X 10.7; rv:7.0.1) Gecko/20100101 Firefox/7.0.1
<?php
function get_page($url) {
$curl = curl_init();
curl_setopt($curl, CURLOPT_RETURNTRANSFER, True);
curl_setopt($curl, CURLOPT_URL, $url);
curl_setopt($curl,CURLOPT_USERAGENT,'Mozilla/5.0 (Macintosh; Intel Mac OS X 10.7; rv:7.0.1) Gecko/20100101 Firefox/7.0.1');
$return = curl_exec($curl);
curl_close($curl);
return $return;
}
echo get_page('http://api.promasters.net.br/cotacao/v1/valores?moedas=USD&alt=json');

I experienced the same behaviour.
Fetching the URL using the CLI Curl worked for me.
I then wrote a script with a file_get_contents call to another script that dumped all request headers to a file using getallheaders:
<?php
file_put_contents('/tmp/request_headers.txt', var_export(getallheaders(),true));
Output of file:
array (
'Host' => 'localhost',
)
I then inspected the curl request headers,
$ curl -v URL
And tried adding one at a time to the file_get_contents request. It turned out a User agent header was needed.
<?php
$opts = array(
'http'=>array(
'method'=>"GET",
'header'=>
"User-Agent: examplebot\r\n"
)
);
$context = stream_context_create($opts);
$response = file_get_contents($url, false , $context);
This gave me a useful response.

Related

file_get_contents does not work for getting json from API in PHP 7

I juz upgraded php from 5.6 to 7.2.
Before 7.2, both file_get_contents worked fine for getting json from API but after upgrade to 7.2,
it returned false.
file_get_contents($url) => false
The url is like this:
'https://username:password#project_domain/api/json/xxx/?param_a=' . $a . '&param_b='. $b
And I didn't even touch the default setting in php.ini which is probably related to file_get_contents:
allow_url_fopen = On
I did google for this but there is no straight answer for my problem.
What is the reason for this?
How to fix to it?
Thanks!
$url = "https://www.f5buddy.com/";
$options = array(
'http'=>array(
'method'=>"GET",
'header'=>"Accept-language: en\r\n" .
"User-Agent: Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_0 like Mac OS X; en-us) AppleWebKit/532.9 (KHTML, like Gecko) Version/4.0.5 Mobile/8A293 Safari/6531.22.7\r\n"
));
$context = stream_context_create($options);
$file = file_get_contents($url, false, $context);
$file=htmlentities($file);
echo json_encode($file);
Finally got it with curl. It only worked when I skipped the ssl stuff. It is juz https to own project anyway so it shouldn't be a problem in security.
function getJsonFromAPI($url) {
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
$result = curl_exec($ch);
curl_close($ch);
$data = json_decode($result);
return $data;
}
Btw, I found out file_get_contents only worked for https to external url but not for https connection to the project itself. Any fix to that is appreciated.

file_get_contents fails via php, works via browser

What I'm trying to achieve:
Get request to an API Endpoint, retrieving an XML and subsequently parse the results.
I am sending a file_get_contents request to achieve this.
Issues:
`file_get_Contents` fails, error:
Warning: file_get_contents(https://api.twitter.com/1.1/statuses/mentions_timeline.json):
failed to open stream:
A connection attempt failed because the connected party did not properly
respond after a period of time, or established connection failed because
connected host has failed to respond.
Update 17/08
To consolidate my current understanding:
1. PHP FAILS:
1.a it fails via php (timeout)
1.b it fails via command line (curl -G http://api.eve-central.com/api/quicklook?typeid=34)
1.c file_get_contents
1.d file_get_contents w/ create_stream_context
2. What WORKS:
2.a Pasting the url in a chrome tab
2.b via postman
What has been attempted:
- Check Headers in Postman ,and try to replicate them via php
Postman Headers sent back by eve-central:
Access-Control-Allow-Origin → *
Connection → Keep-Alive
Content-Encoding → gzip
Content-Type → text/xml; charset=UTF-8
Date → Wed, 17 Aug 2016 10:40:24 GMT
Proxy-Connection → Keep-Alive
Server → nginx
Transfer-Encoding → chunked
Vary → Accept-Encoding
Via → HTTP/1.1 proxy10014
Corresponding Code:
$headers = array(
'method' => 'GET',
'header' => 'Connection: Keep-Alive',
'header' => 'Content-Encoding: gzip',
'header' => 'Content-Type: text/xml',
'header' => 'Proxy-Connection: Keep-Alive',
'header' => 'Server: nginx',
'header' => 'Transfer-Encoding: chunked',
'header' => 'Vary: Accept-Encoding',
'header' => 'Via: HTTP/1.1 proxy10014');
curl_setopt($curl, CURLOPT_HTTPHEADER, $headers);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true );
curl_setopt($curl, CURLOPT_PORT , 8080); // Attempt at changing port in the event it was blocked.
curl_setopt($curl, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($curl, CURLOPT_SSL_VERIFYHOST, false);
curl_setopt($curl, CURLOPT_POST, false );
curl_setopt($curl, CURLOPT_URL, $url );
$resp = curl_exec($curl);
if(curl_error($curl))
{
echo 'error:' . curl_error($curl);
}
Use Wireshark to capture the GET request to see if changing the port helped
Run cUrl via command line
I'm out of ideas and option.
So the questions are:
If it works in a browser, and in Postman, why does it not work via PHP ?
How can I modify my code so that it mimics what Postman does? ?
Previous Attempts
What I have tried:
Various cURL options from other threads, such as
function curl_get_contents($url) {
$ch = curl_init();
if (!$ch)
{
die("Couldn't initialize a cURL handle");
} else
echo "Curl Handle initialized ";
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_USERAGENT, 'Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; .NET CLR 1.1.4322)');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 5);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt($ch, CURLOPT_TIMEOUT, 5);
$data = curl_exec($ch);
// Check if any error occurred
if (!curl_errno($ch))
{
$info = curl_getinfo($ch);
echo 'Took ', $info['total_time'], ' seconds to send a request to ', $info['url'], "";
displayData($info);
} else
echo "Failed Curl, reason: ".curl_error($ch)." ";
curl_close($ch);
return $data;
}
result: nothing, no data returned.
- Checked php.ini options:
- allow_fopen is On
- allow_url_include = on
- relevant ssl extensions are enabled
- Raised the timeout window
- both via php.ini
- also via explicit declaration within the php file.
- Tried with a different url
- same error, so it doesn't really depends on my particular endpoint
- for example, both twitter/wikipedia/google return the specific error
- tried with:
- file_get_contents on a local xml file (https://msdn.microsoft.com/en-us/library/ms762271(v=vs.85).aspx) --> works
- file_get_contents on a remote xml file (http://www.xmlfiles.com/examples/note.xml) --> fails same error
- Overall, the following is true, so far:
- Curl fails, timeout
- file_get_Contents fails, timeout
- Open XML file url in a browser works
- Make a GET request via Postman, works
Obviously, in all cases where the file_get_contents fails via php, I can easily access the file via any browser.
Tried to work around the issue.
Attempt 1:
Use nitrous.io, create a LAMP stack, perform the deed via the platform
results: file_get_contents works, however, due to the large number of xml files to be retrieved, the operation times-out.
Tentative solution:
- Download XML files from source
- Zip them
- Download xml_file
- Locally parse said xml files
Later on, write a small php scripts that, when invoked, performs the bits above, sends the data to the local directory, which then unpacks it and performs additional work on it.
Another attempt would be to use Google Sheets, with a user function that pulls the data into the sheet, and just dump the excel file / values into mysql.
For my purposes, while an awfully ignorant solution, it does the trick.
Code used for avoiding timeout issue on shared host:
function downloadUrlToFile2($url, $outFileName)
{
//file_put_contents($xmlFileName, fopen($link, 'r'));
//copy($link, $xmlFileName); // download xml file
;
echo "Passing $url into $outFileName ";
// $outFileName = touch();
$fp = fopen($outFileName, "w");
if(is_file($url))
{
copy($url, $outFileName); // download xml file
} else
{
$ch = curl_init();
$options = array(
CURLOPT_TIMEOUT => 28800, // set this to 8 hours so we dont timeout on big files
CURLOPT_URL => $url
);
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt_array($ch, $options);
$contents = curl_exec($ch);
fwrite($fp, $contents);
curl_close($ch);
}
}
I have also added this on top of the ini script:
ignore_user_abort(true);
set_time_limit(0);
ini_set('memory_limit', '2048M');
I see some issue with HTTPS url request, for fix issue you have to add below lines in your CURL request
function curl_get_contents($url) {
$ch = curl_init();
$header[0] = "Accept: text/xml,application/xml,application/xhtml+xml,";
$header[0] .= "text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5";
$header[] = "Cache-Control: max-age=0";
$header[] = "Connection: keep-alive";
$header[] = "Keep-Alive: 300";
$header[] = "Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7";
$header[] = "Accept-Language: en-us,en;q=0.5";
$header[] = "Pragma: ";
curl_setopt( $ch, CURLOPT_HTTPHEADER, $header );
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_URL, $url);
// I have added below two lines
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0);
$data = curl_exec($ch);
curl_close($ch);
return $data;
}

cURL headers in command line show content-type as image/png, in PHP shows text/html?

I'm attempting to use cURL to download an external image file. When used from the command line, cURL correctly states the response headers with content-type=image/png. When I attempt to use cURL in PHP however, it returns content-type=text/html.
When attempting to save the file using cURL in PHP, with the CURLOPT_BINARYTRANSFER option set to 1, in conjunction with fopen/fwrite/, the result is a corrupt file.
The only cURL flags I'm using in are -A to send a user agent with the request, which I've also done in PHP by calling curl_setopt($ch, CURLOPT_USERAGENT, ...).
The only thing I can think of that would cause this is perhaps some background request headers sent by cURL which aren't accounted for using the standard PHP functions?
For reference;
CLI
curl -A "Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3" -I http://find.icaew.com/data/imgs/736c476534ddf7b249d806d9aa7b9ee8.png
PHP
private function curl($url) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_USERAGENT, "Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_BINARYTRANSFER, 1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 1);
$response = array(
'html' => curl_exec($ch),
'http_code' => curl_getinfo($ch, CURLINFO_HTTP_CODE),
'contentLength' => curl_getinfo($ch, CURLINFO_CONTENT_LENGTH_DOWNLOAD),
'contentType' => curl_getinfo($ch, CURLINFO_CONTENT_TYPE)
);
curl_close($ch);
return $response;
}
public function parseImage() {
$imageSrc = pq('img.firm-logo')->attr('src');
if (!empty($imageSrc)) {
$newFile = '/Users/firstlast/Desktop/Hashery/test01/imgdump/' . $this->currentListingId . '.png';
$curl = $this->curl('http://find.icaew.com' . $imgSrc);
if ($curl['http_code'] == 200) {
if (file_exists($newFile)) unlink($newFile);
$fp = fopen($newFile,'x');
fwrite($fp, $curl['html']);
fclose($fp);
return $this->currentListingId;
} else {
return 0;
}
} else {
return 0;
}
}
When I mentioned content-type=text/html The call to $this->curl() results in the contentLength and contentType properties of the returned $response variable having the values -1 and text/html respectively.
I can imagine this is quite an obscure question, so I've attempted to provide as much context as to what is going on/what I'm trying to achieve. Any help in understanding why this is the case, and what I can do to resolve/achieve my goal would be greatly appreciated
If you know exactly what you are getting then get_file_contents() is much simpler.
A URL can be used as a filename with this function
http://php.net/manual/en/function.file-get-contents.php
Also, it is helpful to go through the user comments on php.net as they have written many examples and potential issues or tricks to using the function.

How to get router informations using cURL and PHP

I am building a web application for my router, it will be my Bachelor's Thesis.
The bad thing is that I can't display my router's informations using my cURL function because I get bad router username and password error. I didn't found any problem at all:
The cURL function:
function myCurl($url, $post="")
{
global $status;
$header = 'Authorization: Basic YWRtaW46YWRtaW4=';
$cookiepath_tmp = "c:/xampp/htdocs/wifi/cookie.txt";
$resp = array();
$ch = curl_init();
curl_setopt($ch,CURLOPT_USERAGENT, "Mozilla/5.0 (Windows; U; Windows NT 5.1; rv:1.7.3) Gecko/20041001 Firefox/0.10.1" );
curl_setopt($ch,CURLOPT_URL, trim($url));
curl_setopt($ch,CURLOPT_REFERER, trim($url));
curl_setopt($ch,CURLOPT_COOKIEJAR,$cookiepath_tmp);
curl_setopt($ch,CURLOPT_COOKIEFILE,$cookiepath_tmp);
curl_setopt($ch,CURLOPT_COOKIESESSION, true);
curl_setopt($ch,CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch,CURLOPT_RETURNTRANSFER,true);
curl_setopt($ch,CURLOPT_MAXREDIRS, 10);
curl_setopt($ch,CURLOPT_ENCODING, "");
curl_setopt($ch,CURLOPT_RETURNTRANSFER, true);
#curl_setopt($ch,CURLOPT_AUTOREFERER, true);
curl_setopt($ch,CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch,CURLOPT_CONNECTTIMEOUT, 15);
curl_setopt($ch,CURLOPT_TIMEOUT, 15);
curl_setopt($ch,CURLOPT_SSL_VERIFYPEER, false );
curl_setopt($ch,CURLOPT_HEADER, 0);
curl_setopt($ch,CURLOPT_HTTPHEADER, array( 'Expect:' ) );
curl_setopt($ch,CURLOPT_VERBOSE, 1);
#curl_setopt($ch,CURLOPT_FAILONERROR, true);
if($post) { curl_setopt($ch, CURLOPT_POST,1); curl_setopt($ch,CURLOPT_POSTFIELDS,$post); }
$returned = curl_exec($ch);
$resp['returned'] = $returned;
$status=curl_getinfo($ch);
$resp['status'] = $status;
curl_close($ch);
return $resp;
}
I am trying to display the informations using PHP:
The PHP code:
<?php echo $success_msg;
$url = "http://192.168.0.1/session.cgi";
$post = "REPORT_METHOD=xml&ACTION=login_plaintext&USER=admin&PASSWD=admin&CAPTCHA=";
$data = myCurl($url, $post);
#$url = "http://192.168.0.1/st_log.php";
#$data = myCurl($url);
echo $data['returned'];
?>
The error is:
Username or Password is incorrect.
However, The username and password admin are correct.
I have added the following code into myCurl function but still doesn't work:
$header = 'Authorization: Basic YWRtaW46YWRtaW4=';
YWRtaW46YWRtaW4= is the encoded username:password in Base64.
LAST EDIT:
I set the CURLOPT_HEADER to true, and I got this text displayed:
HTTP/1.1 501 Not Implemented Server: Router Webserver Connection: close WWW-Authenticate: Basic realm="TP-LINK Wireless Lite N Router WR740N" Content-Type: text/html
Any solution for this?
I really appreciate your help! Thank you!
I don't known what is your router (vendor / model) but most of them use HTTP basic authentication. And, when the authentication is empty or wrong you get a HTTP 401 error: Unauthorized, which could correspond to your error string.
So you should try to insert a HTTP authorization header in the cURL request:
Authorization: Basic QWxhZGRpbjpvcGVuIHNlc2FtZQ==

file_get_contents from specific URL

I have an API Key that verifies the request URL
If I do
echo file_get_contents('http://myfilelocation.com/?apikey=1234');
RESULT : this api key is not authorized for this domain
However, if I put the requested URL within an iframe with the same URL:
RESULT : this api key is authorized
Obviously, the Server I'm getting the requested JSON return data is working properly because the iframe is outputting the proper information. However, how can I verify that PHP is making the request from the proper domain and URL settings?
By using file_get_contents I am always getting back that the API key is not authorized. However, I'm running the php script from the authorized domain.
Try this PHP code:
<?php
$options = array(
'http'=>array(
'method'=>"GET",
'header'=>"Host: myfilelocation.com\r\n". // Don't forgot replace with your domain
"Accept-language: en\r\n" .
"User-Agent: Mozilla/5.0 (iPad; U; CPU OS 3_2 like Mac OS X; en-us) AppleWebKit/531.21.10 (KHTML, like Gecko) Version/4.0.4 Mobile/7B334b Safari/531.21.102011-10-16 20:23:10\r\n"
)
);
$context = stream_context_create($options);
$file = file_get_contents("http://myfilelocation.com/?apikey=1234", false, $context);
?>
file_get_contents doesn't send a any referrer information and the api may need it, this may help you:
<?php
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://myfilelocation.com/?apikey=1234');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_REFERER, 'http://autorized-domain.here');
$html = curl_exec($ch);
echo $html;
?>

Categories