I have a web service which you post data to and it returns you a txt.gz file. I'm trying to use cURL to post up the information, however I'm not sure how to be ready and handle the file which comes back at me to download.
I'm getting a successful response; but the file is 0 KB, and obviously isn't downloading. Any help would be greatly appreciated!
Here is currently what I'm doing:
$url = 'http://www.mywonderfulurl.com';
$fields = array('id'=>'123');
$fields_string = 'id=123';
$useragent="Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.1) Gecko/20061204 Firefox/2.0.0.1";
$ch = curl_init();
// set user agent
curl_setopt($ch, CURLOPT_USERAGENT, $useragent);
//set the url, number of POST vars, POST data
curl_setopt($ch,CURLOPT_URL,$url);
curl_setopt($ch,CURLOPT_POST,count($fields));
curl_setopt($ch,CURLOPT_POSTFIELDS,$fields_string);
curl_setopt($ch, CURLOPT_TIMEOUT, 250);
curl_setopt($ch, CURLOPT_FILE, 'download.txt.gz');
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
if (file_put_contents('download.txt.gz', curl_exec($ch)) === false) {
// Handle unable to write to file error.
echo('you failed');
exit;
}
echo curl_getinfo( $ch, CURLINFO_HTTP_CODE );
//close connection
curl_close($ch);
You are never writing to the file.
You need to write the result to file.
fwrite($fp, $result);
You are probably better off using file_put_contents, then you don't need to open/close the file.
http://php.net/manual/en/function.file-put-contents.php
if ($file_put_contents('download.txt.gz', curl_exec($ch)) === false) {
// Handle unable to write to file error.
}
But maybe using the $result in between to check the request was actually ok before saving it to file.
You should check the return from curl_exec()
$result = curl_exec($ch);
if ($result === false) {
echo 'Curl error: ' . curl_error($ch);
}
Writing false to a file would result in an empty file, so this could be whats happening.
Related
I'm currently developing a nagios plugin with PHP and cURL.
My problem is that my script is working well when i use it with PHP like this :
#php /usr/local/nagios/plugins/script.php
I mean it returns me a 200 HTTP CODE.
But with nagios it returns me a 0 HTTP CODE. It's strange because the php is working with NAGIOS (i can read variables...). So the problem is that Nagios can't use cURL.
Can someone give me a clue ? Thanks.
Here you can see my code.
<?php
$widgeturl = "http://google.com";
$agent = "Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9.0.12) Gecko/2009070611 Firefox/3.0.12";
if (!function_exists("curl_init")) die("pushMeTo needs CURL module, please install CURL on your php.");
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $widgeturl);
$page = curl_exec($ch); //or die("Curl exe failed");
$code=curl_getinfo($ch, CURLINFO_HTTP_CODE);
if ($code==200) {
fwrite(STDOUT, $page.'Working well : '.$code);
exit(0);
}
else {
fwrite(STDOUT, $page.'not working : '.$code);
exit(1);
}
curl_close($ch);
Solution :
It was because the proxy was basically set on my OS (centOS), but Nagios was not using it instead of PHP. So i just had to put : curl_setopt($ch, CURLOPT_PROXY, 'myproxy:8080'); curl_setopt($ch, CURLOPT_PROXYUSERPWD, "user:pass"); Hope it could help someone
curl_setopt($ch, CURLOPT_PROXY, 'myproxy:8080');
curl_setopt($ch, CURLOPT_PROXYUSERPWD, "user:pass")
Can you try making the CURL request like this (i.e. header only request):
<?php
// config
$url = 'http://www.google.com/';
// make request & parse response
$curl = curl_init();
curl_setopt($curl, CURLOPT_URL, $url);
curl_setopt($curl, CURLOPT_FILETIME, true);
curl_setopt($curl, CURLOPT_NOBODY, true);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
$response_header = curl_exec($curl);
$response_info = curl_getinfo($curl);
curl_close($curl);
// debug
echo "<b>response_header</b>\r\n";
var_dump($response_header);
echo "<b>response_info</b>\r\n";
var_dump($response_info);
The above will output the following:
I have a problem with one website i wrote a few weeks ago.
my website communicates with another website_2 via API hosted on website_2
the curl operation is requested via Query POST to a PHP file.
if for some reason the operation took a longer time (which i can't determine the reason for) and the user hits refresh.. the command sent to the API is done yet my server doesn't get any result so can't log or do anything with that result..
is there a way to preserve the integrity of such transaction?
below is my code and i still get FAILED no matter what the result was on website_2
function doCommit($url_)
{
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url_);
curl_setopt($ch, CURLOPT_USERAGENT, 'Opera/9.23 (Windows NT 5.1; U; en)');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 2);
curl_setopt($ch, CURLOPT_TIMEOUT,5);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT,5);
$commit = curl_exec($ch);
curl_close($ch);
if(!curl_exec($ch))
{
$ERROR="<Transaction>
<Result>Failed</Result>
<Reason>Operation Timed Out</Reason>
</Transaction>";
$oXML = new SimpleXMLElement($ERROR);
return $oXML;
}
else{
$oXML = new SimpleXMLElement($commit);
return $oXML;
}
// return $oXML->Reason;
}
You can use curl parameters to solve your problem setting a "request timeout":
CURLOPT_TIMEOUT - Sets The number of seconds to wait before a curl individual function timeouts.
CURLOPT_CONNECTTIMEOUT - Sets the maximum time before curl connection timeouts.
...and then you could return a text if curl_exec fails:
if(curl_exec($curl) === false)
{
echo 'ERROR: ' . curl_error($curl);
}
Solved it by the following code
function doCommit($url_)
{
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url_);
curl_setopt($ch, CURLOPT_USERAGENT, 'Opera/9.23 (Windows NT 5.1; U; en)');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 2);
curl_setopt($ch, CURLOPT_TIMEOUT,2);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT,2);
$commit = curl_exec($ch);
curl_close($ch);
if(curl_errno($ch) == 0)
{
$oXML = new SimpleXMLElement($commit);
return $oXML;
}
else{
$ERROR="<Transaction>
<Result>Failed</Result>
<Reason>Operation Timed Out</Reason>
</Transaction>";
$oXML = new SimpleXMLElement($ERROR);
return $oXML;
}
// return $oXML->Reason;
}
I am using the following code to get the xml data form icecat.biz:
set_time_limit (0);
$login = "Arpan";
$password = "arpan";
//$url="http://data.icecat.biz/export/freexml.int/EN/files.index.xml";
$url= "http://data.icecat.biz/export/level4/EN";
//$url="http://data.icecat.biz/export/freexml.int/DE/10.xml";
$user_agent = 'Mozilla/5.0 (Windows; U;
Windows NT 5.1; ru; rv:1.8.0.9) Gecko/20061206 Firefox/1.5.0.9';
$header = array(
"Accept: text/xml,application/xml,application/xhtml+xml,
text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5",
"Accept-Language: ru-ru,ru;q=0.7,en-us;q=0.5,en;q=0.3",
"Accept-Charset: windows-1251,utf-8;q=0.7,*;q=0.7",
"Keep-Alive: 300");
$local_path = "myxml.xml";
$file_handle = fopen($local_path, "w");
ob_start();
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_FILE, $file_handle);
curl_setopt($ch, CURLOPT_HEADER, 0);
//curl_setopt ( $ch , CURLOPT_HTTPHEADER, $header );
curl_setopt($ch, CURLOPT_USERPWD, $login . ":" . $password);
curl_setopt($ch, CURLOPT_TIMEOUT, 0); // times out after 4s
//curl_setopt($c, CURLOPT_TIMEOUT, 2);
//curl_setopt($ch, CURLOPT_NOBODY, TRUE); // remove body
//curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
// curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
//$head = curl_exec($ch);
$result = curl_exec ($ch);
if(curl_errno($ch))
{
echo 'Curl error: ' . curl_error($ch);
}
curl_close($ch);
ob_end_clean();
fclose ($file_handle);
$xmlStr = file_get_contents($local_path);
$xmlObj = simplexml_load_string($xmlStr);
print "<pre>";
//print_r($xmlObj->Product->ProductRelated->attributes()->ID);
print_r($xmlObj);
exit;
The page is being executed for a unlimited time but the XML is not being updated after 10 to 20 sec. The output xml is also not being completed. I think after a certain time the server is not responding or data is not being transferred.
Here is the error message:
**** The server xml (icecat) size is big
What is the problem and how do I fix it?
Sounds like you are not giving enough time for the request to download properly.
Uncomment your //curl_setopt($c, CURLOPT_TIMEOUT, 2);
And put the timeout to 600 for a test.
Beyond that your request looks fine, you could always check to see if the server is caching responses, the last thing that I've seen recently happening is if some of my users have reverse proxies in order to cache their normal operations. Where some truncated responses got cached and thats all they got back for a 24 hour period although that may not be related to you.
A similar question has been posted at but i could not find the solution there
Curl error Could not resolve host: saved_report.xml; No data record of requested type"
<?php
$url="http://en.wikipedia.org/wiki/Pakistan";
$ch = curl_init(urlencode($url));
echo $ch;
// used to spoof that coming from a real browser so we don't get blocked by some sites
$useragent="Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.1) Gecko/20061204 Firefox/2.0.0.1";
curl_setopt($ch, CURLOPT_USERAGENT, $useragent);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 4);
curl_setopt($ch, CURLOPT_TIMEOUT, 8);
curl_setopt($ch, CURLOPT_LOW_SPEED_TIME, 10);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$content = curl_exec($ch);
$info = curl_getinfo($ch);
if ($content === false || $info['http_code'] != 200) {
$content = "No cURL data returned for $url [". $info['http_code']. "]";
if (curl_error($ch))
$content .= "\n". curl_error($ch);
}
else {
// 'OK' status; format $output data if necessary here:
echo "...";
}
echo $content;
curl_close($ch);
?>
when i paste the same address in browser i am able to access the webpage. but when i run this script i get the error message. Can anyone please help me.
Thanks
Remove the urlencode call.
remove the urlencode($url) it should be:
$ch = curl_init($url);
Well.
If you remove urlencode() with instantiating your $ch-var, you go just fine. urlencode() is definitely wrong here.
Good:
$ch = curl_init($url);
Bad:
$ch = curl_init(urlencode($url));
$ch = curl_init($url);
instead of
$ch = curl_init(urlencode($url));
i am fetching somesite page..
but it display nothing
and url address change.
example i have typed
http://localhost/sushant/EXAMPLE_ROUGH/curl.php
in curl page my coding is=
$fp = fopen("cookie.txt", "w");
fclose($fp);
$agent= 'Mozilla/5.0 (Windows; U; Windows NT 5.1; pl; rv:1.9) Gecko/2008052906 Firefox/3.0';
$ch = curl_init();
curl_setopt($ch, CURLOPT_USERAGENT, $agent);
// 2. set the options, including the url
curl_setopt($ch, CURLOPT_URL, "http://www.fnacspectacles.com/place-spectacle/manifestation/Grand-spectacle-LE-ROI-LION-ROI4.htm");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt ($ch, CURLOPT_COOKIEJAR, 'cookie.txt');
curl_setopt($ch, CURLOPT_COOKIEFILE, "cookie.txt");
// 3. execute and fetch the resulting HTML output
if(curl_exec($ch) === false)
{
echo 'Curl error: ' . curl_error($ch);
}
else
echo $output = curl_exec($ch);
// 4. free up the curl handle
curl_close($ch);
but it canege url like this..
http://localhost/aide.do?sht=_aide_cookies_
object not found.
how can solve these problem help me
It looks like you're both trying to save cookies to cookies.txt, and read them from there. What you would normally do is that the first url you visit, you have curl save the cookies to a file. Then, for subseqent requests, you supply that file.
I'm not sure of the php aspects, but from the curl aspects it looks like you're trying to read a cookie file that doesn't exist yet.
edit: oh, and if you're only doing one request, you shouldn't even need cookies.
Seems like there is javascript in the output, which is causing the redirect.
So for testing purpose, instead of using:
echo $output = curl_exec($ch);
Use:
$output = curl_exec($ch);
echo strip_tags($output);
Update:
The code below will put the contents into contents.htm .. everything u need for pasring should be in there and in the output variable.
if(curl_exec($ch) === false)
{
echo 'Curl error: ' . curl_error($ch);
}
else{
$output = curl_exec($ch);
$fp2 = fopen("content.htm" , "w");
fwrite($fp2 , $output);
fclose($fp2);
}