I have a curl script that I have calling Rotten Tomatoes. Every time I run it, even in a for loop from 1 to 10, it runs infinitely. The only way to stop it is by restarting the server, the page continues to call the rotten tomatoes site until the server goes down. The curl script works for other APIs so it should work for this one. Here it is, any idea?:
For the $temp_movie, that gets its value and works properly.
$ch = curl_init();
$api_link = "http://api.rottentomatoes.com/api/public/v1.0/movies.json?apikey=****&q=".$temp_movie."&page_limit=1";
echo $api_link."<br>";
curl_setopt($ch, CURLOPT_URL, $api_link);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_TIMEOUT, '3');
$content = trim(curl_exec($ch));
curl_close($ch);
$rottentomatoes = json_decode($content, true);
I have no idea why this worked but like I said, the curl script worked for other APIs, so I tried copy and pasting the same curl code (again) and trying again. Some reason this works? Is there any difference that I'm just not seeing?:
$ch = curl_init();
$api_link = "http://api.rottentomatoes.com/api/public/v1.0/movies.json?apikey=****&q=".$temp_movie."&page_limit=1";
curl_setopt($ch, CURLOPT_URL, $api_link);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_TIMEOUT, '3');
$content = trim(curl_exec($ch));
curl_close($ch);
$rottentomatoes = json_decode($content, true);
Related
I'm using curl to scrap two websites, both of them with the same php script(that is ran every 30 min by a cron job). The request is very simple:
//website 1
$ch = curl_init();
$url = 'url';
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$result = curl_exec($ch);
curl_close($ch);
//website 2
$ch2 = curl_init();
$url2 = 'url';
curl_setopt($ch2, CURLOPT_URL, $url2);
curl_setopt($ch2, CURLOPT_RETURNTRANSFER, true);
$result2 = curl_exec($ch2)
curl_close($ch2);
My question/s is: what is the best practice in cases like this to prevent running out of memory(didn't happen yet but who knows) and to maximize execution speed?
Is there a way to clean memory after each curl request?
Thank you! :D
This really has me perplexed and other examples on Stackoverflow have not helped.
If I type into the browser :
https://api.bitfinex.com/v2/book/tXMRUSD/P0
I will get an long array of data which is correct.
With the following simple code :
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,"https://api.bitfinex.com/v2/book/tXMRUSD/P0");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_HTTPGET, 1);
$result = curl_exec($ch);
curl_close($ch);
var_dump($result);
I get :
bool(false)
The only difference is my browser fetches the data directly, while the PHP code is on a local website hosted using IIS on windows 10.
I've tried everything and just can't see where I'm going wrong. Any help will be most appreciated. (Ironically, all the authenticated and encrypted POST code works fine and I'm stuck on the easy stuff!)
Try This code it will work!!!!
<?php
$url= "https://api.bitfinex.com/v2/book/tXMRUSD/P0";
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
$result = curl_exec($ch);
curl_close($ch);
print_r($result);
?>
I am trying to send SMS from my localhost with xamp installed.
Requested page is on https and an .aspx page.
I am getting error: "HTTP Error 400. The request is badly formed." or blank page only in some cases.
Detaisl is as follows :
$url = 'https://www.ismartsms.net/iBulkSMS/HttpWS/SMSDynamicAPI.aspx';
$postArgs = 'UserId='.$username.
'&Password='.$password.
'&MobileNo='.$destination.
'&Message='.$text.
'&PushDateTime='.$PushDateTime.
'&Lang='.$Lang;
function getSslPage($url) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE);
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_REFERER, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
$result = curl_exec($ch);
curl_close($ch);
return $result;
}
$response = getSslPage($all);
echo "<pre>";
print_r($response); exit;
I tried every possible solution/combination found on internet but could not resolve that. The API developers do not have a example for php script.
I tried httpful php library and file_get_contents function but getting empty page. Also tried every combination with curl_setup.
I need to call this url without any post data and see the response from it.
Instead getting a blank page.
Please note that when I execute the url with all details in browser it works fine.
Can anybody help me in this regard.
Thank you,
Usman
First do urlencode over your data as follows:
$postArgs = 'UserId='. urlencode($username.
'&Password='.urlencode($password).
'&MobileNo='.urlencode($destination).
'&Message='.urlencode($text).
'&PushDateTime='.urlencode($PushDateTime).
'&Lang='.urlencode($Lang);
After that two possible solutions. One is using GET.
curl_setopt($ch, CURLOPT_URL, $url . "?" . $postArgs);
Second option is using POST method.
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, $postArgs);
I built an application on top of the foursquare API. It works perfectly on localhost but as soon as I upload it to a public website it stops working. I have checked to see if php is working by placing simple echos throughout my code and have had no issues. On my localhost I am able to echo information from the JSON the foursquare generates. When it is on the public server it echos nothing.
$urlgen = "https://api.foursquare.com/v2/venues/search?near={$city}&query={$query}&client_id={$client_id}&client_secret={$client_secret}&v=20141015";
$resultFour = fetchData($urlgen);
echo "$resultFour";
This works code returns JSON on localhost but not on the website.
Fetch Data:
function fetchData($url){
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_TIMEOUT, 20);
$result = curl_exec($ch);
curl_close($ch);
return $result;
}
Check if your public server has curl installed (i assume that fethData use curl to connect to server). If you are sure that it is installed. For me works code:
private function fetchUrl($url)
{
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_TIMEOUT, 20);
curl_setopt($ch,CURLOPT_SSL_VERIFYPEER,false);
$feedData = curl_exec($ch);
curl_close($ch);
return $feedData;
}
I am trying to open homepages of websites and extract title and description from it's html markup using curl with php, I am successful in doing this to an extent, but many websites are there I am unable to open. My code is here:
function curl_download($Url){
if (!function_exists('curl_init')){
die('Sorry cURL is not installed!');
}
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $Url);
curl_setopt($ch, CURLOPT_HEADER, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_TIMEOUT, 10);
$output = curl_exec($ch);
curl_close($ch);
return $output;
}
// $url is any url
$source=curl_download($url);
$d=new DOMDocument();
$d->loadHTML($source);
$title=$d->getElementsByTagName("title")->item(0)->textContent)
$domx = new DOMXPath($d);
$desc=$domx->query("//meta[#name='description']")->item(0);
$description=$desc->getAttribute('content');
?>
This code is working fine for most websites but there are many whome it doesn't even able to open. What can be the reason?
When I tried getting headers of those websites using get_headers function, its working fine, but these are not being opened using curl. Two of these websites are blogger.com and live.com.
Replace:
$output = curl_exec($ch);
with
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
curl_setopt($ch, CURLOPT_SSLVERSION, 3);
$output = curl_exec($ch);
if (!$output) {
echo curl_error($ch);
}
and see why Curl is failing.
It's a good idea to always check the result of function calls to see if they succeeded or not, and to report when they fail. While a function may work 99.999% of the time, you need to report the times it fails, and why, so the underlying cause can be identified and fixed, if possible.