I'm trying to get JSON data from 51 URLs using cURL in Php & recently , I'm stuck with an issue of Connection Timeout & it displays data return bool(false).
Here is error message :
Curl Error no :28
Curl error message :Connection timed out after 60051 milliseconds
Count Value : 0
bool(false)
And Here is my Code :
foreach($this->state_list as $short_code => $state_name_value) {
// echo "Key=" .$short_code . ", Value=" . $state_name_value;
$legislator_of_state_Url = "https://openstates.org/api/v1/legislators/?state=".$short_code."&chamber=upper";
echo "URL : ".$legislator_of_state_Url."<br/>";
$curl = curl_init();
curl_setopt($curl, CURLOPT_URL,$legislator_of_state_Url);
curl_setopt($curl,CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($curl , CURLOPT_HTTPHEADER, array ('Accept: application/json'));
curl_setopt($curl,CURLOPT_CONNECTTIMEOUT,60);
$feed = curl_exec($curl);
echo "Curl Info :".curl_getinfo($curl) . '<br/>';
echo "Curl Error no :".curl_errno($curl) . '<br/>';
echo "Curl error message :".curl_error($curl) . '<br/>';
curl_close($curl);
// print_r(json_decode($feed,true));
$jsonarray = json_decode($feed, true);
echo "Count Value : ".count($jsonarray);
if(count($jsonarray) == 0)
{
var_dump($feed);
}
}
Every time whenever I run this code 2 random URLs out of 51 returns this error hence I'm not getting any JSON data
What could be the solution ?
if you want to increase the time limit of the script execution then use this
set_time_limit(seconds);
but there can be some other issues like the url might not be responding at all,you can also check manually the 2 url which are giving the error
Related
The issue:
I'm working with PHP, cURL and a public API to fetch json strings.
These json strings are formatted like this (simplified, average size is around 50-60 kB):
{
"data": {},
"previous": "url",
"next": "url"
}
What am trying to do is fetch all the json strings starting from the first one by checking for the "next" attribute. So I have a while loop and as long as there's a "next" attribute, I fetch the next URL.
The problem is sometimes, randomly the loop stops before the end and I cannot figure out why after many tests.
I say randomly because sometimes the loop goes through to the end and no problem occurs. Sometimes it crashes after N loops.
And so far I couldn't extract any information to help me debug it.
I'm using PHP 7.3.0 and launching my code from the CLI.
What I tried so far:
Check the headers:
No headers are returned. Nothing at all.
Use curl_errno() and curl_error():
I tried the following code right after executing the request (curl_exec($ch)) but it never triggers.
if(curl_errno($ch)) {
echo 'curl error ' . curl_error($ch) . PHP_EOL;
echo 'response received from curl error :' . PHP_EOL;
var_dump($response); // the json string I should get from the server.
}
Check if the response came back null:
if(is_null($response))
or if my json string has an error:
if(!json_last_error() == JSON_ERROR_NONE)
Though I think it's useless because it will never be valid if the cURL response is null or empty. When this code triggers, the json error code is 3 (JSON_ERROR_CTRL_CHAR)
The problematic code:
function apiCall($url) {
...
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$response = curl_exec($ch);
}
$inc = 0;
$url = 'https://api.example.com/' . $id;
$jsonString = apiCall($url);
if(!is_null($jsonString)) {
file_put_contents('pathToDirectory/' . $id + $inc, $jsonString);
$nextUrl = getNextUrl($jsonString);
while ($nextUrl) {
$jsonString = apiCall($url . '?page=' . $nextUrl);
if(!is_null($jsonString)) {
$inc++;
file_put_contents('pathToDirectory/' . $id + $inc, $jsonString);
$nextUrl = getNextUrl($jsonString);
}
}
}
What I'm expecting my code to do:
Not stop randomly, or at least give me a clear error code.
The problem is that your API could be returning an empty response, a malformed JSON, or even a status code different of 200 and you would stop execution imediately.
Since you do not control the API responses, you know that it can fail randomly, and you do not have access to the API server logs (because you don't, do you?); you need to build some kind of resilience in your consumer.
Something very simple (you'd need to tune it up) could be
function apiCall( $url, $attempts = 3 ) {
// ..., including setting "$headers"
$ch = curl_init();
curl_setopt( $ch, CURLOPT_URL, $url );
curl_setopt( $ch, CURLOPT_HTTPHEADER, $headers );
curl_setopt( $ch, CURLOPT_RETURNTRANSFER, true );
for ( $i = 0; $i < $attempts; $i ++ ) {
$response = curl_exec( $ch );
$curl_info = curl_getinfo( $ch );
if ( curl_errno( $ch ) ) {
// log your error & try again
continue;
}
// I'm only accepting 200 as a status code. Check with your API if there could be other posssible "good" responses
if ( $curl_info['http_code'] != 200 ) {
// log your error & try again
continue;
}
// everything seems fine, but the response is empty? not good.
if ( empty( $response ) ) {
// log your error & and try again
continue;
}
return $response;
}
return null;
}
This would allow you to do something like (pulled from your code):
do {
$jsonString = apiCall($url . '?page=' . $nextUrl);
$nextUrl = false;
if(!is_null($jsonString)) {
$inc++;
file_put_contents('pathToDirectory/' . $id + $inc, $jsonString);
$nextUrl = getNextUrl($jsonString);
}
}
while ($nextUrl);
I'm not checking if the return from the API is non-empty, not a connection error, a status different from '200' and yet an invalid JSON.
You may want to check for these things as well, depending on how brittle the API you are consuming is.
I've been following this tutorial on how to find my membership id.
This part is what I'm stuck on. I have a simple PHP file using that code with the API key and correct settings. Its giving me property of non object error. Here:
<?php
$apiKey = 'REMOVED FOR SECURITY';
$ch = curl_init();
‘https://www.bungie.net/platform/destiny/1/Stats/GetMembershipIdByDisplayName/GAMERTAG REMOVED FOR SECURITY');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_HEADER, array('X-API-Key: ' . $apiKey));
$json = json_decode(curl_exec($ch));
echo $json->Response;
?>
The link inside the code would usually have a gamertag, like 'MajorNelson' or something like that. It gives some error when you go to it, but that link doesnt matter. When hosting the php file using XAMPP I get this error
Notice: Trying to get property of non-object in E:\xampp\htdocs\bungieapi.php on line 9
Line 9 is the echo line before the ?>.
Check for errors:
// ...
$json = curl_exec($ch);
if ($json === false) {
die('ERROR: ' . curl_error($ch));
}
$obj = json_decode($json);
if (isset($obj->Response)) {
echo $obj->Response;
} else {
echo 'ERROR: The "Response" property is not there';
}
I have a php file called testResponse.php which is only :
<?php
sleep(5);
echo"go";
?>
Now, I'm calling this file from a another page using file_get_contents like this :
$start= microtime(true);
$opts = array('http' =>
array(
'method' => 'GET',
'timeout' => 1
)
);
$context = stream_context_create($opts);
$loc = #file_get_contents("http://www.mywebsite.com/testResponse.php", false, $context);
$end= microtime(true);
echo $end - $start, "\n";
The output is more than 5 sec, which means that my timeout has been ignored...
I followed the advice of this post : stackoverflow.com/questions/3689371
But it seems that hostname cannot be a path (like www.mywebsite.com/testResponse.php) but directly the hostname like www.mywebsite.com.
So I'm stuck to achieve this goal :
Get content of page www.test.com/x.php with constraint :
if test.com doesn't exist or the page x.php doesn't exist returns nothing quickly
if the page exist but takes more than 1 sec to load, abort
else get the content of the file
Edit : By the way, it seems to work when I call this page (testResponse.php) from my local server. Well, it multiply the timeout by 2. For instance, If I have 1 for timeout, I will have echoed something like "2.0054645". But only from local...
The solution is to use PHP's cURL functions. The other question you linked to explains things properly, about the read timeouts vs. the connection timeouts, and so on, but neither of those are truly what you're looking for here. Even the connection timeout won't work, because the connection to testResponse.php is always successful; after that it's waiting, so what you need is an execution timeout. This is where cURL comes in handy.
So, testResponse.php doesn't need to be altered. In your main file, though, try the following code (this is tested and it works on my server):
$start = microtime(true);
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://www.mywebsite.com/testResponse.php");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_TIMEOUT, 1);
$output = curl_exec($ch);
$errno = curl_errno($ch);
if ($errno > 0) {
if ($errno === 28) {
echo "Connection timed out.";
}
else {
echo "Error #" . $errno . ": " . curl_error($ch);
}
}
else {
echo $output;
}
$end = microtime(true);
echo "<br><br>" . ($end - $start);
curl_close($ch);
This sets the execution time of the cURL session, via the CURLOPT_TIMEOUT option you see on line 5. So, when the connection is timed out, $errno will equal 28, the code for cURL's operation timeout error. The rest of the error codes are listed in the cURL documentation, so you can expand the script above to act accordingly.
Finally, because of the CURLOPT_RETURNTRANSFER option that's set, curl_exec($ch) will be set to the content of the retrieved page if the session succeeds. Otherwise, it will equal false.
Hope this helps!
Edit: Removed the statement setting CURLOPT_HEADER. I also, for some reason, was under the impression that curl_exec($ch) set the value of $ch to the returned contents, forgetting that the contents are returned by curl_exec().
Hi there i am having a strange problem with downloading a remote file. I am using Flurry Data API to download some Reports. Thing is when first time we call the Flurry API it gives us XML/JSON response which contains the path of the Report for Download. It takes like 2 minutes for the report to get ready. I am having Problem with that thing. I wrote a Script which download the remote file if i just paste the link of Report directly to function which is already ready to download. It works like a charm but i have to automate the process of Downloading. So for that i First call the API and get the Report Download Link then i use sleep() function of PHP to stop execution for like 3 Minutes(Tried it with 2 also). Then i call the same function which i uses to download the report successfully doesn't work this time. Here is the File Download Method:
function get_file_and_save($file, $local_path, $newfilename) {
$err_msg = '';
$out = fopen($local_path . $newfilename . ".gz", 'wb');
if ($out == FALSE) {
print "File is not available<br>";
exit;
}
$ch = curl_init();
$headers = array('Content-type: application/x-gzip', 'Connection: Close');
curl_setopt($ch, CURLOPT_FILE, $out);
curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
curl_setopt($ch, CURLOPT_URL, $file);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 0);
curl_exec($ch);
echo "<br>Error Occured:" . curl_error($ch);
curl_close($ch);
}
i also have tried giving the CURLOPT_TIMEOUT but it wasn't working either.
The code to Request to the Flurry note that download_path is working properly it just get the report link:
$query_url = 'http://api.flurry.com/rawData/Events?apiAccessCode=' . $ACCESS_CODE . '&apiKey=' . $API_KEY . '&startTime=' . $start_time . '&endTime=' . $end_time;
$response = download_path($query_url);
if ($response) {
$response_obj = json_decode($response, true);
if (isset($response_obj['report']['#reportUri'])) {
$report_link = $response_obj['report']['#reportUri'];
}
if (isset($response_obj['report']['#reportId'])) {
$report_id = $response_obj['report']['#reportId'];
}
if(isset($report_link) && !empty($report_link)){
echo "<br >Report Link: ".$report_link."<br >";
sleep(240);
$config = array(
'http' => array(
'header' => 'Accept: application/json',
'method' => 'GET'
)
);
$stream = stream_context_create($config);
$json= file_get_contents($report_link,false,$stream);
echo "<pre>";
print_r($http_response_header);
echo "</pre>";
if($http_response_header[3] == "Content-Type: application/octet-stream"){
get_file_and_save($report_link, "data-files/gz/", $current_time);
}
}
else{
echo "There was some error in downloading report";
}
} else {
$error = true;
echo "There was some error in genrating report";
}
is there something problem with sleep() or what i am stuck its been 2nd night i am unable to achieve it.
Check if your PHP script is timing out and being killed off. Both the webserver and PHP have max execution limits to prevent runaway scripts, and if your sleep surpasses that limit, it'll never continue beyond that.
http://www.php.net/manual/en/info.configuration.php#ini.max-execution-time
http://php-fpm.org/wiki/Configuration_File - request_terminate_timeout
http://nginx.org/en/docs/http/ngx_http_fastcgi_module.html#fastcgi_read_timeout
http://httpd.apache.org/docs/2.0/mod/core.html#timeout
I wanna run a GET curl in php A to get data from php B.
This is an example in php A (I got from here http://support.qualityunit.com/061754-How-to-make-REST-calls-in-PHP)
//next example will recieve all messages for specific conversation
$service_url = 'http://localhost/test/getFrom.php?id=1';
$curl = curl_init($service_url);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
$curl_response = curl_exec($curl);
if ($curl_response === false) {
$info = curl_getinfo($curl);
curl_close($curl);
die('error occured during curl exec. Additioanl info: ' . var_export($info));
}
curl_close($curl);
$decoded = json_decode($curl_response);
if (isset($decoded->response->status) && $decoded->response->status == 'ERROR') {
die('error occured: ' . $decoded->response->errormessage);
}
echo 'response ok!';
var_export($decoded->response);
And I tried this example as well (Trying to use curl to do a GET, value being sent is allows null)
In php B.
It will get the ID, run some script and will generate an ARRAY.
I want to get this ARRAY from B to A.
B will run only when A request GET from B.
The problem is I don't how the ARRAY can pass from B to A.
Please give some advice THANK YOU.
The code you provided is expecting back a JSON encoded array. The easiest way to do this is simply JSON encode your array in PHP B and echo it to the page.
Then CURL will be able to read the contents of PHP B, decode and process as required.
// PHP B
<?php
// Check for $_GET params
// Get ID
$id = $_GET['id'];
// Do processing, query etc
....
// Format and display array as JSON
echo(json_encode($result_array));
die();
?>
Also note:
if (isset($decoded->response->status) && $decoded->response->status == 'ERROR') {
die('error occured: ' . $decoded->response->errormessage);
}
The code is expecting the array to be formatted in a particular way. So either match your array in PHP B to the same format, or update the code to your needs.