cURL Checking for Timeout - php

I know how to set the timeout in cURL but I want to alert the user that the request timed out.
I have created an ajax script that allows the user to request data from various insurance sites and aggregate into a list. If any of the insurance sites fail to respond within a certain time I want to alert the user that the current quote from that company is not available at the moment.
Does cURL return anything to signal a timeout?

curl_errno() returns 28 if the operation timed out. See http://curl.haxx.se/libcurl/c/libcurl-errors.html for other error codes.

Or another solution that can cover even more cases (server timed out, server errored out with a blank page) is to check if your get_url function result is different that "" or FALSE.
Example of get_url function :
function get_url($url)
{
$ch = curl_init();
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_TIMEOUT, 5);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE);
$tmp = curl_exec($ch);
curl_close($ch);
return $tmp;
}

Related

Curl request not working to use hasoffers api

I am using curl request to hit the has-offers conversion url from my sevrver with the help of curl but it is not working.But when I call the same URL using a browser, it works.Is they can block CURL requests?.I am not getting why, is there any port blocking issue.
Below is php code to call url using curl request.
<?php
function curl_get_contents($url)
{
$ch = curl_init();
$timeout = 5;
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
$data = curl_exec($ch);
curl_close($ch);
return $data;
}
$url="http://paravey.go2cloud.org/aff_l?offer_id=12&aff_id=1000";
$contents = curl_get_contents($url);
echo $contents;
?>
Please help me thanks in Advance
The url you are curling is a pixel tracking url:
http://paravey.go2cloud.org/aff_l?offer_id=12&aff_id=1000
The aff_l endpoint looks for a cookie with session information (hence why it works in the browser).
If you want to create conversions with server side code, you will need to store the session identifier (the transaction_id) in your system and use the aff_lsr endpoint to send that data to HasOffers to trigger a conversion.
The url for this would look like this:
http://paravey.go2cloud.org/aff_lsr?transaction_id= VALUE
Where Value is the session identifier you have stored.
I would ask the HasOffers support team if you have more issues with this.

Force CURL request to not look for a response PHP

EDIT: The proper thing to do is just to send a response from Node-red as hardillb pointed out below.
My CURL request is working fine and instantly, but I simply need to have the page visit the url and not wait around for a response. I have tried every combination I can think of and my browser still sits waiting for a server response until timeout.
$url = 'http://example.com:1880/get?temperature='.$temperature;
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT_MS, 1);
curl_setopt($ch, CURLOPT_NOBODY, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_HEADER, 0);
// 3. execute and fetch the resulting HTML output
$output = curl_exec($ch);
// 4. free up the curl handle
curl_close($ch);
}
As mentioned in the comments.
The correct solution is to ensure your http-in node is paired with a http-response node in your Node-RED flow

PHP: Call URL with setting timeout?

I want to call a certain URL in my PHP script and if I do not get a response after 10s, the script should just continue. Does anyone knows how I can do that?
I only found two ways. One is fopen(), which determinates my script if it doesn't get a return and curl which is just calling the URL without waiting/getting response?
But how can I say, try to get the content of the URL and if you do not get anything after 10s then continue and ignore the URL?
You can use curl.
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "url");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_TIMEOUT, 10);
$response = curl_exec($ch);
curl_close($ch);
If timeout is occurred - $response variable has boolean 'false' value.

Process Timing when Scraping

I have a site that that scrapes off of it's sister sites but for reporting reason I'd like to be able to work out how long the task took to run. How would I approach with this with PHP, Is it even possible?
In an ideal world if the task couldn't connect to actually run after 5 seconds I'd like to kill the function from running and report the failure.
Thank you all!
If you use cURL for scraping, you can use the timeout function like this
// create a new cURL resource
$ch = curl_init();
// set URL and other appropriate options including timeout
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); // capture the result in a string
curl_setopt($ch, CURLOPT_TIMEOUT, 5); // The number of seconds to wait while trying to connect.
// grab the info
if (!$result = curl_exec($ch))
{
trigger_error(curl_error($ch));
}
// close cURL resource, and free up system resources
curl_close($ch);
// process the $result

CURL: Code 0 from proxy after CONNECT?

I am trying to test the below function but every time I try to use any sort of proxy IP (I have tried about 15 now) - I generally get the same error:
Received HTTP code 0 from proxy after CONNECT
Here is the function, anything wrong with it? It could just be the proxies I am using but I have tried several times now.
function getPage($proxy, $url, $referer, $agent, $header, $timeout) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, $header);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_PROXY, $proxy);
curl_setopt($ch, CURLOPT_HTTPPROXYTUNNEL, 1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
curl_setopt($ch, CURLOPT_REFERER, $referer);
curl_setopt($ch, CURLOPT_USERAGENT, $agent);
$result['EXE'] = curl_exec($ch);
$result['INF'] = curl_getinfo($ch);
$result['ERR'] = curl_error($ch);
curl_close($ch);
return $result;
}
Also in general, anyway I can improve it?
I appreciate all help.
Update
As I submitted this, I tried another proxy and it worked!
The other question still stands, how can I improve the above. It takes about 3-4 seconds to execute, anything I can do, or is this too minimal?
I know you sort of answered your first problem but code 0 is not a valid http status code. They should all begin with either 1 (informational), 2 (success), 3 (redirection), 4 (client error), or 5 (server error). I would be really interseted if anyone knows why you might get this code. Searching the libcurl site didn't bring anything up.
(More detailed information is here if you are interested:
http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.htmlt)
For the second question I think you would need to find where the longest operation was.The microtime() function might be useful to you here. The documentation for microtime() has some example scripts to help you use the timer.
I suspect though that most of the 3-4 seconds could be waiting to get the response via the proxy at curl_exe($ch).

Categories