I have a URL that can be accessible via HTTP or HTTPS. I want to send a HEAD or GET request, which ever is fastest and get the response code so I know whether the URL is up or down.
How do I do that using Zend_HTTP_Client? I used get_headers() function, but it's very very slow on some remote servers. I am not sure if it handles HTTPS.
You may not want to use Zend_Http_Client for this - use native PHP functions instead (like fsockopen since it seems you want this to be efficient).
That said, this may work for you (and since it defaults to the socket adapter, it may not be that less efficient than using the native functions):
$client = new Zend_Http_Client();
$response = $client->setUri($uri)->request(Zend_Http_Client::HEAD);
If not, you could try setting the cURL options manually.
$adapter = new Zend_Http_Client_Adapter_Curl();
$adapter->setCurlOption(CURLOPT_NOBODY, true);
$client = new Zend_Http_Client();
$client->setAdapter($adapter);
$response = $client->setUri($uri)->request(Zend_Http_Client::HEAD);
The code's not tested. Use at your own risk.
Related
I'm planning to use PHP for a simple requirement. I need to download a XML content from a URL, for which I need to send HTTP GET request to that URL.
How do I do it in PHP?
Unless you need more than just the contents of the file, you could use file_get_contents.
$xml = file_get_contents("http://www.example.com/file.xml");
For anything more complex, I'd use cURL.
For more advanced GET/POST requests, you can install the CURL library (http://us3.php.net/curl):
$ch = curl_init("REMOTE XML FILE URL GOES HERE"); // such as http://example.com/example.xml
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_HEADER, 0);
$data = curl_exec($ch);
curl_close($ch);
http_get should do the trick. The advantages of http_get over file_get_contents include the ability to view HTTP headers, access request details, and control the connection timeout.
$response = http_get("http://www.example.com/file.xml");
Remember that if you are using a proxy you need to do a little trick in your php code:
(PROXY WITHOUT AUTENTICATION EXAMPLE)
<?php
$aContext = array(
'http' => array(
'proxy' => 'proxy:8080',
'request_fulluri' => true,
),
);
$cxContext = stream_context_create($aContext);
$sFile = file_get_contents("http://www.google.com", False, $cxContext);
echo $sFile;
?>
Guzzle is a very well known library which makes it extremely easy to do all sorts of HTTP calls. See https://github.com/guzzle/guzzle. Install with composer require guzzlehttp/guzzle and run composer install. Now code below is enough for a http get call.
$client = new \GuzzleHttp\Client();
$response = $client->get('https://example.com/path/to/resource');
echo $response->getStatusCode();
echo $response->getBody();
Depending on whether your php setup allows fopen on URLs, you could also simply fopen the url with the get arguments in the string (such as http://example.com?variable=value )
Edit: Re-reading the question I'm not certain whether you're looking to pass variables or not - if you're not you can simply send the fopen request containg http://example.com/filename.xml - feel free to ignore the variable=value part
I like using fsockopen open for this.
On the other hand, using the REST API of servers is very popular in PHP. You can suppose all URLs are parts of a REST API and use many well-designed PHP packages.
Actually, REST API is a way to use services from a site.
So, there are many PHP packages developed to simplify REST API call. For example here is a very nice one:
https://github.com/romanpitak/PHP-REST-Client
Using such packages helps you to fetch resources easily.
So, getting the xml file (that you mentioned about) is as easy as:
$client = new Client('http://example.com');
$request = $client->newRequest('/filename.xml');
$response = $request->getResponse();
echo $response->getParsedResponse();
I have been using Curl for making network calls and I can use the COOKIEJAR option to use cookies for my requests.
Recently I've started using Zend_REST_Client, I am not sure, how to do this? Can some one show what's the best way to this?
Zend_Http_Client can handle cookie jar using Zend_Http_CookieJar. I think Zend_REST_Client relies on Zend_Http_Client, so you should be able to use a cookie jar.
$client = new Zend_Http_Client();
$cookieJar = new Zend_Http_CookieJar();
$client->setCookieJar($cookieJar);
Zend_Rest_Client::setHttpClient($httpClient);
This is zend code. How to convert it to generic PHP?
$client = New Zend_Http_Client();
$client->setUri('http://someurl.com/somepage');
$request = $client->request();
if ($request->isSuccessful()) {
//do stuff with the result
}
You'll want to look at the cURL functions.
$page = file_get_contents('http://someurl.com/somepage');
In PHP you can just grab the contents of a URI with get_file_contents($url), or even fopen($url, 'r'), etc.
You can set stream parameters using stream_context_create(), then use fopen() on that context, if you want to control things like the HTTP request method, user agent, proxy, timeouts, what to do with errors etc.
cURL is an alternative method that can also give you greater control - but then, that adds another dependency.
I'm using multi curl with anonymous proxies, and I want to flag the proxies based on performance and location etc after the curl handle is returned. I've tried curl_getinfo() but that does not return information about the proxy used for that curl handle.
Any ideas? I've thought about maybe a way to identify a particular handle and storing that with the proxy used, then when the handle has fired off and returned via curl_multi_info_read() I can look up the handle via the proxy. Not sure what to use as an identifier though. Doing a dump shows the handle as resource(20), but not sure if that is something I can rely on?
I guess if there was something like getOpt() would be ideal, but i don't see anything like that for a curl handle from the research I have done.
Check last version of MultiRequest library. There you can do something like this:
$request = new MultiRequest_Request($url);
$request->setCurlOption(CURLOPT_PROXY, $proxy);
// ...
$curlOptions = $request->getCurlOptions();
list($proxyIp, $proxyPort) = explode(':', $curlOptions[CURLOPT_PROXY]);
I found a parallel curl class (by Pete Warden), that passes data for multi-curl using the following..
$this->outstanding_requests[$ch] = array(
'url' => $url,
'callback' => $callback,
'user_data' => $user_data,
'proxy' => $proxy
);
When the multi-curl is done, it's able to use the curl handle to hold information via the outstanding requests array. If you're interested in multi-curl check out the class, it sets up everything for you and is very customizable.
I'm planning to use PHP for a simple requirement. I need to download a XML content from a URL, for which I need to send HTTP GET request to that URL.
How do I do it in PHP?
Unless you need more than just the contents of the file, you could use file_get_contents.
$xml = file_get_contents("http://www.example.com/file.xml");
For anything more complex, I'd use cURL.
For more advanced GET/POST requests, you can install the CURL library (http://us3.php.net/curl):
$ch = curl_init("REMOTE XML FILE URL GOES HERE"); // such as http://example.com/example.xml
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_HEADER, 0);
$data = curl_exec($ch);
curl_close($ch);
http_get should do the trick. The advantages of http_get over file_get_contents include the ability to view HTTP headers, access request details, and control the connection timeout.
$response = http_get("http://www.example.com/file.xml");
Remember that if you are using a proxy you need to do a little trick in your php code:
(PROXY WITHOUT AUTENTICATION EXAMPLE)
<?php
$aContext = array(
'http' => array(
'proxy' => 'proxy:8080',
'request_fulluri' => true,
),
);
$cxContext = stream_context_create($aContext);
$sFile = file_get_contents("http://www.google.com", False, $cxContext);
echo $sFile;
?>
Guzzle is a very well known library which makes it extremely easy to do all sorts of HTTP calls. See https://github.com/guzzle/guzzle. Install with composer require guzzlehttp/guzzle and run composer install. Now code below is enough for a http get call.
$client = new \GuzzleHttp\Client();
$response = $client->get('https://example.com/path/to/resource');
echo $response->getStatusCode();
echo $response->getBody();
Depending on whether your php setup allows fopen on URLs, you could also simply fopen the url with the get arguments in the string (such as http://example.com?variable=value )
Edit: Re-reading the question I'm not certain whether you're looking to pass variables or not - if you're not you can simply send the fopen request containg http://example.com/filename.xml - feel free to ignore the variable=value part
I like using fsockopen open for this.
On the other hand, using the REST API of servers is very popular in PHP. You can suppose all URLs are parts of a REST API and use many well-designed PHP packages.
Actually, REST API is a way to use services from a site.
So, there are many PHP packages developed to simplify REST API call. For example here is a very nice one:
https://github.com/romanpitak/PHP-REST-Client
Using such packages helps you to fetch resources easily.
So, getting the xml file (that you mentioned about) is as easy as:
$client = new Client('http://example.com');
$request = $client->newRequest('/filename.xml');
$response = $request->getResponse();
echo $response->getParsedResponse();