I want to send a raw http packet to a webserver and recieve its response but i cant find out a way to do it. im inexperianced with sockets and every link i find uses sockets to send udp packets. any help would be great.
Take a look at this simple example from the fsockopen manual page:
<?php
$fp = fsockopen("www.example.com", 80, $errno, $errstr, 30);
if (!$fp) {
echo "$errstr ($errno)<br />\n";
} else {
$out = "GET / HTTP/1.1\r\n";
$out .= "Host: www.example.com\r\n";
$out .= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
while (!feof($fp)) {
echo fgets($fp, 128);
}
fclose($fp);
}
?>
The connection to the server is established with fsockpen. $out holds the HTTP request that’s then send with frwite. The HTTP response is then read with fgets.
If all you want to do is perform a GET request and receive the body of the response, most of the file functions support using urls:
<?php
$html = file_get_contents('http://google.com');
?>
<?php
$fh = fopen('http://google.com', 'r');
while (!feof($fh)) {
$html .= fread($fh);
}
fclose($fh);
?>
For more than simple GETs, use curl (you have to compile it into php). With curl you can do POST and HEAD requests, as well as set various headers.
<?php
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://google.com');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$html = curl_exec($ch);
?>
cURL is easier than implementing client side HTTP. All you have to do is set a few options and cURL handles the rest.
$curl = curl_init($URL);
curl_setopt_array($curl,
array(
CURLOPT_USERAGENT => 'Mozilla/5.0 (PLAYSTATION 3; 2.00)',
CURLOPT_HTTPAUTH => CURLAUTH_ANY,
CURLOPT_USERPWD => 'User:Password',
CURLOPT_RETURNTRANSFER => True,
CURLOPT_FOLLOWLOCATION => True
// set CURLOPT_HEADER to True if you want headers in the result.
)
);
$result = curl_exec($curl);
If you need to set a header that cURL doesn't support, use the CURLOPT_HTTPHEADER option, passing an array of additional headers. Set CURLOPT_HEADERFUNCTION to a callback if you need to parse headers. Read the docs for curl_setopt for more options.
Related
I want to parse a lot of URLs to only get their status codes.
So what I did is:
$handle = curl_init($url -> loc);
curl_setopt($handle, CURLOPT_RETURNTRANSFER, true);
curl_setopt($handle, CURLOPT_HEADER , true); // we want headers
curl_setopt($handle, CURLOPT_NOBODY , true);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
$response = curl_exec($handle);
$httpCode = curl_getinfo($handle, CURLINFO_HTTP_CODE);
curl_close($handle);
But as soon as the "nobody"-option is set to true, the returned status codes are incorrect (google.com returns 302, other sites return 303).
Setting this option to false is not possible because of the performance loss.
Any ideas?
The default HTTP request method for curl is GET. If you want only the response headers, you can use the HTTP method HEAD.
curl_setopt($handle, CURLOPT_CUSTOMREQUEST, 'HEAD');
According to #Dai's answer, the NOBODY is already using the HEAD method. So the above method will not work.
Another option would be to use fsockopen to open a connection, write the headers using fwrite. Read the response using fgets until the first occurrence of \r\n\r\n to get the complete header. Since you need only the status code, you just need to read the first 13 characters.
<?php
$fp = fsockopen("www.google.com", 80, $errno, $errstr, 30);
if ($fp) {
$out = "GET / HTTP/1.1\r\n";
$out .= "Host: www.google.com\r\n";
$out .= "Accept-Encoding: gzip, deflate, sdch\r\n";
$out .= "Accept-Language: en-GB,en-US;q=0.8,en;q=0.6\r\n";
$out .= "User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.71 Safari/537.36\r\n";
$out .= "Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8\r\n";
$out .= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
$tmp = explode(' ', fgets($fp, 13));
echo $tmp[1];
fclose($fp);
}
cURL's nobody option has it use the HEAD HTTP verb, I'd wager the majority of non-static web applications I the wild don't handle this verb correctly, hence the problems you're seeing with different results. I suggest making a normal GET request and discarding the response.
i suggest get_headers() instead:
<?php
$url = 'http://www.example.com';
print_r(get_headers($url));
print_r(get_headers($url, 1));
?>
After trying all night without any success, this is the code that I have should work but not working:
<?php
// Get cURL resource
$curl = curl_init();
// Set some options - we are passing in a useragent too here
curl_setopt_array($curl, array(
CURLOPT_RETURNTRANSFER => 1,
CURLOPT_URL => 'http://api.keynote.com/keynote/',
CURLOPT_USERAGENT => 'Codular Sample cURL Request'
));
// Send the request & save response to $resp
$resp = curl_exec($curl);
if(!curl_exec($curl)){
die('Error: "' . curl_error($curl) . '" - Code: ' . curl_errno($curl));
}
// Close request to clear up some resources
curl_close($curl);
echo $resp;
?>
The error that I am getting is this:
Error: "Failed connect to api.keynote.com:80; No error" - Code: 7
On the server, I can manually bring up this url with any browser without any problems.
How do I make php connec to internet?
The thing is that fsockopen is used for opening socks (i.e connect to the specified port on the specified host/IP).
When you try to open sock to the host "http://google.com" it is like running "ping http://google.com" - you will get an error - as there is no such host "http://"
What you shout do is use http_get or curl
<?php
$response = http_get("http://www.example.com/", array("timeout"=>1), $info);
print_r($info);
?>
or remove the "http://"
<?php
$fp = fsockopen("www.example.com", 80, $errno, $errstr, 30);
if (!$fp) {
echo "$errstr ($errno)<br />\n";
} else {
$out = "GET / HTTP/1.1\r\n";
$out .= "Host: www.example.com\r\n";
$out .= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
while (!feof($fp)) {
echo fgets($fp, 128);
}
fclose($fp);
}
?>
Scenario : I have a c# .net web page. I want the user to be able to download a file placed on a remote server from a link on my page. However while downloading there should be minimum load on my server. Hence i tried creating a HttpWebRequest instance, passed the download.php path
e.g.
HttpWebRequest myHttpWebRequest = (HttpWebRequest)WebRequest.Create("http://servername/download.php");
myHttpWebRequest.Headers.Add
("Content-disposition", "attachment;filename=XXX.pdf");
myHttpWebRequest.ContentType = "application/pdf";
Passed the httprequest object in the session; however while reading the httpwebresponse on another page the contenttype is reset to "text/html".
Also the php file readers the headers and uses a readfile command to download the file. It gives the following error. Warning: readfile() [function.readfile]: URL file-access is disabled in the server configuration in
I don't entirely understand the scenario, but on PHP side, if fopen() URL access is disabled, your next port of call should be the curl family of functions. (Or, of course, activate URL access using the allow_url_fopen php.ini option but it sounds like you can't do that.)
The text/html header is probably due to the download failing.
A very rudimentary example:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$result = curl_exec($ch); // $result will contain the contents of the request
curl_close($ch);
?>
You can get around allow_url_fopen restrictions by using fsockopen. Here's a (rudimentary) implementation:
function fsock_get_contents($url) {
$fp = fsockopen($url, 80, $errno, $errstr, 20);
if (!$fp) {
echo "$errstr ($errno)<br />\n";
return false;
} else {
$out = "GET / HTTP/1.1\r\n";
$out .= "Host: " . parse_url($url, PHP_URL_HOST) . "\r\n";
$out .= "Connection: Close\r\n\r\n";
$contents = '';
fwrite($fp, $out);
while (!feof($fp)) {
$contents .= fgets($fp, 128);
} fclose($fp);
return $contents;
}
}
echo fsock_get_contents('www.google.com');
I'm building a Curl web automation app and am having some issue with not getting the desired outcome of my POST action, I am having some trouble figuring out how I can show the full POST request I am sending over (with headers), I have been searching on this but everything that comes up is the response headers, actually I want these too but also the request, which none of the posts I find on google seem to mention..
I know I can display the result of a curl request using something like this (forgive me if my syntax is off, I already shut down my virtual machine with my ide and code to refer to
$result = curl($curl_exect) ;
Anyways, I would greatly appreciate any advice on how to view the full headers, thanks
Here is all you need:
curl_setopt($curlHandle, CURLINFO_HEADER_OUT, true); // enable tracking
... // do curl request
$headerSent = curl_getinfo($curlHandle, CURLINFO_HEADER_OUT ); // request headers
You can see the information regarding the transfer by doing:
curl_setopt($curl_exect, CURLINFO_HEADER_OUT, true);
before the request, and
$information = curl_getinfo($curl_exect);
after the request
View: http://www.php.net/manual/en/function.curl-getinfo.php
You can also use the CURLOPT_HEADER in your curl_setopt
curl_setopt($curl_exect, CURLOPT_HEADER, true);
$httpcode = curl_getinfo($c, CURLINFO_HTTP_CODE);
return $httpcode == 200;
These are just some methods of using the headers.
You can save all headers sent by curl to a file using :
$f = fopen('request.txt', 'w');
curl_setopt($ch,CURLOPT_VERBOSE,true);
curl_setopt($ch,CURLOPT_STDERR ,$f);
You can make you request headers by yourself using:
// open a socket connection on port 80
$fp = fsockopen($host, 80);
// send the request headers:
fputs($fp, "POST $path HTTP/1.1\r\n");
fputs($fp, "Host: $host\r\n");
fputs($fp, "Referer: $referer\r\n");
fputs($fp, "Content-type: application/x-www-form-urlencoded\r\n");
fputs($fp, "Content-length: ". strlen($data) ."\r\n");
fputs($fp, "Connection: close\r\n\r\n");
fputs($fp, $data);
$result = '';
while(!feof($fp)) {
// receive the results of the request
$result .= fgets($fp, 128);
}
// close the socket connection:
fclose($fp);
Like writen on how make request
I had exactly the same problem lately, and I installed Wireshark (it is a network monitoring tool). You can see everything with this, except encrypted traffic (HTTPS).
In PHP, how can I replicate the expand/contract feature for Tinyurls as on search.twitter.com?
If you want to find out where a tinyurl is going, use fsockopen to get a connection to tinyurl.com on port 80, and send it an HTTP request like this
GET /dmsfm HTTP/1.0
Host: tinyurl.com
The response you get back will look like
HTTP/1.0 301 Moved Permanently
Connection: close
X-Powered-By: PHP/5.2.6
Location: http://en.wikipedia.org/wiki/TinyURL
Content-type: text/html
Content-Length: 0
Date: Mon, 15 Sep 2008 12:29:04 GMT
Server: TinyURL/1.6
example code...
<?php
$tinyurl="dmsfm";
$fp = fsockopen("tinyurl.com", 80, $errno, $errstr, 30);
if (!$fp) {
echo "$errstr ($errno)<br />\n";
} else {
$out = "GET /$tinyurl HTTP/1.0\r\n";
$out .= "Host: tinyurl.com\r\n";
$out .= "Connection: Close\r\n\r\n";
$response="";
fwrite($fp, $out);
while (!feof($fp)) {
$response.=fgets($fp, 128);
}
fclose($fp);
//now parse the Location: header out of the response
}
?>
And here is how to contract an arbitrary URL using the TinyURL API. The general call pattern goes like this, it's a simple HTTP request with parameters:
http://tinyurl.com/api-create.php?url=http://insertyourstuffhere.com
This will return the corresponding TinyURL for http://insertyourstuffhere.com. In PHP, you can wrap this in an fsockopen() call or, for convenience, just use the file() function to retrieve it:
function make_tinyurl($longurl)
{
return(implode('', file(
'http://tinyurl.com/api-create.php?url='.urlencode($longurl))));
}
// make an example call
print(make_tinyurl('http://www.joelonsoftware.com/items/2008/09/15.html'));
As people have answered programatically how to create and resolve tinyurl.com redirects, I'd like to (strongly) suggest something: caching.
In the twitter example, if every time you clicked the "expand" button, it did an XmlHTTPRequest to, say, /api/resolve_tinyurl/http://tinyurl.com/abcd, then the server created a HTTP connection to tinyurl.com, and inspected the header - it would destroy both twitter and tinyurl's servers..
An infinitely more sensible method would be to do something like this Python'y pseudo-code..
def resolve_tinyurl(url):
key = md5( url.lower_case() )
if cache.has_key(key)
return cache[md5]
else:
resolved = query_tinyurl(url)
cache[key] = resolved
return resolved
Where cache's items magically get backed up into memory, and/or a file, and query_tinyurl() works as Paul Dixon's answer does.
Here is another way to decode short urls via CURL library:
function doShortURLDecode($url) {
$ch = #curl_init($url);
#curl_setopt($ch, CURLOPT_HEADER, TRUE);
#curl_setopt($ch, CURLOPT_NOBODY, TRUE);
#curl_setopt($ch, CURLOPT_FOLLOWLOCATION, FALSE);
#curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
$response = #curl_exec($ch);
preg_match('/Location: (.*)\n/', $response, $a);
if (!isset($a[1])) return $url;
return $a[1];
}
It's described here.
Another simple and easy way:
<?php
function getTinyUrl($url) {
return file_get_contents('http://tinyurl.com/api-create.php?url='.$url);
}
?>
If you just want the location, then do a HEAD request instead of GET.
$tinyurl = 'http://tinyurl.com/3fvbx8';
$context = stream_context_create(array('http' => array('method' => 'HEAD')));
$response = file_get_contents($tinyurl, null, $context);
$location = '';
foreach ($http_response_header as $header) {
if (strpos($header, 'Location:') === 0) {
$location = trim(strrchr($header, ' '));
break;
}
}
echo $location;
// http://www.pingdom.com/reports/vb1395a6sww3/check_overview/?name=twitter.com%2Fhome
In PHP there is also a get_headers function that can be used to decode tiny urls.
The Solution here from #Pons solution, didn't work alone on my php7.3 server reslolving stackexchange URLs like https://stackoverflow.com/q/62317
This solved it:
public function doShortURLDecode($url) {
$ch = #curl_init($url);
#curl_setopt($ch, CURLOPT_HEADER, TRUE);
#curl_setopt($ch, CURLOPT_NOBODY, TRUE);
#curl_setopt($ch, CURLOPT_FOLLOWLOCATION, FALSE);
#curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
$response = #curl_exec($ch);
$cleanresponse= preg_replace('/[^A-Za-z0-9\- _,.:\n\/]/', '', $response);
preg_match('/Location: (.*)[\n\r]/', $cleanresponse, $a);
if (!isset($a[1])) return $url;
return parse_url($url, PHP_URL_SCHEME).'://'.parse_url($url, PHP_URL_HOST).$a[1];
}