Scenario : I have a c# .net web page. I want the user to be able to download a file placed on a remote server from a link on my page. However while downloading there should be minimum load on my server. Hence i tried creating a HttpWebRequest instance, passed the download.php path
e.g.
HttpWebRequest myHttpWebRequest = (HttpWebRequest)WebRequest.Create("http://servername/download.php");
myHttpWebRequest.Headers.Add
("Content-disposition", "attachment;filename=XXX.pdf");
myHttpWebRequest.ContentType = "application/pdf";
Passed the httprequest object in the session; however while reading the httpwebresponse on another page the contenttype is reset to "text/html".
Also the php file readers the headers and uses a readfile command to download the file. It gives the following error. Warning: readfile() [function.readfile]: URL file-access is disabled in the server configuration in
I don't entirely understand the scenario, but on PHP side, if fopen() URL access is disabled, your next port of call should be the curl family of functions. (Or, of course, activate URL access using the allow_url_fopen php.ini option but it sounds like you can't do that.)
The text/html header is probably due to the download failing.
A very rudimentary example:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$result = curl_exec($ch); // $result will contain the contents of the request
curl_close($ch);
?>
You can get around allow_url_fopen restrictions by using fsockopen. Here's a (rudimentary) implementation:
function fsock_get_contents($url) {
$fp = fsockopen($url, 80, $errno, $errstr, 20);
if (!$fp) {
echo "$errstr ($errno)<br />\n";
return false;
} else {
$out = "GET / HTTP/1.1\r\n";
$out .= "Host: " . parse_url($url, PHP_URL_HOST) . "\r\n";
$out .= "Connection: Close\r\n\r\n";
$contents = '';
fwrite($fp, $out);
while (!feof($fp)) {
$contents .= fgets($fp, 128);
} fclose($fp);
return $contents;
}
}
echo fsock_get_contents('www.google.com');
Related
I'm trying to upload a file using cURL and the PUT method, I have already a function that works using fsockopen but i would like to migrate it to cURL.
The function that uses fsockopen receives the content of a file, the filename and the credentials for auth and make the request:
function put_file($content, $filename, $username, $pass)
{
$header = "PUT /upload?username=".urlencode(user_name)."&passwd=".urlencode($pass)."&filename=".urlencode($file_name)." HTTP/1.0\r\n";
$header .= "Content-Type: application/x-www-form-urlencoded\r\n";
$header .= "Content-Length: " . strlen($content) . "\r\n\r\n";
$fp = #fsockopen("ssl://URL", 443, $errno, $errstr, 30);
if(!$fp)
{
return "ERROR";
}
else
{
fputs ($fp, $header.$content);
while (!feof($fp))
{
$res .= fread ($fp, 1024);
}
fclose($fp);
}
}
I have been trying to migrate that function to cURL, but I don't know how to do it without the need of have a "real" file on my filesystem. The only cURL options I know for this are CURLOPT_INFILE and CURLOPT_INFILESIZE, but I don't have the file (and don't want to write it to disk and after open it).
What I need is to send the "content" of the file, just like the fsockopen version does. How can this be achieved with cURL?
Thank you in advanced.
You could use php://temp wrapper, which is a temporary file stream in PHP.
First you write the data to the stream (don't forget to use rewind() so cURL will read all data):
$fp = fopen("php://temp", "r+");
fputs($fp, $content);
rewind($fp);
Then when setting up the cURL just use:
curl_setopt($ch, CURLOPT_INFILE, $fp);
curl_setopt($ch, CURLOPT_INFILESIZE, strlen($content)); #adding missing bracket
And at the end close temp file handler (optional):
fclose($fp);
Is there a way for PHP CURL functions to get the contents of the website, but stopped on the characters that we just ask. I think this sort of buffer.
so the script did not call the overall page
So schemes like this:
: curl execution
<html>
->
->
->
-> Title Detected
: curl close
->
->
->
->
</ html>
Please this is not a DOM problem. But how to curl stops when it finds that we ask.
this is my code :
function curl_download($Url){
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $Url);
curl_setopt($ch, CURLOPT_REFERER, $Url);
curl_setopt($ch, CURLOPT_USERAGENT, "Mozilla/5.0 (Windows NT 6.1) AppleWebKit/536.5 (KHTML, like Gecko) Chrome/19.0.1084.46 Safari/536.5");
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_TIMEOUT, 10);
$output = curl_exec($ch);
curl_close($ch);
return $output;
}
If cURL can't handle this problem, how about fopen? and do you have an example?
Thanks before.Also please give me your example code for me,, thanks
Here is a very simple example using fsockopen(). Extend it to fit your needs.
$host = 'www.site.com';
$port = 80;
$sock = fsockopen($host, $port, $errno, $errstr, 30);
if (!$sock) {
die("Failed to connect. $errno: $errstr");
}
// write http request to socket:
$request = "GET /file.html HTTP/1.0\r\n"
."Host: $host\r\n"
."User-Agent: some-user-agent\r\n"
."Connection: close\r\n"
."\r\n";
fwrite($sock, $request);
$buffer = ''; // buffer for storing response
while (!feof($sock)) {
$buffer .= fgets($sock, 1024); // read 1024 bytes from socket, append to buffer
if (strpos($buffer, '</title>') !== false)) { // title was found
fclose($sock);
break;
}
}
So we connect to the HTTP server on the remote host, issue a simple HTTP/1.0 request, and read the response 1024 bytes at a time until the closing title tag is detected. Once it is found, the connection is closed.
Note, even though we didn't read the entire response from the socket, the underlying system (PHP and the OS socket layer) may have read more (or possibly all depending on size) of the response. In either case, you did prevent PHP from reading most of the response. If the pages are very big, closing the socket early will likely prevent a bulk of the data from actually ever being received.
Hope that helps.
I do not think you can parse the DOM with CURL
I advise you to use the string function like strstr, strtok...
http://www.php.net/manual/en/ref.strings.php
I'm making a PHP image proxy script. I need it to not only echo the contents of the image it requests, but also identically reproduce the header of the image request.
I've seen one, and the other, but not both together... and these cURL option things confuse me. How would I do this?
Sorry, I'm not sure what is you want.
This is the example to get from a image url, echo header and save image to a file.
But, if you want a proxy, you should use web server (Nginx, Apache, etc), PHP is no need
<?php
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://img3.cache.netease.com/www/logo/logo_png.png");
curl_setopt($ch, CURLOPT_HEADER, 1);
curl_setopt($ch, CURLOPT_REFERER, "http://www.163.com/");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_BINARYTRANSFER, 1);
$return = curl_exec($ch);
curl_close($ch);
list($header, $image) = explode("\r\n\r\n", $return, 2);
echo $header;
file_put_contents("/tmp/logo.png", $image);
You can get all the headers (not as raw text) with getallheaders()
http://www.php.net/manual/en/function.getallheaders.php
Then string them back together:
$headers = "";
foreach (getallheaders() as $name => $value) {
$headers = "$name: $value\r\n";
}
$headers .= "\r\n"; // Double newline to signal end of headers (HTTP spec)
Then I think the best way is to use a socket connection, rather than CURL like so:
$response = '';
$fp = fsockopen('example.org', 80);
fputs($fp, $headers);
while (!feof($fp)) {
$response .= fgets($fp, 128);
}
fclose($fp);
Note that you may need to modify the host/request headers (because this is an identical copy, as you asked), and you may need to implement redirect following.
I want to send a raw http packet to a webserver and recieve its response but i cant find out a way to do it. im inexperianced with sockets and every link i find uses sockets to send udp packets. any help would be great.
Take a look at this simple example from the fsockopen manual page:
<?php
$fp = fsockopen("www.example.com", 80, $errno, $errstr, 30);
if (!$fp) {
echo "$errstr ($errno)<br />\n";
} else {
$out = "GET / HTTP/1.1\r\n";
$out .= "Host: www.example.com\r\n";
$out .= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
while (!feof($fp)) {
echo fgets($fp, 128);
}
fclose($fp);
}
?>
The connection to the server is established with fsockpen. $out holds the HTTP request that’s then send with frwite. The HTTP response is then read with fgets.
If all you want to do is perform a GET request and receive the body of the response, most of the file functions support using urls:
<?php
$html = file_get_contents('http://google.com');
?>
<?php
$fh = fopen('http://google.com', 'r');
while (!feof($fh)) {
$html .= fread($fh);
}
fclose($fh);
?>
For more than simple GETs, use curl (you have to compile it into php). With curl you can do POST and HEAD requests, as well as set various headers.
<?php
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://google.com');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$html = curl_exec($ch);
?>
cURL is easier than implementing client side HTTP. All you have to do is set a few options and cURL handles the rest.
$curl = curl_init($URL);
curl_setopt_array($curl,
array(
CURLOPT_USERAGENT => 'Mozilla/5.0 (PLAYSTATION 3; 2.00)',
CURLOPT_HTTPAUTH => CURLAUTH_ANY,
CURLOPT_USERPWD => 'User:Password',
CURLOPT_RETURNTRANSFER => True,
CURLOPT_FOLLOWLOCATION => True
// set CURLOPT_HEADER to True if you want headers in the result.
)
);
$result = curl_exec($curl);
If you need to set a header that cURL doesn't support, use the CURLOPT_HTTPHEADER option, passing an array of additional headers. Set CURLOPT_HEADERFUNCTION to a callback if you need to parse headers. Read the docs for curl_setopt for more options.
In PHP, how can I replicate the expand/contract feature for Tinyurls as on search.twitter.com?
If you want to find out where a tinyurl is going, use fsockopen to get a connection to tinyurl.com on port 80, and send it an HTTP request like this
GET /dmsfm HTTP/1.0
Host: tinyurl.com
The response you get back will look like
HTTP/1.0 301 Moved Permanently
Connection: close
X-Powered-By: PHP/5.2.6
Location: http://en.wikipedia.org/wiki/TinyURL
Content-type: text/html
Content-Length: 0
Date: Mon, 15 Sep 2008 12:29:04 GMT
Server: TinyURL/1.6
example code...
<?php
$tinyurl="dmsfm";
$fp = fsockopen("tinyurl.com", 80, $errno, $errstr, 30);
if (!$fp) {
echo "$errstr ($errno)<br />\n";
} else {
$out = "GET /$tinyurl HTTP/1.0\r\n";
$out .= "Host: tinyurl.com\r\n";
$out .= "Connection: Close\r\n\r\n";
$response="";
fwrite($fp, $out);
while (!feof($fp)) {
$response.=fgets($fp, 128);
}
fclose($fp);
//now parse the Location: header out of the response
}
?>
And here is how to contract an arbitrary URL using the TinyURL API. The general call pattern goes like this, it's a simple HTTP request with parameters:
http://tinyurl.com/api-create.php?url=http://insertyourstuffhere.com
This will return the corresponding TinyURL for http://insertyourstuffhere.com. In PHP, you can wrap this in an fsockopen() call or, for convenience, just use the file() function to retrieve it:
function make_tinyurl($longurl)
{
return(implode('', file(
'http://tinyurl.com/api-create.php?url='.urlencode($longurl))));
}
// make an example call
print(make_tinyurl('http://www.joelonsoftware.com/items/2008/09/15.html'));
As people have answered programatically how to create and resolve tinyurl.com redirects, I'd like to (strongly) suggest something: caching.
In the twitter example, if every time you clicked the "expand" button, it did an XmlHTTPRequest to, say, /api/resolve_tinyurl/http://tinyurl.com/abcd, then the server created a HTTP connection to tinyurl.com, and inspected the header - it would destroy both twitter and tinyurl's servers..
An infinitely more sensible method would be to do something like this Python'y pseudo-code..
def resolve_tinyurl(url):
key = md5( url.lower_case() )
if cache.has_key(key)
return cache[md5]
else:
resolved = query_tinyurl(url)
cache[key] = resolved
return resolved
Where cache's items magically get backed up into memory, and/or a file, and query_tinyurl() works as Paul Dixon's answer does.
Here is another way to decode short urls via CURL library:
function doShortURLDecode($url) {
$ch = #curl_init($url);
#curl_setopt($ch, CURLOPT_HEADER, TRUE);
#curl_setopt($ch, CURLOPT_NOBODY, TRUE);
#curl_setopt($ch, CURLOPT_FOLLOWLOCATION, FALSE);
#curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
$response = #curl_exec($ch);
preg_match('/Location: (.*)\n/', $response, $a);
if (!isset($a[1])) return $url;
return $a[1];
}
It's described here.
Another simple and easy way:
<?php
function getTinyUrl($url) {
return file_get_contents('http://tinyurl.com/api-create.php?url='.$url);
}
?>
If you just want the location, then do a HEAD request instead of GET.
$tinyurl = 'http://tinyurl.com/3fvbx8';
$context = stream_context_create(array('http' => array('method' => 'HEAD')));
$response = file_get_contents($tinyurl, null, $context);
$location = '';
foreach ($http_response_header as $header) {
if (strpos($header, 'Location:') === 0) {
$location = trim(strrchr($header, ' '));
break;
}
}
echo $location;
// http://www.pingdom.com/reports/vb1395a6sww3/check_overview/?name=twitter.com%2Fhome
In PHP there is also a get_headers function that can be used to decode tiny urls.
The Solution here from #Pons solution, didn't work alone on my php7.3 server reslolving stackexchange URLs like https://stackoverflow.com/q/62317
This solved it:
public function doShortURLDecode($url) {
$ch = #curl_init($url);
#curl_setopt($ch, CURLOPT_HEADER, TRUE);
#curl_setopt($ch, CURLOPT_NOBODY, TRUE);
#curl_setopt($ch, CURLOPT_FOLLOWLOCATION, FALSE);
#curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
$response = #curl_exec($ch);
$cleanresponse= preg_replace('/[^A-Za-z0-9\- _,.:\n\/]/', '', $response);
preg_match('/Location: (.*)[\n\r]/', $cleanresponse, $a);
if (!isset($a[1])) return $url;
return parse_url($url, PHP_URL_SCHEME).'://'.parse_url($url, PHP_URL_HOST).$a[1];
}