How do I load arbitrary data from a url PHP? - php

This question is simple. What function would I use in a PHP script to load data from a URL into a string?

CURL is usually a good solution: http://www.php.net/curl
// create a new cURL resource
$ch = curl_init();
// set URL and other appropriate options
curl_setopt($ch, CURLOPT_URL, "http://www.example.com/");
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
// grab URL and pass it to the browser
$html = curl_exec($ch);
// close cURL resource, and free up system resources
curl_close($ch);

I think you are looking for
$url_data = file_get_contents("http://example.com/examplefile.txt");

With file wrappers you can use file_get_contents to access http resources (pretty much just GET requests, no POST). For more complicated http requests you can use the curl wrappers if you have them installed. Check php.net for more info.

Check out Snoopy, a PHP class that simulates a web browser:
include "Snoopy.class.php";
$snoopy = new Snoopy;
$snoopy->fetchtext("http://www.example.com");
$html = $snoopy->results;

Related

Header() substitute

Hi I am new to php and want to know some alternate function for the header('location:mysit.php');
I am in a scenario that I am sending the request like this:
header('Location: http://localhost/(some external site).php'&?var='test')
something like this but what I wanna do is that I want to send values of variables to the external site but I actually dont want that page to pop out.
I mean variables should be sent to some external site/page but on screen I want to be redirected to my login page. But seemingly I dont know any alternative please guide me. Thx.
You are searching for PHP cUrl:
$ch = curl_init();
// set URL and other appropriate options
curl_setopt($ch, CURLOPT_URL, "http://www.example.com/");
curl_setopt($ch, CURLOPT_HEADER, 0);
// grab URL and pass it to the browser
curl_exec($ch);
// close cURL resource, and free up system resources
curl_close($ch);
Set the location header to the place you actually want to redirect the browser to and use something like cURL to make an HTTP request to the remote site.
The way you usually would do that is by sending those parameters by cURL, parse the return values and use them however you need.
By using cURL you can pass POST and GET variables to any URL.
Like so:
$ch = curl_init('http://example.org/?aVariable=theValue');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$result = curl_exec($ch);
curl_close($ch);
Now, in $result you have the response from the URL passed to curl_init().
If you need to post data, the code needs a little more:
$ch = curl_init('http://example.org/page_to_post_to.php');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, 'variable1=value1&variable2=value2');
$result = curl_exec($ch);
curl_close($ch);
Again, the result from your POST reqeust is saved to $result.
You could connect to another URL in the background in numerous ways. There's cURL ( http://php.net/curl - already mentioned here in previous comments ), there's fopen ( http://php.net/manual/en/function.fopen.php ), there's fsockopen ( http://php.net/manual/en/function.fsockopen.php - little more advanced )

Accessing task management API with PHP using cURL

I'm attempting to access an API for a task management system (Nozbe to be exact) that is outlined here: http://www.nozbe.com/api
If I go in my browser and access the URL it returns the correct json response:
{"response":"644a40436"}
However, when I attempt to access this URL with cURL in PHP, it doesn't create the note like it would if I accessed it manually in my browser.
The normal method is outlined below:
http://www.nozbe.com/api/newnote/name-test/body-test/project_id-c4ca1/context_id-c4ca1/key-1a2b3c4d5e6f7g8h9i0j1k2l3m4n5o6
$api_key = "INSERTAPIKEYHERE";
$project_id = "73d173457";
$eventtitle = "Testing";
$descrip = "This is a test";
$url = "http://www.nozbe.com/api/newnote/name-$eventtitle/body-$descrip/project_id-$project_id/key-$api_key";
echo "$url<br/><br/>";
// create a new cURL resource
$ch = curl_init();
// set URL and other appropriate options
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER,1);
// grab URL and pass it to the browser
$response = curl_exec($ch);
// close cURL resource, and free up system resources
curl_close($ch);
Does anyone have any advice or pointers as to why this isn't working? I know it's probably a pretty obscure API I'm attempting to access.
Note that the API says that the body and the like must be URL-encoded. Instead, you have spaces in your URL. Try running urlencode on the arguments before placing them into the URL.

Using the least RAM with curl_exec

Could you please tell me which code samples uses the least RAM? Here are my two examples:
$ch = curl_init();
foreach ($URLS as $url){
// set URL and other appropriate options
curl_setopt($ch, CURLOPT_URL, $url.'&no_cache');
curl_setopt($ch, CURLOPT_HEADER, 0);
// grab URL and pass it to the browser
curl_exec($ch);
}
// close cURL resource, and free up system resources
curl_close($ch);
or
foreach ($URLS as $url){
$ch = curl_init();
// set URL and other appropriate options
curl_setopt($ch, CURLOPT_URL, $url.'&no_cache');
curl_setopt($ch, CURLOPT_HEADER, 0);
// grab URL and pass it to the browser
curl_exec($ch);
curl_close($ch);
}
// close cURL resource, and free up system resources
First one has lighter overhead, as you only instantiate the curl object once, but if curl has any leaks in it, and you're fetching a large-ish number of URLs, you could run out of memory.
Usually I only invoke a new curl object if the next url to fetch has too many differences in settings than the old curl. Easier to start with a default setup and make changes from that than try to "undo" the conflicting settings from the previous run.

PHP Proxy for getting other domain content

Can I write a PHP file (index.php) that when someone point it browser to
http://www.domain.org/some?params=a&b=1
it returns the content of
http://www.OTHERdomain.org/some?params=a&b=1
Should I use culr?
From http://www.php.net/manual/en/curl.examples-basic.php#88055, this is the code you need:
<?php
// create curl resource
$ch = curl_init();
// set url
curl_setopt($ch, CURLOPT_URL, "example.com");
//return the transfer as a string
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
// $output contains the output string
$output = curl_exec($ch);
// close curl resource to free up system resources
curl_close($ch);
?>
You could create a page called proxy.php or something like that that takes a URL as the parameter. Then, you can replace domain.org with otherdomain.org in the URL. Then use CURL to get the contents and return it.

cUrl Converting into Javascript possible?

is there a way to convert this into javascript?
<?php
$url = 'http://www.yourdomain.com/';
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$output = curl_exec($ch);
curl_close($ch);
echo $output;
?>
Pure JavaScript? No.
JavaScript with a standard browser environment? Maybe. There is the XHR object (which includes the status property, which will tell you if it was successful or not), but there is also the same origin policy.
Not directly. You can use XMLHttpRequest to download webpages, but there are cross-domain issues to be aware of.
You can't but you can:
Actually you can save the php code to, let's say, mycurl.php, in your server, then use AJAX (XMLHttpRequest) to pass the url to mycurl.php and get back the response to your javascript function.

Categories