I have a http GET method url;
http://test.com/test.php?name=sam&age=20
Now I'm in a different website and I want to call, or say ping that URL so that the 'get' content is pushed without actually begin redirected into that URL.
Is there a function or method in php to get it done?
cURL reference
Eg,
$urlStringData = "http://test.com/test.php?name=sam&age=20";
$ch = curl_init();
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); //Set curl to return the data instead of printing it to the browser.
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT,10); # timeout after 10 seconds, you can increase it
curl_setopt($ch, CURLOPT_USERAGENT , "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1)");
curl_setopt($ch, CURLOPT_URL, $urlStringData ); #set the url and get string together
$return = curl_exec($ch);
curl_close($ch);
Related
i use curl. Since a few days curl show's me a blankpage.
The mainpage /kenteken/ he show's without problem. But if there is text like /kenteken/TF172H it show's a blankpage.
I hope some one can help.
$ktk = $_GET['kenteken'];
// create curl resource
$ch = curl_init();
// set url
curl_setopt($ch, CURLOPT_URL, "https://finnik.nl/kenteken/" . $ktk);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_USERAGENT, "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1)");
curl_setopt($ch, CURLOPT_REFERER, "http://www.google.com");
curl_setopt($ch, CURLOPT_AUTOREFERER, true);
// $output contains the output string
$output = curl_exec($ch);
// close curl resource to free up system resources
curl_close($ch);
print_r($output);
If you go to the URL in your browser with the Developer Console - Network tab open, you'll see it's being redirected to a lowercase URL. You can add this line to have it follow the redirect:
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
using cURL I'm trying to transfer to my server a zip file from another server.
after authentication this second server gives me a url that contains even parameters as credentials:
http://content.website.com/file_to_download.zip?nvb=20160622094506&nva=20160622095506&hash=089366e5fe3da46f9caf2
if I surf this url to another web browser or on a private session I can download the content without problems (doesn't ask for login), but if I send the url to my function I get an empity file.
this is my function
function download ($zipUrl, $zipFilename){
$ch = curl_init();
$fp = fopen ($zipFilename, 'w+');
$ch = curl_init($zipUrl);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_TIMEOUT, 0);
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_HTTPHEADER, array('User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:6.0.2) Gecko/20100101 Firefox/6.0.2'));
curl_setopt($ch, CURLOPT_ENCODING, "");
curl_setopt($ch, CURLOPT_POSTFIELDS, "nvb=20160622094506&nva=20160622095506&hash=089366e5fe3da46f9caf2");
curl_exec($ch);
curl_close($ch);
fclose($fp);
}
what's wrong?
If curl_exec returns error, you can read that error with curl_error. Take a look what http code server returned so you can understand clearly what's going on.
I have the following curl request
$url='http://test/paynetz/epi/fts?login=160&pass=Test#123&ttype=NBFundTransfer&prodid=NSE&amt=50&txncurr=INR&txnscamt=0&clientcode=TkFWSU4%3d&txnid='.urlencode($string).'&date='.urlencode($date).'&custacc=1234567890&udf1=ajeesh&udf2=sam#zz.com&udf3=940000000&udf4=arrackaparmabilhouse&ru=http://www.zwitch.co';
$curl = curl_init();
curl_setopt($curl, CURLOPT_URL, $url);
curl_setopt($curl, CURLOPT_HEADER, 0);
curl_setopt($curl, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, false);
echo $auth = curl_exec($curl);
Im getting this
http://test/paynetz/epi/ftsNBFundTransfer267050dHwIMJR%2FucGOZcnocTnwvISAVaeNZK93Y8veI%2Bb1DtY%3D11
Instead of an xml.Im getting the values only not the xml.
I had 505 error inthe response,so I used urlencode($string) instead of $string
Have you tried adding curl_setopt($ch, CURLOPT_USERAGENT, 'Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Trident/5.0)'); This will confirm that you are a human rather than a bot.
If you are trying to output the XML directly onto a web page, you'll might want to lookup htmlentities().
I am a member of Lynda.com, I want to fetch a HTML page from their site and save it onto my disk, the problem is whenever I try to fetch a page via CURL, I get the non-member page (it asks me to sign up), I cant understand why I cant get the members page :(
My code:
get_remote_file_to_cache();
function get_remote_file_to_cache()
{
$the_site = "http://www.lynda.com/AIR-3-0-tutorials/Flex-4-6-and-Mobile-Apps-New-Features/90366-2.html";
$curl = curl_init();
$fp = fopen("cache/temp_file.html", "w");
curl_setopt($curl, CURLOPT_URL, $the_site);
curl_setopt($curl, CURLOPT_COOKIE, '/cookie.txt');
curl_setopt($curl, CURLOPT_FILE, $fp);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, TRUE);
$http_headers = array(
'Host: www.lynda.com',
'User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:6.0.2) Gecko/20100101 Firefox/6.0.2',
'Accept: */*',
'Accept-Language: en-us,en;q=0.5',
'Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7',
'Connection: keep-alive'
);
curl_setopt($curl, CURLOPT_HEADER, true);
curl_setopt($curl, CURLOPT_HTTPHEADER, $http_headers);
curl_exec($curl);
$httpCode = curl_getinfo($curl, CURLINFO_HTTP_CODE);
if($httpCode == 404)
{
touch('cache/404_err.txt');
}
else
{
$contents = curl_exec($curl);
fwrite($fp, $contents);
}
curl_close($curl);
}
I am on Windows 7 and running on this on WAMP.
One of the things I am not sure about is if the "cookie.txt" file is getting read or not (not sure if the path is correct so I put the cookie.txt file in the root of the server as well as in the directory I am running this script from).
Thanks in advance!
----------- Found some code via the online manual ---------
// $url = page to POST data
// $ref_url = tell the server which page you came from (spoofing)
// $login = true will make a clean cookie-file.
// $proxy = proxy data
// $proxystatus = do you use a proxy ? true/false
function
curl_grab_page($url,$ref_url,$data,$login,$proxy,$proxystatus){
if($login == 'true') {
$fp = fopen("ryanCookie.txt", "w");
fclose($fp);
}
$ch = curl_init();
curl_setopt($ch, CURLOPT_COOKIEJAR, "ryanCookie.txt");
curl_setopt($ch, CURLOPT_COOKIEFILE, "ryanCookie.txt");
curl_setopt($ch, CURLOPT_USERAGENT, "Mozilla/4.0 (compatible; MSIE 5.01; Windows NT 5.0)");
curl_setopt($ch, CURLOPT_TIMEOUT, 40);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
if ($proxystatus == 'true') {
curl_setopt($ch, CURLOPT_HTTPPROXYTUNNEL, TRUE);
curl_setopt($ch, CURLOPT_PROXY, $proxy);
}
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_REFERER, $ref_url);
curl_setopt($ch, CURLOPT_HEADER, TRUE);
curl_setopt($ch, CURLOPT_USERAGENT, $_SERVER['HTTP_USER_AGENT']);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, TRUE);
curl_setopt($ch, CURLOPT_POST, TRUE);
curl_setopt($ch, CURLOPT_POSTFIELDS, $data);
ob_start();
return curl_exec ($ch); // execute the curl command
ob_end_clean();
curl_close ($ch);
unset($ch);
}
echo curl_grab_page("https://www.lynda.com/login/login.aspx", "http://www.lynda.com/", "simple_username=*******&simple_password=*******", "true", "null", "false")."done!";
But it still does not work :(
This is the page where I got the above code: http://php.net/manual/en/function.curl-setopt.php
You need to understand how the internet and http work. You see, when you access a website, they usually give you cookies to track your status. You will also start as non logged-in member. After you hit login button, the server will update your status to logged-in and store this status, either in server site session or in your browser using cookies.
Back to your question, since you want to access member page, this mean, you need to do the following step by first, learn how lynda.com work. However, my step below is rather general:
Load login page and get the form information
inject form information with your login info and send the form back to server
store cookies received from server
load member page (don't forget to include cookies information from step 4) and fetch the html
For more information, you can look at this resources:
http://www.codingforums.com/showthread.php?t=252335
http://simpletest.sourceforge.net/en/browser_documentation.html
https://gist.github.com/3697293
Maybe you need to send Authorization header, which contain your username and password for the site in the HTTP header part.
To get the member page you need to login on the website. To do that, you need to:
visit login page
make the same request as your browser would do to submit login credentials
fetch the member page
Alternatively, you could try to extract cookies from your browser after login and use them in curl with curl_setopt($ch, CURLOPT_COOKIE, 'a=b;c=d');, but this might not work as the website can also use IP or session check.
I'm only able to fetch from a site when I use cURL with a proxy. cURL without a proxy and file_get_contents() return nothing (cURL HTTP code "0" and curl_error()
Empty reply from server). I'm able to fetch other sites just fine without a proxy.
Aside from being blocked, is there any other possible explanation of why I can only access this site via proxy?
Did you set a USER AGENT in cURL? Sometimes websites will block you if your USER AGENT isn't set or if your HTTP request looks suspicious.
To set your USER AGENT in PHP:
curl_setopt($curl, CURLOPT_USERAGENT, "Mozilla/4.0 (compatible; MSIE 5.01; Windows NT 5.0)");
Is this from your workplace or something? Many companies disable file_get_contents() on shared PHP installs, as it's quite risky.
The site probably has user agent detection. You can fake that in your curl call but I don't believe that's possible with file_get_contents(). Another method sites use is to only display content once a cookie has been set so site scrapers will never see the data.
Try this:
function curl_scrape($url,$data,$proxy,$proxystatus)
{
$fp = fopen("cookie.txt", "w");
fclose($fp);
$ch = curl_init();
curl_setopt($ch, CURLOPT_COOKIEJAR, "cookie.txt");
curl_setopt($ch, CURLOPT_COOKIEFILE, "cookie.txt");
curl_setopt($ch, CURLOPT_USERAGENT, "Mozilla/4.0 (compatible; MSIE 5.01; Windows NT 5.0)");
curl_setopt($ch, CURLOPT_TIMEOUT, 40);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
if ($proxystatus == 'on')
{
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, FALSE);
curl_setopt($ch, CURLOPT_HTTPPROXYTUNNEL, TRUE);
curl_setopt($ch, CURLOPT_PROXY, $proxy);
}
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, TRUE);
curl_setopt($ch, CURLOPT_USERAGENT, $_SERVER['HTTP_USER_AGENT']);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, TRUE);
curl_setopt($ch, CURLOPT_POST, TRUE);
curl_setopt($ch, CURLOPT_POSTFIELDS, $data);
ob_start(); // prevent any output
return curl_exec ($ch); // execute the curl command
ob_end_clean(); // stop preventing output
curl_close ($ch);
unset($ch);
}
I'm guessing I was truly blocked. Using proxy now and it works fine.