I am trying to get the download link from savefrom.net website, I am using the following code :
<?php
$agent = "Mozilla/5.0 (Windows; U; Windows NT 5.0; en-US; rv:1.4) Gecko/20030624 Netscape/7.1 (ax)";
$url = 'http://en.savefrom.net/savefrom.php';
$ckfile = dirname(__FILE__)."/c.txt";
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,$url);
curl_setopt($ch, CURLOPT_COOKIEJAR, $ckfile);
curl_setopt($ch, CURLOPT_COOKIEFILE, $ckfile);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, false);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
$data['sf_url'] = "http://www.dailymotion.com/video/x34uoat_hero-behind-the-scenes-part-1-making-of-the-trailer_shortfilms";
curl_setopt($ch, CURLOPT_POSTFIELDS, $data);
curl_setopt($ch, CURLOPT_REFERER, 'http://en.savefrom.net/');
$output=#curl_exec($ch);
$info = #curl_getinfo($ch);
$header_size = curl_getinfo($ch, CURLINFO_HEADER_SIZE);
$header = substr($output, 0, $header_size);
$body = substr($output, $header_size);
//echo "<h1>body</h1><br>".$body . "<h1>Header</h1><br><br>" .$header ;
if(curl_errno($ch)){
echo 'Curl error: ' . curl_error($ch);
}
curl_close($ch);
echo "output is:'" . $output . "'";
echo "<br>";
echo "<h2>info is:</h2>";
print_r($info);
?>
so far the code runs but it gives me alert messagbox which says : goto the website for direct download link!!!. Am i doing something wrong in the code. Is there any alternative way to get video download link from website like dailymotion etc. Thanks for your time
regards
Related
Can anyone help me? I am tring to hit a website using curl.
My code is as follows:
$data = array(
'utm_source'=>'Google',
'utm_medium'=>'Google',
'utm_campaign'=>'Sales',
'utm_term'=>'united flights tickets'
);
$data = http_build_query($data, '', '&');
$proxies[] = 'user:password#71.6.46.151:443';
$proxies[] = 'user:password#14.102.19.206:61217';
$proxies[] = 'user:password#187.95.230.65:8080';
$url = 'https://www.example.com/index.php?'.$data;
$ch = curl_init();
if (isset($proxy)) { // If the $proxy variable is set, then
curl_setopt($ch, CURLOPT_PROXY, $proxy); // Set CURLOPT_PROXY with proxy in $proxy variable
}
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, TRUE);
curl_setopt($ch, CURLOPT_USERAGENT, 'Mozilla/5.0 (Windows NT 6.1; rv:5.0) Gecko/20100101 Firefox/5.0');
curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
curl_setopt($ch, CURLOPT_HEADER, 1);
curl_setopt($ch, CURLOPT_REFERER, 'https://www.google.com');
curl_setopt($ch, CURLOPT_HTTPGET, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
curl_setopt($ch, CURLOPT_POSTFIELDS,$data);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 1);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 2);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 500);
curl_setopt($ch, CURLOPT_URL, $url);
$results = curl_exec($ch); // Execute a cURL request
if (curl_error($ch)) {
$error_msg = curl_error($ch);
}
$info = curl_getinfo($ch);
curl_close($ch);
print_r($error_msg);
print_r($info);
print_r($results);
It hits the website and also displayed in the Google Analytics account, but the only problem is that the refferer and source showing "direct" in Google Analytics.
If there is any other way to achive this, please tell me.
I recently installed a LAMP stack on an Ubuntu server. I am having problems with my curl script. I have the following code:
function curl_request($page, $postvalues = false) {
$ch = curl_init();
// curl_setopt($ch, CURLOPT_NOBODY, false);
curl_setopt($ch, CURLOPT_COOKIEJAR, "/var/www/html/cookie.txt");
curl_setopt($ch, CURLOPT_COOKIEFILE, "/var/www/html/cookie.txt");
if (isset($postvalues) && $postvalues != false) {
curl_setopt($ch, CURLOPT_POSTFIELDS, $postvalues);
curl_setopt($ch, CURLOPT_CUSTOMREQUEST, "POST");
curl_setopt($ch, CURLOPT_POST, 1);
}
curl_setopt($ch, CURLOPT_URL, $page);
curl_setopt($ch, CURLOPT_USERAGENT, "Mozilla/5.0 (Windows; U; Windows NT 5.0; en-US; rv:1.7.12) Gecko/20050915 Firefox/1.0.7");
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_VERBOSE, 1);
$result = curl_exec($ch);
echo "CURL ERROR<BR />" . curl_error($ch) . "<br /><br /><br />";
curl_close($ch);
return $result;
}
Whenever I run this script, it never edits the /var/www/html/cookie.txt file.
The file is always blank. I've changed the permissions and done all of that, changed the file name, tried using relative path, etc. Why is the file not being edited?
The size of the save images 0 kb
Working code, except facebook
function imagedownload($url,$saveto){
$ch = curl_init ($url);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, $args);
$result = parse_url($url);
curl_setopt($ch, CURLOPT_REFERER, $result['scheme'].'://'.$result['host']);
curl_setopt($ch, CURLOPT_USERAGENT,'Mozilla/5.0 (Windows NT 10.0; WOW64; rv:45.0) Gecko/20100101 Firefox/45.0');
$raw=curl_exec($ch);
curl_close ($ch);
if(file_exists($saveto)){
unlink($saveto);
}
$fp = fopen($saveto,'x');
fwrite($fp, $raw);
fclose($fp);
}
$url="http://scontent-frt3-1.xx.fbcdn.net/v/t1.0-0/cp0/e15/q65/p320x320/13700038_850487491753797_6227258625184891703_n.jpg?oh=793ecde8db1a8e65789534907d08b25e&oe=57F1DDFF";
$konum="images/"
$yolla=imagedownload($url,$konum);
The size of the save images 0 kb
Working code, except facebook
It works if you remove the options to POST and do a regular GET request:
<?php
function imagedownload($url, $saveto)
{
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
$result = parse_url($url);
curl_setopt($ch, CURLOPT_REFERER, $result['scheme'] . '://' . $result['host']);
curl_setopt($ch, CURLOPT_USERAGENT, 'Mozilla/5.0 (Windows NT 10.0; WOW64; rv:45.0) Gecko/20100101 Firefox/45.0');
$raw = curl_exec($ch);
curl_close($ch);
if (file_exists($saveto)) {
unlink($saveto);
}
$fp = fopen($saveto, 'x');
fwrite($fp, $raw);
fclose($fp);
}
$url = "http://scontent-frt3-1.xx.fbcdn.net/v/t1.0-0/cp0/e15/q65/p320x320/13700038_850487491753797_6227258625184891703_n.jpg?oh=793ecde8db1a8e65789534907d08b25e&oe=57F1DDFF";
$konum = "test.jpg";
$yolla = imagedownload($url, $konum);
I had try in my localhost this code and it working well.
<?php
ini_set('display_errors', 1);
function imagedownload($url,$saveto){
$ch = curl_init ($url);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_BINARYTRANSFER,1);
$raw=curl_exec($ch);
curl_close ($ch);
if(file_exists($saveto)){
unlink($saveto);
}
$fp = fopen($saveto,'x');
fwrite($fp, $raw);
fclose($fp);
}
$url="http://scontent-frt3-1.xx.fbcdn.net/v/t1.0-0/cp0/e15/q65/p320x320/13700038_850487491753797_6227258625184891703_n.jpg?oh=793ecde8db1a8e65789534907d08b25e&oe=57F1DDFF";
$konum="/var/www/html/jsPDF-master/examples/test.jpg";
$yolla=imagedownload($url,$konum);
?>
And result
And one note: Please make sure the directory for save image must have write permission.
Example: chmod -R 777 /var/www/html/jsPDF-master/examples
Or ensure that in php.ini allow_url_fopen is enable
Curl stop working on my server but it is working perfectly on the other server
Here is my script:
<?php
$proxy_ip = '165.139.149.169';
$proxy_port = '3128';
$url = 'https://www.google.com/search?q=skinny+fiber+ingredients&btnG=Search&client=ubuntu&channel=fs&num=100&ie=utf-8&oe=utf-8&gfe_rd=cr&ei=sw9CVbCuPKaA8QfX0ICYBA&gws_rd=cr';
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_REFERER, 'https://www.google.com');
curl_setopt($ch, CURLOPT_USERAGENT, 'Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:28.0) Gecko/20100101 Firefox/28.0');
curl_setopt($ch, CURLOPT_MAXREDIRS, 5); // Good leeway for redirections.
curl_setopt($ch, CURLOPT_PROXYPORT, $proxy_port);
curl_setopt($ch, CURLOPT_PROXYTYPE, 'HTTP');
curl_setopt($ch, CURLOPT_PROXY, $proxy_ip);
//curl_setopt($ch, CURLOPT_PROXYUSERPWD, $loginpassw);
$data = curl_exec($ch);
curl_close($ch);
echo "abc";
echo $data;
?>
http://serpbull.com/dev/curl.php (script not working here)
http://imarkdev.com/serp/curl.php (script working here)
Edit:
Not working script takes very long to respond (curl timeout) and prints empty response.
Use curl_error() function to get more information about the failure.
$data = curl_exec($ch);
if ($data === false)
{
echo 'Curl error: ' . curl_error($ch);
} else {
echo 'Response: ' . $data;
}
But another server returns content normally..
$URL = http://www.youtube.com/get_video_info?&video_id='.$this->VideoId.'&asv=3&el=detailpage&hl=en_US
function curlGet($URL) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_USERAGENT, 'Mozilla/5.0 (Windows NT 6.1; rv:19.0) Gecko/20100101 Firefox/19.0');
curl_setopt($ch, CURLOPT_URL, $URL);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_TIMEOUT, 5);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 5);
$contents = curl_exec($ch);
if(curl_errno($ch)) {
echo ' er: ' . curl_error($ch) ;
}
// get content
echo $contents ;
curl_close($ch);
}
Set the options to see the Request and Response headers.
It is a good idea to include CURLOPT_FOLLOWLOCATION if your url is redirected.
Something like this will help find problems.
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_HEADER, true);
curl_setopt($ch, CURLINFO_HEADER_OUT, true);
curl_setopt($ch, CURLOPT_VERBOSE, true);
curl_setopt($ch, CURLOPT_FAILONERROR,true);
curl_setopt($ch, CURLOPT_AUTOREFERER, true);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_ENCODING,"");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_FILETIME, true);
curl_setopt($ch, CURLOPT_USERAGENT,"Mozilla/5.0 (Windows NT 5.1; rv:32.0) Gecko/20100101 Firefox/32.0");
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 100);
curl_setopt($ch, CURLOPT_TIMEOUT,100);
$data = curl_exec($ch);
if (curl_errno($ch)){
$data .= 'Retreive Base Page Error: ' . curl_error($ch);
}
else {
$skip = intval(curl_getinfo($ch, CURLINFO_HEADER_SIZE));
$responseHeader = substr($data,0,$skip);
$data= substr($data,$skip);
$info = curl_getinfo($ch);
$info = var_export($info,true);
}
echo $responseHeader;
echo $info;
print_r($data);