I have the following PHP script that works perfectly 99% of the time. But it will not download an image from this one server which I think is running Varnish Cache.
<?php
$imglink = 'http://www.dirtrider.com/wp-content/uploads/2014/10/WD-10_1_14-001.jpg';
$ch = curl_init($imglink);
$fp = fopen('/home/path/www/tmp/test.jpg', "w");
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_TIMEOUT, 5);
curl_exec($ch);
fclose($fp);
You get a 403 Forbidden error if you use CURL to load that image. You can work around this error very easily. Just add an alternate user agent for your CURL request:
curl_setopt($ch, CURLOPT_USERAGENT, $_SERVER['HTTP_USER_AGENT']);
And et voila! It works like a charm. Seems like Varnishe Cache blocks CURL requests which use a CURL default user agent.
Related
I am trying to download an image from the internet to my local folder:
$loc = "downloaded images/".basename($_POST["url"]);
if(file_put_contents($loc,file_get_contents($_POST["url"])));
echo "Downloaded successfully";
else
echo error_get_last()['message'];
The code works only in my local server. When I ran the code in my live server, this huge warning keeps popping up in my console:
Warning: file_get_contents(https://neindiabroadcast.com/wp-content/uploads/2022/11/20221122_071752.jpg): Failed to open stream: Connection timed out in
I have also tried using cURL:
set_time_limit(0); // unlimited max execution time
$loc = "downloaded images/".basename($_POST["url"]);
$host = $_POST["url"];
$ch = curl_init();
curl_setopt($ch, CURLOPT_TIMEOUT, 28800);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE);
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_URL, $host);
curl_setopt($ch, CURLOPT_REFERER, $host);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
$result = curl_exec($ch);
curl_close($ch);
$fp = fopen($loc, 'wb');
fwrite($fp, $result);
fclose($fp);
When I try to open the file, it says the file is corrupted. What's surprising is that, both the codes work perfectly only when I try them in my local server.
My max_execution_time is set to unlimited, so the time required to execute shouldn't be the issue here. Then why is this issue occurring? Please help me in resolving this issue
After hours of struggling, I finally came up with a solution. Every website has a no CORS policy which is kind of a security feature, where the server won't allow you to read the data of that website. To bypass this, all you have to do is to use a proxy url:
var proxy_url = 'https://api.codetabs.com/v1/proxy?quest=';
Prepend this url to the url of the website whose data you want to fetch, and you are done :)
Here's the script I use to download an image from Facebook
function downloadImage($image_url)
{
// Set filename
$filename = dirname(__FILE__).'/wow.jpg';
// Proceed to download
// Open file to save
$file = fopen($filename, 'w');
// Use curl
$ch = curl_init($image_url);
// Set options
curl_setopt($ch, CURLOPT_FILE, $file);
curl_setopt($ch, CURLOPT_ENCODING, '');
curl_setopt($ch, CURLOPT_USERAGENT, 'Mozilla/5.0 (Windows NT 6.1; WOW64; rv:27.0) Gecko/20100101 Firefox/27.0');
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 2);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, TRUE);
// Execute
$data = curl_exec($ch);
// Close curl
curl_close($ch);
// Close file
fclose($file);
}
// Download image
downloadImage('https://graph.facebook.com/WowSuchPage/picture?width=720&height=720');
The download succeeds though, but when I open the image file, it appears to be broken.
Here's the image that gets downloaded:
This only occurs when the image source is from Facebook, any other domains are OK. I don't think it has something to do with my ISP though because if I download the image through my browser, it's all fine
I hope you can help me on this one as this has been bugging me for some time now. Thanks!
EDIT
By the way, I'm using Wampserver 2.4 on localhost. My PHP version is 5.4.12
FIXED
Alright, I finally found the issue. It seems either cURL or the SSL component/extension in my local wampserver is the root of the issue. I only had to use "http://" rather than "https://" to access the image perfectly
But it would really be great if I can still download the image as perfect as it should be even in https://, I won't close this one yet until I find some answers. Thanks for your help
I wish to download files from my web server with download progress information. For that purpose, PHP cURL seems to be the best choice.
However, I have difficulties that the downloaded files are not placed into Downloads folder, where all the web files are normally downloaded. I use the following file download routine:
$fp = fopen(dirname(__FILE__) . 'uploaded.pdf', 'w+');
$url = "file:///D:/WEB/SAIFA/WWW/PickUpTest.pdf";
$ch = curl_init(str_replace(" ","%20", $url));
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_USERAGENT, 'Mozilla/5.0 (Windows NT 6.1; rv:2.0.1) Gecko/20100101 Firefox/4.0.1');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_BINARYTRANSFER, true);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_BUFFERSIZE, 1024*8);
curl_setopt($ch, CURLOPT_NOPROGRESS, false );
curl_setopt($ch, CURLOPT_PROGRESSFUNCTION, 'progressCallback' );
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 0);
curl_setopt( $ch, CURLOPT_FILE, $fp );
curl_exec( $ch );
curl_close($ch);
fclose($fp);
unset($fp);
My problem is, that instead Downloads folder, the file is silently downloaded into my WWW folder, where the my PHP scripts including this cURL one reside. I get no File Download Save As dialog box neither.
To force Save As dialog box, I added the following header, at the beginning of the script:
header("Content-Disposition: attachment; filename=\"uploaded.pdf\"");
$fp = fopen(dirname(__FILE__) . 'uploaded.pdf', 'w+');
...
After using the header, I get the Save As dialog box however, the file is still silently download into the folder with my PHP scripts. In the Downloads folder, a file 'uploaded.pdf' with filesize 0 is saved.
My question is, how to make PHP cURL to download files properly and place them into Downloads folder and offer Save As dialog box?
I use:
WAMP
Windows 7
PHP Version 5.4.12
Curl Version 7.29.0
By using the file functions you're actually asking your server to save the file so it makes sense that the results of the cURL call end up in your PHP folder.
What you really want, if I understand the problem, is to send the results of the cURL back to the browser. You're halfway there by sending the header(...) - which lets the user's browser know a file is coming and should be downloaded, the step you've missed is sending the cURL data with the header.
You could echo the contents of the file after you've saved it or, more efficiently (assuming you don't want an extra copy of the file), remove the code to save the file locally and remove the cURL option CURLOPT_RETURNTRANSFER. That will tell cURL to send the output directly so it will become the data for the download.
Hope that helps!
EDIT A simple example that grabs a local file (C:\test.pdf) and sends it to the user's browser (as uploaded.pdf).
<?php
header("Content-Disposition: attachment; filename=\"uploaded.pdf\"");
// Get a FILE url to my test document
$url = 'file://c:/test.pdf';
$url= str_replace(" ","%20", $url);
$ch= curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_BINARYTRANSFER, true);
curl_exec( $ch );
curl_close ($ch);
Hope that helps a bit more!
this is very weird, php curl download stops at 95% all the time. its driving me crazy.
here is the code that i'm using nothing fancy
$fp = fopen($file, 'w');
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://www.domain.com/");
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch,CURLOPT_HTTPHEADER,array("ETag: $rddash"));
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_exec($ch);
curl_close($ch);
fclose($fp);
Something i noticed, the remote website is using Etag, so i used it but still not working.
what could be the reason the download stops before it completes??
Maybe a timeout issue in your php.ini settings. Use set_time_limit(0); in your code.
See the manual for more details.
Also check the PHP error log.
I have sites where stored some xml files, and I want to download to our server, we don't have ftp connection, so we can download through http. I always used file(url) is there any better way to download files through php
If you can access them via http, file() (which reads the file into an array) and file_get_contents() (which reads content into a string) are perfectly fine provided that the wrappers are enabled.
Using CURL could also be a nice option:
// create a new CURL resource
$ch = curl_init();
// set URL and other appropriate options
curl_setopt($ch, CURLOPT_URL, "http://www.server.com/file.zip");
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_BINARYTRANSFER, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
set_time_limit(300); # 5 minutes for PHP
curl_setopt($ch, CURLOPT_TIMEOUT, 300); # and also for CURL
$outfile = fopen('/mysite/file.zip', 'wb');
curl_setopt($ch, CURLOPT_FILE, $outfile);
// grab file from URL
curl_exec($ch);
fclose($outfile);
// close CURL resource, and free up system resources
curl_close($ch);