Image is broken when downloaded from Facebook using cURL in PHP - php

Here's the script I use to download an image from Facebook
function downloadImage($image_url)
{
// Set filename
$filename = dirname(__FILE__).'/wow.jpg';
// Proceed to download
// Open file to save
$file = fopen($filename, 'w');
// Use curl
$ch = curl_init($image_url);
// Set options
curl_setopt($ch, CURLOPT_FILE, $file);
curl_setopt($ch, CURLOPT_ENCODING, '');
curl_setopt($ch, CURLOPT_USERAGENT, 'Mozilla/5.0 (Windows NT 6.1; WOW64; rv:27.0) Gecko/20100101 Firefox/27.0');
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 2);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, TRUE);
// Execute
$data = curl_exec($ch);
// Close curl
curl_close($ch);
// Close file
fclose($file);
}
// Download image
downloadImage('https://graph.facebook.com/WowSuchPage/picture?width=720&height=720');
The download succeeds though, but when I open the image file, it appears to be broken.
Here's the image that gets downloaded:
This only occurs when the image source is from Facebook, any other domains are OK. I don't think it has something to do with my ISP though because if I download the image through my browser, it's all fine
I hope you can help me on this one as this has been bugging me for some time now. Thanks!
EDIT
By the way, I'm using Wampserver 2.4 on localhost. My PHP version is 5.4.12
FIXED
Alright, I finally found the issue. It seems either cURL or the SSL component/extension in my local wampserver is the root of the issue. I only had to use "http://" rather than "https://" to access the image perfectly
But it would really be great if I can still download the image as perfect as it should be even in https://, I won't close this one yet until I find some answers. Thanks for your help

Related

Image exists on remote URL but curl give me 404 status

I have a WordPress site where we are facing some issues using images.
The images exist on the server, when we access the images via URL, it works and the image shows.
But when we use curl, file_get_contents, fopen, copy etc command to get or access the file, we get the 404 error status code.
Can anyone can help me, I don't know if it's a server issue or WordPress issue.
Below is my sample of code:
$url = "http://blogbucket.in/wp-content/uploads/files/0B6IGyYw9A5RYS0tNelJrdmFsMTQ/airbitz-co1.png";
$ch = #curl_init($url);
#curl_setopt($ch, CURLOPT_HEADER, true);
#curl_setopt($ch, CURLOPT_NOBODY, true);
#curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
#curl_setopt($ch, CURLOPT_FAILONERROR, true);
#curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
#curl_setopt($ch,CURLOPT_USERAGENT,'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.13) Gecko/20080311 Firefox/2.0.0.13');
#curl_exec($ch);
$header = curl_getinfo($ch);
curl_close($ch);
echo "<pre>";
print_r($header);
echo "<pre>";
Below are the links I've looked at so far.
Check whether image exists on remote URL
How to check if a file exists from a url
cURL is returning 404 on a file that exists (remote server). Why?
How can one check to see if a remote file exists using PHP?
How to check if an URL exists with the shell and probably curl?
PHP Detecting if source image url link leads to a "broken" image?
You can try it here.
The images exist on the server, when we access the images via URL, it works and the image shows.
But when we use curl, file_get_contents, fopen, copy etc command to get or access the file, we get the 404 error status code.
This url returns 404 response with image payload. So you can disregard the response code and use the image data from the response body.

Unable to correctly download remote image file in PHP with Curl

I am trying to download a remote image using Curl, it downloads a file however when I attempt to open the image it on my mac - I get a 'could not be opened' message.
I can see the filename & extension are intact however somehow it hasn't saved as properly as the filesize is 177 bytes, yet i'm expecting the filesize to be around 3kb.
Can anyone suggest why this is? Is the remote site preventing me somehow from downloading the file? I've tried this same code with some other images on other sites and it works fine??
$url = 'http://www.fifaindex.com/static/FIFA16/images/crest/256/light/21.png';
$saveto = '21.png';
$ch = curl_init ($url);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_BINARYTRANSFER,1);
curl_setopt($ch,CURLOPT_USERAGENT,'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.13) Gecko/20080311 Firefox/2.0.0.13');
$raw = curl_exec($ch);
curl_close ($ch);
if(file_exists($saveto)){
unlink($saveto);
}
$fp = fopen($saveto,'x');
fwrite($fp, $raw);
fclose($fp);
The website you try to get the image from, probably added some restriction so that if the image is called from outside the domain will not be served.
To get around that you can specify the referrer in your CURL options, setting it with the url of the site you want to get the image from.
In your case
curl_setopt($ch, CURLOPT_REFERER, "http://www.fifaindex.com");
I tried it myself on my local server and it worked.

cURL not able to download image file from server running Varnish Cache

I have the following PHP script that works perfectly 99% of the time. But it will not download an image from this one server which I think is running Varnish Cache.
<?php
$imglink = 'http://www.dirtrider.com/wp-content/uploads/2014/10/WD-10_1_14-001.jpg';
$ch = curl_init($imglink);
$fp = fopen('/home/path/www/tmp/test.jpg', "w");
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_TIMEOUT, 5);
curl_exec($ch);
fclose($fp);
You get a 403 Forbidden error if you use CURL to load that image. You can work around this error very easily. Just add an alternate user agent for your CURL request:
curl_setopt($ch, CURLOPT_USERAGENT, $_SERVER['HTTP_USER_AGENT']);
And et voila! It works like a charm. Seems like Varnishe Cache blocks CURL requests which use a CURL default user agent.

File Download not Saving into Downloads Folder

I wish to download files from my web server with download progress information. For that purpose, PHP cURL seems to be the best choice.
However, I have difficulties that the downloaded files are not placed into Downloads folder, where all the web files are normally downloaded. I use the following file download routine:
$fp = fopen(dirname(__FILE__) . 'uploaded.pdf', 'w+');
$url = "file:///D:/WEB/SAIFA/WWW/PickUpTest.pdf";
$ch = curl_init(str_replace(" ","%20", $url));
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_USERAGENT, 'Mozilla/5.0 (Windows NT 6.1; rv:2.0.1) Gecko/20100101 Firefox/4.0.1');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_BINARYTRANSFER, true);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_BUFFERSIZE, 1024*8);
curl_setopt($ch, CURLOPT_NOPROGRESS, false );
curl_setopt($ch, CURLOPT_PROGRESSFUNCTION, 'progressCallback' );
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 0);
curl_setopt( $ch, CURLOPT_FILE, $fp );
curl_exec( $ch );
curl_close($ch);
fclose($fp);
unset($fp);
My problem is, that instead Downloads folder, the file is silently downloaded into my WWW folder, where the my PHP scripts including this cURL one reside. I get no File Download Save As dialog box neither.
To force Save As dialog box, I added the following header, at the beginning of the script:
header("Content-Disposition: attachment; filename=\"uploaded.pdf\"");
$fp = fopen(dirname(__FILE__) . 'uploaded.pdf', 'w+');
...
After using the header, I get the Save As dialog box however, the file is still silently download into the folder with my PHP scripts. In the Downloads folder, a file 'uploaded.pdf' with filesize 0 is saved.
My question is, how to make PHP cURL to download files properly and place them into Downloads folder and offer Save As dialog box?
I use:
WAMP
Windows 7
PHP Version 5.4.12
Curl Version 7.29.0
By using the file functions you're actually asking your server to save the file so it makes sense that the results of the cURL call end up in your PHP folder.
What you really want, if I understand the problem, is to send the results of the cURL back to the browser. You're halfway there by sending the header(...) - which lets the user's browser know a file is coming and should be downloaded, the step you've missed is sending the cURL data with the header.
You could echo the contents of the file after you've saved it or, more efficiently (assuming you don't want an extra copy of the file), remove the code to save the file locally and remove the cURL option CURLOPT_RETURNTRANSFER. That will tell cURL to send the output directly so it will become the data for the download.
Hope that helps!
EDIT A simple example that grabs a local file (C:\test.pdf) and sends it to the user's browser (as uploaded.pdf).
<?php
header("Content-Disposition: attachment; filename=\"uploaded.pdf\"");
// Get a FILE url to my test document
$url = 'file://c:/test.pdf';
$url= str_replace(" ","%20", $url);
$ch= curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_BINARYTRANSFER, true);
curl_exec( $ch );
curl_close ($ch);
Hope that helps a bit more!

Error when create thumb

I use this opensource
http://code.google.com/p/timthumb/
But i try create a thumb from my hosting get an error
This link image wish create a thumb.
http://pixhost.info/avaxhome/94/91/00179194.jpeg
error reading file http://pixhost.info/avaxhome/94/91/00179194.jpeg from remote host: Failed writing body (37 != 1448)
Query String : src=http://pixhost.info/avaxhome/94/91/00179194.jpeg
You should have created manually two subfolders cache and temp in the directory where the timthumb.php is to be found. Those subfolders need to have the permissions set as open as possible (777 on a linux machine) in order to let the server user account write into these. Then everything should be fine. Please note that you cannot grab pictures from the web (as far as i know). Pictures should be placed on the local machine.
Example: http://myserver.com/web/timthumb.php?src=scotland.jpg
The error seems to come from a curl command :
$ch = curl_init ($src);
curl_setopt ($ch, CURLOPT_TIMEOUT, 15);
curl_setopt ($ch, CURLOPT_USERAGENT, "Mozilla/5.0 (Windows; U; Windows NT 5.1; rv:1.7.3) Gecko/20041001 Firefox/0.10.1");
curl_setopt ($ch, CURLOPT_URL, $src);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, TRUE);
curl_setopt ($ch, CURLOPT_HEADER, 0);
curl_setopt ($ch, CURLOPT_SSL_VERIFYPEER, FALSE);
curl_setopt ($ch, CURLOPT_USERAGENT, 'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.7.5) Gecko/20041107 Firefox/1.0');
curl_setopt ($ch, CURLOPT_FILE, $fh);
if (curl_exec ($ch) === FALSE) {
if (file_exists ($local_filepath)) {
unlink ($local_filepath);
}
display_error ('error reading file ' . $src . ' from remote host: ' . curl_error($ch));
}
curl_close ($ch);
fclose ($fh);
It could come from the timeout. You may change it # line 607
EDIT :
Alright, I tested it (since someone don't think pointing out the code doing the error and having guesses isn't enough) it works flawlessly on my server.
So it can be :
The timeout
The directory './cache' (that you can change in the class if needed # line 17) not created/not with the good chmod so the fopen doesn't work. It is used to save the distant image on your server; if there's nowhere to save, then the curl command fails.
This is simply because you ask curl to create a file name which your operating system (or perhaps more accurately your file system) doesn't allow it to use.
You code specifies that if the file does not exist, then proceed to fetch a copy from the url, and at no point are you creating a local file within your PHP Script.
Within your CURL Your code says
$fh = fopen ($local_filepath, 'w');
Within PHP w has a discription of:
Open for writing only; place the file pointer at the beginning of the file and truncate the file to zero length. If the file does not exist, attempt to create it.
The main reason why files are unable to be created is the Folder Permissions, as we have a constant for that folder (define ('DIRECTORY_CACHE', './cache');) you need to have a folder called cache within the same directory of the executing script ./
After This CHMOD The folders to 777 and run the script again
Or maybe you should add the domain, if it's external.
$allowedSites = array (
'flickr.com',
'picasa.com',
'blogger.com',
'wordpress.com',
'img.youtube.com',
);
(See the timthumb.php file)

Categories