Saving a remote image with cURL? - php

Morning all,
Theres a few questions around this but none that really answer my question, as far as I ca understand. Basically I have a GD script that deals with resizing and caching images on our server, but I need to do the same with images stored on a remote server.
So, I'm wanting to save the image locally, then resize and display it as normal.
I've got this far...
$file_name_array = explode('/', $filename);
$file_name_array_r = array_reverse($file_name_array);
$save_to = 'system/cache/remote/'.$file_name_array_r[1].'-'.$file_name_array_r[0];
$ch = curl_init($filename);
$fp = fopen($save_to, "wb");
// set URL and other appropriate options
$options = array(CURLOPT_FILE => $fp,
CURLOPT_HEADER => 0,
CURLOPT_FOLLOWLOCATION => 1,
CURLOPT_TIMEOUT => 60); // 1 minute timeout (should be enough)
curl_setopt_array($ch, $options);
curl_exec($ch);
curl_close($ch);
fclose($fp);
This creates the image file, but does not copy it accross? Am I missing the point?
Cheers guys.

More simple, you can use the file_put_contents instead of fwrite:
$file_name_array = explode('/', $filename);
$file_name_array_r = array_reverse($file_name_array);
$save_to = 'system/cache/remote/'.$file_name_array_r[1].'-'.$file_name_array_r[0];
file_put_contents($save_to, file_get_contents($filename));
or in just 2 lines :)
$file_name_array_r = array_reverse( explode('/', $filename) );
file_put_contents('system/cache/remote/'.$file_name_array_r[1].'-'.$file_name_array_r[0], file_get_contents($filename));

Well, I sorted it! After examing my images rather than my code a little closer, it turned out some of the images were erroring on their side, rather than mine. Once I selected an image that worked, my code worked too!
Cheers as always though guys :)

Personally i don't like to use curl functions that write into a file. Try this instead:
$file_name_array = explode('/', $filename);
$file_name_array_r = array_reverse($file_name_array);
$save_to = 'system/cache/remote/'.$file_name_array_r[1].'-'.$file_name_array_r[0];
$ch = curl_init($filename);
$fp = fopen($save_to, "wb");
// set URL and other appropriate options
$options = array(CURLOPT_HEADER => 0,
CURLOPT_FOLLOWLOCATION => 1,
CURLOPT_TIMEOUT => 60,
CURLOPT_RETURNTRANSFER, true //Return transfer result
);
curl_setopt_array($ch, $options);
//Get the result of the request and write it into the file
$res=curl_exec($ch);
curl_close($ch);
fwrite($fp,$res);
fclose($fp);
But you can use something more simple without curl:
$file_name_array = explode('/', $filename);
$file_name_array_r = array_reverse($file_name_array);
$save_to = 'system/cache/remote/'.$file_name_array_r[1].'-'.$file_name_array_r[0];
$content=file_get_contents($filename);
$fp = fopen($save_to, "wb");
fwrite($fp,$content);
fclose($fp);

Related

PHP cURL downloaded file is 0 bytes on an external link while works on another external link

I'm trying to get a file/image downloaded to a folder on my first server, from second server. I have the following code:
$image = 'http://i1.au.reastatic.net/150x112/73fa6c02a92d60a76320d0e89dfbc1a36a6e46c818f74772dec65bae6959c62f/main.jpg';
$imageName = pathinfo( $image, PATHINFO_BASENAME );
$ch = curl_init();
curl_setopt( $ch, CURLOPT_URL, $image );
curl_setopt( $ch, CURLOPT_RETURNTRANSFER, true );
$source = curl_exec( $ch );
curl_close( $ch );
file_put_contents( './content/' . $imageName, $source );
$out = preg_split('/(\r?\n){2}/', $source, 2);
$headers = $out[0];
$headersArray = preg_split('/\r?\n/', $headers);
$headersArray = array_map(function($h) {
return preg_split('/:\s{1,}/', $h, 2);
}, $headersArray);
$tmp = [];
foreach($headersArray as $h) {
$tmp[strtolower($h[0])] = isset($h[1]) ? $h[1] : $h[0];
}
print_r($tmp);
The linked url is not of my servers, but works as expected, writing a file on my first server.
But when I use my own second server link, for example https://example.com/demo1.png, it writes a file of 0 bytes.
On logging out the headers, the external link which is not of my server logs out an array of many items about the image such as "content-type" and "content-length". On logging out the header response of the image of my second server, it logs out an empty array..
What adjustments do I need to do on the script? I'm also okay to do some adjustments on my second server(?), if needed.
Thank you in advance.
Thanks to #RiggsFolly , I found out the issue was related to SSL certificate. The issue seems to be on my local pc server, and may not persist on a live server. I will move to a live server soon, and if the issue persists I will update this answer as needed. I'll mark this answer as best answer, hoping to also let others know that the issue was related with SSL and can be further fixed with proper measures. SO also has many useful questions/answers for the mentioned SSL issue.

Use curl instead of file_get_contents in function

Below my code for loading and storing a xml-feed. Important is also the time-out in case the feed is offline or responds to slow.
Some users do not have file_get_contents enabled. I'm looking for a way to change this to curl or do a check and use the one that is enabled. And not lose the functionality to set a time out. Any ideas?
function feeder()
{
$cache_time = 3600 * 12; // 12 hours
$cache_file = plugin_dir_path(__FILE__) . '/cache/feed.xml';
$timedif = #(time() - filemtime($cache_file));
$fc_xml_options = get_option('fc_xml_options');
$xml_feed = $fc_xml_options['feed'];
// remove white space(s) and/or space(s) from connector code
$xml_feed = str_replace(' ', '', $xml_feed);
$xml_feed = preg_replace('/\s+/', '', $xml_feed);
if (file_exists($cache_file) && $timedif < $cache_time)
{
$string = file_get_contents($cache_file);
}
else
{
// set a time-out (5 sec) on fetch feed
$xml_context = array( 'http' => array(
'timeout' => 5,
) );
$pure_context = stream_context_create($xml_context);
$string = file_get_contents($xml_feed, false, $pure_context);
if ($f = #fopen($cache_file, 'w'))
{
fwrite($f, $string, strlen($string));
fclose($f);
}
}
You can use curl to read local file. Just change url to file path prefixed by file://
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "file://full_path_to_file");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
// $output contains the output string
$output = curl_exec($ch);
curl_close($ch);

PHP file_get_contents and cURL raise http error 500 internal server

I am using a function inside a PHP class for reading images from array of URLs and writing them on local computer.
Something like below:
function ImageUpload($urls)
{
$image_urls = explode(',', $urls);
foreach ($image_urls as $url)
{
$url = trim($url);
$img_name = //something
$source = file_get_contents($url);
$handle = fopen($img_name, "w");
fwrite($handle, $source);
fclose($handle);
}
}
It successfully read and write 1 or 2 images but raise 500 Internal severs for reading 2nd or 3rd image.
There is nothing important in Apache log file. Also i replace file_get_contents command with following cURL statements, but result is the same (it seems cURL reads one more image than file_get_contents).
$ch=curl_init();
curl_setopt($ch,CURLOPT_URL,$url);
curl_setopt($ch,CURLOPT_CONNECTTIMEOUT,500);
curl_setopt($ch,CURLOPT_RETURNTRANSFER,true);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER,false);
$source = curl_exec($ch);
curl_close($ch);
unset($ch);
Also the problem is only for reading from http URLs, and if I have images on somewhere local, there is no problem for reading and writing them.
I don't see any handler for reading in the loop , your $handle = fopen($img_name, "w"); is just for writing , you also need $handle = fopen($img_name, "r"); for reading ! because you can't read handle (fread () ) for fopen($img_name, "w");.
Additional answer :
Could you modify to (and see if it works):
.........
$img_name = //something
$context = stream_context_create($image_urls );
$source= file_get_contents( $url ,false,$context);
.....
.....
I have made some changed to your code, hope that helps :)
$opts = array(
'http' => array(
'method'=>"GET",
'header'=>"Content-Type: text/html; charset=utf-8"
)
);
$context = stream_context_create($opts);
$image_urls = explode(',', $urls);
foreach ($image_urls as $url) {
$result = file_get_contents(trim($url),TRUE,$context);
if($result === FALSE) {
print "Error with this URL : " . $url . "<br />";
continue;
}
$handle = fopen($img_name, "a+");
fwrite($handle, $result);
fclose($handle);
}

Retrieve data from url and save in php

I am trying to retrieve the html from file get contents in php then save it to a php file so I can include it into my homepage.
Unfortunately my script isn't saving the data into the file. I also need to verwrite this data on a daily basis as it will be setup with a cron job.
Can anyone tell me where I am going wrong please? I am just learning php :-)
<?php
$richSnippets = file_get_contents('http://website.com/data');
$filename = 'reviews.txt';
$handle = fopen($filename,"x+");
$somecontent = echo $richSnippets;
fwrite($handle,$somecontent);
echo "Success";
fclose($handle);
?>
A couple of things,
http://website.com/data gets a 404 error, it doesn't exist.
Change your code to
$site = 'http://www.google.com';
$homepage = file_get_contents($site);
$filename = 'reviews.txt';
$handle = fopen($filename,"w");
fwrite($handle,$homepage);
echo "Success";
fclose($handle);
Remove $somecontent = echo $richSnippets; it doesn't do anything.
if you have the proper permissions it should work.
Be sure that your pointing to an existing webpage.
Edit
When cURL is enabled you can use the following function
function get_web_page( $url ){
$options = array(
CURLOPT_RETURNTRANSFER => true, // return web page
CURLOPT_HEADER => false, // don't return headers
CURLOPT_FOLLOWLOCATION => true, // follow redirects
CURLOPT_ENCODING => "", // handle all encodings
CURLOPT_USERAGENT => "spider", // who am i
CURLOPT_AUTOREFERER => true, // set referer on redirect
CURLOPT_CONNECTTIMEOUT => 120, // timeout on connect
CURLOPT_TIMEOUT => 120, // timeout on response
CURLOPT_MAXREDIRS => 10, // stop after 10 redirects
);
$ch = curl_init( $url );
curl_setopt_array( $ch, $options );
$content = curl_exec( $ch );
curl_close( $ch );
return $content;
}
Now change
$homepage = file_get_contents($site);
in to
$homepage = get_web_page($site);
You should use / instead of ****
$homepage = file_get_contents('http://website.com/data');
Also this part
$somecontent = echo $richSnippets;
I don't see $richSnippets above... it's probably not declared?
You probably want to do this:
fwrite($handle,$homepage);

How to get a .jpg image from external website and store it (Storing it seperate from CURL)

I am currently attempting to make a function in my class which gets data from an external server.
I am able to get the data with CURL, but I do not want to use CURL to directly store it in a file.
This is semi difficult to explain so I will show you.
This is my function for getting the image:
function getCharacterPortrait($CharID, $size){
$url = "http://image.eveonline.com/character/{$CharID}_{$size}.jpg";
$ch = curl_init($url);
curl_setopt_array($ch, array(
CURLOPT_RETURNTRANSFER => true,
CURLOPT_HEADER => false,
CURLOPT_FOLLOWLOCATION => true,
CURLOPT_ENCODING => "",
CURLOPT_AUTOREFERER => true,
));
$data = curl_exec($ch);
curl_close($ch);
return $data;
}
So, what I want to do from here, is take the $data which is the raw image I presume, and store it in a .jpg in a specified file.
I have found a similar explanation for this using CURL, but CURL was used to directly store it. I would like the script that is calling the function to get the image to store the file.
Sorry if I am being a bit confusing, but I think you get the premise of what I am saying. If more explaining is needed please do say so.
How about this?
$a = implode('',#file('http://url/file.jpg'));
$h = fopen('/path/to/disk/file.jpg','wb');
fwrite($h,$a);
fclose($h);
Just write all the data cURL gave you to a file with file_put_contents() for example?
[your curl code...]
file_put_contents('localcopy.jpg', $data);
Edit:
Apparently there is also a cURL option to download to a file like:
$fp = fopen($path, 'w');
curl_setopt($ch, CURLOPT_FILE, $fp);
Found at http://www.phpriot.com/articles/download-with-curl-and-php
Use it
$url = "http://image.eveonline.com/character/{$CharID}_{$size}.jpg";
$local_path = "myfolder/{$CharID}_{$size}.jpg";
$file = file_get_contents($url);
file_put_contents($file, $local_path);

Categories