Downloading Images from URL in PHP - php

I have a file, where I read URLs line by line. The URL are links to images. All the image-links work in the browser. Now I want to download them to my current directory. For the record I need to use a PHP-script.
This is how I get my Images:
for ($j; $j< $count;$j++)
{
// Get Image-URL as String and then do getImage
$image = $arr[$j];
$newname = $j;
getImage($image, $newname);
}
function getImage($image, $newname)
{
$ch = curl_init($image);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_BINARYTRANSFER,1);
$rawdata=curl_exec ($ch);
curl_close ($ch);
$fp = fopen("$newname.jpg",'w');
fwrite($fp, $rawdata);
fclose($fp);
}
But the problem is, I get all the images, only the last one is viewable. The others I can't open and they only have 1KB.
So what did I do wrong?
Plus I need to also download png-Files later on, so can I then just change the $newname.jpg into $newname.png?
Thanks in advance, I really need an answer fast, been sitting here for hours, trying to figure it out.

Why not use stream_copy_to_stream?
function getImage($image, $newname, $fileType = "jpg")
{
$in = fopen($image, "r");
$out = fopen("$newname.$fileType",'w');
stream_copy_to_stream($in, $out);
fclose($in); fclose($out);
}
You can also play with stream_set_read_buffer
stream_set_read_buffer($in, 4096);
Just tested this with our avatar pics.
$data = [
"https://www.gravatar.com/avatar/42eec337b6404f97aedfb4f39d4991f2?s=32&d=identicon&r=PG&f=1",
"https://www.gravatar.com/avatar/2700a034dcbecff07c55d4fef09d110b?s=32&d=identicon&r=PG&f=1",
];
foreach ($data as $i => $image) getImage($image, $i, "png");
Works perfectly.
Debug: Read the remote headers for more info?
$httpresponseheader
var_dump($http_response_header);
or stream_get_meta_data
var_dump(stream_get_meta_data($in), stream_get_meta_data($out));

Related

Save image from PHP URL - returns empty image

I'm working on a small project where I read data from electronic identity cards.
It might be worth mentioning I'm using the LightOpenID PHP library to get $attributes[''] with all the data from the eID.
Now I'm stuck trying to save an image which is displayed on http://my-url.com/photo.php
photo.php contains:
<?php
session_start();
$photo = $_SESSION['photo'];
header('Content-Type: image/jpeg');
echo($photo);
The variable $photo contains $_SESSION['photo'] which comes from index.php:
function base64url_decode($base64url) {
$base64 = strtr($base64url, '-_', '+/');
$plainText = base64_decode($base64);
return ($plainText);
}
$encodedPhoto = $attributes['eid/photo'];
$photo = base64url_decode($encodedPhoto);
$_SESSION['photo'] = $photo;
The images are both perfectly visible on index.php (<?php echo '<img src="photo.php"/>'; ?>) as well as on photo.php.
I've read up on a few similar topics and tried the following methods:
cURL
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://my-url.com/photo.php');
$fp = fopen('./photo/' . $filename . '.jpg', 'wb');
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch); curl_close($ch);
fclose($fp);
File_put_contents
$input = "my-url.com/photo.php";
$output = './photo/' . $filename . '.jpg';
file_put_contents($output, file_get_contents($input));
copy
Even tried a basic copy:
copy( "http://my-url/photo.php", './photo/' . $filename . '.jpg');
All 3 methods create an empty .jpg file in the directory I want them too.
Let me know if I need to provide any extra code.
Hope there's someone who can point out my mistakes
Finally found a solution.
I decode the base64url only once with:
function base64url_decodeOnce($base64url){
$base64 = strtr($base64url, '-_', '+/');
return ($base64);
}
that way I can use the $base64 output for:
$data = 'data:image/jpeg;base64,' . $base64data .'\'';
list($type, $data) = explode(';', $data);
list(, $data) = explode(',', $data);
$data = base64_decode($data);
file_put_contents('./photo/' . $filename . '.jpg', $data);

How Can php know if the image has been loaded fully?

I wrote a PHP script which simply gets URL for images and tries to download/save them on server.
My problem here is that sometimes the image is not fully loaded and the codes blow only save images partially. I did some research but couldn't figure out if I can add something to it, so it can check whether it is saving the full image or not.
Images sizes and other properties of images are random so I can't check it with those factors unless there is a way that I can get those info before loading them image.
Thank you all.
if ( $leng >= "5" ) {
define('UPLOAD_DIR', dirname(__FILE__) . '/files/');
$length = 512000;
$handle = fopen($url, 'rb');
$filename = UPLOAD_DIR . substr(strrchr($url, '/'), 1);
$write = fopen($filename, 'w');
while (!feof($handle))
{
$buffer = fread($handle, $length);
fwrite($write, $buffer);
}
fclose($handle);
fclose($write);
}else {Echo "failed";}
I think that using cURL is a better solution than using fopen for url. Check this out:
$file = fopen($filename, 'wb'); //Write it as a binary file - like image
$c = curl_init($url);
curl_setopt($c, CURLOPT_FILE, $file);
curl_setopt($c, CURLOPT_FOLLOWLOCATION, true); //Allow cURL to follow redirects (but take note that this does not work with safemode enabled)
curl_exec($c);
$httpCode = curl_getinfo($c, CURLINFO_HTTP_CODE); //200 means OK, you may check it later, just for sure
curl_close($c);
fclose($file);
Partially based on Downloading a large file using curl

How to make sure file saved by using cURL

<?php
$source = 'http://www.xxx.com/1.jpg';
$fileBody = date('YmdHis') . rand(1000, 9999);
$extension = pathinfo($source, PATHINFO_EXTENSION);
$fileName = $fileBody . '.' . $extension;
$ch = curl_init($source);
$fp = fopen($path . $fileName, 'wb');
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
fclose($fp);
clearstatcache();
return $fileName;
This is how I grab image from internet, the image saved successfully and I will return file name for ajax to make immediately thumbnail, but sometimes php return the $fileName when it still processing download, therefore JavaScript reveal empty image on the page, how response after the file indeed been download.
curl_exec returns true on success and false on failure. Use that information.
Also, You can check curl_getinfo to make sure the transfer completed successfully and was not empty. (You get http_code there for example, as well as content_type and size_download).
$downComplete = false;
while(!$downComplete)
{
if(!file_exist($filePath))
{
sleep(1);
}
else
{
$downComplete = true;
break;
}
}
Hi,Above is an idea for check if file is completely downloaded,i think it's useful,
main idea is check saved file all the time until its finished download,then you can
display the img to front end..you can add it after your curl code,i didn't run it
myself,so just an idea..

Download multiple images from remote server with PHP (a LOT of images)

I am trying to download lots of files from an external server (approx. 3700 images). These images go from 30KB to 200KB each.
When I use the copy() function on 1 image, it works. When I use it in a loop, all I get are 30B images (empty images files).
I tried using copy, cURL, wget, and file_get_contents. Every time, I either get a lot of empty files, or nothing at all.
Here are the codes I tried:
wget:
exec('wget http://mediaserver.centris.ca/media.ashx?id=ADD4B9DD110633DDDB2C5A2D10&t=pi&f=I -O SIA/8605283.jpg');
copy:
if(copy($donnees['PhotoURL'], $filetocheck)) {
echo 'Photo '.$filetocheck.' updated<br/>';
}
cURL:
$ch = curl_init();
$source = $data[PhotoURL];
curl_setopt($ch, CURLOPT_URL, $source);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$data = curl_exec ($ch);
curl_close ($ch);
$destination = $newfile;
$file = fopen($destination, "w+");
fputs($file, $data);
fclose($file);
Nothing seems to be working properly. Unfortunately, I don't have much choice to download all these files at once, and I need a way to make it work as soon as possible.
Thanks a lot, Antoine
Getting them one by one might be quite slow. Consider splitting them into packs of 20-50 images and grabbing them with multiple threads. Here's the code to get you started:
$chs = array();
$cmh = curl_multi_init();
for ($t = 0; $t < $tc; $t++)
{
$chs[$t] = curl_init();
curl_setopt($chs[$t], CURLOPT_URL, $targets[$t]);
curl_setopt($chs[$t], CURLOPT_RETURNTRANSFER, 1);
curl_multi_add_handle($cmh, $chs[$t]);
}
$running=null;
do {
curl_multi_exec($cmh, $running);
} while ($running > 0);
for ($t = 0; $t < $tc; $t++)
{
$path_to_file = 'your logic for file path';
file_put_contents($path_to_file, curl_multi_getcontent($chs[$t]));
curl_multi_remove_handle($cmh, $chs[$t]);
curl_close($chs[$t]);
}
curl_multi_close($cmh);
I used that approach to grab a few millions of images recently, since one by one would take up to a month.
The amount of images you grab at once should depend on their expected size and your memory limits.
I used this function for that and worked pretty well.
function saveImage($urlImage, $title){
$fullpath = '../destination/'.$title;
$ch = curl_init ($urlImage);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_BINARYTRANSFER,1);
$rawdata=curl_exec($ch);
curl_close ($ch);
if(file_exists($fullpath)){
unlink($fullpath);
}
$fp = fopen($fullpath,'x');
$r = fwrite($fp, $rawdata);
setMemoryLimit($fullpath);
fclose($fp);
return $r;
}
Combined with this other one to prevent memory overflow:
function setMemoryLimit($filename){
set_time_limit(50);
$maxMemoryUsage = 258;
$width = 0;
$height = 0;
$size = ini_get('memory_limit');
list($width, $height) = getimagesize($filename);
$size = $size + floor(($width * $height * 4 * 1.5 + 1048576) / 1048576);
if ($size > $maxMemoryUsage) $size = $maxMemoryUsage;
ini_set('memory_limit',$size.'M');
}

Need php script to download a file on a remote server and save locally

Trying to download a file on a remote server and save it to a local subdirectory.
The following code seems to work for small files, < 1MB, but larger files just time out and don't even begin to download.
<?php
$source = "http://someurl.com/afile.zip";
$destination = "/asubfolder/afile.zip";
$data = file_get_contents($source);
$file = fopen($destination, "w+");
fputs($file, $data);
fclose($file);
?>
Any suggestions on how to download larger files without interruption?
$ch = curl_init();
$source = "http://someurl.com/afile.zip";
curl_setopt($ch, CURLOPT_URL, $source);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$data = curl_exec ($ch);
curl_close ($ch);
$destination = "/asubfolder/afile.zip";
$file = fopen($destination, "w+");
fputs($file, $data);
fclose($file);
file_get_contents shouldn't be used for big binary files because you can easily hit PHP's memory limit. I would exec() wget by telling it the URL and the desired output filename:
exec("wget $url -O $filename");
Since PHP 5.1.0, file_put_contents() supports writing piece-by-piece by passing a stream-handle as the $data parameter:
file_put_contents("Tmpfile.zip", fopen("http://someurl/file.zip", 'r'));
I always use this code,it's working very well.
<?php
define('BUFSIZ', 4095);
$url = 'Type The URL Of The File';
$rfile = fopen($url, 'r');
$lfile = fopen(basename($url), 'w');
while(!feof($rfile))
fwrite($lfile, fread($rfile, BUFSIZ), BUFSIZ);
fclose($rfile);
fclose($lfile);
?>
Use this solution if you do not know the format of the file that you are going to download.
$url = 'http:://www.sth.com/some_name.format' ;
$parse_url = parse_url($url) ;
$path_info = pathinfo($parse_url['path']) ;
$file_extension = $path_info['extension'] ;
$save_path = 'any/local/path/' ;
$file_name = 'name' . "." . $file_extension ;
file_put_contents($save_path . $file_name , fopen($url, 'r'))
Try out phpRFT:http://sourceforge.net/projects/phprft/files/latest/download?source=navbar
It have progress_bar and simple filename detactor...
A better and lighter script which is streaming file:
<?php
$url = 'http://example.com/file.zip'; //Source absolute URL
$path = 'file.zip'; //Patch & file name to save in destination (currently beside of PHP script file)
$fp = fopen($path, 'w');
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_FILE, $fp);
$data = curl_exec($ch);
curl_close($ch);
fclose($fp);
?>

Categories