Save image from webpage using php - php

I know my question is quite similar to this question and I'm using the answer to the problem as my solution but I can't get it to work.
I'm using Immediatenet.com to generate a thumbnail of any given URL (example: google). And then using cURL to save the image in a folder on my server.
I'm not sure if the problem lies in the fact that the URL isn't actually an image URL (ie example.com/image.jpg) or a .html/.php URL or that I'm simply not doing it right. I've never used cURL before so it's distinctly possible that I'm not doing it right.
I'm saving the path to the image into a MySQL database which works just fine. However, the image doesn't actually save in the folder specified. The permissions are set correctly on the folder, so I don't think it's a permissions problem. I also set allow_url_fopen to On just to make sure.
My code is below. Can anyone point me in the right direction for this one?
$url = $_POST['url'];
$url_new = 'http://immediatenet.com/t/m?Size=1024x768&URL='.$url;
$img = "/users/images/".$url.".jpg";
$ch = curl_init($url_new);
$fp = fopen($img, 'wb');
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_exec($ch);
curl_close($ch);
fclose($fp);

If $url contains forward slashes (as most URLs do), these will be treated as directory delimiters in the $img pathname. You either need to replace the slashes with some other character, or create all the necessary directories.
$img = "/users/images/".str_replace('/', '_', $url).".jpg";

Are you sure $img = "/users/images/".$url.".jpg"; is valid? .. do you really have a users folder in your root?
Please correct that if not and it should be ok.

Related

Can't copy images from Tumblr to my site, connection timed out error

Tumblr image example:
https://69.media.tumblr.com/ec3874e541a767ab578495b762201a53/tumblr_ph9je0Akck1r89eh5o1_1280.jpg
Code:
<form method="get">
<input type="text" name="url"/>
<input type="submit" value=" загрузить "/>
</form>
<?php
$name = md5(date('Y-m-d H:i:s').rand(0, 1000));
$folder = 'upload/';
$source = ($_GET['url']);
$dest = $folder.$name.'.png';
copy($source, $dest);
echo 'http://mysite.example/'.$folder.$name.'.png';
?>
I found this in another question on this site:
If you can view the image in a browser that is on the same machine as your program, then it might be that the server won't send the picture unless you look like a user rather than a program. In that case, modifying the browser identification string might fix your problem.
If you cannot view the image from a browser running on the program's PC, you will need to look elsewhere for the source of your problem.
I think, a problem i have is similar to this. Tumblr gives picture to view in a browser, but doesn't allow to copy it with a script.
How to fix that? For example, sites like imgur can upload Tumblr images by url with no any problem.
P.S. For images from other sites copying with this script goes normally.
Addition 01:
As it turned out, the problem is with my site. When i run this code on another site, it works normally with Tumblr images. I have a free domain .ml and free hosting Byethost. I have two guessings. The first is, my domain or hosting is in a blacklist on Tubmlr. The second one, i have some wrong settings on my site. If first guessing is right, is there any way to make it works without changing domain or hosting? If the second is true, what a settings i must check and change?
Tumblr appears to be inspecting the HTTP request and generating different responses depending on how you get it. Your code is fine, as you know, for most sites. When I run it as-is, I get a 403 denied error.
Changing the code to use Curl instead allowed me to download your file. My guess is the default headers used in PHP's copy() are blocked.
<?php
function grab_image($url,$saveto){
$ch = curl_init ($url);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_BINARYTRANSFER,1);
$raw=curl_exec($ch);
curl_close ($ch);
if(file_exists($saveto)){
unlink($saveto);
}
$fp = fopen($saveto,'x');
fwrite($fp, $raw);
fclose($fp);
}
$name = md5(date('Y-m-d H:i:s').rand(0, 1000));
$folder = 'upload/';
$source = ($_GET['url']);
$dest = $folder.$name.'.png';
grab_image($source, $dest);
The grab_image() function was a SO answer for another question.

How to download all images from a website?

I want to download multiple images from the following website:
www.bbc.co.uk
I want to do it by using PHP cURL, can someone help lead me in the right direction?
It would be nice to download all the images in one shot, but if someone can help me download maybe download just 1 or a bunch that would be great!
Edit: it would be a good idea to show what I have tried:
<?php
$image_url = "www.bbc.co.uk";
$ch = curl_init();
$timeout = 0;
curl_setopt ($ch, CURLOPT_URL, $image_url);
curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
// Getting binary data
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_BINARYTRANSFER, 1);
$image = curl_exec($ch);
curl_close($ch);
// output to browser
header("Content-type: image/jpeg");
print $image;
?>
For some reason it is not working. It is to be noted I am an absolute amatuer at PHP and programming in general.
The above code you pasted isn't doing what you think it is.
$image = curl_exec($ch);
The $image variable doesn't contain any image, it actually contains the entire HTML of that webpage as a string.
If you replace
// output to browser
header("Content-type: image/jpeg");
print $image;
with:
var_dump($image);
You will see the html.
Something like this:
Try to find the actual champion image source and parse it accordingly
You just need the whole /Air/assets/images/champions folder right?
Nothing easier when you use FireFox and a Download plugin like "Download Them All", open the FTP folder (Eambo's link) where the champions' pictures are located, right clic, select the plug-in.
It's gonna list all the files, select them all or select those you need and start the download.
Also, if you own that game you can take a look in this path:
\League of Legends\rads\projects\lol_air_client\releases\<newest-version>\deploy\assets\images\champions
Since you will probably use that in your university, CHECK THIS and I hope you will find a solution.

How to get a remote image with iso-8859-2 characters in its name?

I'm saving and resizing tons of images from a remote server using cURL in PHP, but the previous developer doesn't implemented any kind of system for make the user uploaded pictures "safe", so the users were able to upload any kind of file format with any kind of name then the uploaded files name is saved to a database as is, so lots of files have non URL safe and iso-8859-8 characters. For example:
gaght101125659_5 Gn-eŐs mtó gÁrlós.jpg
According to this answer I made a code for getting the pictures.
private function getRemoteUrl($file)
{
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $file);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_BINARYTRANSFER, 1);
$data = curl_exec($ch);
curl_close($ch);
return $data;
}
$img = imagecreatefromstring($this->getRemoteUrl($url));//$url contains the plain url of the image.
This code works perfectly for images with a name like gaht123115516_2.jpg but for the example shown above it gives an error saying:
Warning: imagecreatefromstring() [function.imagecreatefromstring]: Data is not in a recognized format in /home/test/public_html/resizing.php on line 64
Of course, because of the fancy characters. So what should I do in getRemoteUrl($file) with the $file variable to make it work? Can this be done in PHP? If not what other methods/programming languages should I try?
Use urlencode() on the filename. don't urlencode() the entire url.
URL encode the specified filename.

Saving thumbnails images to certain directory

I'm willing to use thumbnails into my website which is mainly like websites directory.
I've been thinking to save url thumbnails into certain directory !
Example :-
I'm going to use free websites thumbnails service that gives me code to show thumbnail image of any URL as follow
<img src='http://thumbnails_provider.com/code=MY_ID&url=ANY_SITE.COM'/>
This would show the thumbnail of ANY_SITE.COM
i want to save the generate thumbnail image into certain directory my_site.com/thumbnails
Why i'm doing this ?
in fact my database table is like my_table {id,url,image} where i'm going to give the image thumbnail random name and store its new name into my_table related to its url then i can call it back anytime and i know how to do it but i don't know how to save it into certain directory.
any help ~thanks
Using cURL should work for you:
$file = 'the URL';
$ch = curl_init ($file);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_BINARYTRANSFER,1);
$rawdata=curl_exec($ch);
curl_close ($ch);
$fullpath = 'path to destination';
$fp = fopen($fullpath);
fwrite($fp, $rawdata);
fclose($fp);
You could use curl to fetch the remote image. You can save it with curl_setopt($handler, CURLOPT_FILE, '/my/image/path/here.jpg');. The id could be something simple like a hash of the original URL. Obviously you'd have to check to make sure the directories exist before you save the file (using is_dir() and creating them with mkdir() if they don't).

Using cURL to save external files to my Server

I have a website to show opensource movies and videos.
I have saved urls in mysql and linked both videos as well as the images to the content server.
But users are complaining of slow website as images are getting fetched from outside and most of time Internet Explorer is not even displaying the image.
I just learnt about cURL and would like to save images as well as videos to my own server and provide mirror to original website.
I got " curl -O ('') ; " syntax at many places to do the task but don't know how to use it inside my php script.
In short:
I already have my form for url saving in mysql. I wish it to also save save file to a directory on my webserver and save file path to another column in mysql.
Any sort of help is welcome.
Thanx in Advance
$local_file = "/tmp/filename.flv";//This is the file where we save the information
$remote_file = "http://www.test.com/filename.flv"; //Here is the file we are downloading
$ch = curl_init();
$fp = fopen ($local_file, 'w+');
$ch = curl_init($remote_file);
curl_setopt($ch, CURLOPT_TIMEOUT, 50);
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_ENCODING, "");
curl_exec($ch);
curl_close($ch);
fclose($fp);
I've decided to update this answer almost 7 years later.
For those who have copy() enabled for remote hosts, you can simply use:
copy("http://www.test.com/filename.flv", "/some/local/path/filename.flv");

Categories