I am using the twitter api to access some account details. It's been made clear that you can easily get the twitter account image:
http://a0.twimg.com/profile_images/1260994338/P3030586-2_bigger.jpg
However, when i try to copy() this image in PHP to my local server it doesn't work. It seems you can't even ping the address as 'could not find host'.
How can I copy the twitter account image to my local server?
$twitsImg ='http://a0.twimg.com/profile_images/1260994338/P3030586-2_bigger.jpg';
$twit=file_get_contents($twitsImg);
$new_img = basename($twitsImg);
file_put_contents($new_img,$twit);
The problem you are having is that the php ini-file variable allow_url_fopen is turned off, as it should be. You need to be using cURL to get the file. cURL is usually installed on web servers by default. This example works on my site:
<?php
function save_image($img,$path){
$ch = curl_init($img);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_BINARYTRANSFER,1);
$data=curl_exec($ch);
curl_close ($ch);
if(file_exists($path)){
unlink($path);
}
$fp = fopen($path,'x');
fwrite($fp, $data);
fclose($fp);
}
save_image("http://a0.twimg.com/profile_images/1260994338/P3030586-2_bigger.jpg","up/file2.jpg");
?>
Also make sure you have the right CHMOD (775) set on your destination folder.
Related
I have some code to convert a PHP page to HTML:
$dynamic = "http://website.net/home.php";
$out = "home.html" ;
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,"$dynamic");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$file = curl_exec($ch);
file_put_contents($out, $file);
This works perfect in localhost but it takes too much time/doesn't work on the live site.
I've tried php_get_contents, but that also doesn't work.
Note:
http://website.net/home.php page is in the same site where the code is hosted.
curl is enabled and allow_url_fopen is on as per phpinfo() in both localhost and at server.
EDIT:
It works fine when using other website's page instead of my website.
The site's target page perfectly loads in my browser.
The web page of my website is loading fast as usual but when I use curl or file_get_contents, it's too slow and even can't get output.
I think you have DNS resolving problem.
There is two ways you can use your website's locahost instead of the external domain name.
1. If you have your own server / VPS / Dedicated server,
Add entry in vim /etc/hosts for 127.0.0.1 website.net or try to fetch the content with localhost(127.0.0.1).
2. If you are using shared hosting, then try to use below url(s) in your code,
http://localhost.mywebsite.net/~username/home.php.
OR
Try to call http://localhost.mywebsite.net/home.php
Try this to fetch the url content -
$dynamic = TRY_ABOVE_URL(S)
$out = "home.html" ;
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,$dynamic);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$file = curl_exec($ch);
if($file === false) {
echo 'Curl error: ' . curl_error($ch);
}
file_put_contents($out, $file);
To investigate the reason why is it not working on live server, Try this:
$dynamic = "http://website.net/home.php";
$out = "home.html" ;
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $dynamic);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$file = curl_exec($ch);
if($file === false)
{
echo 'Curl error: ' . curl_error($ch);
}
curl_close($ch);
file_put_contents($out, $file);
I think it depends on the provider SSL configuration.
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
curl_setopt($ch, CURLOPT_SSLVERSION, 3);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 2);
Check for know smth. about SSL configuration of your provider.
Looks like it is a network routing issue, basically the server can't find a route to website.net(itself). This has is a server issue and not a PHP issue. The quickest solution is to edit the hosts file on the server and set website.net to 127.0.0.1.
On linux servers you will need to add the following line to the bottom of /etc/hosts to:
127.0.0.1 website.net
Alternatively you can try fetching http://127.0.0.1/home.php but that will not work if you have multiple virtual hosts on the server.
Try this one:
file_put_contents("home.html", fopen("http://website.net/home.php", 'r'));
but curl should work as well. If you have SSH access to the server, try to resolve your domain name using ping or something. It is possible to have a local DNS issue - it will explain why you can download from external domains.
For my example, you'll need allow_fopen_url to be On, so please verify that by phpinfo() first.
I'm try to save some pictures from url to my server, but i'm not able to do it.
this is my code (it's standard, i've found on internet):
$ch = curl_init($url);
$fp = fopen($img, 'wb');
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
fclose($fp);
but for each link i put inside $url variable, on my server is always saved a 26 byte image (that's not the original image).
What's wrong?
I can successfully download image using your curl code. It can happen that your server is not allowed to connect the outside web links.
This is a curl equivalent code that download images as well. I believe from your server, you can not download image using this code.
file_put_contents("img.jpg", file_get_contents("http://www.letteratu.it/wp-content/uploads/2014/03/cielo.jpg"));
Run your curl with verbose mode to see the curl's debug messages, and show that us.
curl_setopt($ch, CURLOPT_VERBOSE, 1);
I'm pretty sure you need to include the http:// in the URL. I'm fairly certain it thinks that it is a local file without it (i.e., an implicit file://).
I have a php curl script which downloads an image from a specified url.
Save image from url with curl PHP
function grabImage($url, $saveto){
$ch = curl_init ($url);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_BINARYTRANSFER,1);
$raw=curl_exec($ch);
curl_close ($ch);
if(file_exists($saveto)){
unlink($saveto);
}
$fp = fopen($saveto,'x');
fwrite($fp, $raw);
fclose($fp);
}
grabImage("www.example.com/image/2.png", "/var/www/site/images/image_2.png")
When a person visits my site an image from my server is downloaded however if the image is unavailable it will use curl to download it from my suppliers server, I would rather all images are downloaded from my server directly as I would not want to put any strain on my suppliers server.
The script works fine when running it in ssh e.g. /usr/bin/php /var/www/site/image_s.php but when loading it in a browser it does not download the image.
e.g. www.example.com/image_s.php
What do I need to do to make sure that a browser load is enough to download the image?
looks like a web server permissions error, when you run it on the console, you are running under the user you logged in as. this is not the same as the user that serves the website content (it could be 'nobody')
I did some more searching on the web and it turns out that you need to change folder permissions, by default the public can not write into a folder.
in linux you can change permission by doing the following
chmod 666 /var/www/site/folder_name
I am trying to copy file from a url to a folder in my system using php
<?php $image = "http://l.yimg.com/t/frontpage/10-facts-slideshow-080512-630-01.jpg";
if (!copy($image, "localfolder/mainPic1.png")) {
echo "FAILED";
}else{echo "DONE";}
it always end up with "Failed"
All permissions are set to the localfolder
i checked the phpinfo and found that allow_url_fopen =Off
as it is a shared server i do not have any control over the php settings . please can you suggest an alternative.
Thanks,
Jay
If allow_url_fopen is off and you cannot enable it, you will need to use curl to retrieve the file and save it locally.
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://l.yimg.com/t/frontpage/10-facts-slideshow-080512-630-01.jpg");
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$imgdata = curl_exec($ch);
// Save to disk
file_put_contents("localfolder/mainPic1.png", $imgdata);
Yes, allow_url_fopen have to ON for this.
allow_url_fopen = On
Even if you are allowed to modify php.ini on your shared server. Create a php.ini and keep the value allow_url_fopen = On in the root folder. It generally overrides the default settings.
I post pictures from other websites and I would rather have those on my servers, in case their server dies all of a sudden. Say the file is located at "www.www.www/image.gif", how would I copy it to my directory "images" safely?
I write in PHP.
Thanks!
The following should work:
// requires allow_url_fopen
$image = file_get_contents('http://www.url.com/image.jpg');
file_put_contents('/images/image.jpg', $image);
or the cURL route:
$ch = curl_init('http://www.url.com/image.jpg');
$fp = fopen('/images/image.jpg', 'wb');
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
fclose($fp);
If your server is configured to support http:// file paths, you could use file_get_contents.
If that doesn't work, the second simplest way is by using curl for which you will certainly find full-fledged download scripts.
Some servers you pull images from may require a User-agent that shows that you are a regular browser. There is a ready-made class in the User Contributed Notes to curl that handles that, and provides a simple DownloadFile() function.