copy() not working from url to folder in php - php

I am trying to copy file from a url to a folder in my system using php
<?php $image = "http://l.yimg.com/t/frontpage/10-facts-slideshow-080512-630-01.jpg";
if (!copy($image, "localfolder/mainPic1.png")) {
echo "FAILED";
}else{echo "DONE";}
it always end up with "Failed"
All permissions are set to the localfolder
i checked the phpinfo and found that allow_url_fopen =Off
as it is a shared server i do not have any control over the php settings . please can you suggest an alternative.
Thanks,
Jay

If allow_url_fopen is off and you cannot enable it, you will need to use curl to retrieve the file and save it locally.
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://l.yimg.com/t/frontpage/10-facts-slideshow-080512-630-01.jpg");
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$imgdata = curl_exec($ch);
// Save to disk
file_put_contents("localfolder/mainPic1.png", $imgdata);

Yes, allow_url_fopen have to ON for this.
allow_url_fopen = On
Even if you are allowed to modify php.ini on your shared server. Create a php.ini and keep the value allow_url_fopen = On in the root folder. It generally overrides the default settings.

Related

Can't load file from URL, but works with local path

I am trying to load an XML file, but for some reason I can't use a URL.
allow_url_fopen is set to On.
I used this, which didn't work:
$xml = simplexml_load_file("https://example.corn/test.xml");
But this does work:
$xml = simplexml_load_file("../test.xml");
I also tried file_get_contents:
$xml = file_get_contents("https://example.corn/test.xml");
and even cURL:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "https://example.corn/test.xml");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$xml = curl_exec($ch);
curl_close($ch);
For each example I am displaying the output with var_dump($xml);, which always shows bool(false) when using a URL and shows the correct XML data when using a local path.
I would just use the local path, but this is just for a test and the actual XML file I am going to fetch is from a different site.
I tested it on a local server and it worked, but on my hosted server, it doesn't work at all. They are using the same php.ini. My hosted server is using php-fpm, I'm unsure if that could be a factor, or if there is something that has to be changed with it.
This might be a configuration error that I am unaware of, because the code should work, and the link definitley exists, because I can click on it and see the correct xml in the browser.
What could be causing this issue?
Turns out that I had my openssl.cafile property set to some location that didn't exist.
For some reason no errors were showing when using the commands above.
I think the specified url is not valid or doesn't exist
Read about curl in the PHP docs
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://www.example.com/");
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);

CURL & PHP Web Scraping - cainfo

I am trying to scrape some information from the steam community website.
I am getting an error related to a certificate when I try to access the URL through cURL.
I downloaded cacert.pem
modified the php.ini file to include
[curl]
; A default value for the CURLOPT_CAINFO option. This is required to be an
; absolute path.
curl.cainfo = "D:\xampp\php\caextract.pem.txt"]
PHP File
$url = 'https://steamcommunity.com/search/?text=' . $PlayerName . '&x=0&y=0';
$ch = curl_init(); // Initialising cURL
curl_setopt($ch, CURLOPT_URL, $url); // Setting cURL's URL option with the $url variable passed into the function
curl_setopt($ch, CURLOPT_RETURNTRANSFER, FALSE); // Setting cURL's option to return the webpage data
$html = curl_exec($ch); // Executing the cURL request and assigning the returned data to the $data variable
var_dump(curl_getinfo($ch ));
var_dump(curl_errno($ch) );
var_dump(curl_error($ch));
curl_close($ch); // Closing cURL
Setup:
XAMPP 3.2.2 (Default settings)
Windows 10
Chrome
Error:
error setting certificate verify locations: CAfile: D:\xampp\htdocs\steaminfo\cacert.pem CApath: none
I would not rely too much on such settings in php.ini as they may be overwritten by: 1) php.ini at other level; 2) .htaccess in any parent directory.
The only way to ensure your settings actually take effect is place/run phpinfo() in the same directory as your script.
However there is another, more simple, way: set respective CURL option by curl_setopt in your script:
...
curl_setopt($ch, CURLOPT_RETURNTRANSFER, FALSE);
curl_setopt($ch, CURLOPT_CAINFO, "D:\xampp\php\caextract.pem.txt");
...
Is it necessary to verify the SSL certificate? There is an option CURLOPT_SSL_VERIFYHOST you can set for cURL to not verify it.
Alternatively you can try sending the request to with http:// instead in the beginning of URL.
Since you only scrape, you don't actually need to verify the ssl certificate, because it will only slow you down. Plus you aren't logging anywhere, my advice would be to prevent CURL from checking the SSL.
All you have to do is add:
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);

How to fetch the url content using php curl or file_get_content on the same server?

I have some code to convert a PHP page to HTML:
$dynamic = "http://website.net/home.php";
$out = "home.html" ;
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,"$dynamic");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$file = curl_exec($ch);
file_put_contents($out, $file);
This works perfect in localhost but it takes too much time/doesn't work on the live site.
I've tried php_get_contents, but that also doesn't work.
Note:
http://website.net/home.php page is in the same site where the code is hosted.
curl is enabled and allow_url_fopen is on as per phpinfo() in both localhost and at server.
EDIT:
It works fine when using other website's page instead of my website.
The site's target page perfectly loads in my browser.
The web page of my website is loading fast as usual but when I use curl or file_get_contents, it's too slow and even can't get output.
I think you have DNS resolving problem.
There is two ways you can use your website's locahost instead of the external domain name.
1. If you have your own server / VPS / Dedicated server,
Add entry in vim /etc/hosts for 127.0.0.1 website.net or try to fetch the content with localhost(127.0.0.1).
2. If you are using shared hosting, then try to use below url(s) in your code,
http://localhost.mywebsite.net/~username/home.php.
OR
Try to call http://localhost.mywebsite.net/home.php
Try this to fetch the url content -
$dynamic = TRY_ABOVE_URL(S)
$out = "home.html" ;
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,$dynamic);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$file = curl_exec($ch);
if($file === false) {
echo 'Curl error: ' . curl_error($ch);
}
file_put_contents($out, $file);
To investigate the reason why is it not working on live server, Try this:
$dynamic = "http://website.net/home.php";
$out = "home.html" ;
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $dynamic);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$file = curl_exec($ch);
if($file === false)
{
echo 'Curl error: ' . curl_error($ch);
}
curl_close($ch);
file_put_contents($out, $file);
I think it depends on the provider SSL configuration.
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
curl_setopt($ch, CURLOPT_SSLVERSION, 3);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 2);
Check for know smth. about SSL configuration of your provider.
Looks like it is a network routing issue, basically the server can't find a route to website.net(itself). This has is a server issue and not a PHP issue. The quickest solution is to edit the hosts file on the server and set website.net to 127.0.0.1.
On linux servers you will need to add the following line to the bottom of /etc/hosts to:
127.0.0.1 website.net
Alternatively you can try fetching http://127.0.0.1/home.php but that will not work if you have multiple virtual hosts on the server.
Try this one:
file_put_contents("home.html", fopen("http://website.net/home.php", 'r'));
but curl should work as well. If you have SSH access to the server, try to resolve your domain name using ping or something. It is possible to have a local DNS issue - it will explain why you can download from external domains.
For my example, you'll need allow_fopen_url to be On, so please verify that by phpinfo() first.

php file_get_contents returns null when allow_url_fopen is on

I get the warning message: file_get_contents failed to open stream permission denied
I have set all_url_open to on in the php.ini file.
My php file is in my apache server and it is trying to access a url (that returns JSON) from a tomcat server on the same machine.
The code in the php file looks like:
$srcURL = 'http://samemachine:8080/returnjson/';
$results = file_get_contents($srcURL);
I have also tried curl and it returns nothing and doesn't hit the tomcat server either:
function curl($url){
$ch = curl_init();
curl_setopt($ch, CURLOPT_HTTPHEADER, array('Accept-Language: en-us'));
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER,1);
$data = curl_exec($ch);
curl_close($ch);
return $data;
}
Why is permission denied?
Make sure it's really allow_url_fopen and not all_url_open as you wrote
Double-check this by running phpinfo(). Make sure it lists the option as enabled. If it doesn't restart your web server
To debug your curl problem, run Wireshark and watch on the lo interface

How to Copy twitter account image to server?

I am using the twitter api to access some account details. It's been made clear that you can easily get the twitter account image:
http://a0.twimg.com/profile_images/1260994338/P3030586-2_bigger.jpg
However, when i try to copy() this image in PHP to my local server it doesn't work. It seems you can't even ping the address as 'could not find host'.
How can I copy the twitter account image to my local server?
$twitsImg ='http://a0.twimg.com/profile_images/1260994338/P3030586-2_bigger.jpg';
$twit=file_get_contents($twitsImg);
$new_img = basename($twitsImg);
file_put_contents($new_img,$twit);
The problem you are having is that the php ini-file variable allow_url_fopen is turned off, as it should be. You need to be using cURL to get the file. cURL is usually installed on web servers by default. This example works on my site:
<?php
function save_image($img,$path){
$ch = curl_init($img);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_BINARYTRANSFER,1);
$data=curl_exec($ch);
curl_close ($ch);
if(file_exists($path)){
unlink($path);
}
$fp = fopen($path,'x');
fwrite($fp, $data);
fclose($fp);
}
save_image("http://a0.twimg.com/profile_images/1260994338/P3030586-2_bigger.jpg","up/file2.jpg");
?>
Also make sure you have the right CHMOD (775) set on your destination folder.

Categories