I am trying to retrieve users display picture through graph api and using cUrl to save it into the disk, but am unable succeed in it and getting this error when trying to check the mime type of the picture that I saved:
Notice: exif_imagetype(): Read error! in
//$userPpicture = $user_profile[picture];
//Create image instances
$url = "http://graph.facebook.com/{$userId}/picture?type=large";
$dpImage = 'temp/' . $userId . '_dpImage_' . rand().'.jpg';
echo $dpImage;
function get_data($url) {
$ch = curl_init();
$timeout = 5;
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
$data = curl_exec($ch);
curl_close($ch);
return $data;
}
$returned_content = get_data($url);
file_put_contents($dpImage, $returned_content);
echo "Type: " . exif_imagetype($dpImage);
for this updated code using curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true); I am getting this error:
Warning: curl_setopt(): CURLOPT_FOLLOWLOCATION cannot be activated when in safe_mode or an open_basedir is set in /var/fog/apps/app12345/myapp.phpfogapp.com/start.php on line 178
If this action requires any server side configuration then i might not be able to do this as am using a shared cloud storage over phpfog.
Kindly help me with this.
Thankyou.
The graph url you are using of http://graph.facebook.com/4/picture?type=large returns a HTTP 302 redirect, not the actual user image. You would need to follow the redirect and download the image at that url which is a url that looks like this: http://profile.ak.fbcdn.net/hprofile-ak-snc4/49942_4_1525300_n.jpg
As OffBySome points out, you need to follow the 302 redirect served by graph.facebook.com to the final destination, which contains the actual image data.
The simplest way to do that in this case is to add another curl_setopt call with CURLOPT_FOLLOWLOCATION as true. i.e.
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true)
Check out http://us3.php.net/manual/en/function.curl-setopt.php for more details.
Related
Hello and thanks in advance, now I'm trying to upload a image to a prestashop 1.7 via webservice and I'm unable to insert a image in a product. I don't know what fails, because I don't get any response of the webservice, even with the debug enabled (I get the xml responses of the rest of the files, but not the ones from curl).
The variable $idProduct is a value passed to the function and is defined.
My code is the following:
$url = PS_SHOP_PATH."api/images/products/".$idProduct;
$dir_path_to_save = 'img/import/';
$img_path = getFile($remoteImageURL, $dir_path_to_save);
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_USERPWD, PS_WS_AUTH_KEY.':');
curl_setopt($ch, CURLOPT_POSTFIELDS, array('image' => '#'.$img_path));
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
$response = curl_exec($ch);
echo print_r($response);
curl_close($ch);
The getFile function downloads the image to the server where is installed the prestashop and returns the path (returns a string with the real path where the image is stored, already tested).
I tried to make a form to upload the image (just for testing), but it returns "code 66 - unable to save this image". I don't know if this helps.
Thanks
UPDATE
A fellow programmer told me to use curl_file_create()
So I changed the $img_path declaration this way:
$img = curl_file_create($dir_path_to_save.'/'.basename($img_path));
Now everything works as intended.
I need to get Icecast metadata auto update by PHP let say every 15 min which will be done by cPanel cronjob.
i had the below code but it does't work(it works if I use header location to redirect however cronjob won't be able to do that)
<?PHP
$url="http://tgftp.nws.noaa.gov/data/observations/metar/stations/KJFK.TXT";
$info=file_get_contents($url);
$url_info = "http://username:password#icecast:8000/admin/metadata?mount=/mymount&mode=updinfo&song=" . urlencode($info);
// create a new cURL resource
$ch = curl_init();
// set URL and other appropriate options
curl_setopt($ch, CURLOPT_URL, $url_info);
curl_setopt($ch, CURLOPT_HEADER, 0);
// grab URL and pass it to the browser
curl_exec($ch);
// close cURL resource, and free up system resources
curl_close($ch);
?>
Try checking for errors after you execute the call using curl_error:
<?php
$url="http://tgftp.nws.noaa.gov/data/observations/metar/stations/KJFK.TXT";
$info=file_get_contents($url);
$url_info = "http://username:password#icecast:8000/admin/metadata?mount=/mymount&mode=updinfo&song=" . urlencode($info);
// create a new cURL resource
$ch = curl_init();
// set URL and other appropriate options
curl_setopt($ch, CURLOPT_URL, $url_info);
curl_setopt($ch, CURLOPT_HEADER, 0);
// grab URL and pass it to the browser, check for errors
if (curl_exec($ch) === FALSE)
{
print 'Curl-Error occurred: ' . curl_error($ch).', error code: '.curl_errno($ch);
}
// close cURL resource, and free up system resources
curl_close($ch);
I am trying to get some country name from one of website. that website URL starting with https so i am not able to scrape data. please give me some solution.
Here is my code :
$curl = curl_init('https://testing.co/india');
curl_setopt($curl, CURLOPT_RETURNTRANSFER, TRUE);
$page = curl_exec($curl);
if (curl_errno($curl)) {`enter code here`
echo 'Scraper error: ' . curl_error($curl);
exit;
}
curl_close($curl);
$regex = '/<a class="startup-link">(.*?)<\/a>/s';
if (preg_match($regex, $page, $list))
echo $list[0];
else
print "Not found";
Get this error : Scraper error: SSL certificate problem: unable to get local issuer certificate
Today i am solving this problem and i came to know about it.
See. Below is code that is working for me.
// Set so curl_exec returns the result instead of outputting it.<br/>
$url = "https://www.google.co.in/?gws_rd=ssl";<br/>
$ch = curl_init();<br/>
curl_setopt($ch, CURLOPT_URL, $url);<br/><br/>
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);<br/>
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);<br/>
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 2);<br/>
curl_setopt($ch, CURLOPT_CAINFO, getcwd() . "GeoTrustGlobalCA.crt");
<br/>
// Get the response and close the channel.<br/>
$response = curl_exec($ch);<br/>
$link = fopen("data.txt","w+");<br/>
fputs($link,$response);<br/>
fclose($link);<br/>
curl_close($ch);<br/>
You have pass certificate on this..
On Mozilla firefox left handside of website URL you get one information icon. Then click on Security tab then find View certificate. Click on Details Tab.
See Certificate Hierarchy Section. Click on top most label and see below there is an option as EXPORT. Export that certificate and save the CA certificate to your selected location, making sure to select the X.509 Certificate (PEM) as the save type/format.
e.g.
curl_setopt($ch, CURLOPT_CAINFO, getcwd() . "GeoTrustGlobalCA.crt");
Now save it and run.. You will get the data..
use
curl_setopt($curl,CURLOPT_SSL_VERIFYPEER, false)
I am using an API to get the users profile picture. The call looks something like this
https://familysearch.org/platform/tree/persons/{$rid}/portrait?access_token={$_SESSION['fs-session']}&default=https://eternalreminder.com/dev/graphics/{$default_gender}_default.svg
This link only works for about an hour because the user's session token expires then. I was wondering if there was any way to retrieve the last returned returned URL, which would be the direct link to the image, so I could store that in a database.
I have tried Google but I don't really know where to start.
Thanks in advance!
I was able to solve my own problem. It was doing a redirect to get the image and I just needed that URL. Here is my code that helped me get there.
$url="http://libero-news.it.feedsportal.com/c/34068/f/618095/s/2e34796f/l/0L0Sliberoquotidiano0Bit0Cnews0C12735670CI0Esaggi0Eper0Ele0Eriforme0Ecostituzionali0EChiaccherano0Ee0Eascoltano0Bhtml/story01.htm";
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, true);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true); // Must be set to true so that PHP follows any "Location:" header
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$a = curl_exec($ch); // $a will contain all headers
$url = curl_getinfo($ch, CURLINFO_EFFECTIVE_URL); // This is what you need, it will return you the last effective URL
// Uncomment to see all headers
/*
echo "<pre>";
print_r($a);echo"<br>";
echo "</pre>";
*/
echo $url; // Voila
I'm having a little trouble updating backgrounds via Twitter's API.
$target_url = "http://www.google.com/logos/11th_birthday.gif";
$ch = curl_init();
curl_setopt($ch, CURLOPT_HTTPHEADER, array('Expect:'));
curl_setopt($ch, CURLOPT_USERAGENT, $userAgent);
curl_setopt($ch, CURLOPT_URL,$target_url);
curl_setopt($ch, CURLOPT_FAILONERROR, true);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_AUTOREFERER, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER,true);
curl_setopt($ch, CURLOPT_TIMEOUT, 10);
$html = curl_exec($ch);
$content = $to->OAuthRequest('http://twitter.com/account/update_profile_background_image.xml', array('profile_background_image_url' => $html), 'POST');
When I try to pull the raw data via cURL or file_get_contents, I get this...
Expectation Failed The expectation given in the Expect request-header
field could not be met by this server.
The client sent
Expect: 100-continue but we only allow the 100-continue expectation.
OK, you can't direct Twitter to a URL, it won't accept that. Looking around a bit I've found that the best way is to download the image to the local server and then pass that over to Twitter almost like a form upload.
Try the following code, and let me know what you get.
// The URL from an external (or internal) server we want to grab
$url = 'http://www.google.com/logos/11th_birthday.gif';
// We need to grab the file name of this, unless you want to create your own
$filename = basename($url);
// This is where we'll be saving our new file to. Replace LOCALPATH with the path you would like to save the file to, i.e. www/home/content/my_directory/
$newfilename = 'LOCALPATH' . $filename;
// Copy it over, PHP will handle the overheads.
copy($url, $newfilename);
// Now it's OAuth time... fingers crossed!
$content = $to->OAuthRequest('http://twitter.com/account/update_profile_background_image.xml', array('profile_background_image_url' => $newfilename), 'POST');
// Echo something so you know it went through
print "done";
Well, given the error message, it sounds like you should load the URL's contents yourself, and post the data directly. Have you tried that?