There is a form on my website that allows the upload of images from a remote server. When the user enters the link and submits, I want to check the file and make sure it's the right extension before I copy it off to my server.
I tried to directly use exif_imagetype, but allow_url_fopen is not allowed on the server so need help. I think using curl will solve the problem, but I'm not sure how to get the image extension from the header.
Thanks in advanced!
solution
$ch = curl_init ($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_exec ($ch);
$content_type = curl_getinfo($ch, CURLINFO_CONTENT_TYPE);
echo $content_type;
Thanks to Mario!
solution
$ch = curl_init ($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_exec ($ch);
$content_type = curl_getinfo($ch, CURLINFO_CONTENT_TYPE);
echo $content_type;
Thanks to Mario!
Based on a few responses, I was trying to display an image from a curl request, and dynamically figure out the headers Content-Type. Using what Mario did, and some other things, here is a function which will correctly display an image from a standard url or file, and correctly put in the header. All PHP.
function displayImage( $filename ) {
$ch = curl_init();
$options = array(
CURLOPT_URL => $filename,
CURLOPT_ENCODING => "",
CURLOPT_RETURNTRANSFER => true,
CURLOPT_HTTPGET => true,
CURLOPT_CONNECTTIMEOUT => 60,
CURLOPT_TIMEOUT => 60
);
curl_setopt_array($ch, $options);
$response = curl_exec($ch);
header('Content-Type: ' . curl_getinfo($ch, CURLINFO_CONTENT_TYPE));
if(!curl_errno($ch)){
curl_close($ch);
$img = imagecreatefromstring($response);
imagejpeg($img);
imagedestroy($img);
return true;
}
else{
curl_close($ch);
return curl_error($ch);
}
}
You can also do it this way:
$src = 'https://example.com/image.jpg';
$size = getimagesize($src);
$extension = image_type_to_extension($size[2]);
Related
I'm getting the following error when running a script. The error message is as follows...
Warning: file_get_contents() [function.file-get-contents]: https:// wrapper is disabled in the server configuration by allow_url_fopen=0 in /home/satoship/public_html/connect.php on line 22
I know this is a server issue but what do I need to do to the server in order to get rid of the above warning?
#blytung Has a nice function to replace that function
<?php
$url = "http://www.example.org/";
$ch = curl_init();
curl_setopt ($ch, CURLOPT_URL, $url);
curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, 5);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, true);
$contents = curl_exec($ch);
if (curl_errno($ch)) {
echo curl_error($ch);
echo "\n<br />";
$contents = '';
} else {
curl_close($ch);
}
if (!is_string($contents) || !strlen($contents)) {
echo "Failed to get contents.";
$contents = '';
}
echo $contents;
?>
If you do not have the ability to modify your php.ini file, use cURL:
PHP Curl And Cookies
Here is an example function I created:
function get_web_page( $url, $cookiesIn = '' ){
$options = array(
CURLOPT_RETURNTRANSFER => true, // return web page
CURLOPT_HEADER => true, //return headers in addition to content
CURLOPT_FOLLOWLOCATION => true, // follow redirects
CURLOPT_ENCODING => "", // handle all encodings
CURLOPT_AUTOREFERER => true, // set referer on redirect
CURLOPT_CONNECTTIMEOUT => 120, // timeout on connect
CURLOPT_TIMEOUT => 120, // timeout on response
CURLOPT_MAXREDIRS => 10, // stop after 10 redirects
CURLINFO_HEADER_OUT => true,
CURLOPT_SSL_VERIFYPEER => true, // Validate SSL Cert
CURLOPT_HTTP_VERSION => CURL_HTTP_VERSION_1_1,
CURLOPT_COOKIE => $cookiesIn
);
$ch = curl_init( $url );
curl_setopt_array( $ch, $options );
$rough_content = curl_exec( $ch );
$err = curl_errno( $ch );
$errmsg = curl_error( $ch );
$header = curl_getinfo( $ch );
curl_close( $ch );
$header_content = substr($rough_content, 0, $header['header_size']);
$body_content = trim(str_replace($header_content, '', $rough_content));
$pattern = "#Set-Cookie:\\s+(?<cookie>[^=]+=[^;]+)#m";
preg_match_all($pattern, $header_content, $matches);
$cookiesOut = implode("; ", $matches['cookie']);
$header['errno'] = $err;
$header['errmsg'] = $errmsg;
$header['headers'] = $header_content;
$header['content'] = $body_content;
$header['cookies'] = $cookiesOut;
return $header;
}
NOTE: In revisiting this function I noticed that I had disabled SSL checks in this code. That is generally a BAD thing even though in my particular case the site I was using it on was local and was safe. As a result I've modified this code to have SSL checks on by default. If for some reason you need to change that, you can simply update the value for CURLOPT_SSL_VERIFYPEER, but I wanted the code to be secure by default if someone uses this.
Use this code in your php script (first lines)
ini_set('allow_url_fopen',1);
Edit your php.ini, find allow_url_fopen and set it to allow_url_fopen = 1
Using relative instead of absolute file path solved the problem for me.
I had the same issue and setting allow_url_fopen=on
did not help. This means for instance :
use
$file="folder/file.ext";
instead of
$file="https://website.com/folder/file.ext";
in
$f=fopen($file,"r+");
THIS IS A VERY SIMPLE PROBLEM
Here is the best method for solve this problem.
Step 1 : Login to your cPanel (http://website.com/cpanel OR http://cpanel.website.com).
Step 2 : SOFTWARE -> Select PHP Version
Step 3 : Change Your Current PHP version : 5.6
Step 3 : HIT 'Set as current' [ ENJOY ]
I am using CURL to send get request to a server, but it is not producing the required output.
The script works fine when loaded in browser.
It is script which combines given PNG images to GIF.
<?php
//generate GIF
$name = $_GET['name_of_final_gif'];
$images = $_GET['images'];
$images = json_decode($images, true);
//include GIF maker class based on GD library
include('GIFEncoder.class.php');
/******************************************************/
foreach($images as $image_link) {
// Open the source image and add the text.
$image = imagecreatefrompng($image_link);
// Generate GIF from the $image
// We want to put the binary GIF data into an array to be used later,
// so we use the output buffer.
ob_start();
imagegif($image);
$frames[]=ob_get_contents();
$framed[]=300; // Delay in the animation.
ob_end_clean();
}
// Generate the animated gif and save it
$gif = new GIFEncoder($frames,$framed,0,2,0,0,0,'bin');
$fp = fopen("gifs/$name", 'w');
fwrite($fp, $gif->GetAnimation());
fclose($fp);
?>
Update:
Below is my CURL code which is on another server and is sending GET request to this script which is hosted on another server:
$images = $class_name->get_images_links(); // get image links from database in JSON FORMAT
$name = "something.gif"; //name for output GIF image
$url = "http://example.com/make_gif.php?images=$images&&name_of_final_gif=$name";
// Get cURL resource
$curl = curl_init();
// Set some options
curl_setopt_array($curl, array(
CURLOPT_RETURNTRANSFER => 1,
CURLOPT_URL => $url
));
// Send the request & save response to $resp
$resp = curl_exec($curl);
// Close request to clear up some resources
curl_close($curl);
Try this
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://example.com/make_gif.php?images=$images&&name_of_final_gif=$name");
curl_exec($ch);
curl_close($ch);
Try this:
function curl($url) {
// Assigning cURL options to an array
$options = Array(
CURLOPT_RETURNTRANSFER => TRUE, // Setting cURL's option to return the webpage data
CURLOPT_FOLLOWLOCATION => TRUE, // Setting cURL to follow 'location' HTTP headers
CURLOPT_AUTOREFERER => TRUE, // Automatically set the referer where following 'location' HTTP headers
CURLOPT_TIMEOUT => 120, // Setting the maximum amount of time for cURL to execute queries
CURLOPT_MAXREDIRS => 10, // Setting the maximum number of redirections to follow
CURLOPT_USERAGENT => "Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1a2pre) Gecko/2008073000 Shredder/3.0a2pre ThunderBrowse/3.2.1.8", // Setting the useragent
CURLOPT_URL => $url, // Setting cURL's URL option with the $url variable passed into the function
);
$ch = curl_init(); // Initialising cURL
curl_setopt_array($ch, $options); // Setting cURL's options using the previously assigned array data in $options
$data = curl_exec($ch); // Executing the cURL request and assigning the returned data to the $data variable
curl_close($ch); // Closing cURL
return $data; // Returning the data from the function
}
$page = curl($link);
I have some short URLs. How can I get the original URLs from them?.
As the URL-shortener-services are mostly simple redirectors they use the location header to tell the browser where to go to.
You can use PHP's own get_headers() function to get the appropriate header:
$headers = get_headers('http://shorten.ed/fbsfS' , true);
echo $headers['Location'];
Try this
<?php
$url="http://goo.gl/fbsfS";
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, true);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$a = curl_exec($ch); // $a will contain all headers
$url = curl_getinfo($ch, CURLINFO_EFFECTIVE_URL); // This is what you need, it will return you the last effective URL
echo $url; // Redirected url
?>
You can use the curl functions for this:
// The short url to expand
$url = 'http://goo.gl/fbsfS';
// Prepare a request for the given URL
$curl = curl_init($url);
// Set the needed options:
curl_setopt_array($curl, array(
CURLOPT_NOBODY => TRUE, // Don't ask for a body, we only need the headers
CURLOPT_FOLLOWLOCATION => FALSE, // Don't follow the 'Location:' header, if any
));
// Send the request (you should check the returned value for errors)
curl_exec($curl);
// Get information about the 'Location:' header (if any)
$location = curl_getinfo($curl, CURLINFO_REDIRECT_URL);
// This should print:
// http://translate.google.com.ar/translate?hl=es&sl=en&u=http://goo.gl/lw9sU
echo($location);
For all services there is an API, which you can use.
I want to upload a file from an external URL directly to an Amazon S3 bucket using the PHP SDK. I managed to do this with the following code:
$s3 = new AmazonS3();
$response = $s3->create_object($bucket, $destination, array(
'fileUpload' => $source,
'length' => remote_filesize($source),
'contentType' => 'image/jpeg'
));
Where the function remote_filesize is the following:
function remote_filesize($url) {
ob_start();
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_HEADER, 1);
curl_setopt($ch, CURLOPT_NOBODY, 1);
$ok = curl_exec($ch);
curl_close($ch);
$head = ob_get_contents();
ob_end_clean();
$regex = '/Content-Length:\s([0-9].+?)\s/';
$count = preg_match($regex, $head, $matches);
return isset($matches[1]) ? $matches[1] : "unknown";
}
However, it would be nice if I could skip setting the filesize when uploading to Amazon since this would save me a trip to my own server. But if I remove setting the 'length' property in the $s3->create_object function, I get an error saying that the 'The stream size for the streaming upload cannot be determined.' Any ideas how to solve this problem?
You can upload file from url directly to Amazon S3 like this (my example is about a jpg picture):
1. Convert the content from url in binary
$binary = file_get_contents('http://the_url_of_my_image.....');
2. Create an S3 object with a body to pass the binary into
$s3 = new AmazonS3();
$response = $s3->create_object($bucket, $filename, array(
'body' => $binary, // put the binary in the body
'contentType' => 'image/jpeg'
));
That's all and it's very fast. Enjoy!
Do you have any control over remote server/host?. If so you could set up a php server to query the file locally and pass the data to you.
If not, you could use something like curl to inspect the header like so;
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://sstatic.net/so/img/logo.png');
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_HEADER, true);
curl_setopt($ch, CURLOPT_NOBODY, true);
curl_exec($ch);
$size = curl_getinfo($ch, CURLINFO_CONTENT_LENGTH_DOWNLOAD);
var_dump($size);
This way, you are using a HEAD request, and not downloading the whole file -- still, you depend on the remote server send a correct Content-length header.
I'm looking for the fastest and easiest way to proxy a page in PHP. I don't want the user to be redirected, I just want my script to return the same content, response code and headers as another remote URL.
echo file_get_contents('proxypage');
Would that work?
EDIT:
First answer was a bit short, and I don't believe it will handle headers as you would like.
However you can also do this:
function get_proxy_site_page( $url )
{
$options = [
CURLOPT_RETURNTRANSFER => true, // return web page
CURLOPT_HEADER => true, // return headers
CURLOPT_FOLLOWLOCATION => true, // follow redirects
CURLOPT_ENCODING => "", // handle all encodings
CURLOPT_AUTOREFERER => true, // set referer on redirect
CURLOPT_CONNECTTIMEOUT => 120, // timeout on connect
CURLOPT_TIMEOUT => 120, // timeout on response
CURLOPT_MAXREDIRS => 10, // stop after 10 redirects
];
$ch = curl_init($url);
curl_setopt_array($ch, $options);
$remoteSite = curl_exec($ch);
$header = curl_getinfo($ch);
curl_close($ch);
$header['content'] = $remoteSite;
return $header;
}
This will return you an array containing lots of information on the remote page. $header['content'] will have both the content of the website and the headers, $header[header_size] will contain the length of that header so you can use substr to split those up.
Then it's just a matter of using echoand header to proxy the page.
You can use the PHP cURL functions to achieve this functionality:
http://www.php.net/curl
// create a new cURL resource
$ch = curl_init();
// set URL and other appropriate options
curl_setopt($ch, CURLOPT_URL, 'http://www.example.com/');
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
// grab URL and pass it to the browser
$urlContent = curl_exec($ch);
From this point, you would grab the response header information using http://www.php.net/curl-getinfo. (There are several values you can grab, all listed in the documentation).
// Check if any error occured
if(!curl_errno($ch))
{
$info = curl_getinfo($ch);
header('Content-Type: '.$info['content_type']);
echo $urlContent;
}
Make sure to close out the cURL handle.
// close cURL resource, and free up system resources
curl_close($ch);
You can get the html of the next page with curl, and then echo the response.