CURL is not working to send GET request in php - php

I am using CURL to send get request to a server, but it is not producing the required output.
The script works fine when loaded in browser.
It is script which combines given PNG images to GIF.
<?php
//generate GIF
$name = $_GET['name_of_final_gif'];
$images = $_GET['images'];
$images = json_decode($images, true);
//include GIF maker class based on GD library
include('GIFEncoder.class.php');
/******************************************************/
foreach($images as $image_link) {
// Open the source image and add the text.
$image = imagecreatefrompng($image_link);
// Generate GIF from the $image
// We want to put the binary GIF data into an array to be used later,
// so we use the output buffer.
ob_start();
imagegif($image);
$frames[]=ob_get_contents();
$framed[]=300; // Delay in the animation.
ob_end_clean();
}
// Generate the animated gif and save it
$gif = new GIFEncoder($frames,$framed,0,2,0,0,0,'bin');
$fp = fopen("gifs/$name", 'w');
fwrite($fp, $gif->GetAnimation());
fclose($fp);
?>
Update:
Below is my CURL code which is on another server and is sending GET request to this script which is hosted on another server:
$images = $class_name->get_images_links(); // get image links from database in JSON FORMAT
$name = "something.gif"; //name for output GIF image
$url = "http://example.com/make_gif.php?images=$images&&name_of_final_gif=$name";
// Get cURL resource
$curl = curl_init();
// Set some options
curl_setopt_array($curl, array(
CURLOPT_RETURNTRANSFER => 1,
CURLOPT_URL => $url
));
// Send the request & save response to $resp
$resp = curl_exec($curl);
// Close request to clear up some resources
curl_close($curl);

Try this
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://example.com/make_gif.php?images=$images&&name_of_final_gif=$name");
curl_exec($ch);
curl_close($ch);

Try this:
function curl($url) {
// Assigning cURL options to an array
$options = Array(
CURLOPT_RETURNTRANSFER => TRUE, // Setting cURL's option to return the webpage data
CURLOPT_FOLLOWLOCATION => TRUE, // Setting cURL to follow 'location' HTTP headers
CURLOPT_AUTOREFERER => TRUE, // Automatically set the referer where following 'location' HTTP headers
CURLOPT_TIMEOUT => 120, // Setting the maximum amount of time for cURL to execute queries
CURLOPT_MAXREDIRS => 10, // Setting the maximum number of redirections to follow
CURLOPT_USERAGENT => "Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1a2pre) Gecko/2008073000 Shredder/3.0a2pre ThunderBrowse/3.2.1.8", // Setting the useragent
CURLOPT_URL => $url, // Setting cURL's URL option with the $url variable passed into the function
);
$ch = curl_init(); // Initialising cURL
curl_setopt_array($ch, $options); // Setting cURL's options using the previously assigned array data in $options
$data = curl_exec($ch); // Executing the cURL request and assigning the returned data to the $data variable
curl_close($ch); // Closing cURL
return $data; // Returning the data from the function
}
$page = curl($link);

Related

How to download a .gz file using curl?

So I have written an Ajax call to download a file after clicking on download button so when I hit the API that I was using to get a file over CURL call which returns the file resource stream so if its a pdf then its fine i am using fopen and fwrite to write the data into a file and its working but when i try to get .gz file stream its not working i mean the .gz file is created but its nothing in that file also when i try to extract it gives me error i am using ubuntu 18.04 and Codeigniter 3
private function __curl(
$url,
$request = "POST",
$data = [],
$header = ["Content-Type: application/json"]
) {
$curl = curl_init();
curl_setopt_array($curl, [
CURLOPT_URL => $this->apiUrl . $url,
CURLOPT_RETURNTRANSFER => true,
CURLOPT_ENCODING => "",
CURLOPT_MAXREDIRS => 10,
CURLOPT_TIMEOUT => 30,
CURLOPT_HTTP_VERSION => CURL_HTTP_VERSION_1_1,
CURLOPT_CUSTOMREQUEST => $request,
CURLOPT_POSTFIELDS => !empty($data) ? json_encode($data) : "",
CURLOPT_HTTPHEADER => $header,
]);
$response = curl_exec($curl);
$err = curl_error($curl);
curl_close($curl);
if ($err) {
return $err;
} else {
$path = "path/to/file/".$fileName;
$fp = fopen($path, 'w');
fwrite($fp, $response);
fclose($fp);
}
}
so I am using this function to call the api and i get the .gz file as a response stream and I want to convert that stream to a as it .gz file with data in it and save it in given path.
You can download a .gz file using curl in php by using the following code:
<?php
// Initialize cURL session
$ch = curl_init();
// Set the URL of the file to be downloaded
curl_setopt($ch, CURLOPT_URL, 'http://example.com/file.gz');
// Set cURL to return the contents of the file as a string
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
// Execute cURL session and store the contents of the file into a variable
$data = curl_exec($ch);
// Close cURL session
curl_close($ch);
// Write data to local file
$fp = fopen('file.gz', 'w');
fwrite($fp, $data);
// Close local file handle
fclose($fp); ?>

Trying to get a large xml file through api.php

I'm trying to get the contents of a large xml file from api.My problem is that it takes forever to response actually i waited an hours and still nothing.Even if i try the url to my browser it just time out.Is there a better way to get the file?
Here is my code
$fp = fopen('a.xml','wb');
$ch = curl_init();
curl_setopt($ch,CURLOPT_URL,$url);
curl_setopt($ch,CURLOPT_HEADER,0);
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_exec($ch);
curl_close($ch);
fclose($fp);
Update:
I used this code but its the same it keeps loading it think its some memory problem it tries to load the whole file before send it to simplexml.If i put RETURNTRANSFER to false it will lower memory usage?
$options = array(
CURLOPT_RETURNTRANSFER => true, // return web page
CURLOPT_HEADER => false, // don't return headers
CURLOPT_FOLLOWLOCATION => true, // follow redirects
CURLOPT_ENCODING => "", // handle all encodings
CURLOPT_USERAGENT => "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:18.0) Gecko/20100101 Firefox/18.0", // something like Firefox
CURLOPT_AUTOREFERER => true, // set referer on redirect
CURLOPT_CONNECTTIMEOUT => 0, // timeout on connect
CURLOPT_TIMEOUT => 0, // timeout on response
CURLOPT_MAXREDIRS => 10, // stop after 10 redirects
);
$curl = curl_init('someurl.com');
curl_setopt_array( $curl, $options );
$content = curl_exec($curl);
curl_close($curl);
One problem I notice immediately with your code snippet is that you open a.xml by assigning the file handle to $fp, but you reference it in your curl_setopt as $fh.
How big is the XML? Depending on how much memory you allow php to consume, you could get the entire contents of the file with something as simple as:
$xmlString = file_get_contents('a.xml');
Or if you plan to process the xml further, you could load the file into a SimpleXMLElement:
$xml = simplexml_load_file('a.xml');
or into a DOMDocument:
$xmlDom = new DOMDocument();
$xmlDom->load('a.xml');

CURL script in PHP for blacklist of an ip using XPATH

I want to make a little script that returns me a result depending of how much a ip has been blacklisted.
Result must be like 23/100 means that 23 has blacklisted that ip or 45/100 2/100 ... and so on.
First of all i fetch trough CURL from http://whatismyipaddress.com/blacklist-check sending a post request some data :
<?php
/**
* Get a web file (HTML, XHTML, XML, image, etc.) from a URL. Return an
* array containing the HTTP server response header fields and content.
*/
function get_web_page($url,$argument1)
{
$options = array(
CURLOPT_RETURNTRANSFER => true, // return web page
CURLOPT_HEADER => false, // don't return headers
CURLOPT_FOLLOWLOCATION => true, // follow redirects
CURLOPT_ENCODING => "", // handle all encodings
CURLOPT_USERAGENT => "Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 (FM Scene 4.6.1)", // who am i
CURLOPT_AUTOREFERER => true, // set referer on redirect
CURLOPT_CONNECTTIMEOUT => 120, // timeout on connect
CURLOPT_TIMEOUT => 120, // timeout on response
CURLOPT_MAXREDIRS => 10, // stop after 10 redirects
CURLOPT_POST => 1,
CURLOPT_POSTFIELDS => "LOOKUPADDRESS=".$argument1,
);
$ch = curl_init( $url );
curl_setopt_array( $ch, $options );
$content = curl_exec( $ch );
$err = curl_errno( $ch );
$errmsg = curl_error( $ch );
$header = curl_getinfo( $ch );
curl_close( $ch );
$header['errno'] = $err;
$header['errmsg'] = $errmsg;
$header['content'] = $content;
return $header;
}
echo "<pre>";
$result = get_web_page("http://whatismyipaddress.com/blacklist-check","75.122.17.117");
// print_r($result['content']);
// in $result['content'] we have the whole pag
// Creating xpath and fill it with data
$doc = new DOMDocument();
libxml_use_internal_errors(true);
$doc->loadHTMLFile($result['content']); // loads your html
$xpath = new DOMXPath($doc);
// Get that table
$value = $xpath->evaluate("string(/html/body/div/div/div/table/text())");
echo "Table with blacklists: [$value]\n"; // prints your location
die;
?>
Now what i want is to parse the data with XPATH /html/body/div/div/div/table/text() and where i see the image (!) mark it as blacklisted, otherwise do nothing.
Can anyone help me?
I also observed that vewing the (!) image requires a token, i might switch to another site, but i like that particular website because it has all the websites.
Thank you!
definitely you need this :)
Simple DOM Parser

php/curl - get extension of image on remote server using headers

There is a form on my website that allows the upload of images from a remote server. When the user enters the link and submits, I want to check the file and make sure it's the right extension before I copy it off to my server.
I tried to directly use exif_imagetype, but allow_url_fopen is not allowed on the server so need help. I think using curl will solve the problem, but I'm not sure how to get the image extension from the header.
Thanks in advanced!
solution
$ch = curl_init ($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_exec ($ch);
$content_type = curl_getinfo($ch, CURLINFO_CONTENT_TYPE);
echo $content_type;
Thanks to Mario!
solution
$ch = curl_init ($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_exec ($ch);
$content_type = curl_getinfo($ch, CURLINFO_CONTENT_TYPE);
echo $content_type;
Thanks to Mario!
Based on a few responses, I was trying to display an image from a curl request, and dynamically figure out the headers Content-Type. Using what Mario did, and some other things, here is a function which will correctly display an image from a standard url or file, and correctly put in the header. All PHP.
function displayImage( $filename ) {
$ch = curl_init();
$options = array(
CURLOPT_URL => $filename,
CURLOPT_ENCODING => "",
CURLOPT_RETURNTRANSFER => true,
CURLOPT_HTTPGET => true,
CURLOPT_CONNECTTIMEOUT => 60,
CURLOPT_TIMEOUT => 60
);
curl_setopt_array($ch, $options);
$response = curl_exec($ch);
header('Content-Type: ' . curl_getinfo($ch, CURLINFO_CONTENT_TYPE));
if(!curl_errno($ch)){
curl_close($ch);
$img = imagecreatefromstring($response);
imagejpeg($img);
imagedestroy($img);
return true;
}
else{
curl_close($ch);
return curl_error($ch);
}
}
You can also do it this way:
$src = 'https://example.com/image.jpg';
$size = getimagesize($src);
$extension = image_type_to_extension($size[2]);

How to proxy another page in PHP

I'm looking for the fastest and easiest way to proxy a page in PHP. I don't want the user to be redirected, I just want my script to return the same content, response code and headers as another remote URL.
echo file_get_contents('proxypage');
Would that work?
EDIT:
First answer was a bit short, and I don't believe it will handle headers as you would like.
However you can also do this:
function get_proxy_site_page( $url )
{
$options = [
CURLOPT_RETURNTRANSFER => true, // return web page
CURLOPT_HEADER => true, // return headers
CURLOPT_FOLLOWLOCATION => true, // follow redirects
CURLOPT_ENCODING => "", // handle all encodings
CURLOPT_AUTOREFERER => true, // set referer on redirect
CURLOPT_CONNECTTIMEOUT => 120, // timeout on connect
CURLOPT_TIMEOUT => 120, // timeout on response
CURLOPT_MAXREDIRS => 10, // stop after 10 redirects
];
$ch = curl_init($url);
curl_setopt_array($ch, $options);
$remoteSite = curl_exec($ch);
$header = curl_getinfo($ch);
curl_close($ch);
$header['content'] = $remoteSite;
return $header;
}
This will return you an array containing lots of information on the remote page. $header['content'] will have both the content of the website and the headers, $header[header_size] will contain the length of that header so you can use substr to split those up.
Then it's just a matter of using echoand header to proxy the page.
You can use the PHP cURL functions to achieve this functionality:
http://www.php.net/curl
// create a new cURL resource
$ch = curl_init();
// set URL and other appropriate options
curl_setopt($ch, CURLOPT_URL, 'http://www.example.com/');
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
// grab URL and pass it to the browser
$urlContent = curl_exec($ch);
From this point, you would grab the response header information using http://www.php.net/curl-getinfo. (There are several values you can grab, all listed in the documentation).
// Check if any error occured
if(!curl_errno($ch))
{
$info = curl_getinfo($ch);
header('Content-Type: '.$info['content_type']);
echo $urlContent;
}
Make sure to close out the cURL handle.
// close cURL resource, and free up system resources
curl_close($ch);
You can get the html of the next page with curl, and then echo the response.

Categories