Multithread image download with cURL, PHP - php

I'm trying to use cURL to download images from an URL with multiple connections to speed up the process.
Here's my code:
function multiRequest($data, $options = array()) {
// array of curl handles
$curly = array();
// data to be returned
$result = array();
// multi handle
$mh = curl_multi_init();
// loop through $data and create curl handles
// then add them to the multi-handle
foreach ($data as $id => $d) {
$path = 'image_'.$id.'.png';
if(file_exists($path)) { unlink($path); }
$fp = fopen($path, 'x');
$url = $d;
$curly[$id] = curl_init($url);
curl_setopt($curly[$id], CURLOPT_HEADER, 0);
curl_setopt($curly[$id], CURLOPT_FILE, $fp);
fclose($fp);
curl_multi_add_handle($mh, $curly[$id]);
}
// execute the handles
$running = null;
do {
curl_multi_exec($mh, $running);
} while($running > 0);
// get content and remove handles
foreach($curly as $id => $c) {
curl_multi_remove_handle($mh, $c);
}
// all done
curl_multi_close($mh);
}
And executing:
$data = array(
'http://example.com/img1.png',
'http://example.com/img2.png',
'http://example.com/img3.png'
);
$r = multiRequest($data);
So it's not really working. It creates the 3 files, but with zero bytes (empty), and giving me the following error (3 times) and it's printing some kind of content of the original .PNGs:
Warning: curl_multi_exec(): CURLOPT_FILE resource has gone away, resetting to default in /Applications/MAMP/htdocs/test.php on line 34
So please, could you let me know how to work it out?
Thanks in advance for your help!

What you are doing is creating a file handle then closing it before the end of the loop. This is going to result in curl not having any file to write to. Try something like this:
//$fp = fopen($path, 'x'); Remove
$url = $d;
$curly[$id] = curl_init($url);
curl_setopt($curly[$id], CURLOPT_HEADER, 0);
curl_setopt($curly[$id], CURLOPT_FILE, fopen($path, 'x'));
//fclose($fp); Remove

Related

How to turn Curl response into Json so I can use it on the front-end?

I have a function which uses curl_multi to make several GET requests. However, I can't figure out how to turn the response to JSON so I could access it on the front end. Usually I use json_decode(), however, it is not working this time. Maybe it's because I'm trying to decode a whole array of strings instead of a singular string?. Currently the response looks like this on the front-end:
And here's my function:
public function getVisited($username){
$user = User::where('username', $username)->first();
$visitedPlaces = $user->visitedPlaces;
$finalPlaces = [];
foreach ($visitedPlaces as $visitedPlace) {
$url = "https://maps.googleapis.com/maps/api/place/details/json?placeid=" . $visitedPlace->place_id . "&key=AIzaSyDQ64lYvtaYYYNWxLzkppdN-n0LulMOf4Y";
array_push($finalPlaces, $url);
}
$result = array();
$curly = array();
$mh = curl_multi_init();
foreach ($finalPlaces as $id => $d) {
$curly[$id] = curl_init();
$url = $d;
curl_setopt($curly[$id], CURLOPT_URL, $url);
curl_setopt($curly[$id], CURLOPT_HEADER, 0);
curl_setopt($curly[$id], CURLOPT_RETURNTRANSFER, 1);
curl_setopt($curly[$id], CURLOPT_SSL_VERIFYPEER, 0); // Skip SSL Verification
curl_multi_add_handle($mh, $curly[$id]);
}
$running = null;
do {
curl_multi_exec($mh, $running);
} while($running > 0);
foreach($curly as $id => $c) {
$result[$id] = curl_multi_getcontent($c);
curl_multi_remove_handle($mh, $c);
}
curl_multi_close($mh);
return response()->json([
'sights' => $result
], 201);
}
If you are expecting a json string from the curl_multi_getcontent($c) then you should decode it:
$result[$id] = json_decode(curl_multi_getcontent($c));
Then it should be encoded in the response appropriately.

Trying to figure out how to make asynchronous GET requests with Curl_multi. Code not returning expected stuff but not giving errors either

I'm currently trying to figure out how to make asynchronous GET requests in PHP. I was pointed towards Curl_multi but I am having a hard time figuring it out.
I have an array with 3 URLs to which I want to make the GET requests. Sadly my code gives no errors, however, the returned result is just an array with 3 empty elements " ". I'm not too familiar with Curl but I checked a couple of articles on how to use it and cobbled together this.
Am I doing something wrong?
public function getVisited(){
$finalPlaces = ['https://maps.googleapis.com/maps/api/place/details/json?placeid=1&key=',
'https://maps.googleapis.com/maps/api/place/details/json?placeid=2&key=',
'https://maps.googleapis.com/maps/api/place/details/json?placeid=3&key='];
$curly = array();
$result = array();
$mh = curl_multi_init();
foreach ($finalPlaces as $id => $d) {
$curly[$id] = curl_init();
$url = $d;
curl_setopt($curly[$id], CURLOPT_URL, $url);
curl_setopt($curly[$id], CURLOPT_HEADER, 0);
curl_setopt($curly[$id], CURLOPT_RETURNTRANSFER, 1);
curl_multi_add_handle($mh, $curly[$id]);
}
$running = null;
do {
curl_multi_exec($mh, $running);
} while($running > 0);
foreach($curly as $id => $c) {
$result[$id] = curl_multi_getcontent($c);
curl_multi_remove_handle($mh, $c);
}
curl_multi_close($mh);
}

Need help using multi curl in php to upload files

My current script to upload photos goes like this:
foreach($files as $file) {
$data = base64_encode(file_get_contents($file));
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, $data);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$result = curl_exec($ch);
}
I can have up to 1000 files to upload to a remote server and it can take a long time for curl to process everything. The solution seems to be multi-curl, however there is a unique aspect:
what I need from multi-curl is to save the response into an array like $upload_results[] = array($file, 'response')
How do I do this?
Thanks!
Essentially, this can be done by creating the handles in an array with the file names as keys and then reading the results into another array with the same keys.
function uploadLotsOfFiles($files) {
$mh = curl_multi_init();
$handles = array();
$results = array();
foreach ($files as $file) {
$handles[$file] = curl_init();
curl_setopt($handles[$file], CURLOPT_RETURNTRANSFER, true);
//File reading code and other options from your question go here
curl_multi_add_handle($mh, $handles[$file]);
}
$running = 0;
do {
curl_multi_exec($mh, $running);
curl_multi_select($mh); //Prevent eating CPU
} while($running > 0);
foreach($handles as $file => $handle) {
$results[$file] = curl_multi_getcontent($handle);
curl_multi_remove_handle($mh, $handle);
}
return $results;
}
If you really need the result to be in the format you specified (which I don't recommend since it's less elegant than using the file as a key):
$results[] = array($file, curl_multi_getcontent($handle));
Can be used in place of $results[$file] = curl_multi_getcontent($handle);

PHP Multi Curl is Running Very Slow On High Traffic Server Cluster

running into speed issues with multi curl. I am using multi curl to grab XML from various urls, all with response times of under 300ms. But my multi curl function is taking over 1 second to grab these URLS (About 10-15 URLS only). Below is the code I am using:
function multiRequest($data, $options = array()) {
// array of curl handles
$curly = array();
// data to be returned
$result = array();
// multi handle
$mh = curl_multi_init();
// loop through $data and create curl handles
// then add them to the multi-handle
foreach ($data as $id => $d) {
$curly[$id] = curl_init();
$url = (is_array($d) && !empty($d['url'])) ? $d['url'] : $d;
curl_setopt($curly[$id], CURLOPT_URL, $url);
curl_setopt($curly[$id], CURLOPT_HEADER, 0);
curl_setopt($curly[$id], CURLOPT_RETURNTRANSFER, 1);
curl_setopt($curly[$id], CURLOPT_NOSIGNAL, 1);
curl_setopt($curly[$id], CURLOPT_TIMEOUT_MS, 750);
curl_multi_add_handle($mh, $curly[$id]);
}
// execute the handles
do {
curl_multi_select($mh, .01);
$status = curl_multi_exec($mh, $running);
} while ($status === CURLM_CALL_MULTI_PERFORM || $running);
// get content and remove handles
foreach($curly as $id => $c) {
if(curl_errno($c) == 0)
$result[$id] = curl_multi_getcontent($c);
curl_multi_remove_handle($mh, $c);
}
// all done
curl_multi_close($mh);
return $result;
}
Is there anything I should be doing to speed this up? My clients throw away the request if it takes over 500ms to complete, so I want to get it to run as long as the longest request takes. I have the timeout set to 750ms, because even though the request times are less then 300ms, my function returns no results if below 750ms.
Lower this then curl_setopt($curly[$id], CURLOPT_TIMEOUT_MS, 750); to 500MS

curl multi download image

I would like if it's possible to optimize this request with multi curl handler ?
Thanks
$array_img = array(
'https://www.foooooobbbaaarrr.fr/images/1.jpeg',
'https://www.foooooobbbaaarrr.fr/images/2.jpeg',
'https://www.foooooobbbaaarrr.fr/images/3.jpeg',
'https://www.foooooobbbaaarrr.fr/images/4.jpeg');
foreach ($array_img as $k => $v)
{
$ch = curl_init($v);
$name = ($k + 1).'.jpeg';
$fp = fopen($name, 'wb');
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
fclose($fp);
}
That's a very open-ended question, but with the PHP/CURL's multi functions you can at least get all those files in parallel instead of getting them in a serial manner.

Categories