I want a lot of domain address for checking the status.
I try multi curl but its to slow
class BotCronJobs extends Controller {
public function __construct() {
}
public function index() {
$Query = servers::all();
$urls = [];
foreach ($Query as $item){
$urls[$item->id] = $item->serverUrl;
}
var_dump($this->test($urls));
}
public function test($urls = []) {
$status = [];
$mh = curl_multi_init();
foreach($urls as $key => $value){
$ch[$key] = curl_init($value);
curl_setopt($ch[$key], CURLOPT_NOBODY, true);
curl_setopt($ch[$key], CURLOPT_HEADER, true);
curl_setopt($ch[$key], CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch[$key], CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch[$key], CURLOPT_SSL_VERIFYHOST, false);
curl_multi_add_handle($mh,$ch[$key]);
}
do {
curl_multi_exec($mh, $running);
curl_multi_select($mh);
} while ($running > 0);
foreach(array_keys($ch) as $key){
$status[$key][] = curl_getinfo($ch[$key], CURLINFO_HTTP_CODE);
$status[$key][] = curl_getinfo($ch[$key], CURLINFO_EFFECTIVE_URL);
curl_multi_remove_handle($mh, $ch[$key]);
}
curl_multi_close($mh);
return $status;
}
}
I just need to check is server online or not and every server have an id it's important for me to understand which server is offline.
is there any faster way?
your function looks nearly optimal to me. i guess its slow because some domains are not responding, and the default CURLOPT_CONNECTTIMEOUT is too high. try setting CURLOPT_CONNECTTIMEOUT to 1 and CURLOPT_TIMEOUT to 2, that should make it stall for max 2 seconds on not-responding domains.
also, if you don't actually need the http response code, but is ok with just checking if the server is actually accepting connections or not, maybe using the socket_ api would be faster.
Related
I have a function which uses curl_multi to make several GET requests. However, I can't figure out how to turn the response to JSON so I could access it on the front end. Usually I use json_decode(), however, it is not working this time. Maybe it's because I'm trying to decode a whole array of strings instead of a singular string?. Currently the response looks like this on the front-end:
And here's my function:
public function getVisited($username){
$user = User::where('username', $username)->first();
$visitedPlaces = $user->visitedPlaces;
$finalPlaces = [];
foreach ($visitedPlaces as $visitedPlace) {
$url = "https://maps.googleapis.com/maps/api/place/details/json?placeid=" . $visitedPlace->place_id . "&key=AIzaSyDQ64lYvtaYYYNWxLzkppdN-n0LulMOf4Y";
array_push($finalPlaces, $url);
}
$result = array();
$curly = array();
$mh = curl_multi_init();
foreach ($finalPlaces as $id => $d) {
$curly[$id] = curl_init();
$url = $d;
curl_setopt($curly[$id], CURLOPT_URL, $url);
curl_setopt($curly[$id], CURLOPT_HEADER, 0);
curl_setopt($curly[$id], CURLOPT_RETURNTRANSFER, 1);
curl_setopt($curly[$id], CURLOPT_SSL_VERIFYPEER, 0); // Skip SSL Verification
curl_multi_add_handle($mh, $curly[$id]);
}
$running = null;
do {
curl_multi_exec($mh, $running);
} while($running > 0);
foreach($curly as $id => $c) {
$result[$id] = curl_multi_getcontent($c);
curl_multi_remove_handle($mh, $c);
}
curl_multi_close($mh);
return response()->json([
'sights' => $result
], 201);
}
If you are expecting a json string from the curl_multi_getcontent($c) then you should decode it:
$result[$id] = json_decode(curl_multi_getcontent($c));
Then it should be encoded in the response appropriately.
I'm currently trying to figure out how to make asynchronous GET requests in PHP. I was pointed towards Curl_multi but I am having a hard time figuring it out.
I have an array with 3 URLs to which I want to make the GET requests. Sadly my code gives no errors, however, the returned result is just an array with 3 empty elements " ". I'm not too familiar with Curl but I checked a couple of articles on how to use it and cobbled together this.
Am I doing something wrong?
public function getVisited(){
$finalPlaces = ['https://maps.googleapis.com/maps/api/place/details/json?placeid=1&key=',
'https://maps.googleapis.com/maps/api/place/details/json?placeid=2&key=',
'https://maps.googleapis.com/maps/api/place/details/json?placeid=3&key='];
$curly = array();
$result = array();
$mh = curl_multi_init();
foreach ($finalPlaces as $id => $d) {
$curly[$id] = curl_init();
$url = $d;
curl_setopt($curly[$id], CURLOPT_URL, $url);
curl_setopt($curly[$id], CURLOPT_HEADER, 0);
curl_setopt($curly[$id], CURLOPT_RETURNTRANSFER, 1);
curl_multi_add_handle($mh, $curly[$id]);
}
$running = null;
do {
curl_multi_exec($mh, $running);
} while($running > 0);
foreach($curly as $id => $c) {
$result[$id] = curl_multi_getcontent($c);
curl_multi_remove_handle($mh, $c);
}
curl_multi_close($mh);
}
My current script to upload photos goes like this:
foreach($files as $file) {
$data = base64_encode(file_get_contents($file));
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, $data);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$result = curl_exec($ch);
}
I can have up to 1000 files to upload to a remote server and it can take a long time for curl to process everything. The solution seems to be multi-curl, however there is a unique aspect:
what I need from multi-curl is to save the response into an array like $upload_results[] = array($file, 'response')
How do I do this?
Thanks!
Essentially, this can be done by creating the handles in an array with the file names as keys and then reading the results into another array with the same keys.
function uploadLotsOfFiles($files) {
$mh = curl_multi_init();
$handles = array();
$results = array();
foreach ($files as $file) {
$handles[$file] = curl_init();
curl_setopt($handles[$file], CURLOPT_RETURNTRANSFER, true);
//File reading code and other options from your question go here
curl_multi_add_handle($mh, $handles[$file]);
}
$running = 0;
do {
curl_multi_exec($mh, $running);
curl_multi_select($mh); //Prevent eating CPU
} while($running > 0);
foreach($handles as $file => $handle) {
$results[$file] = curl_multi_getcontent($handle);
curl_multi_remove_handle($mh, $handle);
}
return $results;
}
If you really need the result to be in the format you specified (which I don't recommend since it's less elegant than using the file as a key):
$results[] = array($file, curl_multi_getcontent($handle));
Can be used in place of $results[$file] = curl_multi_getcontent($handle);
running into speed issues with multi curl. I am using multi curl to grab XML from various urls, all with response times of under 300ms. But my multi curl function is taking over 1 second to grab these URLS (About 10-15 URLS only). Below is the code I am using:
function multiRequest($data, $options = array()) {
// array of curl handles
$curly = array();
// data to be returned
$result = array();
// multi handle
$mh = curl_multi_init();
// loop through $data and create curl handles
// then add them to the multi-handle
foreach ($data as $id => $d) {
$curly[$id] = curl_init();
$url = (is_array($d) && !empty($d['url'])) ? $d['url'] : $d;
curl_setopt($curly[$id], CURLOPT_URL, $url);
curl_setopt($curly[$id], CURLOPT_HEADER, 0);
curl_setopt($curly[$id], CURLOPT_RETURNTRANSFER, 1);
curl_setopt($curly[$id], CURLOPT_NOSIGNAL, 1);
curl_setopt($curly[$id], CURLOPT_TIMEOUT_MS, 750);
curl_multi_add_handle($mh, $curly[$id]);
}
// execute the handles
do {
curl_multi_select($mh, .01);
$status = curl_multi_exec($mh, $running);
} while ($status === CURLM_CALL_MULTI_PERFORM || $running);
// get content and remove handles
foreach($curly as $id => $c) {
if(curl_errno($c) == 0)
$result[$id] = curl_multi_getcontent($c);
curl_multi_remove_handle($mh, $c);
}
// all done
curl_multi_close($mh);
return $result;
}
Is there anything I should be doing to speed this up? My clients throw away the request if it takes over 500ms to complete, so I want to get it to run as long as the longest request takes. I have the timeout set to 750ms, because even though the request times are less then 300ms, my function returns no results if below 750ms.
Lower this then curl_setopt($curly[$id], CURLOPT_TIMEOUT_MS, 750); to 500MS
For some reason curl_errno($value) always returns 0 instead of 6 when I try url like stkovrflow.com. This is a non existing domain. So curl supposed to return 6. But i'm getting 0.
Can someone tell me what's wrong with my code?
This is how I check curl error
if (curl_errno($value) !== 0)
{
$handles[$key]['error_code'] = curl_errno($value);
}
Here is my full code
<?php
protected function curl($url)
{
$mh = curl_multi_init();
$handles = array();
foreach ($url as $link)
{
$handles[$link] = curl_init($link);
curl_setopt($handles[$link], CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($handles[$link], CURLOPT_HEADER, true);
curl_setopt($handles[$link], CURLOPT_RETURNTRANSFER, true);
curl_setopt($handles[$link],CURLOPT_FAILONERROR,true);
curl_setopt($handles[$link], CURLOPT_FOLLOWLOCATION, $this->curlFollowLocation);
curl_setopt($handles[$link], CURLOPT_MAXREDIRS, $this->curlMaxRedirects);
curl_setopt($handles[$link], CURLOPT_TIMEOUT, $this->curlTimeout);
curl_setopt($handles[$link], CURLOPT_USERAGENT, $this->curlUserAgent);
curl_setopt($handles[$link], CURLOPT_AUTOREFERER, true);
curl_multi_add_handle($mh, $handles[$link]);
}
$running = null;
do {
curl_multi_exec($mh, $running);
usleep(200000);
} while ($running > 0);
foreach ($handles as $key => $value)
{
$handles[$key] = false;
$handles[$key]['error_code'] = false;
if (curl_errno($value) !== 0)
{
$handles[$key]['error_code'] = curl_errno($value);
} else {
$response = curl_multi_getcontent($value);
$httpCode = curl_getinfo($value, CURLINFO_HTTP_CODE);
if ( $httpCode != 200 ) {
$handles[$key]['error_code'] = $httpCode;
} else {
$handles[$key]['html'] = $response;
}
}
curl_multi_remove_handle($mh, $value);
curl_close($value);
}
curl_multi_close($mh);
return $handles;
}
Update:
Looks like curl_errno does not work (see bug report) in curl multi mode. Instead we should use curl_multi_info_read. When I use curl_multi_info_read like this
$e_code = curl_multi_info_read($mh);
var_dump($e_code);
this is my var_dump output.
array (size=3)
'msg' => int 1
'result' => int 6
'handle' => resource(7, curl)
As you can it returns 6 correctly. But php doc says
The data the returned resource points to will not survive calling curl_multi_remove_handle().
Unfortunately my script depends on curl_multi_remove_handle(). Any solution?
=======