running into speed issues with multi curl. I am using multi curl to grab XML from various urls, all with response times of under 300ms. But my multi curl function is taking over 1 second to grab these URLS (About 10-15 URLS only). Below is the code I am using:
function multiRequest($data, $options = array()) {
// array of curl handles
$curly = array();
// data to be returned
$result = array();
// multi handle
$mh = curl_multi_init();
// loop through $data and create curl handles
// then add them to the multi-handle
foreach ($data as $id => $d) {
$curly[$id] = curl_init();
$url = (is_array($d) && !empty($d['url'])) ? $d['url'] : $d;
curl_setopt($curly[$id], CURLOPT_URL, $url);
curl_setopt($curly[$id], CURLOPT_HEADER, 0);
curl_setopt($curly[$id], CURLOPT_RETURNTRANSFER, 1);
curl_setopt($curly[$id], CURLOPT_NOSIGNAL, 1);
curl_setopt($curly[$id], CURLOPT_TIMEOUT_MS, 750);
curl_multi_add_handle($mh, $curly[$id]);
}
// execute the handles
do {
curl_multi_select($mh, .01);
$status = curl_multi_exec($mh, $running);
} while ($status === CURLM_CALL_MULTI_PERFORM || $running);
// get content and remove handles
foreach($curly as $id => $c) {
if(curl_errno($c) == 0)
$result[$id] = curl_multi_getcontent($c);
curl_multi_remove_handle($mh, $c);
}
// all done
curl_multi_close($mh);
return $result;
}
Is there anything I should be doing to speed this up? My clients throw away the request if it takes over 500ms to complete, so I want to get it to run as long as the longest request takes. I have the timeout set to 750ms, because even though the request times are less then 300ms, my function returns no results if below 750ms.
Lower this then curl_setopt($curly[$id], CURLOPT_TIMEOUT_MS, 750); to 500MS
Related
I'm currently trying to figure out how to make asynchronous GET requests in PHP. I was pointed towards Curl_multi but I am having a hard time figuring it out.
I have an array with 3 URLs to which I want to make the GET requests. Sadly my code gives no errors, however, the returned result is just an array with 3 empty elements " ". I'm not too familiar with Curl but I checked a couple of articles on how to use it and cobbled together this.
Am I doing something wrong?
public function getVisited(){
$finalPlaces = ['https://maps.googleapis.com/maps/api/place/details/json?placeid=1&key=',
'https://maps.googleapis.com/maps/api/place/details/json?placeid=2&key=',
'https://maps.googleapis.com/maps/api/place/details/json?placeid=3&key='];
$curly = array();
$result = array();
$mh = curl_multi_init();
foreach ($finalPlaces as $id => $d) {
$curly[$id] = curl_init();
$url = $d;
curl_setopt($curly[$id], CURLOPT_URL, $url);
curl_setopt($curly[$id], CURLOPT_HEADER, 0);
curl_setopt($curly[$id], CURLOPT_RETURNTRANSFER, 1);
curl_multi_add_handle($mh, $curly[$id]);
}
$running = null;
do {
curl_multi_exec($mh, $running);
} while($running > 0);
foreach($curly as $id => $c) {
$result[$id] = curl_multi_getcontent($c);
curl_multi_remove_handle($mh, $c);
}
curl_multi_close($mh);
}
I want a lot of domain address for checking the status.
I try multi curl but its to slow
class BotCronJobs extends Controller {
public function __construct() {
}
public function index() {
$Query = servers::all();
$urls = [];
foreach ($Query as $item){
$urls[$item->id] = $item->serverUrl;
}
var_dump($this->test($urls));
}
public function test($urls = []) {
$status = [];
$mh = curl_multi_init();
foreach($urls as $key => $value){
$ch[$key] = curl_init($value);
curl_setopt($ch[$key], CURLOPT_NOBODY, true);
curl_setopt($ch[$key], CURLOPT_HEADER, true);
curl_setopt($ch[$key], CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch[$key], CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch[$key], CURLOPT_SSL_VERIFYHOST, false);
curl_multi_add_handle($mh,$ch[$key]);
}
do {
curl_multi_exec($mh, $running);
curl_multi_select($mh);
} while ($running > 0);
foreach(array_keys($ch) as $key){
$status[$key][] = curl_getinfo($ch[$key], CURLINFO_HTTP_CODE);
$status[$key][] = curl_getinfo($ch[$key], CURLINFO_EFFECTIVE_URL);
curl_multi_remove_handle($mh, $ch[$key]);
}
curl_multi_close($mh);
return $status;
}
}
I just need to check is server online or not and every server have an id it's important for me to understand which server is offline.
is there any faster way?
your function looks nearly optimal to me. i guess its slow because some domains are not responding, and the default CURLOPT_CONNECTTIMEOUT is too high. try setting CURLOPT_CONNECTTIMEOUT to 1 and CURLOPT_TIMEOUT to 2, that should make it stall for max 2 seconds on not-responding domains.
also, if you don't actually need the http response code, but is ok with just checking if the server is actually accepting connections or not, maybe using the socket_ api would be faster.
I'm attempting some sort of multitasking using curl. The task the curls do is to look for a string in several 4-megabyte files. When I delegated one curl call for each file, that was already some gain over one PHP scanning them all. It took from 3 to 4 seconds to scan all files. Then I decided to create TWO curls for each file scan, each curl call scanning a half of the file. I felt I didn't get the increase in speed I expected, so I tried now 4 curls for each file because I wanted to check the overhead/efficacy ratio. Some profiling showed me that:
$running = null;
do {
curl_multi_exec($mh, $running);
} while($running > 0);
Takes around 4 seconds to complete 28 curls. Now, the longest time a curl took was 0.4. Why had the difference? I thought I was blocking it somehow, but the added values of the 28 executions (measured independently inside the PHP's that were called) would easily jump 10 s, so the 'multitasking' was there.
To make things clearer, this is what I am using:
function multiRequest($data, $options = array(),$color) {
// array of curl handles
$curly = array();
// data to be returned
$result = array();
// multi handle
$mh = curl_multi_init();
// loop through $data and create curl handles
// then add them to the multi-handle
foreach ($data as $id => $d) {
$curly[$id] = curl_init();
$url = (is_array($d) && !empty($d['url'])) ? $d['url'] : $d;
curl_setopt($curly[$id], CURLOPT_URL, $url);
//curl_setopt($curly[$id], CURLOPT_TIMEOUT,1); <-- this would make my computer a zombie, seemingly worrying about NOTHING but the curls
curl_setopt($curly[$id], CURLOPT_FRESH_CONNECT, true); //<-- I added this line cuz I heard its good
curl_setopt($curly[$id], CURLOPT_HEADER, 0);
curl_setopt($curly[$id], CURLOPT_RETURNTRANSFER, 1);
// post?
if (is_array($d)) {
if (!empty($d['post'])) {
curl_setopt($curly[$id], CURLOPT_POST, 1);
curl_setopt($curly[$id], CURLOPT_POSTFIELDS, $d['post']);
}
}
// extra options?
if (!empty($options)) {
curl_setopt_array($curly[$id], $options);
}
curl_multi_add_handle($mh, $curly[$id]);
}
// execute the handles
$running = null;
do {
curl_multi_exec($mh, $running); //<--- grrrr
} while($running > 0);
// get content and remove handles
foreach($curly as $id => $c) {
$result[$id] = curl_multi_getcontent($c);
curl_multi_remove_handle($mh, $c);
}
// all done
curl_multi_close($mh);
}
(I've read there is a method I can add a callback function that gets triggered seemingly on progress, and that would enable me some manipulation of the "get contents and remove handles" part, but is it possible to bypass the waiting at the curl_multi_exec loop while have the curl files execute? Or we have to keep calling the curl_multi_exec for the execution to proceed? I'm thinking: 1 has curl call for PHPs/2 curl doesn't wait and returns and for us it is dead, but it run desired PHPs/3 PHPs write data to files at their own pace while main program scans for these files, curl theoretically still dealing with its overhead while we already returned to client...)
Thanks in advance.
My current script to upload photos goes like this:
foreach($files as $file) {
$data = base64_encode(file_get_contents($file));
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, $data);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$result = curl_exec($ch);
}
I can have up to 1000 files to upload to a remote server and it can take a long time for curl to process everything. The solution seems to be multi-curl, however there is a unique aspect:
what I need from multi-curl is to save the response into an array like $upload_results[] = array($file, 'response')
How do I do this?
Thanks!
Essentially, this can be done by creating the handles in an array with the file names as keys and then reading the results into another array with the same keys.
function uploadLotsOfFiles($files) {
$mh = curl_multi_init();
$handles = array();
$results = array();
foreach ($files as $file) {
$handles[$file] = curl_init();
curl_setopt($handles[$file], CURLOPT_RETURNTRANSFER, true);
//File reading code and other options from your question go here
curl_multi_add_handle($mh, $handles[$file]);
}
$running = 0;
do {
curl_multi_exec($mh, $running);
curl_multi_select($mh); //Prevent eating CPU
} while($running > 0);
foreach($handles as $file => $handle) {
$results[$file] = curl_multi_getcontent($handle);
curl_multi_remove_handle($mh, $handle);
}
return $results;
}
If you really need the result to be in the format you specified (which I don't recommend since it's less elegant than using the file as a key):
$results[] = array($file, curl_multi_getcontent($handle));
Can be used in place of $results[$file] = curl_multi_getcontent($handle);
I have 3 web services that are providing me data for my hotel booking engine. It is taking too long if I run them sequentially. Therefore I wanted to run them using threads. But not sure if php threading will support this and if its safe (since all 3 processes that will handle the web service will read and write into shared tables )
Can Anyone advise me on how I should proceed ??
I generally use following function to send Simultaneuos HTTP requests to web services
<?php
function multiRequest($data, $options = array()) {
// array of curl handles
$curly = array();
// data to be returned
$result = array();
// multi handle
$mh = curl_multi_init();
// loop through $data and create curl handles
// then add them to the multi-handle
foreach ($data as $id => $d) {
$curly[$id] = curl_init();
$url = (is_array($d) && !empty($d['url'])) ? $d['url'] : $d;
curl_setopt($curly[$id], CURLOPT_URL, $url);
curl_setopt($curly[$id], CURLOPT_HEADER, 0);
curl_setopt($curly[$id], CURLOPT_RETURNTRANSFER, 1);
// post?
if (is_array($d)) {
if (!empty($d['post'])) {
curl_setopt($curly[$id], CURLOPT_POST, 1);
curl_setopt($curly[$id], CURLOPT_POSTFIELDS, $d['post']);
}
}
// extra options?
if (!empty($options)) {
curl_setopt_array($curly[$id], $options);
}
curl_multi_add_handle($mh, $curly[$id]);
}
// execute the handles
$running = null;
do {
curl_multi_exec($mh, $running);
} while($running > 0);
// get content and remove handles
foreach($curly as $id => $c) {
$result[$id] = curl_multi_getcontent($c);
curl_multi_remove_handle($mh, $c);
}
// all done
curl_multi_close($mh);
return $result;
}
?>
to consume web services I use
<?php
$data = array(
'http://search.yahooapis.com/VideoSearchService/V1/videoSearch?appid=YahooDemo&query=Pearl+Jam&output=json',
'http://search.yahooapis.com/ImageSearchService/V1/imageSearch?appid=YahooDemo&query=Pearl+Jam&output=json',
'http://search.yahooapis.com/AudioSearchService/V1/artistSearch?appid=YahooDemo&artist=Pearl+Jam&output=json'
);
$r = multiRequest($data);
echo '<pre>';
print_r($r);
?>
Hope this helps you :)
Also check this answer here
If it is good with AJAX use AJAX to call your web service..
Since ajax is an async task you can call the service asynchronously and write the further process or code on the success function.
You can use ajax like this:
$.ajax({
type: "GET", // insert type
dataType: "jsonp",
url: "http://address/to/your/web/service",
success: function(data){
alert(data);
}
});
Please refer Here for more info.