I have a http webapi. Which url is example.com/api
i want to send 50 post request to it at once. Right now I'm using multi curl to send request. Here is the code i'm using now:
$n = "0";
while ($n < 51) {
$data[$n] = [
'parameter1' => "value1",
'parameter2' => "value2",
];
$urls[$n] = "https://example.com/api";
$n++;
}
if (!empty($urls)) {
//create the array of cURL handles and add to a multi_curl
$mh = curl_multi_init();
foreach ($urls as $key => $url) {
$chs[$key] = curl_init($url);
curl_setopt($chs[$key], CURLOPT_ENCODING, '');
curl_setopt($chs[$key], CURLOPT_CONNECTTIMEOUT, 2);
curl_setopt($chs[$key], CURLOPT_TIMEOUT, 90);
curl_setopt($chs[$key], CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt($chs[$key], CURLOPT_SSL_VERIFYPEER, 0);
curl_setopt($chs[$key], CURLOPT_RETURNTRANSFER, true);
curl_setopt($chs[$key], CURLOPT_POST, true);
curl_setopt($chs[$key], CURLOPT_POSTFIELDS, http_build_query($data[$key]));
curl_multi_add_handle($mh, $chs[$key]);
}
//running the requests
$running = null;
do {
curl_multi_exec($mh, $running);
} while ($running);
//getting the responses
foreach(array_keys($chs) as $key){
$error = curl_error($chs[$key]);
echo $result = curl_multi_getcontent($chs[$key]);
}
This code work perfectly. But I want to know if there is any other better way or is there anything i can do to make it more efficient and fast. in my case the api url is fixed but post content is different. Should i use single curl by making loop and reuse handle? My main concern is speed.
Thanks in advance.
Related
I was wondering if it's possible to open multiple URLs with cURL or maybe something else.
I tried this until now.
$urls = array(
"http://google.com",
"http://youtube.com",
);
foreach($urls as $url) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT,0);
curl_setopt($ch, CURLOPT_TIMEOUT_MS, 200);
curl_exec($ch);
curl_close($ch);
}
The 200ms are there to let the site open fully.
Maybe you know any alternatives.
Is it possible to open multiple URLs in PHP at the same time? Not client sided, server side.
Your solution would be simultaneous cURL HTTP requests.
For faster implementation, you can use this function (thanks to phpied):
function multiRequest($data, $options = array()) {
// array of curl handles
$curly = array();
// data to be returned
$result = array();
// multi handle
$mh = curl_multi_init();
// loop through $data and create curl handles
// then add them to the multi-handle
foreach ($data as $id => $d) {
$curly[$id] = curl_init();
$url = (is_array($d) && !empty($d['url'])) ? $d['url'] : $d;
curl_setopt($curly[$id], CURLOPT_URL, $url);
curl_setopt($curly[$id], CURLOPT_HEADER, 0);
curl_setopt($curly[$id], CURLOPT_RETURNTRANSFER, 1);
// post?
if (is_array($d)) {
if (!empty($d['post'])) {
curl_setopt($curly[$id], CURLOPT_POST, 1);
curl_setopt($curly[$id], CURLOPT_POSTFIELDS, $d['post']);
}
}
// extra options?
if (!empty($options)) {
curl_setopt_array($curly[$id], $options);
}
curl_multi_add_handle($mh, $curly[$id]);
}
// execute the handles
$running = null;
do {
curl_multi_exec($mh, $running);
} while($running > 0);
// get content and remove handles
foreach($curly as $id => $c) {
$result[$id] = curl_multi_getcontent($c);
curl_multi_remove_handle($mh, $c);
}
// all done
curl_multi_close($mh);
return $result;
}
And use it like this:
$data = array(
'http://search.yahooapis.com/VideoSearchService/V1/videoSearch?appid=YahooDemo&query=Pearl+Jam&output=json',
'http://search.yahooapis.com/ImageSearchService/V1/imageSearch?appid=YahooDemo&query=Pearl+Jam&output=json',
'http://search.yahooapis.com/AudioSearchService/V1/artistSearch?appid=YahooDemo&artist=Pearl+Jam&output=json'
);
$r = multiRequest($data);
echo '<pre>';
print_r($r);
Hope it helps.
Also read this.
I have a script which is fetching data from another server via curl_multi_exec using below script, this script is working fine, but I'm getting out of memory exception.
$curly = array(); // array of curl handles
$result = array(); // data to be returned
$mh = curl_multi_init(); // multi handle
foreach ($xmlarray as $id => $d) {
$curly[$id] = curl_init();
curl_setopt($curly[$id], CURLOPT_URL, $url);
curl_setopt($curly[$id], CURLOPT_POST, true);
curl_setopt($curly[$id], CURLOPT_POSTFIELDS, $d);
curl_setopt($curly[$id], CURLOPT_RETURNTRANSFER, true);
curl_setopt($curly[$id], CURLOPT_TIMEOUT, 60);
curl_setopt($curly[$id], CURLOPT_SSLVERSION, 3);
curl_multi_add_handle($mh, $curly[$id]);
} // query data for each of sub queries on the $xmlarray
$running = null; // execute the handles
do {
curl_multi_exec($mh, $running);
curl_multi_select($mh);
} while($running > 0);
foreach($curly as $id => $c) {
$result[$id] = curl_multi_getcontent($c);
curl_multi_remove_handle($mh, $c);
}// get content and remove handles
$active = null;
curl_multi_close($mh);
file_put_contents('test.xml',$result);
$xmlarray here contains an array of requests, each of which contains around 500 users! When running the script for 5000 users - all works fine, when running it for 10000 users I'm getting out of memory exception and debug shows that the most memory is used by curl_multi_exec()!
What would be the best way for me to overcome this? Any assistance is highly appreciated! Thanks in advance.
EDIT
Tried to split my $xmlarray into number of arrays and action each batch separately (code below). This solution got me from 5k users to 13k users being processed.
$xmlarrayB = array_chunk($xmlarray, 5, true);
if(is_array($xmlarrayB)) {
foreach ($xmlarrayB as $xmlarrayBA) {
$curly = array(); // array of curl handles
$result = array(); // data to be returned
$mh = curl_multi_init(); // multi handle
foreach ($xmlarrayBA as $id => $d) {
$curly[$id] = curl_init();
curl_setopt($curly[$id], CURLOPT_URL, $url);
curl_setopt($curly[$id], CURLOPT_POST, true);
curl_setopt($curly[$id], CURLOPT_POSTFIELDS, $d);
curl_setopt($curly[$id], CURLOPT_RETURNTRANSFER, true);
curl_setopt($curly[$id], CURLOPT_TIMEOUT, 60);
curl_setopt($curly[$id], CURLOPT_SSLVERSION, 3);
curl_multi_add_handle($mh, $curly[$id]);
} // query data for each of sub queries on the $xmlarray
$running = null; // execute the handles
do {
curl_multi_exec($mh, $running);
curl_multi_select($mh);
} while($running > 0);
foreach($curly as $id => $c) {
$result[$id] = curl_multi_getcontent($c);
curl_multi_remove_handle($mh, $c);
}// get content and remove handles
$active = null;
//execute the handles
curl_multi_close($mh);
}
}
Any idea on how to increase that number for 5ok users?
EDIT2 - sample $xmlarray just for 2 users
Accept-Encoding:
gzip&token=305c7c5be78b5c8dd583312fe20578ac&subid=test_sub_id&idomain=adk.mediaff.com&cdomain=adk.mediaff.com&request=%3Crequest%3E%3Cemail%3E%3Crecipient%3Ed3e51df8f588139fb210d898c5964c3f%3C%2Frecipient%3E%3Clist%3E23413%3C%2Flist%3E%3Cdomain%3Eicloud.com%3C%2Fdomain%3E%3Ccountrycode%3E%3C%2Fcountrycode%3E%3Cmetrocode%3E%3C%2Fmetrocode%3E%3Cpostalcode%3E%3C%2Fpostalcode%3E%3Cgender%3E2%3C%2Fgender%3E%3Ctest%3E0%3C%2Ftest%3E%3C%2Femail%3E%3Cemail%3E%3Crecipient%3E728929dfbc0d785e41316d4fa97518e9%3C%2Frecipient%3E%3Clist%3E23413%3C%2Flist%3E%3Cdomain%3Ehotmail.com%3C%2Fdomain%3E%3Ccountrycode%3E%3C%2Fcountrycode%3E%3Cmetrocode%3E%3C%2Fmetrocode%3E%3Cpostalcode%3E%3C%2Fpostalcode%3E%3Cgender%3E1%3C%2Fgender%3E%3Ctest%3E0%3C%2Ftest%3E%3C%2Femail%3E%3C%2Frequest%3E&test=0
I would suggest you to split your array $xmlarray into chunks maybe 500 or 5000 chunk size.
then execute your curl request for each of these chunks. Use FILE_APPEND with file_put_contents when trying to put result into the file, otherwise the file will be overwritten for each chunk.
I am trying to speed up my website by processing the cURL requests efficiently. I am running about 3 requests, two go to the same server. Here is my code:
$profile = curl_init();
curl_setopt($profile, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($profile, CURLOPT_RETURNTRANSFER, true);
curl_setopt($profile, CURLOPT_FAILONERROR, true);
curl_setopt($profile, CURLOPT_URL,"https://owapi.net/api/v2/u/".$battletag."/stats/".$mode."?platform=".$platform);
$result = curl_exec($profile); //grab API data
curl_close($profile);
$stats = json_decode($result, true); //decode JSON data
$profile1 = curl_init();
curl_setopt($profile1, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($profile1, CURLOPT_RETURNTRANSFER, true);
curl_setopt($profile1, CURLOPT_FAILONERROR, true);
curl_setopt($profile1, CURLOPT_URL,"https://api.lootbox.eu/".$platform."/us/".$battletag."/profile");
$result1 = curl_exec($profile1); //grab API data
curl_close($profile1);
$stats1 = json_decode($result1, true);
$hero_stats = curl_init();
curl_setopt($hero_stats, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($hero_stats, CURLOPT_RETURNTRANSFER, true);
curl_setopt($hero_stats, CURLOPT_FAILONERROR, true);
curl_setopt($hero_stats, CURLOPT_URL,"https://api.lootbox.eu/".$platform."/us/".$battletag."/competitive-play/heroes");
$hero_play_time = curl_exec($hero_stats); //grab API data
curl_close($hero_stats);
$heroes_info = json_decode($hero_play_time, true);
How can I process these requests at the same time without restarting the connection? I want to speed up the load time of my site because right now, it takes a long time. Any help would be appreciated. I have heard of the curl_multi_init() method but I am not sure on how to use it properly. Any help wit that would be welcomed.
Thanks.
Well thanks for all the help guys. Especially #Andrew.
I found a solution that ended up working. Posting it here for other people with a similar problem.
function multiRequest($data, $options = array()) {
// array of curl handles
$curly = array();
// data to be returned
$result = array();
// multi handle
$mh = curl_multi_init();
// loop through $data and create curl handles
// then add them to the multi-handle
foreach ($data as $id => $d) {
$curly[$id] = curl_init();
$url = (is_array($d) && !empty($d['url'])) ? $d['url'] : $d;
curl_setopt($curly[$id], CURLOPT_URL, $url);
curl_setopt($curly[$id], CURLOPT_HEADER, 0);
curl_setopt($curly[$id], CURLOPT_RETURNTRANSFER, 1);
// post?
if (is_array($d)) {
if (!empty($d['post'])) {
curl_setopt($curly[$id], CURLOPT_POST, 1);
curl_setopt($curly[$id], CURLOPT_POSTFIELDS, $d['post']);
}
}
// extra options?
if (!empty($options)) {
curl_setopt_array($curly[$id], $options);
}
curl_multi_add_handle($mh, $curly[$id]);
}
// execute the handles
$running = null;
do {
curl_multi_exec($mh, $running);
} while($running > 0);
// get content and remove handles
foreach($curly as $id => $c) {
$result[$id] = json_decode(curl_multi_getcontent($c), true);
curl_multi_remove_handle($mh, $c);
}
// all done
curl_multi_close($mh);
return $result;
}
$data = array(
'https://owapi.net/api/v2/u/'.$battletag.'/stats/'.$mode.'?platform='.$platform,
'https://api.lootbox.eu/'.$platform.'/us/'.$battletag.'/profile',
'https://api.lootbox.eu/'.$platform.'/us/'.$battletag.'/competitive-play/heroes'
);
$r = multiRequest($data);
This worked, I just had to add the json_decode method over the get_contents method.
Thanks again everyone. Really appreciate the help.
I found the following function that I've been able to use to collect and cache share counts on various social networks. Thus far, I can feed Twitter, LinkedIn, Facebook, and Pinterest URL's in an Array to this function and they all kick back a response that I can parse and get the count from.
In an effort to speed up the process, I recently found this process that uses cURL multi to fetch all the shares at the same time instead of processing one request at a time.
However, the cURL that I had been using for Google Plus has a lot more configuration in order to make it work. Is it possible to get this configuration into this function so that as it's looping through the requests, if it sees Google Plus, it adds all of this information to that specific request, but still runs the request simultaneously to the others?
Here's the cURL multi function that I'm using:
function sw_fetch_shares_via_curl_multi($data, $options = array()) {
// array of curl handles
$curly = array();
// data to be returned
$result = array();
// multi handle
$mh = curl_multi_init();
// loop through $data and create curl handles
// then add them to the multi-handle
foreach ($data as $id => $d) {
$curly[$id] = curl_init();
$url = (is_array($d) && !empty($d['url'])) ? $d['url'] : $d;
curl_setopt($curly[$id], CURLOPT_URL, $url);
curl_setopt($curly[$id], CURLOPT_HEADER, 0);
curl_setopt($curly[$id], CURLOPT_RETURNTRANSFER, 1);
// post?
if (is_array($d)) {
if (!empty($d['post'])) {
curl_setopt($curly[$id], CURLOPT_POST, 1);
curl_setopt($curly[$id], CURLOPT_POSTFIELDS, $d['post']);
}
}
// extra options?
if (!empty($options)) {
curl_setopt_array($curly[$id], $options);
}
curl_multi_add_handle($mh, $curly[$id]);
}
// execute the handles
$running = null;
do {
curl_multi_exec($mh, $running);
} while($running > 0);
// get content and remove handles
foreach($curly as $id => $c) {
$result[$id] = curl_multi_getcontent($c);
curl_multi_remove_handle($mh, $c);
}
// all done
curl_multi_close($mh);
return $result;
}
The array that I feed into it is basically something like this:
$request_url['pinterest'] = 'https://api.pinterest.com/v1/urls/count.json?url='.$url;
$request_url['twitter'] = 'https://urls.api.twitter.com/1/urls/count.json?url=' . $url;
And so on and so forth for the other networks. I pass those into the cURL multi function, and they send me some json that I can parse and work with.
Here's the configuration for Google Plus that I would like to integrate into the same function so that I can easily pass it in as well:
function sw_fetch_googlePlus_shares($url) {
$url = rawurlencode($url);
$curl = curl_init();
curl_setopt($curl, CURLOPT_URL, "https://clients6.google.com/rpc");
curl_setopt($curl, CURLOPT_POST, true);
curl_setopt($curl, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($curl, CURLOPT_POSTFIELDS, '[{"method":"pos.plusones.get","id":"p","params":{"nolog":true,"id":"'.rawurldecode($url).'","source":"widget","userId":"#viewer","groupId":"#self"},"jsonrpc":"2.0","key":"p","apiVersion":"v1"}]');
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
curl_setopt($curl, CURLOPT_HTTPHEADER, array('Content-type: application/json'));
$curl_results = curl_exec ($curl);
curl_close ($curl);
$json = json_decode($curl_results, true);
return isset($json[0]['result']['metadata']['globalCounts']['count'])?intval( $json[0]['result']['metadata']['globalCounts']['count'] ):0;
}
Can I set up that loop somehow to see if the $id is 'googlePlus', then it adds all this stuff to that particular request? Is it possible to check if it's Google Plus right before the curl_setopts lines and then pass these other ones in instead somehow? Thanks.
It turns out that this was a lot easier than I thought:
function sw_fetch_shares_via_curl_multi($data, $options = array()) {
// array of curl handles
$curly = array();
// data to be returned
$result = array();
// multi handle
$mh = curl_multi_init();
// loop through $data and create curl handles
// then add them to the multi-handle
foreach ($data as $id => $d) {
$curly[$id] = curl_init();
if($id == 'googlePlus'):
curl_setopt($curly[$id], CURLOPT_URL, "https://clients6.google.com/rpc");
curl_setopt($curly[$id], CURLOPT_POST, true);
curl_setopt($curly[$id], CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($curly[$id], CURLOPT_POSTFIELDS, '[{"method":"pos.plusones.get","id":"p","params":{"nolog":true,"id":"'.rawurldecode($d).'","source":"widget","userId":"#viewer","groupId":"#self"},"jsonrpc":"2.0","key":"p","apiVersion":"v1"}]');
curl_setopt($curly[$id], CURLOPT_RETURNTRANSFER, true);
curl_setopt($curly[$id], CURLOPT_HTTPHEADER, array('Content-type: application/json'));
else:
$url = (is_array($d) && !empty($d['url'])) ? $d['url'] : $d;
curl_setopt($curly[$id], CURLOPT_URL, $url);
curl_setopt($curly[$id], CURLOPT_HEADER, 0);
curl_setopt($curly[$id], CURLOPT_RETURNTRANSFER, 1);
endif;
// extra options?
if (!empty($options)) {
curl_setopt_array($curly[$id], $options);
}
curl_multi_add_handle($mh, $curly[$id]);
}
// execute the handles
$running = null;
do {
curl_multi_exec($mh, $running);
} while($running > 0);
// get content and remove handles
foreach($curly as $id => $c) {
$result[$id] = curl_multi_getcontent($c);
curl_multi_remove_handle($mh, $c);
}
// all done
curl_multi_close($mh);
return $result;
}
You can now use this function to fetch all five major networks using simultaneous connections instead of one at a time.
For the array, all the other networks require the request URL, but Google Plus has it's request information built into that plugin so it only requires the URL of the page that you want information for.
I am using curl, I am wondering how would I send post/submit data on my page to those websites? The web site has "host, time, port". My MYSQL database has a list of urls. I was thinking of curl_multi but I am not sure.
Please someone post examples. It has to be a fast method.
Basically feteches the url and post.
while($resultSet = mysql_fetch_array($SQL)){
$ch = curl_init($resultSet['url'] . $fullcurl);
curl_setopt($ch, CURLOPT_TIMEOUT, 2);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
}
The PHP cURL reference says that the CURLOPT_POST option, set to true, makes it a POST request. CURLOPT_POSTFIELDS sets the fields that you will send in foo=bar&spam=eggs format (which one can build from an array with http_build_query).
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, 'foo=bar&spam=eggs');
Here is an example on how to do it with curl_multi. Although you should break it up so you only have a certain amount of URLs going out at once (i.e. 30). I added the follow location directive, which you usually want.
$mh = curl_multi_init();
$ch = array();
while($resultSet = mysql_fetch_array($SQL)){
$ch[$i] = curl_init($resultSet['url'] . $fullcurl);
curl_setopt($ch[$i], CURLOPT_TIMEOUT, 2);
curl_setopt($ch[$i], CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch[$i], CURLOPT_FOLLOWLOCATION, true);
curl_multi_add_handle($mh, $ch[$i]);
}
$running = null;
do {
curl_multi_exec($mh,$running);
} while ($running > 0);
$num = count($ch);
for ($i=0; $i<$num; $i++ ) {
curl_multi_remove_handle($mh, $ch[$i]);
}
curl_multi_close($mh);
Give this a shot:
while ($resultSet = mysql_fetch_assoc($SQL)) {
$ch = curl_init($resultSet['url']);
curl_setopt($ch, CURLOPT_RETURNTRANSFER,1);
curl_setopt($ch, CURLOPT_TIMEOUT,2);
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, $fullcurl);
$response = curl_exec($ch);
curl_close();
}