Hello guys I am checking the url if it has a image or not by using multi curl. But here's the issue. What if the $testArray array has like 2000 links and I do not want to make 2000 curl request at a time, so I would like to do curl request of 50 at a time. How can I accomplish this? Please let me know any confusing with code. Thanks a lot.
function checkImageIfExist ($imageLink) {
$imageLinkArray = array();
$curl_arr = array();
$mh = curl_multi_init();
foreach ($imageLink as $key => $value) {
$curl_arr[$key] = curl_init();
curl_setopt($curl_arr[$key], CURLOPT_URL, $value);
curl_setopt($curl_arr[$key], CURLOPT_RETURNTRANSFER, true);
curl_multi_add_handle($mh, $curl_arr[$key]);
do {
curl_multi_exec($mh, $running);
} while($running > 0);
$httpcode = curl_getinfo($curl_arr[$key], CURLINFO_HTTP_CODE);
if ($httpcode == 200)
$imageLinkArray[] = $value;
else
'';
}
print_r($imageLinkArray);
curl_multi_close($mh);
}
This is how I call the function.
checkImageIfExist($testArray);
Related
I am trying to get response from an array of webpages parallelly, with the help of curl_multi. Following is what I have tried:
$urls = ["wrongUrl1", "http://Wrongurl2.com"]
$mh = curl_multi_init();
foreach ($urls as $key => $url) {
$chs[$key] = curl_init($url);
curl_setopt($chs[$key], CURLOPT_RETURNTRANSFER, true);
curl_setopt($chs[$key], CURLOPT_FAILONERROR, true);
curl_multi_add_handle($mh, $chs[$key]);
}
//running the requests
$running = null;
do {
curl_multi_exec($mh, $running);
if ($running) {
// Wait a short time for more activity
curl_multi_select($mh);
}
} while ($running);
//getting the responses
foreach(array_keys($chs) as $key){
$error = curl_error($chs[$key]);
$header = curl_getinfo($chs[$key], CURLINFO_HTTP_CODE);
$time = curl_getinfo($chs[$key], CURLINFO_TOTAL_TIME);
$response = curl_multi_getcontent($chs[$key]); // get results
if (!empty($error)) {
echo "The request $key return a error: $error" . "\n";
}
else {
echo "The request to $urls[$key] : $error returned in $time seconds as $header" . "<br>";
}
curl_multi_remove_handle($mh, $chs[$key]);
}
// close current handler
curl_multi_close($mh);
$error always remains an empty string, why?.
Secondly I am able to get individual times for all urls. I wonder if the total time taken by curl_multi_exec to fetch all urls is same as the largest value $time?
I´m using cURL to get informations out of an API and write them in a MySQL table.
The url to the API looks like that:
https://eu.api.blizzard.com/data/wow/item-class/4/item-subclass/1?namespace=static-eu&locale=de_DE&access_token=US6XqgbtQ6rh3EVIqPsejuF62RwP8ljzWn
You can change the number from the url part 1?namespace to another value, for example to 4?namespace to get other informations from the API.
I´m using php "range" to generate the numbers for the url.
Problem:
Some numbers in the url leads to a 404 response, since there are no informations in the API. Example URL:
https://eu.api.blizzard.com/data/wow/item-class/4/item-subclass/18?namespace=static-eu&locale=de_DE&access_token=US6XqgbtQ6rh3EVIqPsejuF62RwP8ljzWn
These "404" pages should get ignored and nothing should be written in MySQL. How is this possible with cURL?
Complete code:
$ids = [];
foreach(range(0, 20) as $number) {
$ids[] = $number;
}
$userAgent = 'Mozilla/5.0 (Windows NT 5.1; rv:31.0) Gecko/20100101 Firefox/31.0';
$mh = curl_multi_init();
$channels = [];
foreach ($ids as $id) {
$fetchURL = 'https://eu.api.blizzard.com/data/wow/item-class/4/item-subclass/' . $id . '?namespace=static-eu&locale=de_DE&access_token=US6XqgbtQ6rh3EVIqPsejuF62RwP8ljzWn';
$channels[$id] = curl_init($fetchURL);
curl_setopt($channels[$id], CURLOPT_RETURNTRANSFER, 1);
curl_setopt($channels[$id], CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt($channels[$id], CURLOPT_SSL_VERIFYPEER, 0);
curl_multi_add_handle($mh, $channels[$id]);
}
// execute all queries simultaneously, and continue when all are complete
$running = null;
do {
curl_multi_exec($mh, $running);
curl_multi_select($mh);
} while ($running > 0);
//close the handles
foreach ($ids as $id) {
curl_multi_remove_handle($mh, $channels[$id]);
}
curl_multi_close($mh);
$response = [];
foreach($ids as $id){
$res = curl_multi_getcontent($channels[$id]);
$response[$id] = ($res === false) ? null : json_decode($res, true);
}
echo ("<pre>");
foreach ($response as $item) {
$sqle= "REPLACE INTO `itemsubclasses`
(`class_id`, `subclass`, `name`)
VALUES
('{$item['class_id']}', '{$item['subclass_id']}', '{$item['display_name']}')";
if ($conn->query($sqle) === TRUE) {
echo "Geklappt";
} else {
echo "Problem";
}
}
I am working on API that return single currency record in one request. One request take 0.5-1 sec to response, and 15 requests take 7-15 seconds.
As i know server can manage 100s of request per seconds.
I want to hit 15 request on server at once so server will give response in 1-2 seconds not in 15 Seconds. Return all data in one single array to save my loading time.
Check my Code
I am using Loop, loop wait until previous curl request not complete. How can i say to loop, keep continue and dont wait for response.
$time_Start = microtime(true);
$ids = array(1,2,11,15,20,21); // 6 ids in demo, 15+ ids in real
$response = array();
foreach ($ids as $key => $id) {
$response[$id] = get_data($id);
}
echo "Time: ". (microtime(true)-$time_Start)."sec";
// output 5 seconds on 6 request
function get_data($id){
$fcs_api_key = "API_KEY";
$ch=curl_init();
curl_setopt($ch,CURLOPT_URL,"https://fcsapi.com/api/forex/indicators?id=".$id."&period=1d&access_key=".$fcs_api_key);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$buffer = curl_exec($ch);
curl_close($ch);
return $buffer;
}
You can use PHP multi curl https://www.php.net/manual/en/function.curl-multi-init.php
Below I write a code that open Parallel request.
$time_Start = microtime(true);
$ids = array(1,2,3,4,5,6); // You forex currency ids.
$response = php_curl_multi($ids);
echo "Time: ". (microtime(true)-$time_Start)."sec";
// Time: 0.7 sec
Function
function php_curl_multi($ids){
$parameters = "/api/forex/indicators?period=1d&access_key=API_KEY&id="; // ID will set dynamic
$url = "https://fcsapi.com".$parameters;
$ch_index = array(); // store all curl init
$response = array();
// create both cURL resources
foreach ($ids as $key => $id) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url.$id);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$ch_index[] = $ch;
}
//create the multiple cURL handle
$mh = curl_multi_init();
//add the handles
foreach ($ch_index as $key => $ch) {
curl_multi_add_handle($mh,$ch);
}
//execute the multi handle
do {
$status = curl_multi_exec($mh, $active);
if ($active) {
curl_multi_select($mh);
}
} while ($active && $status == CURLM_OK);
//close the handles
foreach ($ch_index as $key => $ch) {
curl_multi_remove_handle($mh, $ch);
}
curl_multi_close($mh);
// get all response
foreach ($ch_index as $key => $ch) {
$response[] = curl_multi_getcontent($ch);
}
return $response;
}
im trying to download multiple pdfs with php. i get an array of urls and each url redirects to a website that contains a pdf file if something is wrong with that url it just redirects to a html page, so i've been googling and found this to download all pdfs to the server:
public function download ($data, $simultaneous = 1, $save_to)
{
$loops = array_chunk($data, $simultaneous, true);
foreach ($loops as $key => $value)
{
foreach ($value as $urlkey => $urlvalue)
{
$ch[$urlkey] = curl_init($urlvalue["url"]);
curl_setopt($ch[$urlkey], CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch[$urlkey], CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch[$urlkey], CURLOPT_SSL_VERIFYHOST, false);
}
$mh = curl_multi_init();
foreach ($value as $urlkey => $urlvalue)
{
curl_multi_add_handle($mh, $ch[$urlkey]);
}
$running = null;
do {
curl_multi_exec($mh, $running);
} while ($running);
foreach ($value as $urlkey => $urlvalue)
{
$response = curl_multi_getcontent($ch[$urlkey]);
file_put_contents($save_to.$urlvalue["saveas"], $response);
curl_multi_remove_handle($mh,$ch[$urlkey]);
curl_close($ch[$urlkey]);
}
}
}
for some reason this downloads only some of the files
anyone has any idea why this is not working?
any help would be appreciated
I'm writing a mass youtube link finder, which imports a list of titles from an array, generates an API url and the executes them with curl_multi.
However, curl returns blank data for each link. Links are fine as I can access to them correctly via Chrome.
file_get_contents() tried in another script with a url amongst those returns an ERR_EMPTY_RESPONSE in Chrome.
Any help would be much appreciated,
EDIT: Code:
function getYTUrl($urls){
$curls = array();
$result = array();
$arrjson = array();
$mh = curl_multi_init();
foreach ($urls as $key => $value) {
echo $value;
$curls[$key] = curl_init();
curl_setopt($curls[$key], CURLOPT_URL, $value);
curl_setopt($curls[$key], CURLOPT_HEADER, 0);
curl_setopt($curls[$key], CURLOPT_RETURNTRANSFER, true);
curl_setopt($curls[$key],CURLOPT_SSL_VERIFYPEER,false);
curl_multi_add_handle($mh, $curls[$key]);
}
$active = null;
do{
$mrc = curl_multi_exec($mh, $active);
}
while ($active);
foreach ($urls as $key => $value) {
$result[$key] = curl_multi_getcontent($curls[$value]);
curl_multi_remove_handle($mh, $value);
}
curl_multi_close($mh);
}