How to run mulitple cURL requests from API - php

I'm trying to run multiple cURL requests from the Google Analytics API and wanted to see if there was another more efficient way of running the requests than having to manually build them out like below. I would need to eventually build of about 10-15 requests so looking to build something more useful in that case.
<?php
$ch1 = curl_init();
$ch2 = curl_init();
$ch3 = curl_init();
curl_setopt($ch1, CURLOPT_URL, "https://www.googleapis.com/analytics/v3/data/parameters_go_here");
curl_setopt($ch2, CURLOPT_URL, "https://www.googleapis.com/analytics/v3/data/parameters_go_here");
curl_setopt($ch3, CURLOPT_URL, "https://www.googleapis.com/analytics/v3/data/parameters_go_here");
curl_exec($ch1);
curl_exec($ch2);
curl_exec($ch3);
?>

you can use multi-curl
$urls = array($url1, $url2, $url3);
$curl_arr = array();
$inits = curl_multi_init();
for($i = 0; $i < count($urls); $i++)
{
$url =$urls[$i];
$curl_arr[$i] = curl_init($url);
curl_setopt($curl_arr[$i], CURLOPT_RETURNTRANSFER, true);
curl_multi_add_handle($inits, $curl_arr[$i]);
}
do {
curl_multi_exec($inits, $running);
} while($running > 0);
for($i = 0; $i < count($urls); $i++)
{
$results[] = curl_multi_getcontent($curl_arr[$i]);
}
print_r($results);
or create function
function doCurl($url){
$ch = curl_init();
curl_setopt($ch1, CURLOPT_URL, $url);
curl_exec($ch);
}

Considering you're passing in different parameters each time, you do need to make separate calls to the API. Having said that, you may benefit from utilising a function() where you structure the call, and pass the parameter in as a variable:
function getData($param = "") {
$core_url = "https://www.googleapis.com/analytics/v3/data/";
$target_url = $core_url . $param;
$ch = curl_init();
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_URL, $target_url);
$result = curl_exec($ch);
curl_close($ch);
return json_decode($result);
}
This will 'prettify' your code, allowing you to simply pass through a one word call. The following shows how you can access properties of the returned information:
getData('stats')->item; /* Calls https://www.googleapis.com/analytics/v3/data/stats */
getData('info')->item; /* Calls https://www.googleapis.com/analytics/v3/data/info */
Note that this will still result in the same amount of data being requested from the API, though provides a much cleaner way in which to call the API each time you need to.
Hope this helps! :)

Related

PHP CURL Request for 10K Times Same URL

I have an Excel files approx. 1,29 MB and it has 11500 lines of data.
In my PHP code first i read whole files and put in arrays via PHPExcel/IOFactory library.
After the putting, in foreach loop i start to create $postvars for cUrl.
gonderi_referans=REF&alici=Sarah&alici_telefon=55544448885&alici_adres=Krg&alici_ulke=DE
In foreach loop the code will create 11500 times $postvars and make curl request.
$url = "https://$_SERVER[HTTP_HOST]/api572.php";
$ch = curl_init();
curl_setopt($ch,CURLOPT_URL,$url);
curl_setopt($ch,CURLOPT_POST, 1);
curl_setopt($ch,CURLOPT_POSTFIELDS,$postvars);
curl_setopt($ch,CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch,CURLOPT_CONNECTTIMEOUT ,30);
curl_setopt($ch,CURLOPT_TIMEOUT, 20);
$response =json_decode(curl_exec($ch), true);
This operation takes approx 5 minutes. I want to do it in a faster way.
How can i do it?
Code read excel files in a 10 seconds, curl request it takes lots of times.
Try to use some batch-processing to minimize the number of sent requests, totally basic POC:
excel-reader.php:
<?php
// Dummy data START
$dummyExcelRows = [];
for ($i = 0; $i <= 11500; $i++) {
$dummyExcelRows[] = "gonderi_referans=REF&alici=Sarah&alici_telefon=55544448885&alici_adres=Krg&alici_ulke=DE&id=$i";
}
// Dummy data END
// Serialize multiple URI into one string START
$data = [];
foreach ($dummyExcelRows as $postvars) {
$data[] = $postvars;
}
$dataToSend = ['payload' => json_encode($data)];
// Serialize multiple URI into one string END
$url = "http://$_SERVER[HTTP_HOST]/test/api572.php";
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, $dataToSend);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 30);
curl_setopt($ch, CURLOPT_TIMEOUT, 20);
// $response =json_decode(curl_exec($ch), true);
$response = curl_exec($ch);
var_dump($response);
api572.php:
<?php
$decoded = json_decode($_POST['payload']);
foreach ($decoded as $row) {
echo "Do something with $row <br/>";
echo '<pre>';
parse_str($row, $fakePost);
print_r($fakePost);
echo '</pre>';
}

PHP curl multi handle doesn't work for some websites, but they do work not multi handled

I have this piece of simple code. It works except for facebook in multi handle. I don't get respond from it even though I get respond from facebook not multi handled. What could be possibly wrong? I use XAMPP and curl error didn't appear. Another thing I have to mention is that it is the only working for me way of executing multi handle. The other ones did infinite loops or gave no respond.
<?php
$url = 'http://m.facebook.com';
curl_setopt($ch, CURLOPT_URL, $url);
$page=curl_exec($ch);
echo $page;
$ch1 = curl_init();
$ch2 = curl_init();
$ch3 = curl_init();
curl_setopt($ch1, CURLOPT_URL, "https://jbzdy.pl/");
curl_setopt($ch2, CURLOPT_URL, "http://m.facebook.com/");
curl_setopt($ch3, CURLOPT_URL, "https://www.reddit.com");
$mh = curl_multi_init();
curl_multi_add_handle($mh,$ch1);
curl_multi_add_handle($mh,$ch2);
curl_multi_add_handle($mh,$ch3);
$running = null;
do
{
curl_multi_exec($mh, $running);
}
while($running > 0);
$page1 = curl_multi_getcontent($ch1);
$page2 = curl_multi_getcontent($ch2);
$page3 = curl_multi_getcontent($ch3);
echo $page1;
echo $page2;
echo $page3;
echo curl_error($ch2);
?>
I think the problem is in 301 redirect, try to add:
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);

How to combine multiple cURL requests in one?

This is the code I am currently using
function curl_get_contents($url) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_URL, $url);
$data = curl_exec($ch);
return $data;
}
function meta_scrap($filename, $other, $programming) {
$link = 'https://graph.facebook.com/?id=' . $filename . '&scrape=true&method=post';
$output = curl_get_contents($link);
$output = json_decode($output);
$ogtitle = $output->title;
}
I call meta_scrap($filename); 8 times on a single webpage. This makes the page load really slow. Is there something that I can do about it? I read about curl_multi_init() I tried to use it like this
function curl_get_contents($pages) {
$ch = curl_init();
$ch = array();
$mh = curl_multi_init();
for ($i = 0; $i < count($pages); $i++) {
$page = $pages[$i];
$ch[$i] = curl_init();
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_URL, $page);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_multi_add_handle($mh, $ch[$i]);
}
$running = 0;
do {
curl_multi_exec($mh, $running);
} while ($running > 0);
$data = curl_exec($ch);
$results = reset(json_decode(curl_multi_getcontent($ch[$i]), true));
$resultCount = count($results);
curl_close($ch);
return $data;
}
With this I get no output? Could anyone help me modify my code so that it gives correct output?
You seem to have started working with multiple requests then cut and pasted a single curl request handler here:
$data = curl_exec($ch);
$results = reset(json_decode(curl_multi_getcontent($ch[$i]), true));
$resultCount = count($results);
....but I can't imagine where you would have found piece of code which uses reset() like this.
You try to put the response into $results, yet you then throw this array away and return something completely different.
Try this....
...
$running = count($pages);
do {
curl_multi_exec($mh, $running);
usleep(5000);
} while ($running > 0);
$responses=array();
for ($i = 0; $i < count($pages); $i++) {
$responses[$i]=json_decode(curl_multi_getcontent($ch[$i]), true);
}
return $responses;
If you will be reusing the function then you should also remove and close each curl handle then close the multi-handle before returning.
See also my recent blog post about curl_multi_exec().

php CURL not working with dynamic URLs

I can't work out why this URL is not being found by CURL. The CURL engine is simply taken to a 400 error page.
My code is very simple and works fantastically with non-dynamic URLs.
I am hoping it's something easy to spot, for example, a missing CURL option.
I have tried using $url = urlencode($url) but that didn't work either.
Here's the code:
$url = 'http://www.destinations-uk.com/accommodations.php?link=accommodations&country=england&category=Reviews&id=1';
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, true);
curl_setopt($ch, CURLOPT_NOBODY, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 2);
curl_setopt($ch, CURLOPT_TIMEOUT, 15);
$r = curl_exec($ch);
$r = explode("\n", $r);
$keys = array();
if(!empty($r)) $keys[] = array_shift($r);
foreach($r as $line){
preg_match('/.+:\s/',$line,$match);
if($match) $keys[substr($match[0],0,-2)] = preg_replace('/.+:\s/','', $line);
}
print_r($keys);
Perhaps, this is something on the server-side done to prevent automated requests.

PHP - How to send request to web sites?

I am using curl, I am wondering how would I send post/submit data on my page to those websites? The web site has "host, time, port". My MYSQL database has a list of urls. I was thinking of curl_multi but I am not sure.
Please someone post examples. It has to be a fast method.
Basically feteches the url and post.
while($resultSet = mysql_fetch_array($SQL)){
$ch = curl_init($resultSet['url'] . $fullcurl);
curl_setopt($ch, CURLOPT_TIMEOUT, 2);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
}
The PHP cURL reference says that the CURLOPT_POST option, set to true, makes it a POST request. CURLOPT_POSTFIELDS sets the fields that you will send in foo=bar&spam=eggs format (which one can build from an array with http_build_query).
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, 'foo=bar&spam=eggs');
Here is an example on how to do it with curl_multi. Although you should break it up so you only have a certain amount of URLs going out at once (i.e. 30). I added the follow location directive, which you usually want.
$mh = curl_multi_init();
$ch = array();
while($resultSet = mysql_fetch_array($SQL)){
$ch[$i] = curl_init($resultSet['url'] . $fullcurl);
curl_setopt($ch[$i], CURLOPT_TIMEOUT, 2);
curl_setopt($ch[$i], CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch[$i], CURLOPT_FOLLOWLOCATION, true);
curl_multi_add_handle($mh, $ch[$i]);
}
$running = null;
do {
curl_multi_exec($mh,$running);
} while ($running > 0);
$num = count($ch);
for ($i=0; $i<$num; $i++ ) {
curl_multi_remove_handle($mh, $ch[$i]);
}
curl_multi_close($mh);
Give this a shot:
while ($resultSet = mysql_fetch_assoc($SQL)) {
$ch = curl_init($resultSet['url']);
curl_setopt($ch, CURLOPT_RETURNTRANSFER,1);
curl_setopt($ch, CURLOPT_TIMEOUT,2);
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, $fullcurl);
$response = curl_exec($ch);
curl_close();
}

Categories