I can get curl_init() working, but not curl_multi_init() - php

The bounty expires in 6 days. Answers to this question are eligible for a +50 reputation bounty.
Jonathon Philip Chambers wants to draw more attention to this question:
Even a clue or insight that leads me to discovering the correct answer on my own is worthy of the green tick.
This is my first time using curl_multi_init() so I'm probably misunderstanding something. Learning to use it properly is more important to me than solving my problem because this particular function will solve a lot of my problems in future.
This particular call is for uploading Etsy photos. Etsy documentation for this call here.
It works fine in Postman. The code snippet postman generates for "PHP - cURL" works fine. It keeps working fine even after my edits to it.
Trouble is, I've got well over a thousand high resolution images to upload, so running the entire snippet from start to finish, then looping it a thousand times will time out no matter how generous my php.ini settings.
So, line by line I merged the existing code with a synchronous snippet and, I must have done something wrong. This example is almost exactly the live code. I've just deleted/simplified irrelevant things and redacted personal information. (Hopefully I didn't delete/simplify the bug.):
<?php
include_once 'databaseStuff.php';
include_once 'EtsyTokenStuff.php';
$result = mysqli_query($conn, "SELECT product, listing_id, alt_text, dataStuff;");
$multiCurl = [];
$multiResult = [];
$multiHandle = curl_multi_init();
if (mysqli_num_rows($result) > 0){
while ($row = mysqli_fetch_assoc($result)){
for($image = 1; $image <=2; $image++){
$multiCurl[$row['product'] . "_" . $image] = curl_init();
curl_setopt_array($multiCurl[$row['product'] . "_" . $image],
array(
CURLOPT_URL => "https://openapi.etsy.com/v3/application/shops/$myShopNumber/listings/" . $row['listing_id'] . "/images",
CURLOPT_RETURNTRANSFER => true,
CURLOPT_ENCODING => '',
CURLOPT_MAXREDIRS => 10,
CURLOPT_TIMEOUT => 0,
CURLOPT_FOLLOWLOCATION => true,
CURLOPT_HTTP_VERSION => CURL_HTTP_VERSION_1_1,
CURLOPT_CUSTOMREQUEST => 'POST',
CURLOPT_POSTFIELDS => array(
"image" => new CURLFILE(
[
1 => "img/imagePathStuff/" . $row['product'] . ".jpg",
2 => "img/differentImagePathStuff/" . $row['product'] . ".jpg"
][$image]
),
// "listing_image_id" =>,
"rank" => $image,
"overwrite" => true,
// "is_watermarked" =>,
"alt_text" => $row['alt_text']
),
CURLOPT_HTTPHEADER => array(
"x-api-key: $myAPIKey",
"authorization: Bearer {$etsyAccessToken}"
),
)
);
curl_multi_add_handle($multiHandle, $multiCurl[$row['product'] . "_" . $image]);
}
}
$index = null;
do {
curl_multi_exec($multiHandle, $index);
} while($index > 0);
foreach($multiCurl as $k => $curlHandle){
$multiResult[$k] = curl_multi_getcontent($curlHandle);
curl_multi_remove_handle($multiHandle, $curlHandle);
}
curl_multi_close($multiHandle);
}
Once it starts working I'll probably block it out into functions, but I prefer to edit broken code in this format and add the function calls later.
Having never worked with these functions before, I'm not sure how they're supposed to behave but the behaviour I've noticed:
All of the code executes, start to finish, no fatal errors.
The do-while loop executes once then loops once more. (Maybe it's supposed to or maybe it's supposed to loop once per photo. Couldn't get that clarified anywhere.)
It's supposed to update photos. Unfortunately the first test was on very minor edits, but trying again including a deliberately wrong photo I at least know that that particular photo didn't update, so probably none of them updated.
curl_multi_getcontent($curlHandle) always returns an empty string
curl_multi_exec($multiHandle, $index) always returns 0 (Previous claim that it was 1002 was incorrect. 1002 was actually the value of the second argument $index after running the function.)
This particular call normally has very detailed responses for 201 and at least returns the error for 400, 401, 403, 404, 409, and 500, but I don't think my code is even going far enough to make the call. I haven't even figured out how to get the response codes at all.
For a script that transfers well over one thousand high resolution images from my server to Etsy's server, it certainly executes very fast.
The $multiHandle seems to work as intended. At the very least, a var_dump($multiHandle) reveals all the correct file names in there.
I've lost an entire work day to this issue, but it wouldn't surprise me if it's a minor typo causing this. What is it?

This is just a bit of a guess, but if your problem is that your are timing out, then it seems that the following loop you coded may be the problem:
$index = null;
do {
curl_multi_exec($multiHandle, $index);
} while($index > 0);
You are making repeated calls to curl_multi_exec, which is burning up CPU all the while you are waiting for all of your uploads to complete. You should instead only periodically be checking the status of your uploads and going into a wait state in between. This should reduce your total CPU time:
while (TRUE) {
$status = curl_multi_exec($multiHandle, $activeCount);
if ($status == CURLM_OK && $activeCount) {
// Wait some time before checking again:
curl_multi_select($mh, $timeout=1.0);
}
else {
break;
}
}

Related

Guzzle / Laravel cURL error 6: Could not resolve host: api.coingecko.com [duplicate]

This question already has answers here:
curl: (6) Could not resolve host: google.com; Name or service not known
(7 answers)
Closed 9 months ago.
Ok so I am a little stuck with this issue. I have a foreach loop (usually 50 results) that queries an API using Guzzle via Laravel Http and I am getting really inconsistent results.
I monitor the inserts in the database as they come in and sometimes the process seems slow and other times the process will fail with the following after x number of returned results.
cURL error 6: Could not resolve host: api.coingecko.com
The following is the actual code im using to fetch the results.
foreach ($json_result as $account) {
var_dump($account['name']);
$name = $account['name'];
$coingecko_id = $account['id'];
$identifier = strtoupper($account['symbol']);
$response_2 = Http::get('https://api.coingecko.com/api/v3/coins/'.urlencode($coingecko_id).'?localization=false');
if($response_2->successful()){
$json_result_extra_details = $response_2->json();
if( isset($json_result_extra_details['description']['en']) ){
$description = $json_result_extra_details['description']['en'];
}
if( isset($json_result_extra_details['links']['twitter_screen_name']) ){
$twitter_screen_name = $json_result_extra_details['links']['twitter_screen_name'];
}
}else {
// Throw an exception if a client or server error occurred...
$response_2->throw();
}
$crypto_account = CryptoAccount::updateOrCreate(
[
'identifier' => $identifier
],
[
'name' => $name,
'identifier' => $identifier,
'type' => "cryptocurrency",
'coingecko_id' => $coingecko_id,
'description' => $description,
]);
//sleep(1);
}
Now I know I am within the API rate limit of 100 calls a minute so I don't think that is the issue. I am wondering if this is a server/api issue which I don't really have any control over or if it related to my code and how Guzzle is implemented.
When I do single queries I don't seem to have a problem, the issue seems to be when it is inside the foreach loop.
Any advice would be great. Thanks
EDIT
Ok to update the question, I am now wondering if this is Guzzle/Laravel related. I changed the API to now point to the Twitter API and I am getting the same error after 80 synchronous requests.
I think it's better to use Asynchronous Request directly with Guzzle.
$request = new \GuzzleHttp\Psr7\Request('GET', 'https://api.coingecko.com/api/v3/coins?localization=false');
for ($i=0; $i < 50 ; $i++) {
$promise = $client->sendAsync($request)
->then(function ($response) {
echo 'I completed! ' . $response->getBody();
});
$promise->wait();
}
more information on Async requests: Doc
I have a similar problem as yours.
I doing the HTTP requests in the loop, and the first 80 requests are okay.
But the 81st start throwing this "Could not resolve host" exception.
It's very strange for me because the domain can be resolved perfectly fine on my machine.
Thus I start digging into the code.
End up I found that Laravel's Http facades keep generate the new client.
And I guess this eventually trigger the DNS resolver's rate limit?
So I have the workaround as following:
// not working
// as this way will cause Laravel keep getting a new HTTP client from guzzle.
foreach($rows as $row) {
$response = Http::post();
}
// workaround
$client = new GuzzleHttp\Client();
foreach($rows as $row) {
$response = $client->post();
// don't forget use $response->getBody();
}
i believe it's because $client will cached the DNS resolve result, thus it will reduce the call to DNS resolver and not trigger the rate limit?
I'm not sure whether it was right. BUT it's working for me.

code needs to loop over minimum 2000 times in php foreach

I am having the foreach loop that will run minimum 2000 loops
foreach ($giftCardSchemeData as $keypreload => $preload) {
for ($i=0; $i <$preload['quantity'] ; $i++) {
$cardid = new CarddetailsId($uuidGenerator->generate());
$cardnumber = self::getCardNumber();
$cardexistencetype = ($key == "giftCardSchemeData") ? "Physical" : "E-Card" ;
$giftCardSchemeDataDb = array('preload' => array('value' => $preload['value'], 'expirymonths' => $preload['expiryMonths']));
$otherdata = array('cardnumber' => $cardnumber, 'cardexistencetype' => $cardexistencetype, 'isgiftcard' => true , 'giftcardamount' => $preload['value'],'giftCardSchemeData' => json_encode($giftCardSchemeDataDb), 'expirymonths' => $preload['expiryMonths'], 'isloyaltycard' => false, 'loyaltypoints' => null,'loyaltyCardSchemeData' => null, 'loyaltyRedeemAmount' => null, 'pinnumber' => mt_rand(100000,999999));
$output = array_merge($data, $otherdata);
// var_dump($output);
$carddetailsRepository = $this->get('oloy.carddetails.repository');
$carddetails = $carddetailsRepository->findByCardnumber($cardnumber);
if (!$carddetails) {
$commandBus->dispatch(
new CreateCarddetails($cardid, $output)
);
} else {
self::generateCardFunctionForErrorException($cardid, $output, $commandBus);
}
}
}
Like above foreach I am having totally 5 of them. When I call the function each time the 5 foreach runs and then return the response. It take more time and the php maximum execution time occurs.
Is there a any way to send the response and then we could run the foreach in server side and not creating the maximum execution time issue.Also need an optimization for the foreach.
Also In symfony I have tried the try catch method for the existence check in the above code it return the Entity closed Error. I have teprorily used the existence check in Db but need an optimization
There seems to be a lot wrong (or to be optimized) with this code, but let's focus on your questions:
First I think this code shouldn't be in code that will be triggered by a visitor.
You should seperate 2 processes:
1. A cronjob that runs that will generate everything that must be generated and saved that generated info to a database. The cronjob can take as much time as it needs. Look at Symfony's console components
2. A page that displays only the generated info by fetching it from the database and passing it to a Twig template.
However, looking at the code you posted I think it can be greatly optimized as is. You seem to have a foreach loop that fetches variable data, and in that you have a for-loop that does not seem to generate much variability at all.
So most of the code inside the for loop is now being executed over and over again without making any actual changes.
Here is a concept that would give much higher performance. Ofcourse since I don't know the actual context of your code you will have to "fix it".
$carddetailsRepository = $this->get('oloy.carddetails.repository');
$cardexistencetype = ($key == "giftCardSchemeData") ? "Physical" : "E-Card";
foreach ($giftCardSchemeData as $keypreload => $preload) {
$cardnumber = self::getCardNumber();
$carddetails = $carddetailsRepository->findByCardnumber($cardnumber);
$giftCardSchemeDataDb = array('preload' => array('value' =>
$preload['value'], 'expirymonths' => $preload['expiryMonths']));
$otherdata = array('cardnumber' => $cardnumber, 'cardexistencetype' =>
$cardexistencetype, 'isgiftcard' => true , 'giftcardamount' =>
$preload['value'],'giftCardSchemeData' =>
json_encode($giftCardSchemeDataDb), 'expirymonths' =>
$preload['expiryMonths'], 'isloyaltycard' => false, 'loyaltypoints' =>
null,'loyaltyCardSchemeData' => null, 'loyaltyRedeemAmount' => null,
'pinnumber' => 0);
$output = array_merge($data, $otherdata);
for ($i=0; $i <$preload['quantity'] ; $i++) {
$cardid = new CarddetailsId($uuidGenerator->generate());
$output['pinnumber'] = mt_rand(100000,999999);
if (!$carddetails) {
$commandBus->dispatch(
new CreateCarddetails($cardid, $output)
);
} else {
self::generateCardFunctionForErrorException($cardid, $output, $commandBus);
}
}
}
Also: if in this code you are triggering any database inserts or updates, you don't want to trigger them each iteration. You will want to start some kind of database transaction and flush the queries each X iterations instead.

Random Content Array seems stuck

I have a random content script that has worked perfectly but now seems to have a glitch.
It's the "Spotlight On:" story on the upper lefthand corner at http://fiction.deslea.com/index2.php and the code is as follows:
$storyspotlights = array("bluevial", "biophilia", "real", "edgeofreality",
"limitsofperception", "markofcain", "spokenfor", "closer",
"feildelm", "purgatory", "elemental");
$randomstoryID = array_rand($storyspotlights);
$randomstory = $storyspotlights[$randomstoryID];
switch ($randomstory) {
case ($randomstory == 'closer'):
$storyspotlightheader = "<div class='storyspotlightheader'>Closer</div>";
$storyspotlighttext = "snip";
//some stories snipped
case ($randomstory == 'bluevial'):
$storyspotlightheader = "<div class='storyspotlightheader'>The Blue
Vial</div>";
$storyspotlighttext = "snip";
break;
//more stories snipped
}
print($storyspotlightheader);
print($storyspotlighttext);
My problem is - all the stories from Blue Vial to Spoken For appear when you refresh the page, in random order (although Blue Vial seems to stick a fair bit). These were the stories in the script originally.
Since then I have added the last four to the array and the content generation switch case fragment, but these last four stories never, ever appear in the randomiser. I've literally sat and refreshed for hours. I've confirmed over and over that the updated script is on the server, and even deleted and re-uploaded it.
I did try unset and also $storyspotlights = array() at the beginning of the script at various stages of troubleshooting, but to no avail. I also tried moving the new stories to the start of the array - no change there either.
What am I missing?
It's surprising this works at all. That's not how you use switch..case.
switch (<value to compare>) {
case <value to compare against>:
...
}
That means you write this:
switch ($randomstory) {
case 'closer':
...
}
With what you've written it's actually executing like:
if ($randomstory == ($randomstory == 'closer')) ...
Also make sure you have not actually forgotten some break statements, which would make the code fall through to the next case and indeed make certain cases "more sticky" than others.
Also, I'd simplify the whole thing to this:
$stories = array(
array('header' => '...', 'text' => '...'),
array('header' => '...', 'text' => '...'),
...
);
$story = $stories[array_rand($stories)];
echo $story['header'];
echo $story['text'];

Dissapearing PHP Variables

I am creating a 3D Secure PHP Project. I am having a rather bizzare issue in that the "MD" code is going missing when re-submitting the Array of data
My code is as follows :
$paRes = $_REQUEST['PaRes'];
$md = $_REQUEST['MD'];
require "payment_method_3d.php";
x_load('cart','crypt','order','payment','tests');
/*
* For Debugging Purposes
* Only.
echo "The Value Of PaRes is : ";
echo $paRes;
*/
$soapClient = new SoapClient("https://www.secpay.com/java-bin/services/SECCardService?wsdl");
$params = array (
'mid' => '',
'vpn_pswd' => '',
'trans_id' => 'TRAN0095', // Transaction ID MUST match what was sent in payment_cc_new file
'md' => $md,
'paRes' => $paRes,
'options' => ''
);
It seems that the $_REQUEST['MD'] string seems to go missing AFTER the soap call. Although I am having difficulty print this out to the screen. The strange thing is the $paRes variable works without issue.
Any ideas why this would be the case?
Check your case. PHP array keys are case sensitive. From this little bit of code it looks as if the request variable may be 'md' instead of 'MD'.
Try $md = $_REQUEST['md'];
PHP array statements are case sensitive, so this should work:....
$md = $_REQUEST['md'];
Thanks for your responses guys.
What was happening was the include page was sitting in front of the request methods and causing issues loading the REQUEST methods to the page.

Prevent timeout during large request in PHP

I'm making a large request to the brightcove servers to make a batch change of metadata in my videos. It seems like it only made it through 1000 iterations and then stopped - can anyone help in adjusting this code to prevent a timeout from happening? It needs to make about 7000/8000 iterations.
<?php
include 'echove.php';
$e = new Echove(
'xxxxx',
'xxxxx'
);
// Read Video IDs
# Define our parameters
$params = array(
'fields' => 'id,referenceId'
);
# Make our API call
$videos = $e->findAll('video', $params);
//print_r($videos);
foreach ($videos as $video) {
//print_r($video);
$ref_id = $video->referenceId;
$vid_id = $video->id;
switch ($ref_id) {
case "":
$metaData = array(
'id' => $vid_id,
'referenceId' => $vid_id
);
# Update a video with the new meta data
$e->update('video', $metaData);
echo "$vid_id updated sucessfully!<br />";
break;
default:
echo "$ref_id was not updated. <br />";
break;
}
}
?>
Thanks!
Try the set_time_limit() function. Calling set_time_limit(0) will remove any time limits for execution of the script.
Also use ignore_user_abort() to bypass browser abort. The script will keep running even if you close the browser (use with caution).
Try sending a 'Status: 102 Processing' every now and then to prevent the browser from timing out (your best bet is about 15 to 30 seconds in between). After the request has been processed you may send the final response.
The browser shouldn't time out any more this way.

Categories