Clickatell batch templates slow (PHP, http) - php

I'm trying to setup clickatell at the moment for sending out batch sms'. I've got it working but it's quite slow. About 20 seconds to send 5 test sms and 30 seconds for 10 test sms.
$nums = array(
"44-227811116" => "1",
"44-227819885" => "2",
"44-227819314" => "3",
"44-227815413" => "4",
"44-227819326" => "5"
);
//login
$url="https://api.clickatell.com/http/auth?api_id=xxxxx&user=xxxxx&password=xxxxx";
$page=Utilities::getWebPage($url);
//session
$clicksessionparts=explode(":", $page);
$clicksession=trim($clicksessionparts[1]);
//batch
$from=xxxxx;
$batchTemplate = urlencode("Test message #field1#");
$url="https://api.clickatell.com/http_batch/startbatch?session_id=$clicksession&template=$batchTemplate&from=$from&deliv_ack=1";
$page=Utilities::getWebPage($url);
$batchId=explode(":",$page);
$batchId=trim($batchId[1]);
foreach ($nums as $k => $v)
{
$start = new DateTime();
print_r($start->format("H i:s"));
$url="https://api.clickatell.com/http_batch/senditem?session_id=$clicksession&batch_id=$batchId&to=xxxxx&field1=$v";
$page=Utilities::getWebPage($url);
echo "<pre>";
print_r($page);
echo "</pre>";
$end = new DateTime();
print_r($end->format("H i:s"));
echo "<br><br>";
}

You should be able to submit over 100 messages per second to the HTTP API comfortably.
Creating HTTPS connections is a very slow process (compared to HTTP). If you want better performance with HTTPS, you will have to reuse the connections.
I am guessing that Utilities::getWebPage() is creating a new HTTPS connection each time? For PHP I would suggest you look at using cURL.
If you want to go another step further (I doubt you need to go this far), you can consider using curl_multi... Its a bit more work though and most people dont need so much speed (some find it easier just to use another API like the SMTP API so they have many messages in 1 email).
Also, you technically do not need to use the batch commands on the HTTP API to send your messages (unless you want to). You can send millions with just api.clickatell.com/http/sendmsg?.... in which case there is no need to do a start batch call.
With something like an SMTP API, you can put 100 000 messages in one email (if you need unique text per message, you would use the batch facility on that API).

Related

Google translate API POST 11second delay?

EDIT: 11.5 seconds for 28 messages
Single requests work fine. This code below takes 11 seconds, measured using postman setting route in API to access.
Am I doing something wrong? I feel as though it shouldn't take 11 seconds even without cache.
$xs = ChatMessage::where('chat_room_id','=',$roomId)
->with('user')
->orderBy('created_at','DESC')
->get();
foreach($xs as $r){
$translate = new TranslateClient([
'key' => 'xxxxxxxxxxxxxxxxxxxxxxx'
]);
$result = $translate->translate($r->message_english, [
'target' =>'es',
'source' => 'en',
]);
$r->message = $result['text'];
}
return $xs;
I think you can easily speed the process by just moving the client out of the for each loop. You are creating a client each time you iterate. That's not optimal. You should be able to reuse the client per translate call. That should speed your translation process. You can find samples of this usage in the official github client project
Here is a pseudo code sample:
client = new TranslateClient()
foreach(message in messages)
result = client.translate(message)
print(result)
Also, how long is your translated text? You should pass the whole text to be translated into a single call (as long as the supported library allows) So you also reduce the calls done to the API.
If you still have issues you can go by using multiple request in parallel as mentioned in the comments.
Some useful links about this:
PHP Documentation Overview
Translate Client

What type of configration is required to call run multirequest in PHP?

I want to run more than 2500+ call on same time. So i have created a batch of 100 (2500/100 = 25 total call).
// REQUEST_BATCH_LIMIT = 100
$insert_chunks = array_chunk(['array', 'i want', 'to', 'insert'], REQUEST_BATCH_LIMIT);
$mh = $running = $ch = [];
foreach ($insert_chunks as $chunk_key => $insert_chunk) {
$mh[$chunk_key] = curl_multi_init();
$ch[$chunk_key] = [];
foreach ($insert_chunk as $ch_key => $_POST) {
$ch[$chunk_key][$ch_key] = curl_init('[Dynamic path of API]');
curl_setopt($ch[$chunk_key][$ch_key], CURLOPT_RETURNTRANSFER, true);
curl_multi_add_handle($mh[$chunk_key], $ch[$chunk_key][$ch_key]);
}
do {
curl_multi_exec($mh[$chunk_key], $running[$chunk_key]);
curl_multi_select($mh[$chunk_key]);
} while ($running[$chunk_key] > 0);
foreach(array_keys($ch[$chunk_key]) as $ch_key) {
$response = curl_getinfo($ch[$chunk_key][$ch_key]);
$returned_data = curl_multi_getcontent($ch[$chunk_key][$ch_key]);
curl_multi_remove_handle($mh[$chunk_key], $ch[$chunk_key][$ch_key]);
}
curl_multi_close($mh[$chunk_key]);
}
When i running this in local the system is hanged totally.
But this limit of batch like 100, 500 are not same on different device and server, so what is the reason about it? and what changes should i do to increase it?
If i am adding 1000 data with batch of 50, so for every batch 50 records should insert, but it insert randomly for a batch like 40, 42, 48, etc. so way this is skipped calls? (If i am using single record with simple cURL using loop then it is working fine.)
P.S. This code is i am using for bigcommrece API.
The BigCommerce API definitely throttles requests. The limits are different depending on which plan you are on.
https://support.bigcommerce.com/s/article/Platform-Limits
The "Standard Plan" is 20,000 per hour. I'm not sure how that is really implemented, though, because in my own experience, I've been throttled before hitting 20,000 requests in an hour.
As Nico Haase suggests, the key is for you to log every response you get from the BigCommerce API. While not a perfect system, they do usually provide a response that is helpful to understand the failure.
I run a process that makes thousands of API requests every day. I do sometimes have requests that fail as if the BigCommerce API simply dropped the connection.

Message sending Telegram bot (PHP)

I know it's somehow weird to ask something like this, but I'm trying to program a telegram bot with PHP.
The bot is in a channel (e.g. Channel A) and I'm going to send messages in that channel, so the bot will copy X number of messages to another channel (Channel B), Every Y minutes.
Example:
X = 5
Y = 60
Channel A = ID .....
Channel B = ID .....
So it will COPY 5 messages from A to B every hour...
Can anybody write me a template please? I think I can configure the VPS and webhook stuff (SSL and etc).
If you need send message per minutes, and get message from Telegram callback, you need read about queue (zmq, redis, gearman or etc).
Create daemons. These are your bots. They can read messages from queue and send callbacks.
Write Controller to get callback from telegram. It can take message and push to queue.
Install Ev or Event extension on PHP. (You can use reactphp, it simple solution to create timer)
Bot1 create timer, and listen messages. If we have more 5 messages, timer can push message in queue for Bot2.
You can use reactphp/zmq, nrk/predis-async to helpful your task
P.S. It is most simple solution. But you can use pthreads (instead create daemon process) or use simple socket to send message in bot.
If you want to use webhook things you can do this.
write a sample code like this:
<?php
$texts_from_other_channel = [];
array_push($texts_from_other_channel , $update_array['message']['text']);
$t_size = sizeof($texts_from_other_channel)
for($i=0 ; $i < $t_size ; $i++){
$post_prs = ['chat_id' => $channel_id , 'text' => $texts_from_other_channel[$i]];
send_reply($sendmessag_url , $post_prs);
end
?>
other things like send_reply() function or $update_array are up to you and I left to yourself.

slow php imap email fetching ( >10 seconds )

I am building my own webmail client. Like Roundcube or Squirrelmail, for example. The problem is that my version is very slow, while Roundcube is fast and I cannot understand why is that (Roundcube's source ir very big and I am unable to dive in it..)
The goal - fetch last 50 messages from mailbox. My strategy:
Get number of messages in mailbox by imap_num_msg()
Make array of sequence numbers from max to (max-50)
For each sequence number I ran functions imap_header() and imap_fetchstructure()
It takes 10-15 seconds. It allows me to get each messages title, date, whether is has attachments or not, from, to and other information.
However, Roundcube displays the same info, but load time is only 3 seconds or so. My strategy seems to be very wrong. How can I do it faster? I'm pretty sure that it must be slow to ran imap_header and imap_fetchstructure for each sequence number, but I think there is no other way to get that information.. I'm doing something like this:
function getMessageBySequenceNumber($imapStream, $sequence_number){
$header = imap_header($imapStream, $sequence_number);
$structure = imap_fetchstructure($imapStream, $sequence_number);
/*
... some code parsing $structure to find out whether this emails has any attachments or not
*/
return [
'uid' => imap_uid($imapStream, $i),
'subject' => $header->subject,
'timestamp' => $header->udate,
'unseen' => $header->Unseen,
'star' => $header->Flagged,
'draft' => $header->Draft,
'size' => $header->size,
'has_attachments_bool' => $has_attachments_bool,
];
}
$imapStream = imap_open();
$first_sequence_number = imap_num_msg(); // lets imagine it returns 100
$last_sequence_number = $first_sequence_number-50;
$sequence_numbers = [100,99,88 ..., 51, 50];
$messages = [];
foreach($sequence_numbers as $sequence_number){
$messages[] = getMessageBySequenceNumber($imapStream, $sequence_number);
}
return $messages;
You are fetching the messages one-by-one. This means that your PHP code has to wait for the remote IMAP server to answer you, then your PHP code is going to process the (partial) response, send the data back to the server, etc.
Use an IMAP library which allows batched operations, and read RFC 3501 to understand how to use it.

Facebook Batch API insight requests

For a project I have to grab the insights of a page over a long period of time (e.g. 1-2 years) of facebook.
I first tried to do a single request but it turned out that only requesting
/PAGE_ID/insights?since=xxx&until=xxx
doesn't return all the data I want (it somehow supresses data as if there's some limit to the size of the answer).
I then tried to split up the date range (e.g. 01.04.2011-01.04.2011 -> 01.04.2011-01.08.2011-01.12.2011-01.04.2011) which as well, didn't quite work like I wanted it to.
My next approach was to request only the insight values I need, like 'page_stories, page_impressions, ...'. The requests looked like this
/PAGE_ID/insights/page_impressions/day?since=xxx&until=xxx
This actually worked but not with ajax. It sometimes seemed to drop some requests (especially if I changed the browser tab in google chrome) and I need to be sure that all requests return an answer. A synchronus solution would just take way too much time considering that one requests needs at least 2 seconds and with a date range of 2 years I may have about 300 single requests this just takes way too long to complete.
Lastly I stumbled over facebook ability to do batch requests which is exactly what I need. It can pack up to 50 requests in one call which significantly lowers the bandwith. And here's where I'm stuck. The facebook api gives some examples on how to use it but none of them worked when I tested them in the Graph Explorer and via the php facebook api sdk. I tried to pack this request
PAGE_ID/insights/page_fan_adds/day?since=1332486000&until=1333695600
into a batch request but failed.
It seems that the api is bugged. It's always giving me this error when I'm using a question mark '?' in the 'relative_url' field.
{
"error": {
"message": "batch parameter must be a JSON array",
"type": "GraphBatchException"
}
}
Here is what I tried:
These give the 'must be a JSON array' error:
?batch=[{"method":"GET","relative_url":"/PAGE_ID/insights/page_fan_adds/day?since=1332486000&until=1333695600"}]
These two actually return data but they are ignoring the parameters:
?batch=[{"method":"GET","relative_url":"/PAGE_ID/insights/page_fan_adds/day","body":"since=1332486000 until=1333695600"}]
?batch=[{"method":"GET","relative_url":"/PAGE_ID/insights/page_fan_adds/day","body":"since=1332486000,until=1333695600"}]
?batch=[{"method":"GET","relative_url":"/PAGE_ID/insights/page_fan_adds/day","body":{"since":"1332486000","until":"1333695600"}}]
And this one tells me that this is an 'Unsupported post request':
?batch=[{"method":"POST","relative_url":"/PAGE_ID/insights/page_fan_adds/day","body":"since=1332486000 until=1333695600"}]
Can someone help?
I finally found the solution to my problem. It's not mentioned in the facebook documentation but for this request
?batch=[{"method":"GET","relative_url":"/PAGE_ID/insights/page_fan_adds/day?since=1332486000&until=1333695600"}]
to properly work we have to use a function like
urlencode()
to encode the json part. This way the querys work like a charm. A php example:
$insights = $facebook->api('?batch=['.urlencode('{"method":"GET","relative_url":"/PAGE_ID/insights/page_fan_adds/day?since=1332572400&until=1333782000"}').']'
,'post',array('access_token' => $this->facebook->getAccessToken()));
which results in this:
?batch=[%7B%22method%22%3A%22GET%22%2C%22relative_url%22%3A%22%2FPAGE_ID%2Finsights%2Fpage_fan_adds%2Fday%3Fsince%3D1300086000%26until%3D1307862000%22%7D]
This example is for using an array of IDs to make a batch request with urlencoding.
$postIds = [
'XXXXXXXXXXXXXXX_XXXXXXXXXXXXXXX',
'XXXXXXXXXXXXXXX_XXXXXXXXXXXXXXX',
'XXXXXXXXXXXXXXX_XXXXXXXXXXXXXXX',
'XXXXXXXXXXXXXXX_XXXXXXXXXXXXXXX',
'XXXXXXXXXXXXXXX_XXXXXXXXXXXXXXX',
];
$queries = [];
foreach( $postIds as $postId ) {
$queries[] = [
'method' => 'GET',
'relative_url' => '/' . $postId . '/comments?summary=1&filter=stream&order=reverse_chronological',
];
}
$requests = $facebook->post( '?batch=' . urlencode( json_encode( $queries ) ) )->getGraphNode();

Categories