Non-blocking transactions when reading from ActiveMQ queue with STOMP - php

I'm interacting with ActiveMQ via STOMP. I have one process which publishes messages and a multiple processes that subscribes and processes the messages (about 10 parallel instances).
After reading a message I want to be sure that if, for some reason my application fails/crashes, the message will not be lost. So naturally, I turned to transactions. Unfortunately, I discovered that once a consumer reads a message as a part of the transaction, all the following messages are not being sent to the other consumers, until the transaction ends.
Test case: abc queue has a 100 messages. If I activate the following code in two different browser tabs, the first will return in 10 seconds and the second will return in 20 seconds.
<?php
// Reader.php
$con = new Stomp("tcp://localhost:61613");
$con->connect();
$con->subscribe(
"/queue/abc",
array()
);
$tx = "tx3".microtime();
echo "TX:$tx<BR>";
$con->begin($tx);
$messages = array();
for ($i = 0; $i < 10; $i++) {
$t = microtime(true);
$msg = $con->readFrame();
if (!$msg) {
die("FAILED!");
}
$t = microtime(true)-$t; echo "readFrame() took $t MS to complete<BR>";
array_push($messages, $msg);
$con->ack($msg, $tx);
sleep(1);
}
$con->abort($tx);
Is there something I'm missing code-wise? Is there a way to configure ActiveMQ (or send a header) that will make the transaction remove the item from the queue, allow other processes consume the other messages, and if the transaction fails or is timed-out, will put the item back in?
PS: I thought about creating another queue - DetentionQueue for each reading process but I really rather not do it if I have a choice.

You will probably want to adjust the prefetch size of the subscription so that ActiveMQ doesn't send the Messages on the Queue to client 1 before client 2 gets a chance to get any. By default its set to 1000 so best to tune it for your use case.
You can set the prefetch size via the "activemq.prefetchSize=1" header on the subscribe frame. Refer to the ActiveMQ Stomp page for all the frame options.

Related

telegram bot goes into infinite loop when iterating over a big list on the backend

hello so i am trying to add broadcast command to my telegram bot which broadcast a specific message to all my bot subscribers which ids are saved in mysql database but the loop never seem to end and restarts after a random amount of sent messages
for example : the bot start messaging and then stop at 987 users and
restart the loop over and over or a different number of users too
this is the code that i am using:
<?php
http_response_code(200);
$admin_id = ''; // my id
$bot_token = ''; // my bot token
$message_object = json_decode(file_get_contents("php://input"),true);
$message_text = $message_object['message']['text'];
if(startsWith($message_text, '/broadcast') !== false){
$text_to_send = trim(str_replace('/broadcast','',$message_text));
$start_message = sendmessage($admin_id, 'Broadcasting Started', $bot_token, 'markdown');
$start_message_id = $start_message['result']['message_id'];
$query = mysqli_query($con, "SELECT * FROM users");
if($query and mysqli_num_rows($query) >= 1){
$all_users = mysqli_fetch_all($query,MYSQLI_ASSOC);
$sent = 0;
foreach($all_users as $user){
$user_id = $user['userid'];
$sent += 1;
sendmessage($user_id,$text_to_send,$bot_token,'markdown');
sleep(1);
editmessage($admin_id,"Messages Sent : $sent",$bot_token,$start_message_id);
}
sendmessage($admin_id,'finished broadcasting '.$sent.' messages',$bot_token,'markdown');
}
}
?>
and i never manage to get to the end of the loop to get the broadcast finished message and stuck on an infinite loop
same issue happen when i try to import amount of data that is more than 50 items so mysql database using the same method used in broadcast one
I think that's a problem of PHP maximum execution time: by default PHP has a max execution time of 30s. After 30 seconds the script is terminated and an error is reported to the client who made the initial request (in this case the Telegram API). Telegram sees the error thrown by your script and repeat the same request, so the script is executed again and again every 30 seconds. A possible solution may be the following:
Add the following code before $admin_id = '';
set_time_limit(100); // Set the max execution time
ignore_user_abort(true);
header('Connection: close');
flush();
fastcgi_finish_request();
This code will immediately close the connection with Telegram so it doesn't have to wait until the script terminates and it doesn't call the script again if an error occurs. With the set_time_limit(100) function you can increase the execution limit (for example to 100 seconds) so PHP doesn't kill everything before you have sent the broadcast message to everyone. If this operation takes more than 100 seconds just increse the limit or set it to 0 so it never ends, because according to the set_time_limit docs: If set to zero, no time limit is imposed.

How to send a large email?

$i = 1;
foreach ($recipients as $email => $name) {
$mail->ClearAddresses();
$mail->AddBCC($email, $name);
if (!$mail->send()) {
$send = 0;
} else {
$send = 1;
}
$query = "INSERT INTO `newsletter_send`(`email`, `id_newsletter`, `date`, `send`) VALUES ('$email',$id_newsletter, NOW(),$send) ";
$stmt = $link->prepare($query) or die('error');
$stmt->execute();
$mail->clearAllRecipients();
if (($i % 100) == 0) {
sleep(60);
}
$i++;
}
What is the best way to send a large emails without sleep() and without to wait the page to finish loading? In addition to the cron job you have other ideas ?
EDIT: I have 680 users who will receive the email, but after a while I get 500 Internal Server Error.. why? It maybe time_limit?
Message queues.
beanstalkd is a good solution.
You can then use a SDK like pheanstalk to handle the queue and its jobs.
EDIT: If you have restricted access to your server (for example, if you are using a shared hosting) message queues as a service are also an option.
IronMQ
CloudAMQP
AWS (Amazon Web Services) SQS
A good way to send a large amount of emails at a fast pace is to have a lot of worker scripts doing the job instead of 1 php page (GiamPy gave a good example for one of the ways that can be done and I won't mention it since I don't want to be redundant).
One simple (though somewhat hacky) option is: for you to make 20 php scripts in a file. You could name them mailer1.php, mailer1.php, ..., mailer20.php. Then, you could create a folder called mail and put two files inside:
mail/config.txt
and
mail/email.txt
Inside mail/config.txt, you would include the following lines of text:
T
15
where the first line has a T for TRUE meaning you want the mailers to send the mail out as fast as they can in intervals of 15 seconds each. You can obviously change the interval time as well to whatever you like.
And in mail/email.txt you would have the complete email you want to send
After having done all that you make the mailer files. You can make 1 first, write the code, and then copy paste it 19 times to have 20 scripts in total. The code inside could look something like this:
<?php
$pathconfig = "mail/config.txt";
$pathemail = "mail/email.txt";
$email = file_get_contents($pathemail);//now you have the email saved
$filehandleconfig = fopen($pathconfig, "r");
$bool = trim(fgets($pathconfig));
$sleeptime = (integer) trim(fgets($pathconfig));
fclose($filehandleconfig);
while ($bool === 'T')
{
//... code that sends the email
//recheck if 'T' is still 'T':
$filehandleconfig = fopen($pathconfig, "r");
$bool = trim(fgets($pathconfig));
fclose($filehandleconfig);
sleep($sleeptime);
}
?>
So what the previous code would basically do is extract the email that needs to be sent at the beginning, and also extract the time it will sleep after sending an email, and if it should continue to send emails.
What that means is that the mail/config.txt file is your controlpanel, and if you change 'T' to be anything else that 'T' (like 'F' for instance), then all the scripts will terminate.
The downside to this option is that it's a bit hacky, though the upside is that it can be developed in a matter of minutes.

PHP msg_send cant send more the 525 to queue?

When sending a message to the queue via msg_send everything works fine besides one thing.
Running the below function if there are more then 525 messages put in the queue the browser loads until any message > 525 is processed by the worker script. If i put another 525 messages into another QUEUE ID its no problem. Any ideas?
function que($message, $value) {
if (!defined('QUEUE')) define('QUEUE', 16388);
//add message to queue
$queue = msg_get_queue(QUEUE);
//create dummy message object
$object = new stdclass;
$object->message = $message;
$object->value = $value;
$object->id = uniqid();
//send message to queue
//if (msg_send($queue, 1, $object)) { } else { }
msg_send($queue, 1, $object);
}
msg_send is just php wrapper for POSIX V msgsnd, default limit for messages buffer is about 16kb (check MSGMNB in your system).
PHP stores messages as simple records - 4 or 8 byte (32bit or 64bit respectively) field, describing type, and message itself.
You can do calculation by yourself, depends on your stored data.
In C-programm you can call msgctl() to change this limit or pass right one on creation queue, but i am not sure this you can do it from PHP.

Prevent PHP from sending multiple emails when running parallel instances

This is more of a logic question than language question, though the approach might vary depending on the language. In this instance I'm using Actionscript and PHP.
I have a flash graphic that is getting data stored in a mysql database served from a PHP script. This part is working fine. It cycles through database entries every time it is fired.
The graphic is not on a website, but is being used at 5 locations, set to load and run at regular intervals (all 5 locations fire at the same time, or at least within <500ms of each-other). This is real-time info, so time is of the essence, currently the script loads and parses at all 5 locations between 30ms-300ms (depending on the distance from the server)
I was originally having a pagination problem, where each of the 5 locations would pull a different database entry since i was moving to the next entry every time the script runs. I solved this by setting the script to only move to the next entry after a certain amount of time passed, solving the problem.
However, I also need the script to send an email every time it displays a new entry, I only want it to send one email. I've attempted to solve this by adding a "has been emailed" boolean to the database. But, since all the scripts run at the same time, this rarely works (it does sometimes). Most of the time I get 5 emails sent. The timeliness of sending this email doesn't have to be as fast as the graphic gets info from the script, 5-10 second delay is fine.
I've been trying to come up with a solution for this. Currently I'm thinking of spawning a python script through PHP, that has a random delay (between 2 and 5 seconds) hopefully alleviating the problem. However, I'm not quite sure how to run exec() command from php without the script waiting for the command to finish. Or, is there a better way to accomplish this?
UPDATE: here is my current logic (relevant code only):
//get the top "unread" information from the database
$query="SELECT * FROM database WHERE Read = '0' ORDER BY Entry ASC LIMIT 1";
//DATA
$emailed = $row["emailed"];
$Entry = $row["databaseEntryID"];
if($emailed == 0)
{
**CODE TO SEND EMAIL**
$EmailSent="UPDATE database SET emailed = '1' WHERE databaseEntryID = '$Entry'";
$mysqli->query($EmailSent);
}
Thanks!
You need to use some kind of locking. E.g. database locking
function send_email_sync($message)
{
sql_query("UPDATE email_table SET email_sent=1 WHERE email_sent=0");
$result = FALSE;
if(number_of_affacted_rows() == 1) {
send_email_now($message);
$result = TRUE;
}
return $result;
}
The functions sql_query and number_of_affected_rows need to be adapted to your particular database.
Old answer:
Use file-based locking: (only works if the script only runs on a single server)
function send_email_sync($message)
{
$fd = fopen(__FILE__, "r");
if(!$fd) {
die("something bad happened in ".__FILE__.":".__LINE__);
}
$result = FALSE;
if(flock($fd, LOCK_EX | LOCK_NB)) {
if(!email_has_already_been_sent()) {
actually_send_email($message);
mark_email_as_sent();
$result = TRUE; //email has been sent
}
flock($fd, LOCK_UN);
}
fclose($fd);
return $result;
}
You will need to lock the row in your database by using a transaction.
psuedo code:
Start transaction
select row .. for update
update row
commit
if (mysqli_affected_rows ( $connection )) >1
send_email();

Too slow Http Client in Zend Framework 1.12

I want to send ~50 requests to different pages on the same domain and then, I'm using DOM object to gain urls to articles.
The problem is that this number of requests takes over 30 sec.
for ($i = 1; $i < 51; $i++)
{
$url = 'http://example.com/page/'.$i.'/';
$client = new Zend_Http_Client($url);
$response = $client->request();
$dom = new Zend_Dom_Query($response); // without this two lines, execution is also too long
$results = $dom->query('li'); //
}
Is there any way to speed this up?
It's a generel problem by design - not the code itself. If you're doing a for-loop over 50 items each opening an request to an remote uri, things get pretty slow since every requests waits until responde from the remote uri. e.g.: a request takes ~0,6 sec to been completed, multiple this by 50 and you get an exection time of 30 seconds!
Other problem is that most webserver limits its (open) connections per client to an specific amount. So even if you're able to do 50 requests simultaneously (which you're currently not), things won't speed up measurably.
In my option there is only one solution (without any deep going changes):
Change the amout of requests per exection. Make chunks from e.g. only 5 - 10 per (script)-call and trigger them by an external call (e.g. run them by cron).
Todo:
Build a wrapper function which is able to save the state of its current run ("i did request 1 - 10 at my last run, so now I have to call 11 - 20) into a file or database and trigger this function by an cron.
Example Code (untested) for better declaration;
[...]
private static $_chunks = 10; //amout of calls per run
public function cronAction() {
$lastrun = //here get last run parameter saved from local file or database
$this->crawl($lastrun);
}
private function crawl($lastrun) {
$limit = $this->_chunks + $lastrun;
for ($i = $lastrun; $i < limit; $i++)
{
[...] //do stuff here
}
//here set $lastrun parameter to new value inside local file / database
}
[...]
I can't think of a way to speed it up but you can increase the timeout limit in PHP if that is your concern:
for($i=1; $i<51; $i++) {
set_time_limit(30); //This restarts the timer to 30 seconds starting now
//Do long things here
}

Categories