PHP msg_send cant send more the 525 to queue? - php

When sending a message to the queue via msg_send everything works fine besides one thing.
Running the below function if there are more then 525 messages put in the queue the browser loads until any message > 525 is processed by the worker script. If i put another 525 messages into another QUEUE ID its no problem. Any ideas?
function que($message, $value) {
if (!defined('QUEUE')) define('QUEUE', 16388);
//add message to queue
$queue = msg_get_queue(QUEUE);
//create dummy message object
$object = new stdclass;
$object->message = $message;
$object->value = $value;
$object->id = uniqid();
//send message to queue
//if (msg_send($queue, 1, $object)) { } else { }
msg_send($queue, 1, $object);
}

msg_send is just php wrapper for POSIX V msgsnd, default limit for messages buffer is about 16kb (check MSGMNB in your system).
PHP stores messages as simple records - 4 or 8 byte (32bit or 64bit respectively) field, describing type, and message itself.
You can do calculation by yourself, depends on your stored data.
In C-programm you can call msgctl() to change this limit or pass right one on creation queue, but i am not sure this you can do it from PHP.

Related

Can a Queue be rescheduled in OctoberCMS

Task:
I am creating a way that a frontend user can send a message and schedule a time for it to be delivered. To accomplish this, I am storing the message info in database tables and then Setting a queue to fire a send function at the appropriate time.
Question:
If the user changes their mind about the time to send the message after this code is executed, is there a way to remove this from the queue and then re add it to fire at a different time?
Example
$data = ['message_id' => $this->messageModel->id];
$queue = Queue::later($this->send_at, 'KurtJensen\Twilio\Classes\SendQueue', $data);
// ==== Everything works great up to this point =======
// Don't know if this will work
// Can I get a queue identifier here?
$this->messageModel->queue_id = $queue->id;
$this->messageModel->save();
Then later to change time:
$this->messageModel= Message::find($id);
$q_id = $this->messageModel->queue_id;
// ==== I doubt this would work or if canceling a queue is possible =======
Queue::cancel($q_id);
$queue = Queue::later($new_time, 'KurtJensen\Twilio\Classes\SendQueue', $data);
$this->messageModel->queue_id = $queue->id;
$this->messageModel->save();

How to send a large email?

$i = 1;
foreach ($recipients as $email => $name) {
$mail->ClearAddresses();
$mail->AddBCC($email, $name);
if (!$mail->send()) {
$send = 0;
} else {
$send = 1;
}
$query = "INSERT INTO `newsletter_send`(`email`, `id_newsletter`, `date`, `send`) VALUES ('$email',$id_newsletter, NOW(),$send) ";
$stmt = $link->prepare($query) or die('error');
$stmt->execute();
$mail->clearAllRecipients();
if (($i % 100) == 0) {
sleep(60);
}
$i++;
}
What is the best way to send a large emails without sleep() and without to wait the page to finish loading? In addition to the cron job you have other ideas ?
EDIT: I have 680 users who will receive the email, but after a while I get 500 Internal Server Error.. why? It maybe time_limit?
Message queues.
beanstalkd is a good solution.
You can then use a SDK like pheanstalk to handle the queue and its jobs.
EDIT: If you have restricted access to your server (for example, if you are using a shared hosting) message queues as a service are also an option.
IronMQ
CloudAMQP
AWS (Amazon Web Services) SQS
A good way to send a large amount of emails at a fast pace is to have a lot of worker scripts doing the job instead of 1 php page (GiamPy gave a good example for one of the ways that can be done and I won't mention it since I don't want to be redundant).
One simple (though somewhat hacky) option is: for you to make 20 php scripts in a file. You could name them mailer1.php, mailer1.php, ..., mailer20.php. Then, you could create a folder called mail and put two files inside:
mail/config.txt
and
mail/email.txt
Inside mail/config.txt, you would include the following lines of text:
T
15
where the first line has a T for TRUE meaning you want the mailers to send the mail out as fast as they can in intervals of 15 seconds each. You can obviously change the interval time as well to whatever you like.
And in mail/email.txt you would have the complete email you want to send
After having done all that you make the mailer files. You can make 1 first, write the code, and then copy paste it 19 times to have 20 scripts in total. The code inside could look something like this:
<?php
$pathconfig = "mail/config.txt";
$pathemail = "mail/email.txt";
$email = file_get_contents($pathemail);//now you have the email saved
$filehandleconfig = fopen($pathconfig, "r");
$bool = trim(fgets($pathconfig));
$sleeptime = (integer) trim(fgets($pathconfig));
fclose($filehandleconfig);
while ($bool === 'T')
{
//... code that sends the email
//recheck if 'T' is still 'T':
$filehandleconfig = fopen($pathconfig, "r");
$bool = trim(fgets($pathconfig));
fclose($filehandleconfig);
sleep($sleeptime);
}
?>
So what the previous code would basically do is extract the email that needs to be sent at the beginning, and also extract the time it will sleep after sending an email, and if it should continue to send emails.
What that means is that the mail/config.txt file is your controlpanel, and if you change 'T' to be anything else that 'T' (like 'F' for instance), then all the scripts will terminate.
The downside to this option is that it's a bit hacky, though the upside is that it can be developed in a matter of minutes.

How to build a mysql queuing system with count downs?

I am working on a browser/mobile game and I am trying to build a system that automatically ends queued tasks after a certain time has passed. It's the basic research schema used in most games.
Research A costs $100 and will take 1 hour to complete. Do I have to check every second for tasks that are at or past their completion time and trigger an event to clear them and increment the level number? Is there a better way or more optimum way? This idea works by itself but what happens if you need to run 5 or 6 different queues in the game design? show I abstract them enough to get them all in one table?
I apologize if I seem a little vague or erratic with my questions. I am trying to figure out where to start with this concept.
I'm not very familiar with it, but I believe you could use websockets or NodeJS to create a callback event, you could then call that callback with a PHP socket server. This kind of
You can base yourself off this tutorial: http://www.sanwebe.com/2013/05/chat-using-websocket-php-socket
Steps
First, identify the message type using the websocket.onmessage callback, something similar to this should work:
websockets.onmessage = function(ev)
{
var msg = JSON.parse(ev.data); //Assuming you'll encode the message components in JSON with PHP
if ( msg.type == "research_end" )
{
FinishResearch(msg.content); //Assuming that the content element of the JSON array contains the ID of the research
}
}
Secondly, make the server send the actual message. To not make this too complicated or long I'll just pretend that sendMessage($msg, $client) is a function that sends a message to a client.
However, as explained in the tutorial, each client socket is stored in an array called $clients, you'll have to add some kind of identifier to each research so it's easy to know which research belongs to what client.
Now, here's an important part. On the server there will be a variable called $research which will be structured as such:
$research['peername'][0]['time'] = 60000
$research['peername'][0]['type'] = 20
You can add the research by sending an outgoing message to the websocket server by using this:
var array = {message: '20', type: 'research', time : '300000'}; //Create the request array
websocket.send(JSON.stringify(msg)); //Send it to the socket server as a json string, decode with json_decode once it arrives
Then, when it gets to the server and is identified as a research request, we call a callback called doResearch which takes two arguments
//loop through all connected sockets
foreach ($changed as $changed_socket) {
//check for any incomming data
while(socket_recv($changed_socket, $buf, 1024, 0) >= 1)
{
$received_text = unmask($buf); // Unmask data
$msg_array = json_decode($received_text); // Decode the JSON string we sent to the server
doResearch($msg_array, $changed_socket); // Let's say this function contains all the procedures to do the research
}
}
doResearch would be similar to this:
function doResearch($msg_array, $socket)
{
$name = socket_getpeername($socket, $addr);
$count = count($research[$name]);
$research[$name][$count]['time'] = $msg_array['time'];
$research[$name][$count]['type'] = $msg_array['type'];
}
And finally, you would have to add a conditional like this inside the main server loop:
foreach ( $research as $i )
{
foreach ( $i as $i2 )
{
if ( time() <= $i2['time'] )
{
$sql->query("INSERT INTO researches('peer', 'researchid') VALUES ('".$i."', '".$i2['type']."')");
sendMessage('Research type '.$i2['type'].' has finished.', $i2['socket']);
}
}
}
Then, that would check if a research has been finished and insert it into the database.
Hope this helps.

Prevent PHP from sending multiple emails when running parallel instances

This is more of a logic question than language question, though the approach might vary depending on the language. In this instance I'm using Actionscript and PHP.
I have a flash graphic that is getting data stored in a mysql database served from a PHP script. This part is working fine. It cycles through database entries every time it is fired.
The graphic is not on a website, but is being used at 5 locations, set to load and run at regular intervals (all 5 locations fire at the same time, or at least within <500ms of each-other). This is real-time info, so time is of the essence, currently the script loads and parses at all 5 locations between 30ms-300ms (depending on the distance from the server)
I was originally having a pagination problem, where each of the 5 locations would pull a different database entry since i was moving to the next entry every time the script runs. I solved this by setting the script to only move to the next entry after a certain amount of time passed, solving the problem.
However, I also need the script to send an email every time it displays a new entry, I only want it to send one email. I've attempted to solve this by adding a "has been emailed" boolean to the database. But, since all the scripts run at the same time, this rarely works (it does sometimes). Most of the time I get 5 emails sent. The timeliness of sending this email doesn't have to be as fast as the graphic gets info from the script, 5-10 second delay is fine.
I've been trying to come up with a solution for this. Currently I'm thinking of spawning a python script through PHP, that has a random delay (between 2 and 5 seconds) hopefully alleviating the problem. However, I'm not quite sure how to run exec() command from php without the script waiting for the command to finish. Or, is there a better way to accomplish this?
UPDATE: here is my current logic (relevant code only):
//get the top "unread" information from the database
$query="SELECT * FROM database WHERE Read = '0' ORDER BY Entry ASC LIMIT 1";
//DATA
$emailed = $row["emailed"];
$Entry = $row["databaseEntryID"];
if($emailed == 0)
{
**CODE TO SEND EMAIL**
$EmailSent="UPDATE database SET emailed = '1' WHERE databaseEntryID = '$Entry'";
$mysqli->query($EmailSent);
}
Thanks!
You need to use some kind of locking. E.g. database locking
function send_email_sync($message)
{
sql_query("UPDATE email_table SET email_sent=1 WHERE email_sent=0");
$result = FALSE;
if(number_of_affacted_rows() == 1) {
send_email_now($message);
$result = TRUE;
}
return $result;
}
The functions sql_query and number_of_affected_rows need to be adapted to your particular database.
Old answer:
Use file-based locking: (only works if the script only runs on a single server)
function send_email_sync($message)
{
$fd = fopen(__FILE__, "r");
if(!$fd) {
die("something bad happened in ".__FILE__.":".__LINE__);
}
$result = FALSE;
if(flock($fd, LOCK_EX | LOCK_NB)) {
if(!email_has_already_been_sent()) {
actually_send_email($message);
mark_email_as_sent();
$result = TRUE; //email has been sent
}
flock($fd, LOCK_UN);
}
fclose($fd);
return $result;
}
You will need to lock the row in your database by using a transaction.
psuedo code:
Start transaction
select row .. for update
update row
commit
if (mysqli_affected_rows ( $connection )) >1
send_email();

Non-blocking transactions when reading from ActiveMQ queue with STOMP

I'm interacting with ActiveMQ via STOMP. I have one process which publishes messages and a multiple processes that subscribes and processes the messages (about 10 parallel instances).
After reading a message I want to be sure that if, for some reason my application fails/crashes, the message will not be lost. So naturally, I turned to transactions. Unfortunately, I discovered that once a consumer reads a message as a part of the transaction, all the following messages are not being sent to the other consumers, until the transaction ends.
Test case: abc queue has a 100 messages. If I activate the following code in two different browser tabs, the first will return in 10 seconds and the second will return in 20 seconds.
<?php
// Reader.php
$con = new Stomp("tcp://localhost:61613");
$con->connect();
$con->subscribe(
"/queue/abc",
array()
);
$tx = "tx3".microtime();
echo "TX:$tx<BR>";
$con->begin($tx);
$messages = array();
for ($i = 0; $i < 10; $i++) {
$t = microtime(true);
$msg = $con->readFrame();
if (!$msg) {
die("FAILED!");
}
$t = microtime(true)-$t; echo "readFrame() took $t MS to complete<BR>";
array_push($messages, $msg);
$con->ack($msg, $tx);
sleep(1);
}
$con->abort($tx);
Is there something I'm missing code-wise? Is there a way to configure ActiveMQ (or send a header) that will make the transaction remove the item from the queue, allow other processes consume the other messages, and if the transaction fails or is timed-out, will put the item back in?
PS: I thought about creating another queue - DetentionQueue for each reading process but I really rather not do it if I have a choice.
You will probably want to adjust the prefetch size of the subscription so that ActiveMQ doesn't send the Messages on the Queue to client 1 before client 2 gets a chance to get any. By default its set to 1000 so best to tune it for your use case.
You can set the prefetch size via the "activemq.prefetchSize=1" header on the subscribe frame. Refer to the ActiveMQ Stomp page for all the frame options.

Categories