Throttling PHPmailer for use in Elgg - php

I'll be using the social networking software, Elgg, for an organization that needs to send mass emails to specific groups when they need to. The number of emails can range from 10-1000 depending on the group. Web host only allows 500 emails per hour, so I need to throttle the script to send one email every 8 seconds.
I'm using PHPmailer with Elgg. PHPmailer says that I should use these two scripts (code below) in conjunction with each other in order to throttle the mailing. I know how I'm going to use the code in the mailing script, I'm just unsure about a couple things.
1) I don't really understand the purpose for the safemode
2) After looking up set_time_limit, it looks like I should set this to an amount of time to allow all potential emails to be sent, whether it's 10 or 1000? Or is this a max of 30 seconds per loop in case it needs to timeout?
3) How should I set this to get what I need?
Links to PHPmailer describing code:
http://phpmailer.worxware.com/index.php?pg=tip_ext
http://phpmailer.worxware.com/index.php?pg=tip_pause
<?php
/* The following code snippet with set the maximum execution time
* of your script to 300 seconds (5 minutes)
* Note: set_time_limit() does not work with safe_mode enabled
*/
$safeMode = ( #ini_get("safe_mode") == 'On' || #ini_get("safe_mode") === 1 ) ? TRUE : FALSE;
if ( $safeMode === FALSE ) {
set_time_limit(300); // Sets maximum execution time to 5 minutes (300 seconds)
// ini_set("max_execution_time", "300"); // this does the same as "set_time_limit(300)"
}
echo "max_execution_time " . ini_get('max_execution_time') . "<br>";
/* if you are using a loop to execute your mailing list (example: from a database),
* put the command in the loop
*/
while (1==1) {
set_time_limit(30); // sets (or resets) maximum execution time to 30 seconds)
// .... put code to process in here
if (1!=1) {
break;
}
}
?>
and
<?php
/* Note: set_time_limit() does not work with safe_mode enabled */
while (1==1) {
set_time_limit(30); // sets (or resets) maximum execution time to 30 seconds)
// .... put code to process in here
usleep(1000000); // sleep for 1 million micro seconds - will not work with Windows servers / PHP4
// sleep(1); // sleep for 1 seconds (use with Windows servers / PHP4
if (1!=1) {
break;
}
}
?>

Safe mode is deprecated as of php 5.3 and removed in php 5.4, so if your install is relatively recent, it's a moot point: http://php.net/manual/en/ini.sect.safe-mode.php#ini.safe-mode
Doing a set_time_limit() will reset the counter, so as long as your code reaches the set_time_limit() call in less time than the limit was set previously (e.g. gets there in 29 seconds, leaving 1 second on clock), the code will reset the timer and get another 30 seconds. However, since you don't want your code to be racy, you should simply disable the time limit entirely.
Personally, I wouldn't dump out one email every 8 seconds. I'd blast out the 500 we're allowed, then have a scheduled job to fire up the script once an hour and resume from where the blast left off. This will make things be a bit bursty for the mail server, but potentially more efficient in the long run, as it could batch together emails for the same recipient domains. e.g. all #aol.com mails in the group of 500 can go together, rather than forcing the server to connect to aol multiple times to deliver individual mails.
As well, if you're batching like this, a server failure will only be 'bad' during the few seconds when the script's actually running and building emails. The rest of the time the PHP script won't even be running, and it'll be up to the smtp server to do its thing.

I might not be of quick and perticular help but i would consider an asynchronous approach.
This involves storing the task to send an email in a queue and having workers which process those tasks.
The simplest way is to just store emails in the database and have a cronjob running on the server which sends the emails in batches.
The better (but more complex) solution would be to use some sort of message queue system, like zeromq or the heavy-weight rabbitmq.
The last and maybe most comfortable option from the top of my head is to use a web service like MailChimp or Postmark.

Related

PHPList parallel processing

Been using PHPlist and im trying to get parallel processing to work but it seems that even if I set batch processing and mailqueue throttling as well as parallel processing only one campaign seems to run even after the batch limit is reached. Heres the config.php file with the mail processing section below. Please note I used the config_extend file to further customise this. I also have a cron job that runs every 5mins to check and process the queue.
=========================================================================
Queue and Load management
=========================================================================
*/
// If you set up your system to send the message automatically (from commandline),
// you can set this value to 0, so "Process Queue" will disappear from the site
// this will also stop users from loading the page on the web frontend, so you will
// have to make sure that you run the queue from the commandline
// check README.commandline how to do this
define('MANUALLY_PROCESS_QUEUE', 1);
// This setting will activate an initial setup choice for processing the queue
// When "true" it will allow a choice between remote queue processing with the
// phpList.com service or processing it locally in the browser.
// when the value is "false", you can use remote processing in your own way
// as explain at https://resources.phplist.com/system/remote_processing
define('SHOW_PQCHOICE',false);
// batch processing
// if you are on a shared host, it will probably be appreciated if you don't send
// out loads of emails in one go. To do this, you can configure batch processing.
// Please note, the following two values can be overridden by your ISP by using
// a server wide configuration. So if you notice these values to be different
// in reality, that may be the case
// max messages to process
// if there are multiple messages in the queue, set a maximum to work on
define('MAX_PROCESS_MESSAGE', 999);
// process parallel
// if there are multiple messages in the queue, divide the max batch across them
// instead of sending them one by one.
// this only works if you use batch processing. It will divide the batch between the
// campaigns that need sending.
define('PROCESSCAMPAIGNS_PARALLEL',true);
// define the amount of emails you want to send per period. If 0, batch processing
// is disabled and messages are sent out as fast as possible
define('MAILQUEUE_BATCH_SIZE', 360);
// define the length of one batch processing period, in seconds (3600 is an hour)
define('MAILQUEUE_BATCH_PERIOD', 3600);
// to avoid overloading the server that sends your email, you can add a little delay
// between messages that will spread the load of sending
// you will need to find a good value for your own server
// value is in seconds, and you can use fractions, eg "0.5" is half a second
// (or you can play with the autothrottle below)
define('MAILQUEUE_THROTTLE', 10);
// Mailqueue autothrottle. This will try to automatically change the delay
// between messages to make sure that the MAILQUEUE_BATCH_SIZE (above) is spread evently over
// MAILQUEUE_BATCH_PERIOD, instead of firing the Batch in the first few minutes of the period
// and then waiting for the next period. This only works with mailqueue_throttle off
// and MAILQUEUE_BATCH_PERIOD being a positive value
// it still needs tweaking, so send your feedback to mantis.phplist.com if you find
// any issues with it
define('MAILQUEUE_AUTOTHROTTLE', 0);
// Domain Throttling
// You can activate domain throttling, by setting USE_DOMAIN_THROTTLE to 1
// define the maximum amount of emails you want to allow sending to any domain and the number
// of seconds for that amount. This will make sure you don't send too many emails to one domain
// which may cause blacklisting. Particularly the big ones are tricky about this.
// it may cause a dramatic increase in the amount of time to send a message, depending on how
// many users you have that have the same domain (eg hotmail.com)
// if too many failures for throttling occur, the send process will automatically add an extra
// delay to try to improve that. The example sends 1 message every 2 minutes.
define('USE_DOMAIN_THROTTLE', 0);
define('DOMAIN_BATCH_SIZE', 1);
define('DOMAIN_BATCH_PERIOD', 120);
// if you have very large numbers of users on the same domains, this may result in the need
// to run processqueue many times, when you use domain throttling. You can also tell phplist
// to simply delay a bit between messages to increase the number of messages sent per queue run
// if you want to use that set this to 1, otherwise simply run the queue many times. A cron
// process every 10 or 15 minutes is recommended.
define('DOMAIN_AUTO_THROTTLE', 0);
// MAX_PROCESSQUEUE_TIME
// to limit the time, regardless of batch processing or other throttling of a single run of "processqueue"
// you can set the MAX_PROCESSQUEUE_TIME in seconds
// if a single queue run exceeds this amount, it will stop, just to pick up from where it left off next time
// this allows multiple installations each to run the queue, but slow installations (eg with large emails)
// set to 0 to disable this feature.
define('MAX_PROCESSQUEUE_TIME', 0);
/*
My goal is to have 2 campaigns sending out mail. I know it cant send both campaigns out at once unless I have 2 installs but even after sending the max batch it doesnt switch to the second campaign. Doesnt anyone have experience in this?
Thanks
After much tinkering and trying to figure out why emails were being sent out so slow came down to a bandwidth problem. I thought it was something to do with PHPList and the scripts but after checking with our hosting provider I can confirm they have been having bandwith issues. So all fixed and working now

How do I observe a rate Limit in PHP?

So, I'm requesting data from an API.
So far, my API key is limited to:
10 requests every 10 seconds
500 requests every 10 minutes
Bascially, I want to request a specific value from every game the user has played.
That are, for example, about 300 games.
So I have to make 300 requests with my PHP. How can I slow them down to observe the rate limit?
(It can take time, site does not have to be fast)
I tried sleep(), which resulted in my script crashing.. Any other ways to do this?
I suggest setting up a cron job that executes every minute, or even better use Laravel scheduling rather than using sleep or usleep to imitate a cron.
Here is some information on both:
https://laravel.com/docs/5.1/scheduling
http://www.cyberciti.biz/faq/how-do-i-add-jobs-to-cron-under-linux-or-unix-oses/
This sounds like a perfect use for the set_time_limit() function. This function allows you to specify how long your script can execute, in seconds. For example, if you say set_time_limit(45); at the beginning of your script, then the script will run for a total of 45 seconds. One great feature of this function is that you can allow your script to execute indefinitely (no time limit) by saying: set_time_limit(0);.
You may want to write your script using the following general structure:
<?php
// Ignore user aborts and allow the script
// to run forever
ignore_user_abort(true);
set_time_limit(0);
// Define constant for how much time must pass between batches of connections:
define('TIME_LIMIT', 10); // Seconds between batches of API requests
$tLast = 0;
while( /* Some condition to check if there are still API connections that need to be made */ ){
if( timestamp() <= ($tLast + TIME_LIMIT) ){ // Check if TIME_LIMIT seconds have passed since the last connection batch
// TIME_LIMIT seconds have passed since the last batch of connections
/* Use cURL multi to make 10 asynchronous connections to the API */
// Once all of those connections are made and processed, save the current time:
$tLast = timestamp();
}else{
// TIME_LIMIT seconds have not yet passed
// Calculate the total number of seconds remaining until TIME_LIMIT seconds have passed:
$timeDifference = $tLast + TIME_LIMIT - timestamp();
sleep( $timeDifference ); // Sleep for the calculated number of seconds
}
} // END WHILE-LOOP
/* Do any additional processing, computing, and output */
?>
Note: In this code snippet, I am also using the ignore_user_abort() function. As noted in the comment on the code, this function just allows the script to ignore a user abort, so if the user closes the browser (or connection) while your script is still executing, the script will continue retrieving and processing the data from the API anyway. You may want to disable that in your implementation, but I will leave that up to you.
Obviously this code is very incomplete, but it should give you a decent understanding of how you could possibly implement a solution for this problem.
Don't slow the individual requests down.
Instead, you'd typically use something like Redis to keep track of requests per-IP or per-user. Once the limit is hit for a time period, reject (with a HTTP 429 status code, perhaps) until the count resets.
http://redis.io/commands/INCR coupled with http://redis.io/commands/expire would easily do the trick.

PHP Cron job execution time limits fail

I've got a CronJob scheduled to run every 7 minutes. The script runs a loop over a number of users and sends email via SMTP or does some API calls via curl.
Thus, most of the execution time is apparently spent outside the realm tracked by max_execution_time on Linux. So, because I was experiencing hangs in my script that were always fixed by restarting it (I'm also looking for the cause of the hangs, but wasn't successful so far).
Because with set_time_limit set to 6 minutes, the script still ran 30 minutes sometimes, I now check microtime(true) after each round in the loop and break out of it, if it has been running more than 6 minutes.
Still, the script sometimes runs 37 minutes (even though I can see that emails, which map to one round in a loop, still go off).
The only trick I have left is pcntl_fork. I am reluctant to employ it because of platform dependence, and because I figured using a loop and microtime(true) should track time spent outside the process too and I'd like to understand why this isn't the case here.
The max_execution_time is used for limiting script execution time. However this does not affect system calls. See the manpage.
You can also use set_time_limit() to properly set the maximum amount of time a script can run. This only changes the time_limit for the script in scope. Have you tried this as well?
You can find another question that has been asked may help slightly at this link.
I ended up going outside PHP to get the time, this is simple, works and doesn't require me to learn to deal with pcntl functions.
$time_comp = microtime(true); // save start time
function check_time_against_mysql($dbh,$time_comp) // run this function
{ // to see how much time passed
$time = $dbh->prepare("SELECT UNIX_TIMESTAMP() - :created AS time_in_seconds");
$time->bindValue(":created",$time_comp);
$time->execute() or die("fail time");
$time_passed = $time->fetch(PDO::FETCH_ASSOC);
return floatval($time_passed['time_in_seconds']);
}

php using loop to restart time limit for script execution

I'm running a foreach loop in php which takes longer to execute than my maximum execution time of 30 seconds. The loop sends individual emails to users.
Instead of running cron jobs every 30 seconds and creating queues for records is it unethical to just restart the counter in the loop using set_time_limit(30) ?
$i = 0; //start count from 0
foreach ($users as $user):
//limit emails sent
if(++$i == 100) break; //ends execution of loop
set_time_limit(30); //restart timeout counter
send_email($user); //send email to user
endforeach;
I'm new to this but with the code above I think I'm giving each email 30 seconds to complete but also breaking the loop when 100 emails are sent so the script doesn't run forever.
Update: set_time_limit(0) goes against hosting TOS, I believed that restarting the timeout counter restarts the script as well as would CRON
Running set_time_limit in foreach loop with brings and solves few problems at the same time.
I see the greatest pro of this solution in making sure that no request will take more than 30 seconds long (and when you have a full cue I believe it's even desirable to cut every script that takes that long) .
The problem it brings it's that no all jobs will be executed necessarily. Maybe you'll experience some problems in the middle of jobs queue and it will all fail.
I would go with this:
# crontab
0,30 * * * * php /path/to/your/script.php
And would use your script.
If you need to execute jobs as fast as possible, I'd create a bash script that would execute (without any timeout) php script as long as it wouldn't finish with exit(0) (all jobs executed successfully) or wouldn't return "Done!" or whatever you like.
Exampe bash script: 1,2
#!/bin/bash
# Note that false sets $? to 1
false
while [ $? -ne 0 ]; do
php /path/to/your/script.php >> log.log
done
And if you need to make sure no two instances will run at the same time, google one of those (just from the top of my head):
.pid file
mysql LOCK TABLE
PS: If you'll use some of the method make sure that your script will work if it crashes in the middle
Just disable the time limit all together at the start of your script:
set_time_limit(0);
If your host's TOS preclude using unlimited scripts, they're almost certainly going to object to resetting the script. The only choice would be to send the emails in parallel, or move to a different host.

Delay email sending with php

I'm using a FOR loop to send emails from an array[250].
for ($counter = 0; $counter <= 250; $counter ++){
// send email function[$counter]
}
I thought about the sleep() function but since the server have limit excute time isn't an option.
Please help me with this!
To delay sending emails in a loop, you can create your own wait() function with a loop inside it and call it before iterating. If the reason you want to wait is to avoid problems with an ISP then read this SO Answer:
Sending mass email using PHP
Without some kind of scheduler you're always going to hit your execution limit.
You may want to store the emails in a database then have cron execute them.
Or you can up your execution time:
<?php
//replace 600 without how many seconds you need
ini_set('max_execution_time', 600);
... loop through emails
?>
Why do you need to delay them anyways?
Apparently (untested) the sleep function takes control away from php so the max execution time does not apply.
From: http://www.hackingwithphp.com/4/11/0/pausing-script-execution
"Note that the default maximum script execution time is 30 seconds, but you can use sleep() and usleep() to make your scripts go on for longer than that because technically PHP does not have control during the sleep operation."
Use cron - almost all hosts let you use it (except the free-host ones) and they should be more than happy to help you set it up if you need assistance (if they don't help you, don't give them your money)

Categories