Delay email sending with php - php

I'm using a FOR loop to send emails from an array[250].
for ($counter = 0; $counter <= 250; $counter ++){
// send email function[$counter]
}
I thought about the sleep() function but since the server have limit excute time isn't an option.
Please help me with this!

To delay sending emails in a loop, you can create your own wait() function with a loop inside it and call it before iterating. If the reason you want to wait is to avoid problems with an ISP then read this SO Answer:
Sending mass email using PHP

Without some kind of scheduler you're always going to hit your execution limit.
You may want to store the emails in a database then have cron execute them.
Or you can up your execution time:
<?php
//replace 600 without how many seconds you need
ini_set('max_execution_time', 600);
... loop through emails
?>
Why do you need to delay them anyways?

Apparently (untested) the sleep function takes control away from php so the max execution time does not apply.
From: http://www.hackingwithphp.com/4/11/0/pausing-script-execution
"Note that the default maximum script execution time is 30 seconds, but you can use sleep() and usleep() to make your scripts go on for longer than that because technically PHP does not have control during the sleep operation."

Use cron - almost all hosts let you use it (except the free-host ones) and they should be more than happy to help you set it up if you need assistance (if they don't help you, don't give them your money)

Related

prevent php timeout in block of code

Wanted to know if there was a was to prevent a php timeout of occurring if a part of the code has started being process.
Let me explain:
i have a script that is executed that take way too long to even use
ini_set('max_execution_time', 0);
set_time_limit(0);
the code is built to allow it to timeout and restart where it was but I have 2 line of code that need to be executed together for that to happen
$email->save();
$this->_setDoneWebsiteId($user['id'], $websiteId);
is there a way in php to tell it it has to finish executing them both even if the timeout is called?
Got an idea as I'm writing this, i could use a time_out of 120 sec and start a timer and if there is less then 20 sec left before timeout to stop, i just wanted to know if i was missing something.
Thank you for your inputs.
If your code is not synchronous and some task takes more than 100 seconds - you'll not be able to check the execution time.
I see only one truly HACK (be careful, test it with php -f in console for be able to kill the processes):
<?php
// Any preparations here
register_shutdown_function(function(){
if (error_get_last()) { // There was timeout exceeded error
// Call the rest of your system
// But note: you have no stack, no context, no valid previous frame - nothing!
}
});
One thing you could do is use the DATE time features to monitor your average execution time. (kind of builds up with each execution assuming you have a loop).
If the average then time is then longer than how ever much time you have left (you would be counting how much time has been taken already against your maximum execution time), you would trigger a restart and let it pick up from where it left off.
How ever if you are experiencing time outs, you might want to look at ways to make your code more efficient.
No, you can't abort timeout handler, but i'd say that 20 seconds is quite a big time if you're not parsing something huge. However, you can do the following:
Get time of the execution start ($start = $_SERVER['REQUEST_TIME'] or just $start = microtime(true); in the beginning of your controller).
Asset that execution time is lesser than 100 seconds before running $email->save() or halt/skip code if necessary. This is as easy as if (microtime(true) - $start < 100) { $email->save()... }. You would like more abstraction on this, however.
(if necessary) check execution time after both methods have ran and halt execution if it has timed out.
This will require time_limit set to big value or even turned off and tedious work to prevent too long execution. In most cases timeout is your friend that just tells you're work is taking too much time and you should rethink your architecture and inner processes; if you're out of 120 seconds, you'll probably want to put that work on a daemon.
Thank you for your input, as I thought the timer solution is the best way.
what I ended up doing was the following, this is not the actual code as its too long to make a good answer but just the general idea.
ini_set('max_execution_time', 180);
set_time_limit(180);
$startTime = date('U'); //give me the current timestamps
while(true){
//gather all the data i need for the email
//I know its overkill to break the loop with 1 min remaining
//but i realy dont want to take chances
if(date('U') < ($startTime+120)){
$email->save();
$this->_setDoneWebsiteId($user['id'], $websiteId);
}else{
return false
}
}
I could not use the idea of measuring the average time of each cycle as it vary too much.
I could have made the code more efficient but it number of cycle is based on the number of users and websites in the framework. It should grow big enough to need multiple run to be completed anyway.
Il have to make some research to understand register_shutdown_function, but I will look into it.
Again Thank you!

PHP Cron job execution time limits fail

I've got a CronJob scheduled to run every 7 minutes. The script runs a loop over a number of users and sends email via SMTP or does some API calls via curl.
Thus, most of the execution time is apparently spent outside the realm tracked by max_execution_time on Linux. So, because I was experiencing hangs in my script that were always fixed by restarting it (I'm also looking for the cause of the hangs, but wasn't successful so far).
Because with set_time_limit set to 6 minutes, the script still ran 30 minutes sometimes, I now check microtime(true) after each round in the loop and break out of it, if it has been running more than 6 minutes.
Still, the script sometimes runs 37 minutes (even though I can see that emails, which map to one round in a loop, still go off).
The only trick I have left is pcntl_fork. I am reluctant to employ it because of platform dependence, and because I figured using a loop and microtime(true) should track time spent outside the process too and I'd like to understand why this isn't the case here.
The max_execution_time is used for limiting script execution time. However this does not affect system calls. See the manpage.
You can also use set_time_limit() to properly set the maximum amount of time a script can run. This only changes the time_limit for the script in scope. Have you tried this as well?
You can find another question that has been asked may help slightly at this link.
I ended up going outside PHP to get the time, this is simple, works and doesn't require me to learn to deal with pcntl functions.
$time_comp = microtime(true); // save start time
function check_time_against_mysql($dbh,$time_comp) // run this function
{ // to see how much time passed
$time = $dbh->prepare("SELECT UNIX_TIMESTAMP() - :created AS time_in_seconds");
$time->bindValue(":created",$time_comp);
$time->execute() or die("fail time");
$time_passed = $time->fetch(PDO::FETCH_ASSOC);
return floatval($time_passed['time_in_seconds']);
}

How long can a php code execute?

I dont know if its the right question.
The thing is that, I have a list of users and their emails stored in my database.
Now I want to send a email to all of them with the help of a loop and setting a time interval delay (may be 5-10 seconds ).
foreach($users as $user){
//code to send the email
}
I roughly have around 50 users. So will the code execute until the loop will complete? I mean is it a correct way to do?
What if I have hundreds or even thousands of users in future?
The default max execution time is 30 seconds. You can easily modify this by going to your php.ini file and changing max_execution_time option.
If you wish to modify max execution time within your PHP script, you can use this function: set_time_limit ( int $seconds ), you can also revert these changes later.
//Settings time limit to 0 will make the script execute without time limit(see comments).
set_time_limit (0);
foreach($users as $user){
//code to send the email
}
set_time_limit (30);
http://php.net/manual/en/info.configuration.php
By default the max_execution_time is 30 seconds. But I don't understand why you would want to stall a script for 5-10 seconds for every user when you're sending an e-mail. You may want to look at a cron job instead.
This problem is belong to server config for maximum time execute. You can read here How can I set the maximum execution time for a PHP script?

Throttling PHPmailer for use in Elgg

I'll be using the social networking software, Elgg, for an organization that needs to send mass emails to specific groups when they need to. The number of emails can range from 10-1000 depending on the group. Web host only allows 500 emails per hour, so I need to throttle the script to send one email every 8 seconds.
I'm using PHPmailer with Elgg. PHPmailer says that I should use these two scripts (code below) in conjunction with each other in order to throttle the mailing. I know how I'm going to use the code in the mailing script, I'm just unsure about a couple things.
1) I don't really understand the purpose for the safemode
2) After looking up set_time_limit, it looks like I should set this to an amount of time to allow all potential emails to be sent, whether it's 10 or 1000? Or is this a max of 30 seconds per loop in case it needs to timeout?
3) How should I set this to get what I need?
Links to PHPmailer describing code:
http://phpmailer.worxware.com/index.php?pg=tip_ext
http://phpmailer.worxware.com/index.php?pg=tip_pause
<?php
/* The following code snippet with set the maximum execution time
* of your script to 300 seconds (5 minutes)
* Note: set_time_limit() does not work with safe_mode enabled
*/
$safeMode = ( #ini_get("safe_mode") == 'On' || #ini_get("safe_mode") === 1 ) ? TRUE : FALSE;
if ( $safeMode === FALSE ) {
set_time_limit(300); // Sets maximum execution time to 5 minutes (300 seconds)
// ini_set("max_execution_time", "300"); // this does the same as "set_time_limit(300)"
}
echo "max_execution_time " . ini_get('max_execution_time') . "<br>";
/* if you are using a loop to execute your mailing list (example: from a database),
* put the command in the loop
*/
while (1==1) {
set_time_limit(30); // sets (or resets) maximum execution time to 30 seconds)
// .... put code to process in here
if (1!=1) {
break;
}
}
?>
and
<?php
/* Note: set_time_limit() does not work with safe_mode enabled */
while (1==1) {
set_time_limit(30); // sets (or resets) maximum execution time to 30 seconds)
// .... put code to process in here
usleep(1000000); // sleep for 1 million micro seconds - will not work with Windows servers / PHP4
// sleep(1); // sleep for 1 seconds (use with Windows servers / PHP4
if (1!=1) {
break;
}
}
?>
Safe mode is deprecated as of php 5.3 and removed in php 5.4, so if your install is relatively recent, it's a moot point: http://php.net/manual/en/ini.sect.safe-mode.php#ini.safe-mode
Doing a set_time_limit() will reset the counter, so as long as your code reaches the set_time_limit() call in less time than the limit was set previously (e.g. gets there in 29 seconds, leaving 1 second on clock), the code will reset the timer and get another 30 seconds. However, since you don't want your code to be racy, you should simply disable the time limit entirely.
Personally, I wouldn't dump out one email every 8 seconds. I'd blast out the 500 we're allowed, then have a scheduled job to fire up the script once an hour and resume from where the blast left off. This will make things be a bit bursty for the mail server, but potentially more efficient in the long run, as it could batch together emails for the same recipient domains. e.g. all #aol.com mails in the group of 500 can go together, rather than forcing the server to connect to aol multiple times to deliver individual mails.
As well, if you're batching like this, a server failure will only be 'bad' during the few seconds when the script's actually running and building emails. The rest of the time the PHP script won't even be running, and it'll be up to the smtp server to do its thing.
I might not be of quick and perticular help but i would consider an asynchronous approach.
This involves storing the task to send an email in a queue and having workers which process those tasks.
The simplest way is to just store emails in the database and have a cronjob running on the server which sends the emails in batches.
The better (but more complex) solution would be to use some sort of message queue system, like zeromq or the heavy-weight rabbitmq.
The last and maybe most comfortable option from the top of my head is to use a web service like MailChimp or Postmark.

Avoid PHP execution time limit

I need to create a script in PHP language which performs permutation of numbers. But PHP has an execution time limit set to 60 seconds. How can I run the script so that if you need to run more than 60 sesunde, not been interrupted by the server. I know I can change the maximum execution time limit in php, but I want to hear another version that does not require to know in advance the execution time of a script.
A friend suggested me to sign in and log out frequently from the server, but I have no idea how to do this.
Any advice is welcome. An example code would be useful.
Thanks.
First I need to enter a number, lets say 25. After this the script is launch and it need to do the following: for every number <= than 25 it will create a file with the numbers generated at the current stage; for the next number it will open the previuos created file, and will create another file base on the lines of the opened file and so on. Because this take to long, I need to avoid the script beeing intrerupted by the server.
#emanuel:
I guess when your friend told you "A friend suggested me to sign in and log out frequently from the server, but I have no idea how to do this.", he/she must have meant "Split your script computation into x pieces of work and run it separately"
For example with this script you can execute it 150 times to achieve a 150! (factorising) and show the result:
// script name: calc.php
<?php
session_start();
if(!isset($_SESSION['times'])){
$_SESSION['times'] = 1;
$_SESSION['result'] = 0;
}elseif($_SESSION['times'] < 150){
$_SESSION['times']++;
$_SESSION['result'] = $_SESSION['result'] * $_SESSION['times'];
header('Location: calc.php');
}elseif($_SESSION['times'] == 150){
echo "The Result is: " . $_SESSION['result'];
die();
}
?>
BTW (#Davmuz), you can only use set_time_limit() function on Apache servers, it's not a valid function on Microsoft IIS servers.
set_time_limit(0)
You could try to put the calls you want to make in a queue, which you serialize to a file (or memory cache?) when an operation is done. Then you could use a CRON-daemon to execute this queue every sixty seconds, so it continues to do the work, and finishes the task.
The drawbacks of this approach are problems with adding to the queue, with file locking and the such, and if you need the results immediately, this can prove troublesome. If you are adding stuff to a Db, it might work out. Also, it is not very efficient.
Use set_time_limit(0) but you have to disable the safe_mode:
http://php.net/manual/en/function.set-time-limit.php
I suggest to use a fixed time (set_time_limit(300)) because if there is a problem in the script (endless loops or memory leaks) this can not be a source of problems.
The web server, like Apache, have also a maximum time limit of 300 seconds, so you have to change it. If you want to do a Comet application, it may be better to chose another web server than Apache that can have long requests times.
If you need a long execution time for a heavy algorithm, you can also implement a parallel processing: http://www.google.com/#q=php+parallel+processing
Or store the input data and computer with another external script with a cron or whatever else.

Categories