Setting amount of time to pass when sending emails in a loop - php

Forgive me for this noob question, but is there such a setting that sets a certain amount of time (e.g. milli/seconds) that has to pass in between sending emails through a script? How is that setting called and where do I change that setting?
I'll give an example:
I used to have a PHP script that sends emails like so:
for ($i=0; $i<count($emails); $i++) {
mail($email[$i],'test','test');
}
It turned out that not all emails were sent successfully because the script ran so fast that there was not enough time in between sending emails that was required by the server.
Did I make sense?

You can use one of these functions to do nothing for a while :
sleep() : Delays the program execution for the given number of seconds.
usleep() : Delays program execution for the given number of micro seconds.
Putting one of those in your loop should help.
For example :
for ($i=0; $i<count($emails); $i++) {
mail($email[$i],'test','test');
usleep(100*1000); // 100 milli-seconds
}

This script is untested but the theory is sound. Upon each send mail, check with a delay sequence to see if the mail has sent. I've set a limit to ensure the script does not fail - with the print, the execution time shouldn't be a problem.
for ($i=0; $i<count($emails); $i++) {
$sent = mail($email[$i],'test','test');
$count = 0;
while($sent == false) {
usleep(500); // half a second - test this number until the minimum is found
$count++;
if($count == 1000) {
echo "Email to " . $email[$i] . " failed due to timeout</br>";
break;
}
}
}
does that help?

Related

PHP execution time: How to measure using a symfony system

I am trying to send out a bigger amount of emails using PHP in a symfony 3.4 system. Now the script stops due to php time execution limit, which of course I need to work around.
My idea is to put all emails to sent in a table email_queue and then send them after they are saved in the queue.
So I am saving all these emails, displaying a page with a progress bar and making an ajax call which should send emails out up to under the time limit, report back how many it sent out (to update the progress bar) and if something is left in the queue, the scipt is called again till everything is sent.
So far so good. Now I try to measure the execution time of the ajax script part to stop sending before I reach the 30 seconds time limit.
Now for some reason the measurement is always much smaller then the 30 seconds, but before the measured execution time even reaches something like 2 seconds, the script stops due to reaching the time limit. Why is that so?
This is how I do my measurement:
$executionTimes = [];
$executionPoints = [];
$timeStart = $_SERVER['REQUEST_TIME_FLOAT'];
foreach ($mailQueue as $mail) {
// do some mail configuration
// send the email
$now = microtime(true);
$executionTime = $now - $timeStart;
$executionTimes[] = $executionTime;
$executionPoints[] = $now;
if ($executionTime >= 10) {
break;
}
}
// return data
Using $_SERVER['REQUEST_TIME_FLOAT'] I thought I'd get the real execution time including all the symfony overhead. Is that not correct?
Here is the result of one run:
{
"timeStart": 1525702394.248,
"executionPoints": [
1525702394.863065,
1525702394.866609,
1525702394.870812,
1525702394.874702,
1525702394.878718,
1525702394.882434,
1525702394.886418,
1525702394.890428,
1525702394.894365,
1525702394.899119
],
"executionTimes": [
0.6150650978088379,
0.6186091899871826,
0.622812032699585,
0.626702070236206,
0.6307179927825928,
0.6344339847564697,
0.6384181976318359,
0.6424281597137451,
0.6463651657104492,
0.6511189937591553
]
}
So the last execution measurement was under 0.7 seconds, but there the script execution stoped.
I did a lot of research already but I just can't figure out why my script is doing things it shouldn't. Any ideas would be highly appreciated.

In PHP, how to stop function, wait, and recursively restart itself until some condition is met?

If some condition is met, how can I make the function:
Stop executing the rest of the function
Wait X time
Restart the function
Would it be something like:
function someFunc() {
if (x == 0) {
sleep(60);
someFunc();
return;
}
...other code only to be run if above is false...
}
someFunc();
...other code only to be run if above function finishes running completely...
In case its relevant and there is some library to handle APi limits or something, I am doing this for an API connection. First I receive a webhook hit via
file_get_contents('php://input')
which contains a URL. Then I hit the URL with
file_get_contents( $url )
and, after parsing $http_response_header into a $headers array, check it's header as if ($header['api_limit'] == 0) ... (in the above example this is x). If "x" is 0, then I want the function to wait a minute until the limit cycle resets and run the second file_get_contents( $url ) and the parsing that follows it again.
The main reason I wanted to handle it this way is to not have to record anything. The webhook I receive via file_get_contents('php://input') only happens once. If API rate limit is hit and I try to use the URL in the webhook but fail, then the URL is lost. So I was hoping the function would just wait X time until the rte resets until trying to use the webhook-received URL with file_get_contents($url) again. Is this bad practice somehow?
With rate limited resources you usually want to cache a copy of the data for blocks of X minutes so that the limit is never actually exceeded. For example, in the case of a maximum of 10 requests per hour you would cache the response for at least 6 minutes before attempting fetch a new response.
It is not a good idea to stall the entire PHP interpreter until the rate limit is lifted.
As for approaching "repeat an attempt to do something until it works" in general, this is not something that PHP handles very well given you usually want a request and response cycle with PHP to be as fast as possible so it can move onto the next request. Your PHP application should provide an immediate yes/no response to an external utility that triggers the task at a given interval.
I solved it like this:
// This will be the testing variable, where in the actual script
// we'll check the response code of file_get_contents
$count = 0;
function funcTwo( &$count ) {
// Here I'd run file_get_contents and parse the headers
$count = ++$count;
echo "functTwo() running $count... \n";
// Here I'll test for response code 429, if true, restart
if ($count !== 5) {
echo "Count only = $count so we're gonna take a nap... \n";
sleep(1);
echo "Waking from sleep $count and rerunning myself... \n";
funcTwo($count);
return;
}
echo "Count finally = $count, exiting funcTwo... \n";
}
// This function does the main work that relies on the successful response from
function funcOne( $count ) {
echo "functOne() running! \n";
// The function will be delayed here until a successful response is returned
funcTwo($count);
echo "Count finally = $count, so now we can finally do the work that \n";
echo "depends on a successful response from funcTwo() \n";
// Do main work
echo "Work done, exiting funcOne and script... \n";
}
funcOne($count);

Batch proccesing of long execution time script

I searched around but i couldn't find a solution other than set_time_limit(0) which won't work on most of the shared hosting around.
Basically i have a script that send messages to my user's friends when they want. Some of my users have +4000 friends and the script gets into trouble.
Currently im calling this script in the background with AJAX. As i don't need/want the user to wait until this finish i would love to have some kind of background proccesing.
My current code:
global $client, $emails, $subject, $message;
_info("got on_auth_success cb, jid ".$client->full_jid->to_string());
$client->set_status("available!", "dnd", 10);
set_time_limit(60*10);
if( count($emails) < 40 ){
foreach( $emails as $email )
{
$msg = new XMPPMsg(array('to'=>'-'.$email.'#chat.facebook.com'), $message);
$client->send($msg);
sleep(1);
}
}
else
{
$counter = 0;
//Lets create batches
foreach( $emails as $email )
{
$counter++;
$msg = new XMPPMsg(array('to'=>'-'.$email.'#chat.facebook.com'), $message);
$client->send($msg);
sleep(1);
if( $counter == 50 )
{
sleep(10);
$counter = 0;
}
}
}
$client->send_end_stream();
Would be a good solution to use exec ? like for example
exec("doTask.php $arg1 $arg2 $arg3 >/dev/null 2>&1 &");
I need a solution that works on most of the hosting as this is a wordpress plugin that can be installed on any host. Thanks!
It would be ideal if you put this into some sort of cron. Build a list of emails to send and store them in a queue and have a cron script of some sort process that queue. Wordpress does have it's own cron mechanism (see wp_cron), but it might be difficult on low-traffic sites to run frequently enough to send that number of emails. Granted, you could use cron proper to make sure wp_cron is run. You should see How to execute a large PHP Script? as it's very related.
If you really want it to be synchronous, since you are using background ajax you could also just make a number of smaller ajax calls. For example, you could make your ajax script perform 20 emails at a time, and then return the number of emails remaining. Then your client code on the browser knows there's still some left and calls the background php via ajax again, perhaps with a counter indicating the current position. [Edit: But this is reliant on your users being patient enough to keep their browsers open to complete the send, and they may well need to have relatively stable connection to the internet.]

PHP - Stop and catch code which takes too long

I'd like to limit a specific section of PHP to X seconds - if it takes longer, kill the currently executing code (just the section, not the entire script) and run an alternate code.
Pseudo code example (Example use case here is an unstable API which is sometimes fast and other times its a black hole):
$completed = 1;
$seconds = 60;
while ($completed != -1 && $completed < 5) {
limit ($seconds) {
$api = new SomeAPI('user','secret','key');
$data = $api->getStuff('user="bob"');
$completed = -1;
} catch () {
$completed++;
sleep(10);
}
}
if ($completed === 5) echo "Error: API black-hole'd 5 times.\n";
else {
//Notice: data processing is OUTSIDE of the time limit
foreach ($data as $row) {
echo $row['name'].': '.$row['message']."\n";
}
}
HOWEVER, this should work for anything. Not just API/HTTP requests. E.g. an intensive database procedure.
In case you're reading too fast: set_time_limit and max_execution_time are not the answer as they affect the time limit for the entire script rather than just a section (unless I'm wrong on how those work, of course).
In the case of an API call, I would suggest using cURL, for which you can set a specific timeout for the API call.
For generic use, you can look at forking processes, which would give you the ability to time each process and kill it if it exceeds the expected time.
Of course if the section of code might be subject to long execution times due to a highly repetitive loop structure, you can provide your own timers to break out of the loop after a specified time interval.
I might not have directly answered your question, but really the point I wanted to get to is that you might have to use a different approach depending on what the code block actually does.

breaking loop process into parts by delaying them, need advice

i wanted to break down a process (loop) into some parts, for example if have 128 emails to send :
function subs_emails(){
$subscribers = //find subscribers
if(!empty($subscribers )){
foreach($subscribers as $i => $subscriber){
sendEmail($subscriber->id);
if($i % 15 == 0){ //<-- send email per 15
sleep(60); //to pause the process for 60 seconds
}
}
return true;
}else{
return false;
}
}
will this works ?? or is there any other "better approach" solution ?? need advice please
thanks
The usual approach would be to send only a few emails at once and mark the sent ones on the background(via database flag sent=1 for example)
Then call the script every few minutes via a cronjob
This way you dont run into problems with php timeouts when sending emails to a lot of subscribers
sleep() will cause the script to stall for the defined number of seconds (60 in your example). It won't really be breaking the loop but instead merely delaying it.
A possible solution is to make a note of which subscriber has already been sent an email. Then you can have your script execute at regular intervals via cron and only load a small amount of those who have not yet been sent an email. For example:
Script executes every 10mins
Load 15 subscribers who have not been flagged as already notified
Loop through all 15 loaded subscribers and send each an email
Set flag on all 15 to say they have been sent the email
Script will then run 10mins later to process the next 15 subscribers

Categories