I dont know if its the right question.
The thing is that, I have a list of users and their emails stored in my database.
Now I want to send a email to all of them with the help of a loop and setting a time interval delay (may be 5-10 seconds ).
foreach($users as $user){
//code to send the email
}
I roughly have around 50 users. So will the code execute until the loop will complete? I mean is it a correct way to do?
What if I have hundreds or even thousands of users in future?
The default max execution time is 30 seconds. You can easily modify this by going to your php.ini file and changing max_execution_time option.
If you wish to modify max execution time within your PHP script, you can use this function: set_time_limit ( int $seconds ), you can also revert these changes later.
//Settings time limit to 0 will make the script execute without time limit(see comments).
set_time_limit (0);
foreach($users as $user){
//code to send the email
}
set_time_limit (30);
http://php.net/manual/en/info.configuration.php
By default the max_execution_time is 30 seconds. But I don't understand why you would want to stall a script for 5-10 seconds for every user when you're sending an e-mail. You may want to look at a cron job instead.
This problem is belong to server config for maximum time execute. You can read here How can I set the maximum execution time for a PHP script?
Related
I have a PHP script which sends a message to a number of users, with 6 mins delay in between. SO if there is 30 users, the script should run for 3 hours, to complete sending message to all 30 users. I tried Background Jobs with Workers in PHP, but the script stops after 30 minutes. Is there any way to increase the timeout, or is there any other way to achieve this?
set_time_limit — Limits the maximum execution time
set_time_limit ( 10800 )
So at last what I did is, I start a cron job which calls this script every 5 mins. In the script, I check a file whether it contains any message, if there is, I will send it to users if not ignore.
So when I set up the message to sent to users, I will save the message and list of users in different files. After sending the message to each user, I delete them from the list. As soon as the users list become empty, I delete the message from the file too. In this way, the script will know, that it sends the message to all the users. By far it's working well. I am using cron-job.org for creating cron job. It's free in there.
I've got a CronJob scheduled to run every 7 minutes. The script runs a loop over a number of users and sends email via SMTP or does some API calls via curl.
Thus, most of the execution time is apparently spent outside the realm tracked by max_execution_time on Linux. So, because I was experiencing hangs in my script that were always fixed by restarting it (I'm also looking for the cause of the hangs, but wasn't successful so far).
Because with set_time_limit set to 6 minutes, the script still ran 30 minutes sometimes, I now check microtime(true) after each round in the loop and break out of it, if it has been running more than 6 minutes.
Still, the script sometimes runs 37 minutes (even though I can see that emails, which map to one round in a loop, still go off).
The only trick I have left is pcntl_fork. I am reluctant to employ it because of platform dependence, and because I figured using a loop and microtime(true) should track time spent outside the process too and I'd like to understand why this isn't the case here.
The max_execution_time is used for limiting script execution time. However this does not affect system calls. See the manpage.
You can also use set_time_limit() to properly set the maximum amount of time a script can run. This only changes the time_limit for the script in scope. Have you tried this as well?
You can find another question that has been asked may help slightly at this link.
I ended up going outside PHP to get the time, this is simple, works and doesn't require me to learn to deal with pcntl functions.
$time_comp = microtime(true); // save start time
function check_time_against_mysql($dbh,$time_comp) // run this function
{ // to see how much time passed
$time = $dbh->prepare("SELECT UNIX_TIMESTAMP() - :created AS time_in_seconds");
$time->bindValue(":created",$time_comp);
$time->execute() or die("fail time");
$time_passed = $time->fetch(PDO::FETCH_ASSOC);
return floatval($time_passed['time_in_seconds']);
}
I have PHP script contains a long loop to list about 60,000 email addresses from mysql, but after 30,000, my php script stops working and brakes, and sometimes all php script is written in white page (my page has imaged background), I increased PHP memory limit to unlimited, but no help.
What's problem?
The default execution time limit is 30 seconds, or the *max_execution_time* value from php.ini. Is your script taking longer than that? If so, you can increase the limit with set_time_limit.
The better question is, what are you doing with 60,000 email addresses that is taking so long? Chances are, there is a much better way to do whatever is taking too long.
There are two things you can try here. Firstly, set_time_limit(0). Setting it to zero means there is essentially no time limit.
Secondly you might want to use ignore_user_abort(false). This will et whether a client disconnect should abort script execution.
Both will ensure that your script keeps running for as long as it needs... or until your server packs out :)
The problem is not in memory I think it's in Time for execution
set_time_limit(VERY_BIG_NUMBER);
Update:
add
ini_set("display_errors","on");
error_reporting(E_ALL);
and inspect errors if any
check "max_execution_time"
How long does the script run before breaking? 30 seconds?
Use set_time_limit() and increase the max execution time for the script.
I'm using a FOR loop to send emails from an array[250].
for ($counter = 0; $counter <= 250; $counter ++){
// send email function[$counter]
}
I thought about the sleep() function but since the server have limit excute time isn't an option.
Please help me with this!
To delay sending emails in a loop, you can create your own wait() function with a loop inside it and call it before iterating. If the reason you want to wait is to avoid problems with an ISP then read this SO Answer:
Sending mass email using PHP
Without some kind of scheduler you're always going to hit your execution limit.
You may want to store the emails in a database then have cron execute them.
Or you can up your execution time:
<?php
//replace 600 without how many seconds you need
ini_set('max_execution_time', 600);
... loop through emails
?>
Why do you need to delay them anyways?
Apparently (untested) the sleep function takes control away from php so the max execution time does not apply.
From: http://www.hackingwithphp.com/4/11/0/pausing-script-execution
"Note that the default maximum script execution time is 30 seconds, but you can use sleep() and usleep() to make your scripts go on for longer than that because technically PHP does not have control during the sleep operation."
Use cron - almost all hosts let you use it (except the free-host ones) and they should be more than happy to help you set it up if you need assistance (if they don't help you, don't give them your money)
My php script creates thumbnails of images. Sometimes when it handles a lot of images, the script have to run a long time and ends after 60 seconds because of the time limit on my server.
Can I tell the script to time out after 59sek and then repeat itself?
I need some ideas.
Edit:
I don't think my web hosting allows me to change max_execution_time
I can't believe this is my answer..
loopMe.php:
<meta http-equiv="refresh" content="65;url=loopMe.php">
<?php
/* code that is timing out here */
?>
Your best bet though may be to look at set_time_limit() like others have suggested.
Important Note: this loop will never end. You could however have a session variable, say, $_SESSION['stopLoop'] to flag whenever the script successfully ends. Before printing the meta-refresh line you can check the value of that.
If you don't want to modify your code, use the set_time_limit(0), which sets to have no time limit.
set_time_limit(0)
http://php.net/manual/en/function.set-time-limit.php
I wouldn't recommend to use though, if your system grows this would take a very long time processing. It is better to recode in such a way that you run that only when your server is on low traffic and control how much time it will process data.
For instance you could store the data you need to process in a queue and process as much as you can in a time widow once per night.
set_time_limit(0);
yeah as above suggested use set_time_limit It will increase the sript execution timeout.
If your are in loop for processing multiple files then do set_time_limit every time you before you process the file to the expected duration of time. i.e if a file is assumed to take max 60s for execution before processing use set_time_limit(60)
If your server is not runnig with php in safe mode you can use set_time_limit($time); where $time is the time you need to execute the script, for example you can use set_time_limit(240); set the timeout on 4 minutes.
Or for example you can find out in your script how much time has passed and before the timeout expires add some seconds (set_time_limit(10);) until the script finish. (if you call multiple times to set_time_limit() you will add the time of each call.)
Using this code you can calculate the elapsed time and after for each image you compare it against the server timeout.
Also you can modify your script to log the images that has been already processed and in case of timeout carry on with the task in teh same point.
You can run your script from a cron job in your server (if it's allow you) to make it run periodically. Take a look at this article