I have created a Job Queue module that processes jobs and constructs "social-network" type emails.
2 processes consists of:
Building the custom emails (Views) e.g. User A and User B have commented on your post or User B and User C also likes User C's post. Each recipient gets a different email. I initially created a new Swiftmailer instance and add the message content, subject and recipient. I then added these instances to the database.
A cron job runs to fetch and send these emails at a later time.
While benchmarking, I realised it was sending out 2 emails per second avg. So I tried storing Swift_Message Instances in the database instead. No luck though, still takes as long.
Currently, the code
Creates a new Swift_SmtpTransport.
Creates a new Swift_Mailer instance.
Loops through the Swift_Message messages retrieved from the DB
Sends each email.
But it still averages about 2 emails a second. Is there any way I can improve on the process to speed up delivery? I am using Amazon SES as my SMTP transport and I know it can at least handle 5 emails a second.
So it is probably something I am doing wrong. Any thoughts appreciated.
EDIT
Please keep in mind that the messages differ for each recipient. I could try out the Swift_Decorator plugin but it will mean that I will have to change the way the views are generated. I am just looking out for other alternatives to speed the process up.
I am using Amazon SES as my SMTP transport
In my experience using SES, the minimum API response time I've seen has been in the half second range, while the average hovered around one second. This is not a limitation of the connection, of TCP/IP, of bandwidth available, etc, but of their processing of requests per connection. Others on the official support forums report the same. The SMTP transport isn't any faster.
The only way to send faster is to send in parallel. This is the approach they've been recommending on their official support forums.
The SES API permits multiple simultaneous connections as long as you stay under the "mails per second" quota. If you don't know your current per-second quota, check your send stats. I don't know the SMTP transport handles being over this limitation.
In order to send a higher volume of mail in parallel, we modified our queue processor to be able to run in parallel. To ensure that mails were never sent twice, we added an "PID x grabbed this" column to the table and ran a query similar to UPDATE Queue SET selected_pid = ? WHERE target_ts < NOW() AND selected_pid IS NULL LIMIT X. We'd then look for mails we can send, send them all, and then try again until we ran out of mails that we'd need to send in that period.
We also modified the code that populates the queue to ensure that there would never be more mails in the queue than we could send. We were able to do this because we send mails in batches. When you have a constant trickle of mails, chances are that you won't need to jump through that hoop. Just make sure your sending code knows how to properly respond to the various over-quota errors.
As an alternate to a cron-based queue processor, consider using Gearman to set up a set of asynchronous background workers to send mail.
Related
I'm creating a site that sends out a daily news email to about 800 users, at a time they can specify. My problem is that my script takes a long time to run and times out, so I'm looking for some advice on how I could be approaching this better.
Current approach:
Users are placed in a 'mailing queue' database table with their ID, receive time, and a sent flag.
I'm then running a CRON script every minute which does the following:
Grab all from mailing queue with a 'receive time' less than or equal to now, that haven't sent
Loop through the users, joining a preferences table to get their chosen categories (up to about 30 per user).
For each category, find the latest 3 articles
Prepare an HTML email with this content using PHPMailer
PHPMailer is using Mailgun SMTP to avoid overloading my SMTP server
Send mail to user, mark as sent in database
My observations so far are:
When testing the script by running in-browser, it runs incredibly slowly for a few minutes then times out (without sending any emails).
When running every minute via CRON, it sends way over the number of emails (about 1400) over the course of 40 minutes, I guess because the script is overlapping itself and the sent flag is not reliably updated.
The majority of users are set to receive their email at the same time, so I'm doing 'worst case scenario' testing on this basis
Questions
Is my script far too heavy, by querying the database and generating the HTML email content for each user on the fly? I'm wondering if it would be better to generate the content ahead of time and store against the user in the mailing queue.
Would a queue manager like Beanstalkd help? I've had a look into it, but am struggling to see how to implement into my routine.
Ultimately I need the emails to be sent reliably to each user at the time they expect.
Any advice much appreciated!
You can do this in PHP, but you probably shouldn't. You're trying to build a mail server when there are much better ways of doing this, which mainly involve using a mail server.
Sending high volumes of email during page loads is not workable – it can be troublesome even for single messages, yet many still try. Approach it like this:
Store your list in a database.
When you want to send, generate a record representing each message to send (essentially a copy of the list).
Have a daemon (a long-running task) or cron job that sends the messages in chunks.
Create messages one at a time and submit them to a local mail server.
Use DKIM signatures.
As each message is sent, mark them as sent in the database, but you need to be very aware of how database transactions and locks work for this to work safely and avoid duplicates – do this right and overlapping processes work just fine.
You can generate messages as fast as you like, and your mail server will deal with queuing, onward delivery, retries, bounces.
Use VERP addressing, and feed bounces into a bounce handler (be warned, writing these is not fun!), and have that prevent sending to bouncing addresses in future.
This approach works well – it is exactly how my own system ([Smartmessages.net](https://info.smartmessages.net/ , which is built in PHP) works, and I can sustain over 200 messages per second using multiple message generators running in parallel (database transactions FTW!).
If you find all this a bit too much (it is very difficult), you're probably better off using a commercial sending service (like my own) or hosting a DIY solution, such as Mailcoach by my good friends at Spatie. Either of these would work well, and your list is pretty small – I'm often handling lists of over 100,000.
I have a big amount of users that agreed to recive daily newsletter. Content of newsletter is being auto-generated, so the only thing to do is to set up a cron job which would send e-mails.
However, if there is e.g. 10.000 users, such cron job would kill my server. What can be done to solve this problem?
Is sleep(1) after sending 100 e-mails enough? (and of course setting execution time limit to 1 day)
Have a look into http://php.net/manual/en/function.mail.php
Note:
It is worth noting that the mail() function is not suitable for larger
volumes of email in a loop. This function opens and closes an SMTP
socket for each email, which is not very efficient.
For the sending of large amounts of email, see the » PEAR::Mail, and »
PEAR::Mail_Queue packages.
So simply use the Mail_Queue package... which takes every mail and then simply works through them.
I have made a system for sending emails for project few months ago so I done the following:
In database, I have 3 tables:
users
user_emails (some users have multiple emails)
email_campaign (this is table in which I store data temporary while sending campaign and on finish I truncate everything in it)
And when I start sending campaign I insert row in email_campaign table for every user that I finished sending email to.
This way if error occur before campaign finishes, I know where to continue and know to whom I sent email and to who need to send email.
Practicaly I was able to send 45.000 emails during 2 hours. Without server overload.
I use sleep() on every 100 emails like you wanted to do.
Also I send campaigns at 2 am in the morning when my server load is the lowest.
You could also configure your email server to send limited ammount of emails per hour.
This would slow sending but it will reduce server load.
I started managing a website with a member list of over 50,000 registered members.
I am having this problem:
Each time i try sending the emails, the page runs for several minutes (more than 20 minutes) and eventually ends with up 'Internal Server Error'.
Now, i do not know how many mails that were sent or if any was sent at all.
How do i send the emails:
I select all emails from the database, run a loop and send one after the other. I chose this option because:
i do not want to add several emails in 'To' field, so people's privacy will be maintained.
Adding the emails in Bcc will make it appear as a spam.
Please how can i handle this.
Thank You
All you need to do is break the job into manageable chunks, instead of reading in all 50,000 emails and trying to send them all.
Read in only X emails from the database at a time.
Send that batch of X emails individually (If X = 10, you will send 10 separate emails).
Mark each email as "sent" as soon as each email is sent, otherwise, if there is an error, you may want to increment "send attempts" on that particular email.
After the X emails have been sent (or attempted), update your statistics to indicate that X emails have been sent.
Go back to 1.
I suggest starting with X = 1
I did this by making a script that calls itself with the id of the user it last processed.
E.g. sendmail.php will then call sendmail.php?id=1 which will then call sendmail.php?id=2 and so on.
That will fix your timeout issue.
Within the script you can echo out the results of the mail command. If its false, then you could echo out an error.
I also added a field in my user table for mail send date and time, so i could have a confirmation of where the script got to incase the browser died.
Your sql for getting the next user to send email to would then be something along the lines of
Select useremail from users where sendmaildatetime is not null limit 1
Assuming your default for the sendmaildatetime column is null.
Sorry if im not detailed enough, im writing this from my iphone and its a PITA :)
#hamlin11 response is right, if you want to manage such a big email sending you MUST do it in small chunks and manage this task as an asynchronous task.
To Manage asynchronous tasks you'll have a lot of way, a subprocess check on all your requests that some async jobs are waiting, a separate cron job, a call on a cron.php script from a crontab, even psynnott answer, with an external script relaunching himself at the end.
But you could also use the right tool for the right task, if you have some control on the system under your website. Send one simple email to a mailing list manager that would do the job for you. This would only imply you create the right user list in the mailing list manager. External mailing list manager are for example mailman or sympa. These tools contains robots that you can talk with to feed mailing list recipients.#psynnott answer can be seen as an external PHP script performing very simple mailing list manager tasks. If you want to alter content of the email depending on some user parameters you'll quite certainly have to write your own separate process managing the tasks.
But you could also search for web services handling this job for you. Spam management is a hard job and managing an official big mailer is an hard job as well, but this is not free, of course.
Okay, so here's my problem:
I have a list of members on a website, and periodically one of the admins my site (who are not very web or tech savvy) will send a newsletter to the memberlist.
My current memberlist is well over 800 individuals long.
So, I wrote an email script that sends the email to the full memberlist, with the members listed in the Bcc header.
However, I've discovered that my host server has a limit of 300 emails per hour, which I apparently exceed even though the members are listed in the Bcc field. (I wasn't previously aware that the behaviour of Bcc was to send separate emails for each name on the list...)
After some thought, I've come to the conclusion that my only solution is to have my script send only the email to only the first 300 emails, wait an hour, and send a second email to the next three hundred, wait another hour, and so on until I've sent the email to the whole member list.
Looking around on the internet, I've seen some other solutions people have come up with for delaying emails in PHP. Sleep() is obviously not an option, because I can't just leave the script open and running for 3 or four hours. I've seen some people suggest cron jobs, but I'm not sure how feasible it would be to create three new cron jobs every time I send an email, use them once, and then delete them afterward.
The final (and what I think is the smartest) solution I've seen, is to have a table in my database to temporarily store the emails to be delayed and sent later, and then create a cron job that checks this sql table every hour or so, compares the timestamp of the row to the current timestamp, and then sends the email if an hour has passed.
So I'm asking you all which method you would recommend. Is there an easier solution that I've completely looked over (aside from getting a different hosting plan. ha!), or is there a cleaner way to do it than the database / cron job approach?
tl;dr: I have >800 emails to send in an hour on a server that limits me to 300/hr. Using PHP, find a way to get around this problem in a way that the person sending the email needs only to click "send."
You could send this into a gearman queue and then have a gearman worker with the appropriate sleep calls. See http://gearman.org/ and http://php.net/gearman
Sounds like you need to setup a batching function that pulls from a pool of messages to be sent and processes X everytime its run during cron. Then you would have a table that tracks messages that were sent and to who, so you can keep track of who has received emails.
I would recommend that you create a queue, and process X number of items from the queue each time you need to send email. The sender of the messages just places the email in the queue and your processing code picks up the item sending the maximum number of items in that period. Occasionally you will have failures and use of the queue will allow simple recovery. You only remove items from the queue when they are processed.
You can use a simple database table to act as the queue but you may prefer to use a specialist queuing solution.
Another recommendation would be to look into external email services like Strongmail. These will help you send more email per hour.
What is the most proper way to sending email of minimal 1000 or more in PHP? Any reliable email queuing technique that is capable to handle that?
You could just insert your emails into a Mail Queue database table, and have a separate process check the queue and batch send a certain number at once.
There's a tested solution for that: PEAR Mail_Queue
Works fine for me.
as mercutio suggested, i would insert a new record into a mail queue table for each email waiting to be sent and then use a separate process (like a CRON) to check the table periodically for any queued items.
if any emails are queued (and the email is not customised for each recipient) i would then group the emails by domain and send blocks together to reduce the total number of emails that have to be sent, i.e. if you have 1000 emails queued and 250 are to gmail accounts i would send the 250 in 25 blocks of 10 (remember to Bcc recipients to avoid them seeing each other).
to actually send the mail i would use PEAR mail over php's mail() function
after sending the email either delete record(s) from the queue or change a status flag to show it was sent and loop - i would also add a counter to keep track of emails that failed to send and remove them after x failed attempts
to overcome timeout issues i would either,(depending on the situation)
- set the set_time_limit() to x seconds and keep track of the script execution time (killing the script after (x-1) seconds)
- call the script from the command line to avoid timeouts
- set a limit to the number of emails the script could send in one execution
Sure, the database table might be an idea. But what about sending 1000 e-mails with a 2mb attachment? you'd have to take that into account as well. I had the problem myself, and I eventually resorted to writing the e-mail to the database, and the files to the filesystem. The e-mail script I used then read the database records, and tried to fetch the attachments to send.
Are you sure you need do this mail queuing yourself?
Just deliver all mail to the local machine's mail transfer agent (sendmail...) and let that take care of the queuing and sending. After all, that's what was designed for.
In other words: don't worry about it!
I created Emailqueue, which is a server that allows you to add emails to a queue so your app get relieved of the stress of the mailing, and also provides useful additional options, like the ability to program emails to be sent in the future, or setting per-email sending priorities. I think this might very well be what you're searching for.
Emailqueue is available here: https://github.com/tin-cat/emailqueue
And there is also a Docker version that allows you to set up a working Emailqueue server in just a few minutes, here: https://github.com/tin-cat/emailqueue-docker
I've generally relied on a hack.
I have a database list of email addresses and then use a meta-redirect to self with an increasing 'offset' parameter that specifies which row in the database I am up to. Server redirects cause problems because browsers assume that the time taken indicates an infinite loop.