What is the best approach for sending the highest email rate possible with Swiftmailer?
We own an email automation tool and sometimes there are single sendouts of 40.000 emails. Our average rate with the spool:send command is of ~50 emails/min. I've tried copying the same command on the cron 5 times and it worked (i.e. it was sending ~250 emails/min), but it looks like the SMTP server got dizzy, because some contacts were receiving emails with another contact's information (any idea on what could be causing that?).
So now I was thinking about setting up 5 different mailers that spools the emails on different folders and running the 5 commands with a cron, each one for one of those mailers. Should it work? Any other recommended solution?
If you're sending 250 emails per minute, then you need something more resilient than the cron and the Swiftmailer spool. It will be hard to scale, a nightmare to debug, and not very inflexible. The Swiftmailer spool is great if you're sending no more than a couple of emails a minute, but any bigger than that and it's hard to scale and a nightmare to debug.
Instead, use a job queue like PHP Resque or Rabbit MQ (both are open source). You can replicate the 'spool' by having a queue of emails that need to be sent, and you can add multiple workers and queues. You could also have a second queue that actually adds the jobs to the first queue.
The advantage is that Rabbit MQ comes with a manager interface, so you can see things like how many emails are being sent, how many are failing, etc. Also, it's easier to scale up and down by adding and removing workers when you're under heavy load, for example.
Kacper from Sensio Labs actually gave a talk on Rabbit MQ with Symfony last year - http://www.slideshare.net/cakper/2014-0821-symfony-uk-meetup-scaling-symfony2-apps-with-rabbit-mq.
Related
I know that for large amount of emails it is recommended to use Pear but I'm wondering if it is worth digging into it in my case (I installed it but I have many errors coming from PEAR)
I need to send emails to my subscribers (around 20K) but my host only allow 200 emails per hour. This is OK because I don't need everyone to get the e-mail at the same time, I can send all these mails within a month, I'm not in the hurry.
In that case, would it be OK to have a really simple loop that send on email with mail() and then sleep for 18 seconds (to be under 200emails/hour) . Basically, what I'm thinking is simply to do something like this
for($i=0;$i<=count($recipient);$i++)
{
mail($recipient[$i].....);
sleep(18);
}
Is this OK vs using PEAR (which require many more time) ?
Have you looked at using PEAR's Mail_Queue package? http://pear.php.net/package/Mail_Queue ? You can set it to send X many emails in one process and then rerun the same script to send the next X many.
I certainly wouldn't use the native mail function for sending anything more than concise emails, perhaps for notifying you of exceptional conditions in an app.
If your server is linux based, you might get away with that (See this question).
Although this doesn't really sound like a great solution, also taking into account that it seems like you are using a for loop with all your recipients (20k).
If you don't want to use PEAR you might want to try setting up a cron job every hour that somehow (using a text file, or a database) remembers the last user it sent an email to, and sends the next batch of 200.
In that case you might want to setup the cronjon every 125 minutes just to be sure you don't reach the limit. Also using sleep(1); after each mail() will spare CPU.
Check out this question for performance considerations.
First, I wouldn't say that the limitations of good old mail()...
Manual encoding of almost everything
Poor error handling
No authentication (though not an issue for you)
... are related to volume.
Second, I've never used PEAR Mail so I can't speak about its performance or overhead but your use case comes precisely from a low-capacity e-mail server. You don't need high performance to do things slowly, do you?
So I'd dare say you're using the wrong criteria to evaluate tools.
My advice is that you leave mail() for extremely simple and unimportant tasks (and subscriber communication does not qualify as such) and use a proper third-party mail library, not necessarily PEAR's.
Particularly, Swift Mailer features Throttler Plugin that's designed to do exactly what you ask for:
If your SMTP server has restrictions in place to limit the rate at
which you send emails, then your code will need to be aware of this
rate-limiting. The Throttler plugin makes Swift Mailer run at a
rate-limited speed.
Many shared hosts don't open their SMTP servers as a free-for-all.
Usually they have policies in place (probably to discourage spammers)
that only allow you to send a fixed number of emails per-hour/day.
The Throttler plugin supports two modes of rate-limiting and with
each, you will need to do that math to figure out the values you want.
The plugin can limit based on the number of emails per minute, or the
number of bytes-transferred per-minute.
I am creating a system, where a list of thousands of emails will be sent periodically, I know that the mail() function in PHP is quite heavy, specially if calling it too many times simultaneously.
Roughly the way my system works , is that I create queue of emails in MySQL and send them in batches of 25 using mail(), removing from the table the top 25 sent. And I wait 2 seconds between each set of 25.
Is this too much effort for the server or I can push it a bit further?
Lets say 50 per second? Or there is a better way of sending many emails in less time without sacrificing Server performance.
I have a dedicated server without any mail() call limit.
There are other factors to consider besides performance, but the short answer is: there are better options. Amazon SES and MailChimp are the two I know about have heard positive feedback on.
Look at j08691's answer regarding the performance, but other issues with using mail() for this purpose include:
Scalability (you will hit a wall that no SMTP server can handle eventually, and you're already thinking about it)
Integrity - You are much more likely to get flagged as spammy when rolling your own mass mailer, especially using mail as it uses the local sendmail by design.
Cost/Benefit and ROI - the reliable mass-mailers get it right and for a competitive rate. At some point you are paying yourself less per hour to maintain your mail server when it crashes, getting off of black lists, writing the email layout by hand, general upkeep, etc etc than you would paying for the mass mailer service.
Overall, the big issue is that you have to do all the work yourself and you're likely to get flagged as SPAM for the benefit of not paying for a service that will be able to send hundreds of emails a second versus a hundred a minute when PHP isn't busy doing everything else it handles for your web app.
Personal anecdote (not an endorsement for SES, just mass-mailers) : We had a client that sent 100k+ emails per campaign, with 1 - 3 campaigns per day minimum. They started complaining that the clients were getting emails about "daily deals" 2 days late. It wasn't because the Mailer library was slow (even this app avoided using plain mail), it was that it couldn't be sure to send all of the emails for every campaign before the email was irrelevant. We switched them over to SES (with some optimizing on our end, but not much), and they could clear a campaign in under an hour.
From the PHP manual:
Note:
It is worth noting that the mail() function is not suitable for larger
volumes of email in a loop. This function opens and closes an SMTP
socket for each email, which is not very efficient.
For the sending of large amounts of email, see the » PEAR::Mail, and »
PEAR::Mail_Queue packages.
Try using PHPMailer. i used it to send about 100 mails everyday without any problem
I have a large board with 1 million+ members and I'm experiencing great lag between the sending of emails to each member. At the current rate it would literally take me 3 months to send emails to all 1 million members.
My machine (dedicated):
dual quad xeon
32 gigs of ram
Centos 5.4
vBulletin
I've tried configuring it a number of ways and it is still slow.
The resolution is done locally, so I don't think that's the issue. Any suggestions?
vBulletin shows as it sends out the emails (500 at a time) so I know the script isn't timing out or a memory issue. To complete a page of 500, it takes 10 minutes. I am using PHP's mail() function, which is the only other option I have other than SMTP. With previous servers I have not configured myself, it had always been fast. Now trying it with sendmail (PHP's mail function) it is so slow.
Check your /etc/hosts file.
If you have an entry for your external IP address that points to your local hostname for example:
75.23.123.21 my-server-hostname
Change it to:
127.0.0.1 my-server-hostname
Then try running the PHP mail() function again.
I'm going to say if you have 1 million subscribers you need to reach, perhaps it's better that you not do yourself. Instead, why not use a service like Mailchimp who's primary focus is on delivering email.
Think about the advantages:
You don't worry about bandwidth, infrastructure and maintenance.
You get comprehensive analytics on how your email campaigns are performing and the health of your list - you say you have a million emails but how many of them bounce back? How many are opened? what is the open rate per country?, how many are marked as spam etc?
Depending on what your business is, you can A/B test your campaigns and optimize reads/clicks/conversions.
You will obviously pay extra for this service which is separate from your current hosting costs, but with Mailchimp you pay for what you use. Also if you can reach a million humans, you probably figured out how to monetize it (if not, you really should). So using a 3rd party service might pay for itself.
Mailchimp is one of many services out there (I mention it because I use it and very happy with it). You might want to check out SendGrid, Campaign Monitor and Aweber and weigh your pros and cons.
Probably not the answer you were expecting, but this is just my $0.02.
P.S: Mailchimp also gives you an API so you can seamlessly integrate your app with their services.
From the PHP Manual
It is worth noting that the mail() function is not suitable for larger volumes of email in a loop. This function opens and closes an SMTP socket for each email, which is not very efficient.
For the sending of large amounts of email, see the » PEAR::Mail, and » PEAR::Mail_Queue packages.
I'm far from an expert, but the mail() function uses a lot more CPU and memory than normal web functions but having 1 million users may already have a significant load (CPU and IO) on your server already. This may impact the speed of sending out emails, especially if you're on an older Xeon.
From what I know, dual quad Xeons are relatively new and sending those emails shouldn't take anywhere near as long as it is.
From what I've read, a lower end single cpu dedicated server should be able to send out about 500-700 emails per minute... but that is a system dedicated to only sending emails. On a mid range server like I suspect you have I'd expect it to be able to send the emails in hours, not months.
It may be a configuration or a load issue which could be on many different levels.
I have developed a web application where students across the country come and register for some academic purpose. The users are expected to be around 100k within next year.
I need to send all of these people periodic mails. The web app is developed using codeigniter. The php script can run for 3000 seconds. But still the app is unable to send mails to more that 100 users.
The machine I run is in cloud and has got 256MB ram. I used the free -m command to check the memory usage but that doesnt seem to be a problem. Everything works fine for 10-20 mails.
What would be the best solutions? Is there any way I can transfer this job to some other app/program/shell script ?
If you cannot use some external service for your emails I would just setup a cronjob that sends a couple of emails every n seconds. Its pretty cumbersome to send a lot of emails with php as you have discovered. But the cronjob solution works everytime as far as I know.
So you have a list of emails/addresses and a cronjob that iterates that list and sends the emails.
Sure you can send the emails yourself from a server, but that is only half the battle.
If you are sending bulk emails, as opposed to the transactional type, it's best to use a third party service that is already whitelisted on mail servers. The primary reason being, you might get blacklisted by the major mail servers as a spammer. If this happens, you will have to work with them individually to get removed from the blacklists.
Also if you are operating in the United States you should be familiar with CAN-SPAM: http://business.ftc.gov/documents/bus61-can-spam-act-Compliance-Guide-for-Business
MailChimp is a viable candidate for this. Serving mail is a time-consuming task, and sending it up to 100k email addresses will be an arduous task for your server.
They provide an very capable PHP API.
https://developer.mailchimp.com/
It is very appropriate to get this out of your web server threads and into something that runs standalone. Typically for stuff like this, I have tables in the DB where the appropriate information is written to from the web site, so when I am ready to e-mail, something on the backend can assemble the e-mails and send them out. If you are sending out 100,000 e-mails, you are going to want something multithreaded.
It might be good in this case to use one of the many off-the-shelf tools for this, rather than reinventing the wheel. We use an old version of Campaign Enterprise here, and I am able to throw queries at it which I can use to pull data from my web DB directly, over ODBC. That may or may not work well for you, considering you are in the cloud.
Edit: You can also write a PHP script to do this and call PHP from the shell. Perhaps you can get around your timeout limit this way? (This is assuming you are referring to some service-level timeout. If you are talking about the regular PHP timeout, this can worked around with set_time_limit().)
You might be able to do with using pcntl_fork or creating a daemon process.
Fork: By using the fork process you could batch the emails into groups and send them out. each batch could be in it's own fork child process
Daemon: By using a Daemon you could create a batch of emails and send them to be processed by the daemon. a daemon could run multiple batches at once.
I am new to PEAR::Mail, and I am looking for a tutorial that can teach me how to send bulk mails(10K+ emails). "Using mail() in php is not efficient, as its open and close the smtp sockets", this is what I read from internet sources (could not find link now, grrr).
Thus, I am thinking of doing it manually and using mail library that are available for PHP, and I found this PEAR:Mail. On the PEAR site itself, there is "Sending Multiple Recipients" simple tutorial, all recipients will be inserted into an array and then send. Is this the way of sending 10k++ emails? I remember something called "mail queue", but really dont how to use it in PEAR:Mail, can anyone help me?
I dont think Facebook will use for loop to send bulk emails (notifications) right? (well, this is what I thought)
There's more to bulk email than which language you implement your sender in. As far as the library suggested by rich goes you would be looking at using an SMTP relay to queue and throttle your mails.
As I discovered when I wrote the mass mailer for my company the major problem any bulk mailer faces is the speed at which mails can be punted out into the ether and how it manages retries for mails that have been graylisted or whatever.
So number one you need a good solid SMTP server which can run the mailout job. You will also want some way to throttle the service and monitor it. On a standard Windows Server running IIS and connected to a reasonably large pipe we can clear 5k mails every 15 minutes. If you're looking to implement all that in 48 hours you're going to be pushed.
The fact is there are hard limits to how fast you can push data and further artificial limits imposed by ISPs and so on and so forth. This makes throttling, correct DNS records and the like absolutely vital if you don't want the job to run at snail's pace. The minimum time I could push 10k mails out the door (and the mails are about 50kb in size so that gives you a further idea on throughput) is half an hour and we've got top of the line kit and a connection into a vast distribution pipe backing us up.
In the early days of our company when they used to mail the stuff out from our local broadband it took about 12-14 hours to send 7000 mails. So you've got to understand that physical resources are really important.
Also you will inevitably end up with a minimum of about 50 mails per 10k that just won't deliver first time out. And about 10 of those are not going anywhere ever. The existence of these mails in the retry queue can have a bit of a drag effect on the delivery of further batches of mail, it's minimal but significant.
Also you can't just bang 10k mail files into any server and expect it to be entirely happy about it. We've found through experimentation that dripping 1k mails every three minutes gives us the optimal queue to send ratio. Your mileage will vary depending on your hardware.
Frankly, your choice of software library is the least of your worries at this stage.
Be reallyREALLY careful with email stuff, there is a hell of a lot to think about with reguards to spam and data protection. With Pear, there is little useful documentation anywhere it seems, though this may help you:
http://www.phpmaniac.net/wiki/index.php/Pear_Mail
Though maybe you may be better off using something like Campaign Monitor, espescially if you are short on time.