PHPMailer- Should I send emails in runtime, or via cron job? - php

I am using the PHPMailer library to handle the sending of emails from within my application.
The problem is, when some emails are triggered to be sent (such as when a contact form has been submitted, a new user registers, etc), it could take 1-3 seconds for the page to load while the email is sending. If there is ever a problem sending the mail, the delay can be more.
I was thinking of saving any emails that need to be sent into a pending_emails table in my database, then just have a cron job ran every minute which would send out all those emails, then remove them from the table.
My question is, does this seem like a logical thing to do? Are there any potential resource concerns I should have with a cron job running every minute vs sending the email in runtime? (I need to run the cron job often, as someone may be waiting on an urgent message, for example "reset password" email)

You got everything right already.
Sending at runtime, just when you respond to the user's HTTP request, is the easiest thing to do. But the response is slowed down a bit by this, of course. That's not too bad in a small application, because sending email is faster than one might think. It definitely works.
Implementing a message queue is the more elegant and scalable approach, of course. But it takes a little more work. Your idea of using a pending_emails database table is totally valid. There are libraries and components for such queues, but you don't have to use them.

This is a very opinion based question so you're going to get a lot of different, conflicting answers because there are some who might tell you its ok to make a user wait 1-3 seconds since its not that long but I tend to disagree with that. What I typically do instead, however, is use a Queue.
There are ways to create a queue WITHOUT using 3rd party software, but there are some excellent tools out there such as RabbitMQ, Iron.io or Beanstalkd which can be extremely helpful to performing tasks in the background. These services push your task into a queue and these items in the queue are processed in a timely manner in the background, but the user gets an almost immediate response (depending on what you're doing). This is how I usually handle more resource intensive tasks, like sending an email, in the background to avoid holding up the response to a user.
Best of luck.

Look into threading (PHP Threading). I would suggest you create a new thread which invokes the sending of the email. This way, you can return a response to the user without waiting for the email to be sent, and the email sending process would run in parallel in another thread.

Related

Best approach for PHP mass-mailing routine?

I'm creating a site that sends out a daily news email to about 800 users, at a time they can specify. My problem is that my script takes a long time to run and times out, so I'm looking for some advice on how I could be approaching this better.
Current approach:
Users are placed in a 'mailing queue' database table with their ID, receive time, and a sent flag.
I'm then running a CRON script every minute which does the following:
Grab all from mailing queue with a 'receive time' less than or equal to now, that haven't sent
Loop through the users, joining a preferences table to get their chosen categories (up to about 30 per user).
For each category, find the latest 3 articles
Prepare an HTML email with this content using PHPMailer
PHPMailer is using Mailgun SMTP to avoid overloading my SMTP server
Send mail to user, mark as sent in database
My observations so far are:
When testing the script by running in-browser, it runs incredibly slowly for a few minutes then times out (without sending any emails).
When running every minute via CRON, it sends way over the number of emails (about 1400) over the course of 40 minutes, I guess because the script is overlapping itself and the sent flag is not reliably updated.
The majority of users are set to receive their email at the same time, so I'm doing 'worst case scenario' testing on this basis
Questions
Is my script far too heavy, by querying the database and generating the HTML email content for each user on the fly? I'm wondering if it would be better to generate the content ahead of time and store against the user in the mailing queue.
Would a queue manager like Beanstalkd help? I've had a look into it, but am struggling to see how to implement into my routine.
Ultimately I need the emails to be sent reliably to each user at the time they expect.
Any advice much appreciated!
You can do this in PHP, but you probably shouldn't. You're trying to build a mail server when there are much better ways of doing this, which mainly involve using a mail server.
Sending high volumes of email during page loads is not workable – it can be troublesome even for single messages, yet many still try. Approach it like this:
Store your list in a database.
When you want to send, generate a record representing each message to send (essentially a copy of the list).
Have a daemon (a long-running task) or cron job that sends the messages in chunks.
Create messages one at a time and submit them to a local mail server.
Use DKIM signatures.
As each message is sent, mark them as sent in the database, but you need to be very aware of how database transactions and locks work for this to work safely and avoid duplicates – do this right and overlapping processes work just fine.
You can generate messages as fast as you like, and your mail server will deal with queuing, onward delivery, retries, bounces.
Use VERP addressing, and feed bounces into a bounce handler (be warned, writing these is not fun!), and have that prevent sending to bouncing addresses in future.
This approach works well – it is exactly how my own system ([Smartmessages.net](https://info.smartmessages.net/ , which is built in PHP) works, and I can sustain over 200 messages per second using multiple message generators running in parallel (database transactions FTW!).
If you find all this a bit too much (it is very difficult), you're probably better off using a commercial sending service (like my own) or hosting a DIY solution, such as Mailcoach by my good friends at Spatie. Either of these would work well, and your list is pretty small – I'm often handling lists of over 100,000.

Mailing queue in PHP

I'm working on an application that will handle a lot of email sending, and I'm looking for a minimal email queue solution.
What the sending code will do, is get the "To", "From", "Subject", "Text", "Format" fields from the queue, generate the headers, and send the email. If the sent is not successful, it can be retried. I would like a priority system too, with, at least, two levels of priority.
I've been thinking and the ideas I got are these:
MySQL: as everything else in the system goes through MySQL, I thought in using a MySQL table as a queue. The problem is that the sender must be always looking on the table, which causes a processor high load.
Files: a queue can be done through XML files in a directory. This is bad for everything (performance, server life...)
FIFOs: I've used FIFOs in C applications, but probably this is too low level for a high level application, and raw data is a bit harder to process (sizes, order of parameters...).
So I'm looking for ideas on how to do this email queue in an easy way. The system is done in PHP, and I'd like it to be in PHP, if there's solution.
Thanks in advance.
I've developed an email queue system for PHP that does exactly what you're asking for, check it our here. https://github.com/tin-cat/emailqueue
I've done something similar to this before to send about 200,000 emails in a day. Since they weren't time critical, I generated them (with Mail_Mime), and stored them all into a database, with Mail_Queue, sending them out with a shell script that kept re-running itself if the load average of the machine was OK.
Today, I'd do it with a Symfony-based system around Swiftmailer and White October SwiftMailer DB Bundle.
To have it avoid the database (which isn't optimum, but it does Just Work) I would use the DBBundle as a base and instead have it go through a queue system, such as Beanstalkd (it wouldn't be a big job to send it to a queue instead of a database table). The sending system can just delete the job if it decides it's 'too old'. Adding priorities to the queue job is also very easy - it's built right into Beanstalkd.
You could also elect to simply have the message in the queue be, "send user X an update email" - and the queue-runner goes to the original DB to assemble the email just before it is sent.

best way to send mass emails without slow page load

My script sends notification emails of a new comment, this could be to 50 members and 50 emails need to be sent, which could take 20 seconds which is way to long for the user to wait! What's the best way in PHP to do this, is there a way to do asynchronously?
A simple way might be to store the necessary information (email addresses, content) in a database and have a batch process run every minute or so with a cron job. The batch process can query the database for pending emails and, if any are to be sent, go through them and then delete the database entries.
is there a way to do asynchronously?
Yes, there is!
exec('wget PATH_TO_YOUR_SCRIPT_THAT_SENDS_THE_NEWSLETTER > /dev/null &');
Note that the database alternative is a pretty good one too. But this should work too if you're on Linux (and doesn't require a database).
I'd use something like RabbitMQ. Your website acts like a producer sending the email requests to Rabbit; then have a consumer running that processes the requests from Rabbit.
Advantages - if your consumer falls over then when you restart it will pickup from where it left off (last acknowledged request).
Indeed it can be done asynchronously.
The simplest way is to insert the email data into a database rather than actually running the emails, and then have a cron job that periodically actually sends the emails.
There are of course other ways too, but that will probably be the most straight forward.
You can use cURL POST to start an asynchronous script. Set the timeout to a short period so your script can resume after the POST request has been made. You can set the email information in the POST request or store it in a data base table.

I am running through a php background process problem

In my application, the user needs to register through a form, where I have to send three mails and do some other (huge) database checks. It takes a lot of time, is it possible to make the whole task as background process or some other alternates is there?
If your database activities take too long then you need to rethink your design. However if the delay is due to emails, then just store the emails in DB or in files. Create a cron job that sends out these queued emails every 5/10/15 minutes(and then delete them).
maybe you can once a user is registered flag him as pending in your database.
Then you could defer the work in a python or php routine running in the background continuously who would look for any pending request, do the check, send the emails and finally update the database accordingly.
the user during this time would be in a registered but pending status, but at least from a visitor point of view, he is not stuck waiting for everything to be processesed.
You cannot make a PHP script that has been started over a webserver process a background process.
I would check if I can optimize the database (probably, you have insufficient indizes), and if that doesn't fly, build a second process that gets started regularily (maybe once every five minutes or so) on the CLI side with a cronjob, showing the user a "Thank you for your registration" page...
As per my comment elsewhere, spawning a long running process from PHP is a practical solution bearing in mind a few caveats if the performance problems are unavoidable.
However "send 3 mails" should not take an appreciable amount of time (I don't know what the database checks are). You need to spend some time looking at optimizing the existing process.
Other ways to solve the problem would be conventional batch processing, offloading the heavy lifting to a multi-process/multi-threaded daemon via a network call or asynchronous messaging system, or even a single threaded job processor using a message queue.

Implementing Email Notification

I have a web application where users can create topics and also comment on other topics (similar to what we have here on stackoverflow). I want to be able to send notifications to participating users of a discussion.
I know the easiest way to go about it is to hook the notification to the script executed when a user interacts with a discussion. In as much as that seems very easy, I believe its not the most appropriate way as the user will need to wait till all the emails notifications (notification script finishes execution) are sent till he gets the status of his action.
Another alternative I know of is to schedule the execution of the notification script using cronjob. In order for the notification to be relevant, the script will be scheduled to execute every 3 to 7 minutes so as to make sure the users get notification in a reasonable time.
Now my concern is, will setting cronjob to run a script every 3 minutes consume reasonable system resource putting into consideration my application is still running on a shared hosting platform?
Also, am thinking is it possible to have a scenario where by the comment script will trigger or notify a notification script to send notifications to specified email addresses while the comment script continues it's execution without having to wait for the completion of the notification script. If this can be achievable, then I think it will be the best choice for me.
Thank you very much for your time.
Unless your notification script is enormously resource-intensive and sends dozens or hundreds of messages out on each run, I would not worry about scheduling it every 3-7min on a shared host. Indeed, if you scheduled it for 3 minutes and found performance sagging on your site, then increase it to 4min for a 25% reduction in resources. It's pretty unlikely to be a problem though.
As far as starting a background process, you can achieve that with a system call to exec(). I would direct you to this question for an excellent answer.
IMO adding a "hook" to each "discussion interaction" is by far the cleanest approach, and one trick to avoid making users wait is to send back a Content-Length header in the HTTP response. Well-behaved HTTP clients are supposed to read the specified number of octets and then close the connection, so if you send back your "status" response with the proper Content-Length HTTP header (and set ignore_user_abort) then the end user won't notice that your server-side script actually continues on its merry way, generating email notifcations (perhaps even for several minutes) before exiting.

Categories