PHP based web application and mass mailing - php

I have developed a web application where students across the country come and register for some academic purpose. The users are expected to be around 100k within next year.
I need to send all of these people periodic mails. The web app is developed using codeigniter. The php script can run for 3000 seconds. But still the app is unable to send mails to more that 100 users.
The machine I run is in cloud and has got 256MB ram. I used the free -m command to check the memory usage but that doesnt seem to be a problem. Everything works fine for 10-20 mails.
What would be the best solutions? Is there any way I can transfer this job to some other app/program/shell script ?

If you cannot use some external service for your emails I would just setup a cronjob that sends a couple of emails every n seconds. Its pretty cumbersome to send a lot of emails with php as you have discovered. But the cronjob solution works everytime as far as I know.
So you have a list of emails/addresses and a cronjob that iterates that list and sends the emails.

Sure you can send the emails yourself from a server, but that is only half the battle.
If you are sending bulk emails, as opposed to the transactional type, it's best to use a third party service that is already whitelisted on mail servers. The primary reason being, you might get blacklisted by the major mail servers as a spammer. If this happens, you will have to work with them individually to get removed from the blacklists.
Also if you are operating in the United States you should be familiar with CAN-SPAM: http://business.ftc.gov/documents/bus61-can-spam-act-Compliance-Guide-for-Business

MailChimp is a viable candidate for this. Serving mail is a time-consuming task, and sending it up to 100k email addresses will be an arduous task for your server.
They provide an very capable PHP API.
https://developer.mailchimp.com/

It is very appropriate to get this out of your web server threads and into something that runs standalone. Typically for stuff like this, I have tables in the DB where the appropriate information is written to from the web site, so when I am ready to e-mail, something on the backend can assemble the e-mails and send them out. If you are sending out 100,000 e-mails, you are going to want something multithreaded.
It might be good in this case to use one of the many off-the-shelf tools for this, rather than reinventing the wheel. We use an old version of Campaign Enterprise here, and I am able to throw queries at it which I can use to pull data from my web DB directly, over ODBC. That may or may not work well for you, considering you are in the cloud.
Edit: You can also write a PHP script to do this and call PHP from the shell. Perhaps you can get around your timeout limit this way? (This is assuming you are referring to some service-level timeout. If you are talking about the regular PHP timeout, this can worked around with set_time_limit().)

You might be able to do with using pcntl_fork or creating a daemon process.
Fork: By using the fork process you could batch the emails into groups and send them out. each batch could be in it's own fork child process
Daemon: By using a Daemon you could create a batch of emails and send them to be processed by the daemon. a daemon could run multiple batches at once.

Related

SendGrid for PHP is slow. Are non-blocking requests possible?

We are currently developing a mobile app for iOS and Android. For this, we need a stable webservices.
Requirements: - Based on PHP and MySQL, must be blazing fast, must be scalable
I've created a custom-coded simple webservices with multiple endpoints to allow passing data from the app to our database, and vice versa.
My Question:
our average response time with my custom coded solution is below 100ms (measured using newrelic) for normal requests (say, updating a DB field, or performing INSERT INTO). This is without any load however (below 100 users daily). When we are creating outbound requests (specifically, sending E-Mail using SendGrid PHP-Framework) we are seeing a response time of > 1000ms. It appears that the request is "waiting" for a response from Sendgrid. Is it possible to tell the script not to "wait for a response"? This is not really ideal. My idea was to store all "pending" requests to a separate table, and then using a cron to run through all "pending" requests and mark them as "completed". Is this a viable solution? And will one cron each minute be enough for processing requests (possible delay of 1min for each E-Mail)?
As always, any replies or suggestions are very appreciated. Thanks in advance!
To answer the first part of your question: Yes you can make asynchronous requests with PHP, and even ignore the service's response. However, as you correctly say it's not a super great solution.
Asynchronous Requests
This excellent blog post on PHP Asynchronous Requests by Segment.io comes to several conclusions:
You can open a socket and write to it, as described by this Stack Overflow Topic - However, it seems that this is actually blocking and fairly slow (300ms in their tests).
You can write to a log file and then process it in another way (essentially a queue, like you describe) - However, this requires another process to read the log and process it. Using the file system can be slow, and shared files can cause all sorts of problems.
You can fork a cURL request - However, this means you aren't waiting for a response, so if SendGrid (or some other service) responds with an error, you can't catch it and react.
Opinion Land
We're now entering semi-opinion land, but queues as you describe (such as a mySQL one with a cron job, or a text file, or something else) tend to be very scalable as you can throw workers at the queue if you need it to process faster. These can be outside your user facing system (and therefor not share resources).
Queues
With a queue, you'd have a separate service that would be responsible for sending an email with SendGrid (e.g.). It would pull tasks off a queue (e.g. "send an email to Nick")and then execute on it.
There are several ways to implement queues that you can process.
You can write your own - As you seem to want to stay on PHP/mySQL, if you do this you'll need to take into account a bunch of queueing problems and weird edge cases. However, you'll have absolute control and for a simple application maybe this will work.
You can implement a self hosted task queue - Celery is meant to be a distributed task queue, øMQ (ZeroMQ) and RabbitMQ can also be used as Task Queues. These are meant to be fast and distributed and have had a lot of thought put into them. You'd need to benchmark them in your system to see if they speed it up. It'd also mean you have to host additional pieces yourself. This however, is likely to be the fastest solution from a communication standpoint.
You can pass things off to a hosted task queue - IronMQ and Amazon SQS are both cool hosted solutions which means you wouldn't need to dedicate resources to them, additionally with IronWorkers (e.g.) you could have the other service taken care of. However, since you're trying to optimize a request to an external service, this probably isn't the solution in this scenario.
Queueing Emails
On the topic of queuing emails (specifically), this is something common to email senders. Like with everything else it means you can have better reliability (because if a service down the line fails you can keep it in the queue and retry).
With email however, there's some specific services out there for queueing messages. These are SMTP Servers. Theoretically you can setup a server like sendmail and then set SendGrid as your "smarthost" or relay and have the server send to SendGrid. It then queues and deals with service interruptions and sends mail with little additional code. However, SMTP servers are pains to deal with, even if they're just forwarding messages. Additionally, SMTP is even slower than HTTP to establish a connection and therefor probably not what you want, but it's good to know.
Another possible solution if you control your own server environment that will speed up your email sending and your application is to install a mail server such as Postfix locally. You then configure Postfix to use your Sendgrid credentials, so any email sent will go from your server to sendgrid.
This is not a PHP solution, but removes the need for writing your own customer solution. If you set Postfix as the default mail server. You can then just use the php mail() function to send email.
https://sendgrid.com/docs/Integrate/Mail_Servers/postfix.html

Using php mail VS Pear function for many emails (but slow frequency)

I know that for large amount of emails it is recommended to use Pear but I'm wondering if it is worth digging into it in my case (I installed it but I have many errors coming from PEAR)
I need to send emails to my subscribers (around 20K) but my host only allow 200 emails per hour. This is OK because I don't need everyone to get the e-mail at the same time, I can send all these mails within a month, I'm not in the hurry.
In that case, would it be OK to have a really simple loop that send on email with mail() and then sleep for 18 seconds (to be under 200emails/hour) . Basically, what I'm thinking is simply to do something like this
for($i=0;$i<=count($recipient);$i++)
{
mail($recipient[$i].....);
sleep(18);
}
Is this OK vs using PEAR (which require many more time) ?
Have you looked at using PEAR's Mail_Queue package? http://pear.php.net/package/Mail_Queue ? You can set it to send X many emails in one process and then rerun the same script to send the next X many.
I certainly wouldn't use the native mail function for sending anything more than concise emails, perhaps for notifying you of exceptional conditions in an app.
If your server is linux based, you might get away with that (See this question).
Although this doesn't really sound like a great solution, also taking into account that it seems like you are using a for loop with all your recipients (20k).
If you don't want to use PEAR you might want to try setting up a cron job every hour that somehow (using a text file, or a database) remembers the last user it sent an email to, and sends the next batch of 200.
In that case you might want to setup the cronjon every 125 minutes just to be sure you don't reach the limit. Also using sleep(1); after each mail() will spare CPU.
Check out this question for performance considerations.
First, I wouldn't say that the limitations of good old mail()...
Manual encoding of almost everything
Poor error handling
No authentication (though not an issue for you)
... are related to volume.
Second, I've never used PEAR Mail so I can't speak about its performance or overhead but your use case comes precisely from a low-capacity e-mail server. You don't need high performance to do things slowly, do you?
So I'd dare say you're using the wrong criteria to evaluate tools.
My advice is that you leave mail() for extremely simple and unimportant tasks (and subscriber communication does not qualify as such) and use a proper third-party mail library, not necessarily PEAR's.
Particularly, Swift Mailer features Throttler Plugin that's designed to do exactly what you ask for:
If your SMTP server has restrictions in place to limit the rate at
which you send emails, then your code will need to be aware of this
rate-limiting. The Throttler plugin makes Swift Mailer run at a
rate-limited speed.
Many shared hosts don't open their SMTP servers as a free-for-all.
Usually they have policies in place (probably to discourage spammers)
that only allow you to send a fixed number of emails per-hour/day.
The Throttler plugin supports two modes of rate-limiting and with
each, you will need to do that math to figure out the values you want.
The plugin can limit based on the number of emails per minute, or the
number of bytes-transferred per-minute.

PHP: Sending huge quantity of emails in batch

Putting aside the disdain from junk marketing I need to send around 15,000 emails to customers. My coworker has tried to send them through php mail loop but obviously it gets stuck quickly. Is there a conventional (i.e. through PHP script) to accomplish this swiftly? If not, how do you suggest I do this (maybe through exec) without too much overhead?
Thanks!
I've used PEAR's Mail_Queue to queue up 200,000+ mails at a time. Populating a database is easy and quick, even with customised content, and then a fairly simple script sends around 250 a time - if the load average isn't too high. Then it loops around and sends the next batch.
You won't send them any faster than is usually possible, but it will do it without any problems.
The tutorial gives you almost everything you need - just loop around th 'send_messages.php' script (from the command line is better) until the database queue is empty.
You could look at using something like Gearman to create a queue system as recommended here. Another option would be to look at a paid service like Amazon's Simple Email Service (SES)
No matter how you implement immediate delivery: it'll be a lengthy process that's always subject to interruptions and you can't afford restarting the delivery and sending the same message twice to 5,000 customers.
I believe that a reliable system must use queues. The main script simply adds recipients to a queue and then you have a secondary process that picks items from the queue, get them sent and finally mark them as sent. This secondary process can be launched manually (maybe from the command line) or via cron tab.
I've never used but I have this in my bookmarks: http://ledscripts.com/free/php/phpledmailer
Are you running it through CGI or as a script on the command line? It's best to run it as a script on the command line.
If you say that it gets stuck, try running set_time_limit(0); to avoid PHP quitting from the execution being too long.

How can I send regularly scheduled emails without setting up a cron job?

I'm setting up a subscription site using PHP that will allow admins to set up schedules (monthly, weekly, daily, all down to the minute) for when emails will be sent out to subscribers. Is there a plugin, extension, or even a library, that allows me to send these emails based on a predetermined schedule?
I thought of setting a cron script to check every minute for pending emails, but with amount of traffic I'm estimating for my site I'm worried that it would put too much of a strain on my servers.
It would also be nice if this plugin, extension, etc, could provide analytic data in return. But that's definitely not vital.
The cronjob service itself takes little resources. It is the work that is done by the scripts it calls that have any impact whatsoever on your server. This means that whatever approach you use to send emails regularly, cronjob, or something else, the load the server takes by sending those mails will be the same. So if your server is slowing down too much, you probably need a dedicated server or more resources. The cron services itself does slow anything down (unless you are calling it like 10 times a second)
You could call php script from outside, from somewhere where you will be able to setup scheduled calls (may be your PC or other hosting service). Most shared hostings supporting schedules. Look into control panel (cpanel, plesk, ...):

Threads in PHP?

I am creating a web application using zend, here I create an interface from where user-A can send email to more than one user(s) & it works excellent but it slow the execution time because of which user-A wait too much for the "acknowledged response" ( which will show after the emails have sent. )
In Java there are "Threads" by which we can perform that task (send emails) & it does not slow the rest application.
Is there any technique in PHP/Zend just like in Java by which we can divide our tasks which could take much time eg: sending emails.
EDIT (thanks #Efazati, there seems to be new development in this direction)
http://php.net/manual/en/book.pthreads.php
Caution: (from here on the bottom):
pthreads was, and is, an experiment with pretty good results. Any of its limitations or features may change at any time; [...]
/EDIT
No threads in PHP!
The workaround is to store jobs in a queue (say rows in a table with the emails) and have a cronjob call your php script at a given interval (say 2 minutes) and poll for jobs. When jobs present fetch a few (depending on your php's install timeout) and send emails.
The main idea to defer execution:
main script adds jobs in the queue
cron script sends them in tiny slices
Gotchas:
make sure u don't send an email without deleting from queue (worst case would be if a user rescieves some spam at 2 mins interval ...)
make sure you don't delete a job without executing it first ...
handle bouncing email using a score algorithm
You could look into using multiple processes, such as with fork. The communication between them wouldn't be as simple as with threads (but then, it won't come with all of its pitfalls either), but if you're just sending emails, it might not be necessary to communicate much, if at all.
Watch out for doing forks on an Apache process. You may get some behaviors that you are not expecting. If you are looking to do any kind of asynchronous execution it should be via some kind of queuing mechanism. Gearman is one. Zend Server Job Queue is another. I have some demo code at Do you queue? Introduction to the Zend Server Job Queue. Cron can be used, but you'll have the problem of depending on your cron scheduler to run tasks whereas asynchronous computing often needs to be run immediately. Using a queuing system allows you to do that without threading.
There is a Threading extension being developed based on PThreads that looks promising at https://github.com/krakjoe/pthreads
There is pcntl, which allows you to create sub-processes, but php doesn't work very well for this kind of architecture. You're probably better off creating a long-running script (a daemon) and spawning multiple of them.
As of PHP there are no threads in it. However for php, you can have a look at this roundabout way
http://www.alternateinterior.com/2007/05/multi-threading-strategies-in-php.html
You may want to use a queue system for your email sending and send the email from another system which supports threads. PHP is just a tool and you should the tool that is best fitted for the job.
PHP doesn't include threading as part of the language, there are some methods that can emulate it but they aren't foolproof.
This Google search shows a few potential workarounds

Categories