I am building a website where users will be able to send newsletters. They will need to track the sending process and also to stop it if required. Since they may have lots of emails the actual sending will be delegated to a script which will run in the background and will be executed by the user. This script will handle all users' newsletters, only the arguments to it will be different.
For the user to be able to see how many mails have been sent and to stop the sending process I was thinking of implementing some sort of interprocess communication between the website and the scripts running in the background but I'm not sure how to do it.
Any help will be much appreciated.
The process:
When sending the newsletter, queue all target addresses in a table.
Process a fix amount of the queue in a cronjob that runs every few minutes.
Then when displaying sent amount, just count the amount of entries in your queue that are processed.
When working with a queue will also fix any performance issues your script encounters when everyone clicks 'send' simultaneously.
Stop mailing
If the customer would like to stop mailing, you only have to remove all queue'd emails where the mailing matches. You could also flag them as 'cancelled' if you would like to keep the data. (You will have to make sure your queue only processes 'pending' mails from the queue).
Related
I am developing a mass-mailing system. At a time we send 2-4K emails, the email contacts are imported using PHPexcel library at same quantity of emails.
Last night when we are sending 2k emails we get the "500 internal server" error.
I think I should develop the new process for email handling and the contacts importing, am I right? If so, how should I do it? Is there any other way to overcome such 500 errors?
The PHP script is called by web browser and browser loads it for 5-10 minutes and then 500 error occurs. I am using the PHPMailer library for sending the mail.
Calling a long-running PHP script from a web browser is not really the same as running PHP in the background. That will lock up an Apache thread, and is likely to be subject to whatever timeouts PHP has configured. My guess is the timeout is being hit before the send has completed.
It would be better to do this on a cron. Here are some general pointers:
Every ten minutes, select the next unsent set of email addresses from your data store, maybe 100 of them.
Send an email to each one, logging to a database what you have done
Pause for a few seconds. This is helpful as it makes it less likely your mail will be directed to spam bins
If your script has been running for more than five minutes, exit (it will do the next set of email addresses on the next cron call)
Otherwise, loop back to the start
That will be much more reliable. For bonus points, write a web page to show you what emails are sent and which are still waiting. Also, you may wish to use a third-party mailing system like MailChimp, to increase your delivery reliability. Make sure all of your recipients really have opted into receiving email from you.
I've suggested the script should batch in groups of 100, run for five minutes, be called every ten minutes, and pause for a few seconds after each send -- but these are just examples. If you don't mind sending more slowly (e.g. overnight) then you can change these figures to suit. Sending more slowly is generally more reliable, so do this if you can.
Here is the 'use case':
Admin goes to the newsletter.php, fills in the form for email, i.e the subject, group of users to whom email is sent, writes the message and clicks the "Send" button.
Problem:
The number of emails sent per hour should be limited to, let's say, 400. That is, one email should be sent approx. every 10 seconds. Besides, sent and not sent emails should be tracked.
Question:
Will cron job do the trick?
The code is written in Yii framework. Is it possible that cron job will be activated when the user clicks on the "Send" button or only in the command line?
If cron job can do things above, can it be activated only in specific action of specific controller? or it affects the whole script?
Thank you
1- Yes it is a cron job.
2- You can do it in Yii refer to ConsoleCommand. Don not know how much do you know about cron? but make a script using consolecomamnd which runs every 10 secs, gets a singe email address in the queue and sends email to that address + removes it from the queue.
3- Yes in an action just build a queue (hint: mysql table) with the email address you need
I think you have the concept of a cron job wrong.
Think of the process this way:
Admin presses the send button.
...That script creates a queue i.e. adds email address's etc to a database table lets call it Queue. And finishes.
You setup a cron job to run every 5 minutes for example.
...The cron starts a PHP script which processes X entries from the Queue table, sends those emails, removes the entries from the queue table, and stops.
...The cron job starts again automatically after 5 minutes and repeats the process...
All you have to do is work out how many emails to send in each execution of the cron job, and how often to run it, so you dont exceed your limits or get flagged as a spammer.
My suggestion is that if you wanted to use cron to do such a thing, you would want your newsletter.php to write out a file to disk or database which contains the list of users. You would write a simple PHP script which would be triggered by cron which would be responsible for calling sendmail to send the messages. As each recipient is mailed you script would remove them from the list.
Alternately, you might want to look into some basic mailing list software. These often support a notion of throttled message sending.
I assume you are trying to manage limits imposed by your hosting provider related to number of emails sent per hour (antispam controls)
Yes cron jobs can be used in such a scenario. I personally have developed newletter systems which use such methodology. However, I send 350 and do a cron job every hour.
You should check if there are any emails to be sent on the first line and leave the cron job running every hour. There is no need to activate the cron job when the send button is clicked
cron itself just runs programs on a schedule. It doesn't have any logic in it that can know whether somebody clicked on a button, or logic that can know how many emails you've sent this hour, or which emails have or have not been sent.
cron runs under the control of a daemon. Although you probably could enable and disable it through a controller, you probably don't want to do that. If you're using cron, you usually need it to run all the time.
Instead, put the logic and the constraints into the program that cron runs.
if you can use system or exec function you could call cron job or any thing else in terminal can do it
for example system("ps aux | grep crond");
I'm writing a web app in PHP + Laravel + MySQL.
In the system, a user can schedule emails (and other API calls) at arbitrary times (much like how you schedule posts in WordPress). I can use CRON to inspect the database every 5min or so to find emails that should be sent, send them, and update their status.
However, this is a SaaS app. So the amount of emails to be sent at a particular time can grow rapidly. I can create a "lock file" every time the CRON script runs so that only one instance of it is running at a time. The lock file will be deleted after a script finishes execution.
But with potentially large data, I would want a way to process multiple messages simultaneously, potentially using multiple "workers." Is there any existing solution manage such a queue?
Yes! Task/Message/Job queues are what you are looking for! They allow you to put various tasks in queues from which you can retrieve them and process them, this process can scale horizontally as each worker can pull a task once its finished with the previous one.
You should have the cron maybe every minute/two minutes that just uploads the task and what needs to be done. This will make sure the cron is very quick.
Take a look at Iron.io Here is an extract from the website which gives a nice overview of these kinds of systems:
An easy-to-use scalable task queue that gives cloud developers a
simple way to offload front-end tasks, run scheduled jobs, and process
tasks in the background and at scale.
Gearman is also a great solution that you can use yourself and is very simple. You can send the message in many different languages and use a different langauge to process it. Say PHP -> C etc...
The Wikipedia link will tell you everything you need to know, here is a quick excerpt:
Message queues provide an asynchronous communications protocol,
meaning that the sender and receiver of the message do not need to
interact with the message queue at the same time. Messages placed onto
the queue are stored until the recipient retrieves them.
I was wondering if there is a way to run a PHP loop in order to send a few hundred emails to subscribers in background. My goal is to format the newsletter, click send and then close the browser or change page. Of course, the actual process of sending the e-mail will be running in background and would not be interrupted by the browser closing.
I know this can be made with a cron job reading from a queue saved in MySQL or text file, but this way, even if there is no queue for a long period, the cron will always be running, looking for the queue...
I've seen this funcionality in a script called Pommo (https://github.com/soonick/poMMo) but can't seem to understand how it's done.
Does anyone have an idea for this?
I was going to add a comment to your question, but them I didn't have enough space there to format and give the example.
Here is an idea I believe might work:
1 - Load all the emails you want to send to a database or file.
2 - From your web application click on the button to send emails. This will submit an Ajax request to the server. You can define the number of emails you want to send within a certain timeframe. Remember that most hosts have limits on number of emails you can send every hour.
3 - create a php script that will receive the Ajax request and send all the emails within the parameters you define.
4 - I believe you can kill your web browser because the PHP script will run through the whole list and will not return until it finishes sending all the emails.
The above might work, however I would never do it this way. I would use a cronjob as stated above. Your cronjob would only have to check if there are emails to send or not. This is not resource intensive.
If you decide to implement the ideas above, please make sure you let us know. I am curious if that would work.
Good luck!
I know this can be made with a cron job reading from a queue saved in
MySQL or text file, but this way, even if there is no queue for a long
period, the cron will always be running, looking for the queue...
That pretty much beats the purpose of Cron. You should create a job that runs, say, every 15 minutes and checks the queue for mails that need to be sent. If there are no mails, let the script die, it'll run again in 15 minutes.
If there are mails to be sent, update the rows to indicate that you're processing them before you start sending, so a run taking more than 15 minutes won't cause another script instance to send the same mails.
You need a queue system. There is e.g. Beanstalkd for linux, which you would feed things with php.
I have developed a web application where students across the country come and register for some academic purpose. The users are expected to be around 100k within next year.
I need to send all of these people periodic mails. The web app is developed using codeigniter. The php script can run for 3000 seconds. But still the app is unable to send mails to more that 100 users.
The machine I run is in cloud and has got 256MB ram. I used the free -m command to check the memory usage but that doesnt seem to be a problem. Everything works fine for 10-20 mails.
What would be the best solutions? Is there any way I can transfer this job to some other app/program/shell script ?
If you cannot use some external service for your emails I would just setup a cronjob that sends a couple of emails every n seconds. Its pretty cumbersome to send a lot of emails with php as you have discovered. But the cronjob solution works everytime as far as I know.
So you have a list of emails/addresses and a cronjob that iterates that list and sends the emails.
Sure you can send the emails yourself from a server, but that is only half the battle.
If you are sending bulk emails, as opposed to the transactional type, it's best to use a third party service that is already whitelisted on mail servers. The primary reason being, you might get blacklisted by the major mail servers as a spammer. If this happens, you will have to work with them individually to get removed from the blacklists.
Also if you are operating in the United States you should be familiar with CAN-SPAM: http://business.ftc.gov/documents/bus61-can-spam-act-Compliance-Guide-for-Business
MailChimp is a viable candidate for this. Serving mail is a time-consuming task, and sending it up to 100k email addresses will be an arduous task for your server.
They provide an very capable PHP API.
https://developer.mailchimp.com/
It is very appropriate to get this out of your web server threads and into something that runs standalone. Typically for stuff like this, I have tables in the DB where the appropriate information is written to from the web site, so when I am ready to e-mail, something on the backend can assemble the e-mails and send them out. If you are sending out 100,000 e-mails, you are going to want something multithreaded.
It might be good in this case to use one of the many off-the-shelf tools for this, rather than reinventing the wheel. We use an old version of Campaign Enterprise here, and I am able to throw queries at it which I can use to pull data from my web DB directly, over ODBC. That may or may not work well for you, considering you are in the cloud.
Edit: You can also write a PHP script to do this and call PHP from the shell. Perhaps you can get around your timeout limit this way? (This is assuming you are referring to some service-level timeout. If you are talking about the regular PHP timeout, this can worked around with set_time_limit().)
You might be able to do with using pcntl_fork or creating a daemon process.
Fork: By using the fork process you could batch the emails into groups and send them out. each batch could be in it's own fork child process
Daemon: By using a Daemon you could create a batch of emails and send them to be processed by the daemon. a daemon could run multiple batches at once.