PHP and background processes - php

I am developing a mass-mailing system. At a time we send 2-4K emails, the email contacts are imported using PHPexcel library at same quantity of emails.
Last night when we are sending 2k emails we get the "500 internal server" error.
I think I should develop the new process for email handling and the contacts importing, am I right? If so, how should I do it? Is there any other way to overcome such 500 errors?
The PHP script is called by web browser and browser loads it for 5-10 minutes and then 500 error occurs. I am using the PHPMailer library for sending the mail.

Calling a long-running PHP script from a web browser is not really the same as running PHP in the background. That will lock up an Apache thread, and is likely to be subject to whatever timeouts PHP has configured. My guess is the timeout is being hit before the send has completed.
It would be better to do this on a cron. Here are some general pointers:
Every ten minutes, select the next unsent set of email addresses from your data store, maybe 100 of them.
Send an email to each one, logging to a database what you have done
Pause for a few seconds. This is helpful as it makes it less likely your mail will be directed to spam bins
If your script has been running for more than five minutes, exit (it will do the next set of email addresses on the next cron call)
Otherwise, loop back to the start
That will be much more reliable. For bonus points, write a web page to show you what emails are sent and which are still waiting. Also, you may wish to use a third-party mailing system like MailChimp, to increase your delivery reliability. Make sure all of your recipients really have opted into receiving email from you.
I've suggested the script should batch in groups of 100, run for five minutes, be called every ten minutes, and pause for a few seconds after each send -- but these are just examples. If you don't mind sending more slowly (e.g. overnight) then you can change these figures to suit. Sending more slowly is generally more reliable, so do this if you can.

Related

Sending large number of emails in PHP with a sleep command and preventing the domain from hanging during the job

I have an email list of people I have to send out emails to every morning. The number is about 1000 at the moment and it is on my own VPS LINUX server.
The problem is that when the job is running the websites I have on the server seem to hang and timeout quite a lot.
I thought adding a sleep command of a couple of seconds between each iteration of the loop would help this but a PHP developer at my work just told me that this wouldn't do anything to help Apache in regards to memory and it would be better to just run the script without any sleep command.
I have read articles on here and other sites where people recommend adding a sleep command in between the loop iterations but if this just means that the Apache process consumes more memory and causes my site to hang then I don't want that to happen.
I am calling this job from my domain from a CRON job with a secure hash. The reason I am calling it externally through my site e.g http://www.example.com/sendemails.php?secretcode=_--1033-449 is that often I need to debug it to see what is going on or if the job hasn't run I can run it manually and read the debug on the page.
Is there an issue with Sleep and Apache / website timeouts or should I do it in another way e.g having some sort of batch of starting the script every 10 minutes and sending out 100 emails before updating a flag in the DB against the person so I know they have had an email sent (True/False) and starting from that point in the next batch.
Then at night after a certain time e.g 6pm when I know no emails should be sent out I could easily re-set all their flags to False ready for tomorrow. Ensuring no emails are sent after 6pm of course.
What is the preferred way of doing things like this as it does seem that the website hangs/times out during the time the job runs.
Thanks
Rob
It seems that in my case anyway having NO SLEEP in between the loop iteration works a whole lot better than having a 1 second sleep
I have warning emails set up on my VPS and yesterday during the sending of the batch I received one saying timeouts were occurring from specific locations (Rackspace does 3 tests from London, Chicago and Dallas), 2 locations returned times of over 25 seconds to connect to my server - not an HTTP PING but just to connect with a normal PING.
I send success emails at the end of the batch job to tell me what happened, when the job started and finished and how many emails were sent out.
This was yesterday's email with a 1 second sleep between each iteration in the loop.
Date of Job Starting: 2015-May-19 11:15:02 We successfully sent out
945 emails to subscribers Date of Job Finishing: 2015-May-19 11:50:54
So with a 1 second delay and calling the PHP script by an APACHE process showed that it took 35 minutes to send out the emails.
Plus I got website / server issues during the job.
Checking my debug script showed that there gaps longer than 1 second between the sending of a lot of emails so the job was obviously causing issues with the server.
However today I had no delay between each loop iteration and the job completed within a minute.
I sent all emails out successfully with no warnings from my VPS. My VPS does both Connection PINGS and HTTP Pings (HEAD Requests) to my server.
Date of Job Starting: 2015-May-20 11:16:01 We successfully sent out
945 emails to subscribers Date of Job Finishing: 2015-May-20 11:16:54
I guess this shows that you can send 1000 emails in less than a minute. #
This is with a standard MySQL loop to get the email/name of the person to send the email to from the database and using an Apache process e.g a CRON call to a webpage which holds the script rather than an internal call to the PHP file.
The script is configurable so I can set the sleep above 0 to do a wait and all debug messages are stored in an array during the job and then piped out in one file_put_contents call to my debug file at the end. This is rather than a constant opening and closing of the debug file as I have found this is always a performance killer.
So I guess the answer FOR ME at least is to remove the SLEEP and just get the job done as quick as possible so there is no build up of Apache connections waiting to use the server as the script runs.
If I do get to the stage where I have issues I am going to move the CRON job to an internal CRON call to the PHP job so all APACHE processes are free for use. However the speed is the key issue.
Thanks for your ideas though.

Sending huge amount of email from SMTP

I have an issue. I have situation where I need to send around 3000 emails per request using SMTP. However, only 30-40 reaches destination.
Do you have any idea what can be a problem and how to solve it. as server side script I am using PHP.
How the big e-mail service providers (Constant Contact, WhatCounts, etc.) do mass e-mails is to put the "campaign" in a queue and send it at a later time. They have dedicated, high-performance delivery software for looking through the queue for new campaigns to send and then send them out at rates exceeding 50,000 messages per minute. Anything you might do in PHP won't even compare.
If you are trying to send from your local computer, that won't work. DNSRBL lookups will identify your computer as being on a "DUN" (Dial-Up Network) and will block the message. Most PHP scripts also have a timeout of 30 seconds in a web server environment but running via cron a PHP script can run as long as it needs to.
You shouldn't send mass e-mails from your main e-mail server either. That's a nice, fast way to get on global blacklists so that you can't send regular e-mail to common hosts (e.g. Hotmail, GMail, etc). The big e-mail service providers have dedicated staff whose job it is to remove themselves from global blacklists. That's a full time job. You are better off paying for the service (don't forget to set up SPF records correctly if you do go this route).
Warnings and advice aside, to answer the question, use a cron job for your PHP script and put the e-mails to be sent out in a queue.
If your looking to inboxing as many emails as possible and you are not sending spam, and prefer to use your own smtp, check them out.

Mass mailing queue (possibly with PHP interprocess communication)

I am building a website where users will be able to send newsletters. They will need to track the sending process and also to stop it if required. Since they may have lots of emails the actual sending will be delegated to a script which will run in the background and will be executed by the user. This script will handle all users' newsletters, only the arguments to it will be different.
For the user to be able to see how many mails have been sent and to stop the sending process I was thinking of implementing some sort of interprocess communication between the website and the scripts running in the background but I'm not sure how to do it.
Any help will be much appreciated.
The process:
When sending the newsletter, queue all target addresses in a table.
Process a fix amount of the queue in a cronjob that runs every few minutes.
Then when displaying sent amount, just count the amount of entries in your queue that are processed.
When working with a queue will also fix any performance issues your script encounters when everyone clicks 'send' simultaneously.
Stop mailing
If the customer would like to stop mailing, you only have to remove all queue'd emails where the mailing matches. You could also flag them as 'cancelled' if you would like to keep the data. (You will have to make sure your queue only processes 'pending' mails from the queue).

Send mails in background without cron

I was wondering if there is a way to run a PHP loop in order to send a few hundred emails to subscribers in background. My goal is to format the newsletter, click send and then close the browser or change page. Of course, the actual process of sending the e-mail will be running in background and would not be interrupted by the browser closing.
I know this can be made with a cron job reading from a queue saved in MySQL or text file, but this way, even if there is no queue for a long period, the cron will always be running, looking for the queue...
I've seen this funcionality in a script called Pommo (https://github.com/soonick/poMMo) but can't seem to understand how it's done.
Does anyone have an idea for this?
I was going to add a comment to your question, but them I didn't have enough space there to format and give the example.
Here is an idea I believe might work:
1 - Load all the emails you want to send to a database or file.
2 - From your web application click on the button to send emails. This will submit an Ajax request to the server. You can define the number of emails you want to send within a certain timeframe. Remember that most hosts have limits on number of emails you can send every hour.
3 - create a php script that will receive the Ajax request and send all the emails within the parameters you define.
4 - I believe you can kill your web browser because the PHP script will run through the whole list and will not return until it finishes sending all the emails.
The above might work, however I would never do it this way. I would use a cronjob as stated above. Your cronjob would only have to check if there are emails to send or not. This is not resource intensive.
If you decide to implement the ideas above, please make sure you let us know. I am curious if that would work.
Good luck!
I know this can be made with a cron job reading from a queue saved in
MySQL or text file, but this way, even if there is no queue for a long
period, the cron will always be running, looking for the queue...
That pretty much beats the purpose of Cron. You should create a job that runs, say, every 15 minutes and checks the queue for mails that need to be sent. If there are no mails, let the script die, it'll run again in 15 minutes.
If there are mails to be sent, update the rows to indicate that you're processing them before you start sending, so a run taking more than 15 minutes won't cause another script instance to send the same mails.
You need a queue system. There is e.g. Beanstalkd for linux, which you would feed things with php.

Shared Hosting mail limits workaround

so i have a couple of sites on paid shared hosting, my host limits the mail to 300 per hour.
One of my sites has more than 500 subscribers.
My question is how can i send the newsletter to all my subscribers? is there a way or a script that i can use to send the first 300 users the email and after an hour to send the rest...?
i've also considered making a gmail account to send the newsletters via smtp. Do you guys know the limit of free gmail smtp?
You shouldn't circumvent the restrictions placed by your host. I would suggest you pace your sends, and record your last-sent-id, picking up from there in your next hour. That, or you can place sufficiently sleep-time between sends to allow the entire thing to go out at a rate of about 300/hr.
Thank You for all your Reply guys... it really helped me find a solution for this inconvenience. I personally can't afford VPS hosting nor pay extra for an external for a mail server...
Considering Jonathan Solution and William's Comments i ended up developing a small php application based on XML to send different batch to 250 recipients each with a GAP of 65 minutes.
So the way it works, by default it only enables the first batch link to be clicked and send the newsletters to the first batch of users and recording the exact time this was sent in a XML file.
Then using the XML file info the next link registers that the batch before it was sent and starts a count down of 65 minutes with the time on the XML as reference.
So a Script will not be running for hours and the browser could be safely close since all the info required is in the XML file.
This may sound Simple but is a complex and efficient app that dynamically adapts to growth (new subscribers) as it queries the master Table on a database using the sql LIMIT clause to make the different batches. So it doesn't required maintenance.
If anyone is interested on the source code feel free to contact me # admin#thechozenfew.net
Google Mail does have limits, see:
Sending limits In an effort to fight
spam and prevent abuse, Google will
temporarily disable your account if
you send messages to more than 500
recipients or if you send a large
number of undeliverable messages. If
you use a POP or IMAP client
(Microsoft Outlook or Apple Mail,
e.g.), you may only send a message to
100 people at a time. Your account
should be re-enabled within 24 hours.
Source: http://mail.google.com/support/bin/answer.py?hl=en&answer=22839
To get around the problem, you could create a queue table in your db with a list of all the users you're sending the newsletter to. Then send e-mails out in bulks (500 example). Remove the e-mails from the queue table as they're sent out. You could use a cron (if on linux and host allows) to run a PHP script every hour that sends e-mails based off the queue.
I'd seek a place just to park your MX (not sure of Google's limits, but that could be a start). Its very common for mailing list managers to queue mails to fit within sending limits. I.e. a cron job queries a database, picks up 250 emails to send and sends them out.
The problem lies when you have 10,000 subscribers and need to send non-automated emails from the same MX. I.e., if your limit is consumed getting out a newsletter, what happens to your ability to reply to your own e-mail?
A lot of companies offer MX only hosting, I'd go with one of them and move the whole business of sending the list over there. Or, just get yourself a VPS (its going to be about the same monthly price).

Categories