I'm using Amazon web server.. I have a function for mail like below,
function mail_send($to){
require_once("class.phpmailer.php");
require_once("class.smtp.php");
$mail->AddAddress($to);
$mail->Send();
}
Whenever I want to send mail, I'm just calling this function by,
mail_send("example#xyz.com");
So, if I'm trying to send mail in loop, this will take more time to send mail.. approximately it will take 5 seconds for sending each mail.. But in aws, maximum execution time is 60 seconds in browser, after that it shows empty response... So, I need to trigger the mail and it need to be executed in backend. So, if I'm sending mail to 10 members, It need to only trigger the mail function for 10 times and the page loading should be stopped..
I have tried CURL, But it also waits for response of sending each mail so, it will taking same time to execute
Send from a cron script or other scheduled task that does not have a timeout - search on here for how to do that.
Send more efficiently - see the mailing list example provided with PHPMailer.
Get your local mail server to work for you - submit messages to it (which will be very fast) and let it deal with slow deliveries - it's what mail servers are for.
I can see you've based your code on an obsolete example and are using an old version of PHPMailer, so get the latest version.
Related
I build an online shop when my customers buy something from the shop, when they reach checkout, I use hyperpay API, so when they pay by visa, its a request from my server to hyperpay server then I have the result if it the success I send an email message to customer and store payment and products data in another server with api.
not always but sometimes, my server stop after sending emails, or stop before sending data to the second server, or I get a response from hyperpay and the operation cut, I don't know why.
So I think it's a time-out problem, but I am still not sure.
and another problem, my jquery scripts response take a long time.
so I want to know, the problem with my codes or from my server?
There could numerous reason behind for that to happen. As per your info, it looks like the time to send the email is taking longer or sometime it halts. Then you can do this things step by step:
Check laravel.log inside storage directory to check if any relevant information.
If the SMTP or whatever means of connection you are using to send the email needs to debugged.
You should send heavy tasks such as email sending to queue: https://laravel.com/docs/8.x/mail#queueing-mail so the other tasks can be executed without any problem while the SMTP call is being made.
If none of this help, you should look into your both web server and mail server
I am working on an email system , All things go fine , I can send email without any error, I have more than 3000 email address and I want to send email to them , but when I press on send button the operation took more than one hour , so I decided to use pear queue mail package, the package now insert the emails queue inside database, but how can I run send function after inserting it in database in background so user will not get confused from a long time that he had to wait .
using loop is a long process to send email one by one it is better to send bulk email. SwiftMailer is best to do that.
use SwiftMailer, which has HTML support, support for different mime types and SMTP authentication (there are very low chances that your email will go into spam).
also, I would like to tell you when you run a long process in PHP it gets time out after some time. you have to increase the execution time in php.ini file.
if you want to stuck with your process(loop).
I am using CakeEmail to sent about 7000 emails in a loop. But when I sent emails with small attachments of about (1kb) its working fine. But when I use bigger attcahment files such as of size 800kb it just quite after sending some emails may be 23 or 60. But not completing the process.
The Page quits by giving a message "This webpage is not available"
The code is in a loop where each CakeEmail is initialized to sent to an email address
I tried using
set_time_limit(0);
But it didnt work.
Can anyone helps me out why its not able send using an attachment of 800kb ?
Some shared servers do not allow to run longer scripts, thus not allowing to override the time limit.
Might be you can do something to send 10 emails at a time and then redirect the browser to the same page to send another 10. Just fetch the specific 10 emails using DB LIMIT clause.
Or you can also create a cron job to run the script through shell.
PS - kindly provide the code you are using so I can help in modifying it
So I have a problem with SMTP mail, I have a Zend Framework 2 application, and when the user signs up on the site I send him an confirmation email.
The problem is that when the user click on submit it takes about 3-5 seconds on page load, and that's because of the smtp email that is sending, if I take the part out which sends the email, the answer is instant.
I'm using SMTP from gmail, do you guys have any tips how to solve this?
Actually, PHP documentation doesn't recommend to use the PHP's mail() function to send an email on page load. Instead, it would be much faster to send emails in background. For example, create the outgoing_mail table in your database, and save your messages there when page is loaded. Then, create a cron scheduled task which runs your PHP script that would enumerate all pending messages and send them in turn. This way, you will have quick page load time and mail sent.
I have a php script which sends user a mail if his purchase is successful. The problem is it the page loads slow due to mail(). I know there are ways like putting the mails in a database table and then using a cron job to send them but the frequency of purchase can be high and I want the mail to be sent right away, so that doesn't look like a good option.
The purchase request gets processed by the same page from where he purchased and he can only do so once. The user doesn't control any part of mail other than the purchase details. I thought of using Ajax, the script would send the data to client side and call a ajax function which then calls another mail script but this would let user know what is being sent and can be tampered. Is there any other way I can use Ajax safely without letting user know whats being sent and where? And are there any better workarounds?
mail() should return immediately upon queuing the message to your mail server, which (generally) should take no time at all, so I think your problem here is definitely the mail server and not your PHP. I'm guessing that the mail server you're submitting to is running some anti-spam checks, like reverse DNS lookups. It might also be throttling you based on usage. Can you try sending through another mail server to verify?
Also, if you have shell access, try sending a message from the shell (e.g., echo test | mail -v -s test me#whatever.com) to see what it's doing and how long it takes.
Edit: Just noticed your comment re: Windows. In this case, at least you can try a telnet from the PHP server to port 25 on the mail host to see how long it takes to connect and get the 220 greeting header. (I bet you'll connect immediately but won't get the 220 header for a while.)
You're on to the right idea. Here's what I would do in this scenario:
The user makes a purchase (no ajax unless you want to at this point)
In processing that purchase an email is inserted into the email table with an "id" & "sent" column plus all the other stuff.
User is brought to a success page
The success page kicks off ajax in the background to send the email with that id from the db - and doesn't care about the result
The php page in charge of emailing sends the email and marks the sent column
If someone ever makes a call to your database with an id that's not sent, that email needed to get sent out anyway, so it's ok. If the id doesn't match or the sent column is marked true, you can ignore it. No worries about people trying to send spam using your system.