Issues with sending large email with Laravel job - php

This is my controller that get contact emails of over 50k and iterate over it then batched it to queue
This is the job that send the email
The problem is that the email can only send to less than 500 contacts without any issue but anything above 1000 contact will result to maximum execution timeout
How can I solve this problem.

This statement is going to retrieve and store all your 50k in memory:
$contacts = Contact::where(...)->where(...)->get();
I suggest chunking the results to avoid memory exhaustion.
Back to your problem, I am thinking about chunking the results (about 500 each) and dispatch an intermediary job that will eventually send emails.
class SendEmailsInChunk implements ShouldQueue
{
public $contacts;
public $batch;
function __construct(public $batch, public $contacts) {}
public function handle()
{
foreach ($this->contacts as $contact) {
$this->batch->add(new BroadCampaignJob(..., $contact, ...));
}
}
Then you can chunk the results and dispatch the above job with each chunk:
$batch = ...;
$query = Contact::where(...)->chunk(500, function ($contacts) use ($batch) {
SendEmailsInChunk::dispatch($batch, $contacts);
})

Related

Laravel Excel queued export failing

I have been having a lot of trouble getting the Laravel Excel package to export a large amount of data. I need to export about 80-100k rows so I implemented the queued export as mentioned in the docs. It works fine when I export a smaller amount of rows, but when I try to do 60-80k rows, it fails every time. While the jobs are being processed, I watch the temp file that is created, and I can see that the size of the file is increasing. I also watch the jobs in the database (I'm using the database queue driver), and I can see the jobs completing for a while. It seems that the jobs take incremently more time until the job fails. I don't get why the first several jobs are quick, and then they start taking more and more time to complete.
I'm using supervisor to manage the queue, so here's my config for that:
[program:laravel-worker]
process_name=%(program_name)s_%(process_num)02d
command=php /var/www/html/site/artisan queue:work --sleep=3 --tries=3 --timeout=120 --queue=exports,default
autostart=true
autorestart=true
user=www-data
numprocs=8
redirect_stderr=true
stdout_logfile=/var/log/supervisor/worker.log
loglevel=debug
And then my controller to create the export
(new NewExport($client, $year))->queue('public/exports/' . $name)->allOnQueue('exports')->chain([
new NotifyUserOfCompletedExport($request->user(), $name),
]);
I'm using:
Laravel 5.8,
PHP 7.2,
Postgresql 10.10
I should also mention that I have played around with the chunk size a bit, but in the end I've always run into the same problem. I tried chunk sizes of 500, 2000, 10000 but no luck.
In the failed_jobs table, the exception is MaxAttemptsExceededException, although I have also got exceptions for InvalidArgumentException File '/path/to/temp/file' does not exist. I'm not quite sure what else to do. I guess I could make it so it doesn't timeout, but that seems like it will just cause more problems. Any help would be appreciated.
EDIT
Here is the content of my Export Class:
class NewExport implements FromQuery, WithHeadings, WithMapping, WithStrictNullComparison
{
public function __construct($client, $year)
{
$this->year = $year;
$this->client = $client;
}
public function query()
{
$data = $this->getDataQuery();
return $data ;
}
public function headings(): array
{
$columns = [
//....
];
return $columns;
}
public function map($row): array
{
$mapping = [];
foreach($row as $key => $value) {
if(is_bool($value)) {
if($value) {
$mapping[$key] = "Yes";
} else {
$mapping[$key] = "No";
}
}else{
$mapping[$key] = $value;
}
}
return $mapping;
}
private function getDataQuery()
{
$query = \DB::table('my_table')->orderBy('my_field');
return $query;
}
The NotifyUserOfCompletedExport class is just creating a job to email the logged in user that the export is finished with a link to download it.
class NotifyUserOfCompletedExport implements ShouldQueue
{
use Queueable, SerializesModels;
public $user;
public $filename;
public function __construct(User $user, $filename)
{
$this->user = $user;
$this->filename = $filename;
}
public function handle()
{
// This just sends the email
$this->user->notify(new ExportReady($this->filename, $this->user));
}
}
EDIT 2:
So I read this post, and I verified that eventually my server was just running out of memory. That lead to the MaxAttemptsExceededException error. I added more memory to the server, and I am still getting the InvalidArgumentException File '/path/to/temp/file' does not exist after the jobs have completed. It's even more weird though, because I can see that /path/to/temp/file actually does exist. So I have no idea what is going on here, but it's super frustrating.

Sending bulk email using laravel queue

We are trying to send bulk email (100k) with PHP Laravel framework. Which way is the correct way to send bulk email with Laravel queue?
Case 1.
//controller
public function runQueue(){
dispatch(new ShootEmailJob());
}
//job
public function handle(){
$emails = EmailList::get(['email']);
foreach($emails as $email){
Mail::to($email)->send();
}
}
Case 2.
//controller
public function runQueue(){
$emailList = EmailList::get(['email']);
foreach($emailList as $emailAddress){
dispatch(new ShootEmailJob($emailAddress->email));
}
}
//job
public function handle(){
Mail::to($emailAddress)->send(new ShootMail($emailAddress));
}
Which one is the correct approach case 1 or case 2?
The first approach will first fetch all emails and then send them one by one in one "instance" of a job that is run as a background process if you queue it.
The second approach will run n "instances" of jobs, one for each email on the background process.
So performance-wise option 1 is the better approach. You could also wrap it in a try - catch block in case of exceptions so that the job does not fail if one of the emails fails, e.g.:
try {
$emails = EmailList::get(['email']);
foreach($emails as $email){
Mail::to($email)->send();
}
} catch (\Exception $e) {
// Log error
// Flag email for retry
continue;
}

How to unlink files after they have been attached and sent? (When using Mail::queue)

I switched from sending my mails immediately to adding them to the queue, here is my code, the $attachments is an array of temporary paths, I've commented out what I've tried, which throws errors about files not existing.
Mail::queue($view, $data, function(\Illuminate\Mail\Message $message) use($mail,$attachments){
foreach($mail->getRecipients() as $recipient){
$message->to($recipient);
}
$message->subject($mail->getSubject());
foreach($attachments as $attachment){
$message->attach($attachment);
//this deletes the attachment before being sent
//unlink($attachment);
}
});
/* This code only works when using Mail::send() instead of Mail:queue()
foreach($attachments as $attachment){
unlink($attachment);
}
*/
Basically I want to clean up and remove my temporary attachments after the mail was sent. I am guessing this would not work with the out of the box laravel mail solutions. How can I trigger code post-queue-mail-sent?
Expanding on Deric Lima's answer a bit, you don't necessarily need a new Job class for this. You can do it with a Mailable object as well. Just override the send method.
/**
* #param MailerContract $mailer
*/
public function send(MailerContract $mailer)
{
parent::send($mailer);
//$this->clearAttachments is something you can defined in your constructor,
//making it the responsibility of whatever is applying the attachment
//to know whether it needs to remain in tact after the email is transmitted.
if ($this->clearAttachments) {
foreach ($this->attachments as $attachment) {
if (\File::exists($attachment['file'])) {
\File::delete($attachment['file']);
}
}
}
}
Personally, I'd make a BaseMailable class that all other Mailable classes extend, as opposed to the Illuminate\Mail\Mailable directly. Then you don't even have to worry about it from then on.
You have to wait until the queue is processed before removing the file.
Without knowing the implementation details of the queue it is hard to answer your question, but if your queue is processed before the script ends, you can use register_shutdown_function
http://www.php.net/manual/en/function.register-shutdown-function.php to run a cleanup function that removes the file
register_shutdown_function(function() use (filename){
if (file_exists($filename)) {
unlink($filename);
}
})
I had similar problem and I solved using Laravel Jobs. Basically, you can create a Job class to send the email:
class MailJob extends Job implements SelfHandling, ShouldQueue
{
use InteractsWithQueue, SerializesModels;
public function handle()
{
Mail::send($view, $data, function (\Illuminate\Mail\Message $message) use ($mail, $attachments) {
foreach ($mail->getRecipients() as $recipient) {
$message->to($recipient);
}
$message->subject($mail->getSubject());
foreach ($attachments as $attachment) {
$message->attach($attachment);
unlink($attachment);
}
});
foreach ($attachments as $attachment) {
unlink($attachment);
}
}
}
And then you just dispatch the Job inside the controller that you want to send the email:
$this->dispatch(new MailJob());
P.S: The job is running asynchronous on the background, so I used Mail::send instead of Mail::queue.

Fastest or most robust way to make 7 soap api requests in parallel

my web app requires making 7 different soap wsdl api requests to complete one task (I need the users to wait for the result of all the requests). The avg response time is 500 ms to 1.7 second for each request. I need to run all these request in parallel to speed up the process.
What's the best way to do that:
pthreads or
Gearman workers
fork process
curl multi (i have to build the xml soap body)
Well the first thing to say is, it's never really a good idea to create threads in direct response to a web request, think about how far that will actually scale.
If you create 7 threads for everyone that comes along and 100 people turn up, you'll be asking your hardware to execute 700 threads concurrently, which is quite a lot to ask of anything really...
However, scalability is not something I can usefully help you with, so I'll just answer the question.
<?php
/* the first service I could find that worked without authorization */
define("WSDL", "http://www.webservicex.net/uklocation.asmx?WSDL");
class CountyData {
/* this works around simplexmlelements being unsafe (and shit) */
public function __construct(SimpleXMLElement $element) {
$this->town = (string)$element->Town;
$this->code = (string)$element->PostCode;
}
public function run(){}
protected $town;
protected $code;
}
class GetCountyData extends Thread {
public function __construct($county) {
$this->county = $county;
}
public function run() {
$soap = new SoapClient(WSDL);
$result = $soap->getUkLocationByCounty(array(
"County" => $this->county
));
foreach (simplexml_load_string(
$result->GetUKLocationByCountyResult) as $element) {
$this[] = new CountyData($element);
}
}
protected $county;
}
$threads = [];
$thread = 0;
$threaded = true; # change to false to test without threading
$counties = [ # will create as many threads as there are counties
"Buckinghamshire",
"Berkshire",
"Yorkshire",
"London",
"Kent",
"Sussex",
"Essex"
];
while ($thread < count($counties)) {
$threads[$thread] =
new GetCountyData($counties[$thread]);
if ($threaded) {
$threads[$thread]->start();
} else $threads[$thread]->run();
$thread++;
}
if ($threaded)
foreach ($threads as $thread)
$thread->join();
foreach ($threads as $county => $data) {
printf(
"Data for %s %d\n", $counties[$county], count($data));
}
?>
Note that, the SoapClient instance is not, and can not be shared, this may well slow you down, you might want to enable caching of wsdl's ...

Polling Azure queue returns Broken pipe

When long polling Azure Queue Storage with azure-sdk-for-php, if my request are more than 30 seconds apart, the library dies with this error:
PHP Notice: fwrite(): send of 277 bytes failed with errno=32 Broken pipe in ..../vendor/pear-pear.php.net/HTTP_Request2/HTTP/Request2/SocketWrapper.php on line 188
If I set the sleep function to 30 seconds, everything goes great, but I'm making LOTS of requests that I don't need.
My workers code:
use WindowsAzure\Common\ServicesBuilder;
use WindowsAzure\Common\ServiceException;
Class Worker_Task {
public $queueRestProxy;
public $servicesBuilder;
public $connectionString;
public function __construct() {
$this->connectionString = Config::get('azure.connection_string');
$this->servicesBuilder = ServicesBuilder::getInstance();
$this->queueRestProxy = $this->servicesBuilder->createQueueService($this->connectionString);
}
public function emails() {
$this->write('Processing mails...');
while(true) {
$this->queueRestProxy = $this->servicesBuilder->createQueueService($this->connectionString);
// Get message.
$listMessagesResult = $this->queueRestProxy->listMessages("emails");
$messages = $listMessagesResult->getQueueMessages();
foreach($messages as $message) {
// Process Message
$content = json_decode($message->getMessageText());
// Dispatch Email
// Delete Job
$this->deleteJob('emails', $message->getMessageId(), $message->getPopReceipt());
}
sleep(30);
}
}
}
This looks like overload or timeout problem. PHP library can't write to Azure socket. Probably you exceeding Azure load limits with infinite loop while(true) { ... }.
Don't create queueService at each loop because you closing and opening pear socket infinitely.
Try moving line:
$this->queueRestProxy = $this->servicesBuilder->createQueueService($this->connectionString);
before while(true) { :
$this->write('Processing mails...');
$this->queueRestProxy = $this->servicesBuilder->createQueueService($this->connectionString);
while(true) {
// Get message.
If that doesn't help then probably azure-sdk-for-php library incorrectly uses a socket mechanism - create new issue at https://github.com/WindowsAzure/azure-sdk-for-php OR use this approach: AzurePHP - Polling an Azure Queue

Categories