Hello I need help with Laravel Excel 3.1 with Laravel 7.
I have to export an excel with arround 300,000 records o more and download or save in public folder to download...
So when i do it with this plugin, localhost works fine. But in the server it shows an error 500.
I set memory limit = -1 and time execution = 1800.
it takes time to appear error 500 arround 10min or less.
This is my code:
use Maatwebsite\Excel\Concerns\Exportable;
use Maatwebsite\Excel\Concerns\FromQuery;
class ProductsExport implements FromQuery
{
protected $filters;
use Exportable;
function __construct($filters) {
$this->filters= $filters;
}
public function query()
{
return Product::query()
->where($this->filters)
->orderby('name');
}
}
Controller
(new ProductsExport($filters)->queue('products.xlsx');
return back()->withSuccess('Export started!');
In excel.php I just configured chunk_size to 5000. I don't know what else I need to configure here
I know that i need to use queue for big data, and i used it, but i dont know what else to do.
Related
I want to import a csv:xlsx file which has 50 column and 1M rows
I tried with laravel-excel package and Laravel queue. Some how i cant import the data. Nginx timeout given. I modify the max execution time in my php setting
In import controller
public function import(Request $request) { Excel::import(new LeadsImport, $request->file); }
In LeadsImport
public function collection(Collection $rows) { dispatch(new ImportJob($rows)); }
You should not call this type of heavy tasks from the browser because you are going to get timeout or worse out of memory error, due to the nature of webpages and the typical configuration.
I would make a custom command which make the heavy task of importing the 1M rows and for this if you use Eloquent make use of cursor or chunk methods of eloquent.
Hope this help.
https://qiita.com/ryo511/items/ebcd1c1b2ad5addc5c9d
I'm trying to migrate a Laravel 5 API with MySQL 5.7 to a Laravel 9 API with MySQL 8.
Almost everything is working well, except on a few queries that tries to load data with their parent, recursively.
On Laravel 5, i came up with this following solutions : Recursive Eloquent Models | Laravel ORM from self referencing table get N level hierarchy JSON
It was working like a charm, but on Laravel 9, I get a HTTP 500 error from Apache, which tells me in the logs the following error :
PHP Fatal error: Allowed memory size of 134217728 bytes exhausted
(tried to allocate 262144 bytes)
I tried at first to increase the memory in php.ini, but it was getting worst as the server was getting very laggy, so I had to restart Apache and go back with the 128M default value. Also on my Laravel 5 environment, I did not need to increase memory.
I was also suspecting the MySQL 8 upgrade to be involved in this problem, but by connecting to my MySQL 5.7 database, i had the same issue, so I think it comes from the way that Laravel loads relations in the 9 version.
Here is my Model code :
<?php
namespace App\Models\Consommation;
use Illuminate\Database\Eloquent\Model;
class ConsoRequestAttribut extends Model
{
protected $table = 'conso_request_attribut';
public $incrementing = false;
protected $primaryKey = 'id_attribut';
public $timestamps = false;
const ERROR_DELETE_ATTRIBUTE = 1;
const SUCCESS_DELETE_ATTRIBUTE = 0;
protected $fillable = [
'id_attribut',
'code_type_attribut',
'valeur',
'id_parent_attribut'
];
public function parent_attribut() {
return $this->belongsTo('App\Models\Consommation\ConsoRequestAttribut', 'id_parent_attribut', 'id_attribut');
}
public function parent() {
return $this->parent_attribut()->with('parent');
}
...
}
So on my Laravel 9 app, if I remove the ->with('parent') in my parent() function, the query result is returned and I don't have a 500 HTTP error, so I think the problem is with recursive loading.
Any idea ?
Thanks
It is better to call nested relationships like this:
public function parent() {
return $this->with('parent_attribut.parent');
}
I did not succeed to load the parent entity recursively with my ConsoRequestAttribut model, as I'm still stuck with the same memory problem. So it's not really "resolved".
As an alternative, in my ConsoRequestAttributRepository class, I made a function to load parent of entity recursively, which works perfectly :
public function retrieveRecursivelyConsoRequestAttributeParent(ConsoRequestAttribut $attribut)
{
$parent = ConsoRequestAttribut::where('id_attribut', $attribut->id_parent_attribut)
->first();
$attribut->parent = $parent;
if($parent->id_parent_attribut != null)
$this->retrieveRecursivelyConsoRequestAttributeParent($parent);
}
I'm using Laravel 5.8 and I tried creating a custom command like this:
php artisan make:command app exportappresults
And the Command goes like this:
protected $signature = 'app:exportappresults';
protected $description = 'Export App Exam Result Into Excel';
public function __construct()
{
parent::__construct();
}
public function handle()
{
return Excel::download(new TavanmandAppUserExamExport,'userexamlist.xlsx');
}
So as you can see I have used Laravel Excel and tried exporting data into Excel file.
Note that this code works fine in the Controller and can properly export an Excel file.
But now I don't know where does the exported excel file from the DB goes when I use the Console Command.
So if you know, please let me know...
Thanks.
Excel::download is used in HTTP controllers, to pass exported data to HTTP response. If you need to store file on disk, use Excel::store instead:
public function handle()
{
Excel::store(new TavanmandAppUserExamExport, 'userexamlist.xlsx');
}
File will be stored in default storage (disk), configured in laravel's config/filesystems.php, by default it's ./storage/app.
Also you may specify another disk as third argument of Excel::store method, see Storing exports on disk
I would like to know re queueing a laravel job is a bad idea or not. i had a scenario where i need to pull users post from facebook once they integrated there facebook account to my application. i want to pull {x} days historic data. facebook api like any other api limit there api request per minute. i keep track the request headers and once rate limit reached i saved those information in database and for each re queue i check whether i am eligible to make a call to facebook api
here is the code snippet for a better visualization
<?php
namespace App\Jobs;
class FacebookData implements ShouldQueue
{
/**
* The number of seconds the job can run before timing out.
*
* #var int
*/
public $timeout = 120;
public $userid;
public function __construct($id)
{
$this->userid=$id;
}
public function handle()
{
if($fbhelper->canPullData())
{
$res=$fbhelper->getData($user->id);
if($res['code']==429)
{
$fbhelper->storeRetryAfter($res);
self::dispatch($user->id);
}
}
}
}
The above snippet is a rough idea. is this a good idea? the reason why i post this question is the self::dispatch($user->id); looks like a recursion and it will try until $fbhelper->canPullData() returns true.that probably will take 6 minutes.i am worried about any impact would happen in my application.Thanks in advance
Retrying job is not a bad idea, it is just build into jobs design already. Laravel have retries for this matter, that jobs can do unreliable operations.
As an example in a project i have been working on, an external API we are working with has 1-5 http 500 errors per 100 requests we are sending. This is thou handled by the built in retry functionality of Laravel.
As of Laravel 5.4 you can set it in the class like so. This will do exactly what you want, without defining the logic. Finally for hitting the retry limit, you can define a function called retryAfter(), which specifies when the job should be retried.
class FacebookData {
public $tries = 5;
public function retryAfter() {
//wait 6 minutes
return 360;
}
}
If you want to keep your logic where you only retry 429 errors, i would use the inverse of that to delete the job, if its anything else than a 429.
if ($res['code'] !== 429) {
$this->delete();
}
I'm deploying my application to my production environment and it's not working as expected. I've narrowed the issue down to one line inside this loop in my controller;
foreach($temp_table_data as $a_payment) {
//array_push($payments, $a_payment->payment); //big collection object
array_push($payments, $a_payment->payment->first()->attributesToArray()); //smaller object
}
The error I get is call to a member function attributesToArray() on a non object. This seems crazy to be because - as the old saying goes - it works fine on my machine.
My dev. environment is Ubuntu trusty64 on PHP 5.5.21 and my production is RedHat Linux PHP 5.5.11. I thought these differences were very minor (maybe I'm wrong?).
If I do a print_r($temp_table_data() then I get a big collection returned. The same on both servers. So at some point it just stops liking either payment (that's a method) or first()
Here is partial of my TempTable.php Model with the payment method;
public function payment(){
return $this->hasMany('App\Models\Payment', 'Vendor ZIP', 'postcode');
}
And my Payment.php model (part of it);
class Payment extends Model {
protected $table = 'headquarters_data';
public function tempTable()
{
return $this->belongsTo('App\Models\TempTable', 'postcode', 'Vendor ZIP');
}
One of the tempTable models doesnt have a Payment and the attributesToArray() method is failing.
Try this and see if it works.
foreach($temp_table_data as $a_payment) {
$payment = $a_payment->payment->first();
if(!is_null($payment)){
$payments[] = $payment->attributesToArray();
}
}
the problem is that in production you have probably changed the data in your database and calling to first() method returns null then you are trying to call attributesToArray() on a null, which is wrong!
you should do a isset() function before calling attributesToArray().
if(isset($a_payment->payment->first()))
array_push($payments, $a_payment->payment->first()->attributesToArray());