Control excel export - php

So basically got laravel project and got to export tables in excel. I'm using this for the export. Till now i only export from array or in my case.
public function exportGames()
{
$export = Games::all();
Excel::create('Games Data', function($excel) use($export){
$excel->sheet('Games', function($sheet) use($export){
$sheet->fromArray($export);
});
})->export('xlsx');
}
This however returns full table data, which include some field I to be gone (timestamps etc.). Also relational tables is not possible to display (only in the actual table on the page). Spend hours in the documentation still don't understand how to do it. Wanna rip my hair off. How do modify columns to export. Thanks in advice.

Seems simple enough:
$export = Games::select(['client_game_id', 'client_game_name', 'gameid', 'description', 'game_type', 'db_name', 'short_name'])->get();
You'll only see those columns in your exported XLS file.

You can do this way
$export = Games::all();
Excel::create('Games Data', function($excel) use($export){
$excel->sheet('Games', function($sheet) use($export){
foreach ($export as $key => $value) {
$exports[] = array('email' => $value['email'], 'name' =>$value['name']);
}
$sheet->fromArray($exports);
});
})->export('xlsx');

Related

maatwebsite~2.1.0 : Datetime column in exported xls file shoud have Excel date formatting

I am trying to export time column as a Excel time format but for some reason It doesn´t work.
$array_tytan_info_ntt= TYTAN_INFO_NTT::whereDate('fec_startdate', '>=', $variable)
->selectRaw(" TIME(fec_startdate) as 'STARTDATE1' ")
->get();
$excel_tytan_info_ntt = json_decode( json_encode($array_tytan_info_ntt), true);
Excel::create('ntts_mes_'.Carbon::now()->toDateTimeString(), function($excel) use($excel_tytan_info_ntt) {
$excel->sheet('nttsmes', function($sheet) use($excel_tytan_info_ntt) {
$sheet->setColumnFormat(array(
'A' => 'hh:mm:ss',));
$sheet->fromArray($excel_tytan_info_ntt);
});
})->export('xls');
As you can see, return as a String.
Expected behaviour
Many thanks for any help. Regards.

How to export child array element to excel in Laravel

I'm building an application in Laravel 5.5, with matwebsite/excel plugin where I'm having an array element to be exported into csv file. I'm able to get the values when I assign single string to key inside an array, but currently I have child array element which is present into the array, which is giving me blank excel sheet once included the same.
Without array elements:
public function investor(Request $request)
{
$record_set = [];
$tables = json_decode($request->investorTable);
foreach($tables as $table)
{
$record_set[] = array(
'schedule' => $table->schedule,
'Event type' => $table->event_type,
'Venue' => $table->venue,
'Nature' => $table->nature,
);
}
return Excel::create('investor', function($excel) use ($record_set) {
$excel->sheet('mySheet', function($sheet) use ($record_set) {
$sheet->fromArray($record_set);
});
})->download('csv');
}
this is working perfect but I want to place an array element whose name key I want to export, I don't know how to implement when I implement it it gives me blank sheet,
$record_set[] = array(
'schedule' => $table->schedule,
'Company Name' => $table->companyName,
'Contact Person' => $table->contact_participants,
'Event type' => $table->event_type,
'Venue' => $table->venue,
'Nature' => $table->nature,
);
my complete array element is something like this:
and the table looks into my html something like this:
I want exactly same output in excel, help me out with this.
Thanks.
You can use
Export to Excel5 (xls)
Excel::create('Filename', function($excel) {
})->export('xls');
// or
Export to Excel2007 (xlsx)
->download('xls');
Export to Excel2007 (xlsx)
->export('xlsx');
// or
->download('xlsx');
Export to CSV
Export to CSV
->export('csv');
// or
->download('csv');
Please refer that documentation
http://www.maatwebsite.nl/laravel-excel/docs/export

How to import Excelsheets in Symfony doctrine entity

I want to import an ExcelSheet to my DB using Symfony/Doctrine (imported ddeboer data-import bundle)
What is best practice to import the data and first check if the data is already imported?
I was thinking of two possibilities:
1)
$numarray = $repo->findAllAccounts();
$import = true;
foreach ($reader as $readerobjectkey => $readervalue) {
foreach ($numarray as $numkey){
if (($numkey->getNum() == $readervalue['number'])){
$import = false;
}
}
if($import){
$doctrineWriter ->disableTruncate()
->prepare()
->writeItem(
array(
'num' => $readervalue['number'],
'name' => $readervalue['name'],
'company' => $companyid
)
)
->finish();
2)
foreach ($reader as $row =>$value ) {
// check if already imported
$check = $this->checkIfExists($repo,'num', $value['number']);
if ($check){
echo $value['number']." Exists <br>";
}else{echo $value['number']." new Imported <br>";
$doctrineWriter ->disableTruncate()
->prepare()
->writeItem(
array(
'num' => $value['number'],
'name' => $value['name'],
'company' => $companyid
)
)
->finish();
public function checkIfExists($repo, $field, $value){
$check = $repo->findOneBy(array($field => $value));
return $check;
Problem is with big exceldatasheets (3000 rows +) with both solutions i get a timeout....
Error: Maximum execution time of 30 seconds exceeded
in general: for performance issues: is it prefered to generate 1000 queries to check if value exists (findOneBy) or to use two foreach loops to compare values?
Any help would be awesome!
Thx in advance...
You can try to check the filemtime of the file : http://php.net/manual/en/function.filemtime.php
I'm not sure if it would work properly but it worth a shot to try it and see if the modified date works as expected.
Otherwise you should think of another way that checking the data like this, it would take lot of resources to do so. maybe adding some metadata to the excel file :
http://docs.typo3.org/typo3cms/extensions/phpexcel_library/1.7.4/manual.html#_Toc237519906
Any other way than looping or querying database for large data is better.

update json file dynamically whenever mysql is updated

i am going to create a live score board using php and mysql. But i want to create a json file which update automatically whenever i update my table on database. i mean new scores added in tables should be added in my json file. Really confused. sorry for not having any program. hope for some solutions. Thank you. i have some code which i have used to insert in database.
$data = array(
'name' => $name,
'score' => $score,
'comment' => $comment
);
$result=$this->db->insert('score', $data);
Do not quite understand what you want, I developed this code, see if it is what you need.
https://gist.github.com/juniorb2ss/7431040
Example:
public function __construct()
{
$this->load->helper('json');
}
public function index()
{
$data = array(
'name' => $name,
'score' => $score,
'comment' => $comment
);
save($data);
$result = $this->db->insert('score', $data);
}
File is created in folder application/cache/score.cache. Content is all scores the application.
You can enter a new score or update, the file will be updated.
Example cache content:
{"teste":{"name":"teste","score":"32","comment":"score comment"}}
The most direct solution would be to dump the whole of your score table as a JSON object on each insert. The problem with this approach is that if you have a large amount of data, on each insert you'll be selecting a lot of data.
function SelectScores()
{
$query = $this->db->query('SELECT * FROM score');
return $query->result();
}
You can then json_encode this result and save as file.

What is the best way to define a large array of data in PHP?

I have a PHP application which requires large 2D arrays of hard-coded data. I currently just have this defined in a script which is included at the start of every script execution.
I would like to know if anyone has a better idea for how to do this.
My script looks something like the following. Except that I have many more rows and each row has many more fields.
function getData() {
return array(
1 => array('name'=>'a something', 'price'=>123, 'field1'=>1, 'field2'=>3, 'field3'=>2),
2 => array('name'=>'b something', 'price'=>123, 'field1'=>3, 'field2'=>3, 'field3'=>2),
3 => array('name'=>'c something', 'price'=>234, 'field1'=>2, 'field2'=>3, 'field3'=>2),
4 => array('name'=>'d something', 'price'=>345, 'field1'=>8, 'field2'=>3, 'field3'=>2),
5 => array('name'=>'e something', 'price'=>655, 'field1'=>12, 'field2'=>3, 'field3'=>2),
6 => array('name'=>'f something', 'price'=>124, 'field1'=>11, 'field2'=>3, 'field3'=>2),
);
}
Each row has the same fields. So it is very much like a DB table result set. I suppose I could put it in a DB table, but I find this script easier to edit and I would think it's much faster to run than querying a DB.
The problem with my current solution is that it can be hard to read because there are so many fields. What I would like is a system that is easy to read and edit, but it must also be very fast.
Would a DB table be better? Or perhaps reading in a CSV file from a spreadsheet?
As a general idea, write your data is any way that's convenient, possibly CSV or something similar. Create a function that can process that data into the format you need. Cache the processed data so this doesn't need to be done every time.
function getData() {
$cache = 'data.dat';
if (file_exists($cache)) {
return unserialize(file_get_contents($cache));
}
$data = <<<DATA
a something, 123, 1, 3, 2
b something, 123, 3, 3, 2
...
DATA;
$data = array_map(function ($row) {
$row = array_map('trim', explode(',', $row));
return array_combine(array('name', 'price', 'field1', 'field2', 'field3'), $row);
}, explode("\n", $data));
file_put_contents($cache, serialize($data));
return $data;
}
In the above example I'm just using a HEREDOC string to store the data. You may want to put that into an external CSV file, or an XML file that you can generate automatically from somewhere or whatever fits your needs.
If it's very large, storing it in a database may be a better idea. Something lightweight like SQLite will do.

Categories