PHP excel - data looping? - php

I have an array of arrays of data.
so the basic format is
$sheet = array(
array(
'a1 data',
'b1 data',
'c1 data',
'd1 data',
),
array(
'a2 data',
'b2 data',
'c2 data',
'd2 data',
),
array(
'a3 data',
'b3 data',
'c3 data',
'd3 data',
)
);
When I am passed the array I have no idea how many columns or rows there will be.
What I want to do is using php excel create an excel sheet out of the array.
from what I have seen, the only way to set data is to use
$objPHPExcel->getActiveSheet()->setCellValue('A1', $value);
So my question is
How would you loop over the cells?
remembering that there could be say 30 columns and say 70 rows which would be AD70 So, how do you loop that?
Or is there a built in function to turn an array to a sheet?

You can set the data from an array like so:
$objPHPExcel->getActiveSheet()->fromArray($sheet, null, 'A1');
fromArray works with 2D arrays. Where the 1st argument is the 2D array, the second argument is what to use if there is a null value, and the last argument is where the top left corner should be.
Otherwise you would need to loop through the data:
$worksheet = $objPHPExcel->getActiveSheet();
foreach($sheet as $row => $columns) {
foreach($columns as $column => $data) {
$worksheet->setCellValueByColumnAndRow($column, $row + 1, $data);
}
}

$rowID = 1;
foreach($sheet as $rowArray) {
$columnID = 'A';
foreach($rowArray as $columnValue) {
$objPHPExcel->getActiveSheet()->setCellValue($columnID.$rowID, $columnValue);
$columnID++;
}
$rowID++;
}
or
$objPHPExcel->getActiveSheet()->fromArray($sheet);

As far as I know, you don't need to know the height and width of the final Excel worksheet when populating a worksheet using PHP excel. You can just use a nested foreach loop to read all the nested arrays and use appropriate variables to build up the cell references and add them to the worksheet. The plugin will handle the size.

Related

Why laravel eloquent insert function missing values after 50k entries while using chunks?

I have an excel sheet that has more than 1 million rows in it. I want to insert and update these rows of data to a table. I have used "Maatwebsite" excel library and imported excel to a temporary table. After this, I want to insert temporary table data into the actual table.
Following is the code for fetching data from the temp table.
$temp_products = ProductTemp::limit(100000)->get()->map(function(ProductTemp $temp_product) {
return [
'msrp' => $temp_product->msrp,
'price' => $temp_product->unit_price,
'product_overview' => $temp_product->part_description,
'manufacturer_id' => $temp_product->manufacturer_id,
'upc' => $temp_product->upc,
'manufacturer_part_number' => $temp_product->part_no
];
});
Then I tried both the chunk() and array_chunk() functions to divide data into parts.
//$chunks = $temp_products->chunk(5000);
$arr_chunks = array_chunk($temp_products->toArray(), 5000);
Then I loop through them to insert rows in a actual table like this with the array_chunk() function.
foreach($arr_chunks as $chunk)
{
Product::insert($chunk);
}
and with chunk() function like this.
foreach($arr_chunks as $chunk)
{
Product::insert($chunk->toArray());
}
I have also tried the sleep() function with this but not helping. After 50, 000 entries it's missing values or adding more. Also, I tried chunks of 100, and 1000 but the result is the same and the page never finishes loading.
Here is the full function code, this might help with what I have tried more and what I am missing.
public function insertFromTempTable()
{
// dd("HERE 1111111 22222222222 4444444444");
$temp_products = ProductTemp::all()->map(function (ProductTemp $temp_product) {
return [
'msrp' => $temp_product->msrp,
'price' => $temp_product->unit_price,
'product_overview' => $temp_product->part_description,
'manufacturer_id' => $temp_product->manufacturer_id,
'upc' => $temp_product->upc,
'manufacturer_part_number' => $temp_product->part_no,
];
});
// dd($temp_products);
$inc = 0;
dump("Started on: " . Carbon::now()->toDateTimeString());
// sleep(10);
// foreach ($temp_products as $value) {
// Product::insert($temp_products[$inc]);
// $inc++;
// }
// dd($temp_products->toArray());
// Product::insert($temp_products->toArray());
// $chunks = $temp_products->chunk(100);
$chunk_sub = 5000;
$arr_chunks = array_chunk($temp_products->toArray(), $chunk_sub);
// dd($arr_chunks[0]);
// dd(gettype($chunks[0]->toArray()));
// dd($chunks[0]->toArray());
$chunk_temp = 0;
foreach ($arr_chunks as $chunk) {
// dd($chunk->toArray());
// dump(count($chunk->toArray()));
// Product::insert($chunk->toArray());
$chunk_temp = $chunk_temp + $chunk_sub;
Product::insert($chunk);
}
// $temp_products->chunk(2, function ($subset) {
// dd("HHHHHHH");
// $subset->each(function ($item) {
// dd($item);
// });
// });
dump("Ended on: " . Carbon::now()->toDateTimeString());
}
After inserting I want to compare an excel sheet on daily basis, that might have more than 1 million rows. That will check for MSRP and price columns and update them. Please let me know which will be the best solution for all of this.
UPDATE 01:
I have tried using a raw query. There were 104, 8575 rows in the temp table but after the query run, there are 103, 9229 rows in the products table. Following is the code for the raw query.
$results = DB::table('products')->insertUsing(
['msrp', 'price', 'product_overview', 'manufacturer_id', 'upc', 'manufacturer_part_number',],
function ($query) {
$query
->select(['msrp', 'unit_price', 'part_description', 'manufacturer_id', 'upc', 'part_no',])
->from('products_temp');
}
);
When I dump and die the $results variable it output 104, 8575.

How to export child array element to excel in Laravel

I'm building an application in Laravel 5.5, with matwebsite/excel plugin where I'm having an array element to be exported into csv file. I'm able to get the values when I assign single string to key inside an array, but currently I have child array element which is present into the array, which is giving me blank excel sheet once included the same.
Without array elements:
public function investor(Request $request)
{
$record_set = [];
$tables = json_decode($request->investorTable);
foreach($tables as $table)
{
$record_set[] = array(
'schedule' => $table->schedule,
'Event type' => $table->event_type,
'Venue' => $table->venue,
'Nature' => $table->nature,
);
}
return Excel::create('investor', function($excel) use ($record_set) {
$excel->sheet('mySheet', function($sheet) use ($record_set) {
$sheet->fromArray($record_set);
});
})->download('csv');
}
this is working perfect but I want to place an array element whose name key I want to export, I don't know how to implement when I implement it it gives me blank sheet,
$record_set[] = array(
'schedule' => $table->schedule,
'Company Name' => $table->companyName,
'Contact Person' => $table->contact_participants,
'Event type' => $table->event_type,
'Venue' => $table->venue,
'Nature' => $table->nature,
);
my complete array element is something like this:
and the table looks into my html something like this:
I want exactly same output in excel, help me out with this.
Thanks.
You can use
Export to Excel5 (xls)
Excel::create('Filename', function($excel) {
})->export('xls');
// or
Export to Excel2007 (xlsx)
->download('xls');
Export to Excel2007 (xlsx)
->export('xlsx');
// or
->download('xlsx');
Export to CSV
Export to CSV
->export('csv');
// or
->download('csv');
Please refer that documentation
http://www.maatwebsite.nl/laravel-excel/docs/export

Laravel chunk skipping 1 record from every chunk

So I have weird problem, using Laravel Excel and importing some data. I want to split import by chunks, but every time I define chunk size it skip 1 record from every chunk.
Here is peace of code:
Excel::selectSheets('Sheet1')->load($tmp_path)->chunk(3,function($result) use ($product)
foreach ($result as $row ) {
$row->dump();
}
});
So I just splitting collection by 3 records to demonstrate problem, screen bellow
Update:
'import' => [
'heading' => false,
'startRow' => 3
]
So if I define startRow I will see desired number of items per chunk, but unnecessary data at the beginning...
Well it's seams that https://github.com/Maatwebsite/Laravel-Excel have some problems with chunk method so I used Laravel chunk instead like this:
$tmp_path = $request->file('import_data')->getRealPath();
$results = Excel::load($tmp_path)->get();
$chunks = $results->chunk(3);
$chunks->toArray();
foreach ($chunks as $rows )
{
foreach ($rows as $row)
{
$row->dump();
}
}

Very Complex SQL Data Result Merge via PHP (Multi Dimentional Arrays)

Sorry, I have a hard time to write proper thread title for this problem.
Here's my question:
In short: How to merge array item and calculate.
Here's the full explanation
1) I have a (WP custom) database to track statistics: visits, unique visits, etc, and it's stored per post per date.
To make it easier to understand. Here's the screenshot of the table:
2) This is the example data when I queried it:
https://gist.github.com/turtlepod/8e7dc93bae7f0b665fd5aea8a9694998
So in this example we have multiple post ID: "90", "121", & "231"
We have multiple date in db: "2017-03-20", "2017-03-21", "2017-03-22"
We have multiple stats: "visits", and "unique_visits"
We also have a "stat_value" for each item.
Each item have unique ID.
All data is dynamically created when an event happen. so not all post_id have 2 stats or the above date.
Note: keep in mind that in real code, we have a lot more data and variations than the example above.
3) I need to merge the data:
The post_id "121" is the same as post "231", so we need to merge and add the "stat_value" into one data and remove "231" entry.
What is the best way to do this (dynamically) via PHP ?
I have this data:
$raw_data = array( ... ); // the one in github gist
$post_groups = array(
'121' => array( '121', '231' ), // main post_id => array of alias.
);
It need to return the same data format as $raw_data, but remove the data of "231" and include/sum the "stat_value" of "231" to "121".
Thank you.
Try it with this:
function david_transform_data($data, $groups) {
if (empty($groups) === true) {
return $data;
}
// Transform groups into a more useful format
$transformed_groups = array();
foreach ($groups as $post_id => $aliases) {
foreach ($aliases as $alias) {
if (absint($post_id) === absint($alias)) {
continue;
}
$transformed_groups[absint($alias)] = $post_id;
}
}
// Replace aliases with the real post id
foreach ($data as $index => $stat) {
if (isset($transformed_groups[absint($stat->post_id)]) === false) {
continue;
}
$data[$index]->post_id = $transformed_groups[absint($stat->post_id)];
}
// Go through stats and merge those with the same post_id, stat_id
// and stat_date
$merged_stats = array();
$index_tracker = 0;
$stats_hash = array();
foreach ($data as $index => $stat) {
$hash_key = sprintf(
'%s-%s-%s',
$stat->post_id,
$stat->stat_id,
$stat->stat_date
);
if (isset($stats_hash[$hash_key]) === true) {
$merged_stats[$stats_hash[$hash_key]]->stat_value += absint($stat->stat_value);
continue;
}
$merged_stats[] = $stat;
$stats_hash[$hash_key] = $index_tracker;
$index_tracker++;
}
return $merged_stats;
}
var_dump(david_transform_data($raw_data, $post_groups));
There might be a faster solution but this is the first thing that came to my mind.

How to pull data from multiple arrays to construct one nice array?

I have 4 arrays :
qb_array = { sku, stock } size:20803
valid_array = { sku, name, price, status } size:199803
by intersect qb_array && valid_array, base one sku then I have :
intersect_sku_array { sku } size:18795
Now, there are some data that I want to grab out of my first 2 arrays.
From qb_array, I want stock
From valid array - I want name, price, and status
Then I decide to create my 4th-array and called it :
4 - inserted_array ( because I will use this array to insert into my database )
I tried to construct it, and now I am stuck.
Here is my approach :
First, I did
foreach ($intersect_sku_array as $key ) {
$inserted_array[] = $key;
}
So far, over here is good - everything is working when I dd($inserted_array); I see all the stuffs in it.
Then, moving on to add the other 3 from my valid_array
foreach ($valid_array as $key => $value ) {
if ( in_array( $key , array_map('strtolower', $intersect_sku_array ) )){
$inserted_array[] = $value['name'];
$inserted_array[] = $value['price'];
$inserted_array[] = $value['status'];
}
}
Then I did dd($inserted_array); at the end, it is hanging on me.
After about 5 mn I got this error :
Maximum execution time of 300 seconds exceeded
Is it because I have too much data, or my code is stuck in the infinite loop some how ?
Can someone please explain all of these in details ?
Maybe this would help:
foreach($intersect_sku_array as $sku)
{
$qbRow = qb_array[array_search($sku,$qb_array)];
$validRow = valid_array[array_search($sku,$valid_array)];
inserted_array[] = array($sku, $qbRow[1], $validRow[1], $validRow[2], $validRow[3]);
}
Although I think it would be easier for you to use named arrays like the following:
qb_array = ['sku' => ['stock'=> 'actual stock']];
valid_array = ['sku' => ['name'=> 'actual name', 'price' => 'actual price', 'status' => 'actual status']];
Like so.
For one thing you are running the lowercasing inside the loop, which is certainly not the best way to save time.
Then you are constructing a "flat" array that will contain (sku, name, price, status) quadruplets (and no stock) in sequence.
I would rather have the database do a join on both tables, since SQL is a much better tool than PHP for that kind of job.
Now if for some reason you can't do that, better use sku as a key for your two arrays.
foreach ($qb_array as $val) $k_qb[strtolower($val['sku']) = $val['stock'];
foreach ($valid_array as $val) $k_va[strtolower($val['sku']) = array ($val['name'], $val['price'], $val['status']);
(if you must lowercase something, better do it late than never, but frankly this should also be done in the database, unless you're forced to work with garbage data)
Then you can do the join manually without any intermediate intersection array, like so:
forach ($k_qb as $sku => $stock) // use smallest array
{
if (!isset($k_va[$sku])) continue; // skip non-matching records
$record = $k_va[$sku];
$inserted_array[] = array (
'sku' => $sku,
'stock' => $stock,
'name' => $record[0],
'price' => $record[1],
'status' => $record[2]);
}
The order of the algorithm will be NlogM instead of MN if M and N are the sizes of both your arrays.

Categories