I need to upload an excel file and import the content in the database to the related table.
I have already created a script to import it. The script below works properly
$excelObj = $this->get('phpexcel')->createPHPExcelObject($path_file);
$sheet = $excelObj->getActiveSheet()->toArray(null,true,true,true);
$em = $this->getDoctrine()->getManager();
//READ EXCEL FILE CONTENT
foreach($sheet as $i=>$row) {
if($i !== 1) {
$account = $em->getRepository('ExcelBundle:Users')->findOneByUsername($row['A']);
if(!$account) {
$user = new Users();
}
$user->setName($row['A']);
$user->setUsername($row['B']);
$user->setEmail($row['C']);
//... and so on
$em->persist($user);
$em->flush();
}
}
Now, instead of importing the row A, row B...etc of the excel file I need to import the name of the row.
name username email ...and so on
$user->setName($row['name']);
$user->setUsername($row['username']);
$user->setEmail($row['email']);
How can I do it?
So you need to map the headings row (row #1) to the actual values in each subsequent row
foreach($sheet as $i=>$row) {
if($i == 1) {
$headings = $row;
} else {
$row = array_combine($headings, $row);
.... do the rest of your stuff here
$user->setName($row['name']);
$user->setUsername($row['username']);
$user->setEmail($row['email']);
....
}
Related
How to insert table from database employee to my template document as above?
I successfully filled company field.
code like bellow
require_once APPPATH.'libraries/autoload.php';
use PhpOffice\PhpWord\PhpWord;
public function cetaksurat()
{
$templateProcessor = new \PhpOffice\PhpWord\TemplateProcessor('source.docx');
$templateProcessor->setValues([
'company' => 'my company',
//how to insert employee table from my database array result?
//.....
]);
header("Content-Disposition: attachment; filename=result.docx");
$templateProcessor->saveAs('php://output');
}
thank you
in my case I would use:
$news2 = $this->phpword_model->get_employe(); //
foreach($news2 as $key=>$values) {
$array_values1 = $values['id'];
$array_values2 = $values['name'];
}
$templateProcessor->setValue('id', $array_values1);
$templateProcessor->setValue('name', $array_values2);
$templateProcessor->saveAs($filename);
/// the first example is for a single record.
now this example is for multiple records.
$news7 = $this->phpword_model->get_employes();
$templateProcessor->cloneRow('id', count($news7));
$i=1;
foreach($news7 as $key=>$values) {
$templateProcessor->setValue('id#'.$i++, $values['id']);
}
$i2=1;
foreach($news7 as $key=>$values) {
$templateProcessor->setValue('name#'.$i2++, $values['name']);
}
$templateProcessor->saveAs($filename);
In laravel 5.5 project I want some my tables export into to csv files and I get this http://www.maatwebsite.nl/laravel-excel/docs/export#export library
with code like:
$path = $directoriesArray[count($directoriesArray)-1];
Excel::create($filename, function($excel) use($dataArray) {
$excel->sheet('file', function($sheet) use($dataArray) {
$sheet->fromArray( $dataArray );
});
})->store('csv', $path)->export('csv');
I upload 1 file ok, but as I need to upload several tables any
to the relative file I run the exporting function in a circle
and only the first file is uploaded. Is it restriction of the
browser(I tried in chromium, firefox) or if is the way to upload all files?
1) If there is a way just to write this csv file to disk without uploading?
2) If there is some way to make kind of buffering ( like ob_start ) of output data and write them to the files manually?
3) If there is some other tools that could do this?
Thanks!
The decision was :
})->store('csv', $path);
without
->export('csv')
:)
I did it like :
<?php
$tablesArray= [ // array of tables to export
'settings',
'groups',
'users',
];
$now_label= '_1';//strftime('%Y-%m-%d_%H_%M_%S');
$directories= [ public_path('uploads'), public_path('uploads/csv') , public_path('uploads/csv/loan_dump_'.$now_label) ];// to which directory export files
foreach( $tablesArray as $next_key=>$next_table_name ) { // all tables
$columnListing= \Schema::getColumnListing( $next_table_name ); // table columns
if ( $next_table_name == 'settings' ) {
$dataRows = Settings::all();
}
if ( $next_table_name == 'groups' ) {
$dataRows = Group::all();
}
if ( $next_table_name == 'users' ) {
$dataRows = User::all();
}
if ($dataRows == null) {
die("-1 XXZ INVALID DATA");
}
$dataRowsArray = $dataRows->toArray();
$writeRowsArray= [];
$row= 0;
foreach( $dataRowsArray as $next_key=>$nextDataRow ) {
if (!empty($nextDataRow['created_at'])) {
$nextDataRow['created_at'] = $this->dateTimeToDbFormat($nextDataRow['created_at']);
}
if (!empty($nextDataRow['updated_at'])) {
$nextDataRow['updated_at'] = $this->dateTimeToDbFormat($nextDataRow['updated_at']);
}
if ($row == 0) { // first row with keys/file headers
$writeRowsArray[] = $nextDataRow;
} else {
$writeRowsArray[] = array_values($nextDataRow);
}
$row++;
}
$this->writeArrayToCsvFile($writeRowsArray, $next_table_name, $directories);
} // foreach( $tablesArray as $next_key=>$next_table_name ) { // all tables
public function writeArrayToCsvFile(array $dataArray, string $filename, array $directoriesArray) : int
{
self::createDir( $directoriesArray );
$path = $directoriesArray[count($directoriesArray)-1];
Excel::create($filename, function($excel) use($dataArray) {
$excel->sheet('file', function($sheet) use($dataArray) {
$sheet->fromArray( $dataArray );
});
})->store('csv', $path)->export('csv');
return 1;
}
?>
Only 1st file in $tablesArray array is exported ok.
If to comment 1st table, then next table would uploaded ok and so on.
What is strange file is uploaded twice : in $directories path and in "Downloads" directory(I am under kubuntu).
Currently I am using Google Big query to match two tables and export the query result as CSV file in GCS.
I can't find a way to directly export the query result as CSV in GCS through API, whereas it is readily available in Google cloud console.
So I am creating a table dynamically with same column name and type as query output and inserts the record in the table. However as soon as I did it, Streaming buffer is created before actual table is populated with data. It is taking around 90 minutes to mature the table.
If I export that table (with active streaming buffer) as CSV in GCS, sometimes the exported file contains only the column header and no row data is exported.
Is there a way to overcome the situation?
I am using google cloud php API for my code.
Below is the sample working code.
1. We are creating two Table with schema dynamically
$bigQuery = new BigQueryClient();
try{
$res = $bigQuery->dataset(DATASET)->createTable($owner_table, [
'schema' => [
'fields' => $ownerFields
]
]);
}
catch (Exception $e){
//echo 'Message: ' .$e->getMessage();
//die();
$custom_error_message = "Error while creating owner table schema. Trying to create with ".$ownerFields;
sendErrorEmail($e->getMessage(), $custom_error_message);
return false;
}
2. Import two CSV from GCS into created tables.
$bigQuery = new BigQueryClient([
'projectId' => $projectId,
]);
$dataset = $bigQuery->dataset($datasetId);
$table = $dataset->table($tableId);
// load the storage object
$storage = new StorageClient([
'projectId' => $projectId,
]);
$object = $storage->bucket($bucketName)->object($objectName);
// create the import job
$job = $table->loadFromStorage($object, $options);
// poll the job until it is complete
$backoff = new ExponentialBackoff(10);
$backoff->execute(function () use ($job) {
//print('Waiting for job to complete' . PHP_EOL);
$job->reload();
if (!$job->isComplete()) {
//throw new Exception('Job has not yet completed', 500);
// sendErrorEmail("Job has not yet completed", "Error while import from storage.");
}
});
3. Run Big Query to find some matches from both table.
{
//Code omitted for brevity
}
4. Big Query returns the result as array in php. Then We are creating the 3rd table and inserting the query result set into it.
$queryResults = $job->queryResults();
if ($queryResults->isComplete()) {
$i = 0;
$rows = $queryResults->rows();
$dataRows = [];
foreach ($rows as $row) {
//pr($row); die();
// printf('--- Row %s ---' . PHP_EOL, ++$i);
++$i;
$inlineRow = [];
$inlineRow['insertId'] = $i;
foreach ($row as $column => $value) {
// printf('%s: %s' . PHP_EOL, $column, $value);
$arr[$column] = $value;
}
$inlineRow['data']= $arr;
$dataRows[] = $inlineRow;
}
/* Create a new result table to store the query result*/
if(createResultTable($schema,$type))
{
if(count($dataRows) > 0)
{
sleep(120); //force sleep to mature the result table
$ownerTable = $bigQuery->dataset(DATASET)->table($tableId);
$insertResponse = $ownerTable->insertRows($dataRows);
if (!$insertResponse->isSuccessful()) {
//print_r($insertResponse->failedRows());
sendErrorEmail($insertResponse->failedRows(),"Failed to create resulted table for".$type);
return 0;
}
else
{
return 1;
}
}
else
{
return 2;
}
}
else
{
return 0;
}
//printf('Found %s row(s)' . PHP_EOL, $i);
}
5. We are trying to export the 3rd table which contains the result set to GCS.
function export_table($projectId, $datasetId, $tableId, $bucketName, $objectName, $format = 'CSV')
{
$bigQuery = new BigQueryClient([
'projectId' => $projectId,
]);
$dataset = $bigQuery->dataset($datasetId);
$table = $dataset->table($tableId);
// load the storage object
$storage = new StorageClient([
'projectId' => $projectId,
]);
$destinationObject = $storage->bucket($bucketName)->object($objectName);
// create the import job
$options = ['jobConfig' => ['destinationFormat' => $format]];
$job = $table->export($destinationObject, $options);
// poll the job until it is complete
$backoff = new ExponentialBackoff(10);
$backoff->execute(function () use ($job) {
//print('Waiting for job to complete' . PHP_EOL);
$job->reload();
if (!$job->isComplete()) {
//throw new Exception('Job has not yet completed', 500);
return false;
}
});
// check if the job has errors
if (isset($job->info()['status']['errorResult'])) {
$error = $job->info()['status']['errorResult']['message'];
//printf('Error running job: %s' . PHP_EOL, $error);
sendErrorEmail($job->info()['status'], "Error while exporting resulted table. File Name: ".$objectName);
return false;
} else {
return true;
}
}
The Problem is on #5. As the 3rd Table in #4 is in streaming buffer the export job is not working successfully.
Sometimes it is able to export the table correctly and sometimes it is not done. We have tried to give a sleep between #4 and #5 about 120 seconds but the problem remains.
I have seen that the table generated dynamically via API is in streaming buffer for 90 mins. But this is too high.
I am trying to import data via excel sheet into database with codeigniter application.I am using phpexcel. However the code is right but i am getting an error which states:
Error Number: 1054
Unknown column 'joker' in 'field list'
INSERT INTO studentsaccount (joker) VALUES ('')
Filename: C:/xampp/htdocs/Nalanda_Library/system/database/DB_driver.php
Line Number: 691
however my code is as follows: for controller
public function studentaccountimport(){
$this->load->model('Department');
$file = $_FILES['upload']['tmp_name'];
//load the excel library
$this->load->library('excel');
//read file from path
$objPHPExcel = PHPExcel_IOFactory::load($file);
//get only the Cell Collection
$cell_collection = $objPHPExcel->getActiveSheet()->getCellCollection();
//extract to a PHP readable array format
foreach ($cell_collection as $cell) {
$column = $objPHPExcel->getActiveSheet()->getCell($cell)->getColumn();
$row = $objPHPExcel->getActiveSheet()->getCell($cell)->getRow();
$data_value = $objPHPExcel->getActiveSheet()->getCell($cell)->getValue();
//header will/should be in row 1 only.
if ($row == 1) {
$header[$row][$column] = $data_value;
} else {
$arr_data[$row][$column] = $data_value;
$this->Department->modeluploadation($data_value);
}
}
}
for model:
public function modeluploadation($data){
$this->db->insert('studentsaccount',$data);
}
i am novice in codeigniter here so please help
You need to specify column names in insert query..
try
if ($row == 1) {
$header[$row][$column] = $data_value;
} else {
$arr_data[$row][$column] = $data_value;
}
$data['header'] = $header;
$data['values'] = $arr_data;
$this->Department->modeluploadation($data);
Ok the reason for your problem is you need to pass an array of key=>value pairs as the parameter for insert().
I'm not sure why Safins answer has been marked down because he is right. So when you should set modeluploadation as:
public function modeluploadation($data){
$this->db->insert('studentsaccount',array('field_name'=>$data_value));
}
I am working on importing a csv of product reviews to my magento.
The csv is in my magento shell folder. I created a product_review.php script inside my shell directory
<?php
require '../app/Mage.php';
umask(0);
Mage::app()->setCurrentStore(Mage_Core_Model_App::ADMIN_STORE_ID);
ini_set('max_execution_time', 84000);
ini_set('memory_limit', '5120M');
$fp = fopen('final_available_dup.csv', 'r');
//Mage::app()->setCurrentStore(4); //desired store id
while($line = fgetcsv($fp)) {
$review = Mage::getModel('review/review');
$review->setEntityPkValue($line[0]);//previews1.csv
$review->setCreatedAt($line[1]);//previews1.csv
$review->setStatusId($line[2]); //approved
$review->setTitle($line[3]);
$review->setNickname($line[4]);
$review->setDetail($line[5]);
$review->setEntityId($line[6]);
$review->setStoreId(Mage::app()->getStore()->getId());
//$review->setStatusId($line[5]);
$review->setCustomerId($line[7]);//null is for administrator
$review->setReviewsCount($line[8]);//null is for administrator
$review->setReviewId($review->getId());
$review->setStores(array(Mage::app()->getStore()->getId()));
$review->save();
$review->aggregate();
}
?>
and when i run the shell folder and run product_review.php a blank page came which i guess is the correct way.
But when i go into my back end and check i cannot see any reviews.I am not able to which product the review is getting updated.
I don't know is there anything more I should do?
You can use following code for importing product reviews.
My CSV is as following format :
"created_at","Sku","status_id","title","detail","nickname","customer_id","option_id","entity_id"
"2016-04-01 19:42:09","1991474","2","Top","Blij van!!","claes wassenaar","","1:4#3:15#2:9","24582"
So make sure your edit your CSV path in following code and assign proper values to variables.
require_once 'app/Mage.php';
Mage::app();
$fileLocation = "var/import/import_review.csv"; // Set your CSV path here
$fp = fopen($fileLocation, 'r');
$count = 1;
while($data = fgetcsv($fp)){
if($count > 1){
//initiate required variables
$_createdAt = $data[0];
$_sku = $data[1];
$_catalog = Mage::getModel('catalog/product');
$_productId = $_catalog->getIdBySku($_sku);
$_statusId = $data[2];
$_title = $data[3];
$_detail = $data[4];
$_customerId = NULL;
$_nickname = $data[5];
//load magento review model and assign values
$review = Mage::getModel('review/review');
$review->setCreatedAt($_createdAt); //created date and time
$review->setEntityPkValue($_productId);//product id
$review->setStatusId($_statusId); // status id
$review->setTitle($_title); // review title
$review->setDetail($_detail); // review detail
$review->setEntityId(1); // leave it 1
$review->setStoreId(Mage::app()->getStore()->getId()); // store id
$review->setCustomerId($_customerId); //null is for administrator
$review->setNickname($_nickname); //customer nickname
$review->setReviewId($review->getId());//set current review id
$review->setStores(array(Mage::app()->getStore()->getId()));//store id's
$review->save();
$review->aggregate();
//set review ratings
if($data[7]){
$arr_data = explode("#",$data[7]);
if(!empty($arr_data)) {
foreach($arr_data as $each_data) {
$arr_rating = explode(":",$each_data);
if($arr_rating[1] != 0) {
Mage::getModel('rating/rating')
->setRatingId($arr_rating[0])
->setReviewId($review->getId())
->setCustomerId($_customerId)
->addOptionVote($arr_rating[1], $_productId);
}
}
}
$review->aggregate();
}
}
// if($count == 5){
// die("total $count reviews are imported!");
// }
$count++;
}
echo "total $count reviews are imported!";
?>