Laravel Raw Bulk insert - php

I wrote a function in a Controller used in the Laravel Framework. The function gets the file path of an *.csv file and then inserts all elements of this *.csv file with a raw bulk insert statement into the database. The problem is that the elements are not put in the database table when the function is executed. I do not get any errors, too. When I execute the Query in SQL Server Manager it inserts the elements without problems. What am I doing wrong and is there a better way to bulk insert the elements of an csv file to the database?
Here is the code for the function:
public static function bulkInsertCSV($filePath){
$sql = "use [testDatabase] BULK INSERT [dbo].[testTable]
FROM '" . $filePath . "' WITH (FIELDTERMINATOR = ';',"
. "ROWTERMINATOR = '\\n' );";
//DB::statement($sql);
DB::statement(DB::raw($sql));
}
Best regards,
Yalcin

I like using the laravel-excel library for dealing with my CSV files, it makes it a bit easier to manage and feels more "laravel-y" to me.
The laravel/excel lbirary: http://www.maatwebsite.nl/laravel-excel/docs/getting-started#installation library.
What you'd then do is somewhat simple:
Excel::filter('chunk')->load($filePath)->chunk(500, function($rows){
foreach($rows as $row) {
$data = collect([]);
foreach(App\TestTable::getFillable() as $fillable ) {
if (isset($row[$fillable])) {
$data->put($fillable, $row[$fillable]);
}
}
App\TestTable::create($data->toArray());
}
});
Notes:
Using chunk ensures we don't run out of memory.
The getFillable() methods assumes you defined the protected $fillable = []; properties on your TestTable model.
The column names from your testTable match the column names from your csvfile.
Hopefully this helps you.

Related

How can I import csv file to MySQL database more efficiently with PHP?

I explain, I have a Symfony2 project and I need to import users via csv file in my database. I have to do some work on the datas before importing it in MySQL. I created a service for this and everything is working fine but it takes too much time to execute and slow my server if I give it my entire file. My files have usually between 500 and 1500 rows and I have to split my file in ~200 rows files and import one by one.
I need to handle related users that can be both in the file and/or in database already. Related users are usually a parent of a child.
Here is my simplified code :
$validator = $this->validator;
$members = array();
$children = array();
$mails = array();
$handle = fopen($filePath, 'r');
$datas = fgetcsv($handle, 0, ";");
while (($datas = fgetcsv($handle, 0, ";")) !== false) {
$user = new User();
//If there is a related user
if($datas[18] != ""){
$user->setRelatedMemberEmail($datas[18]);
$relation = array_search(ucfirst(strtolower($datas[19])), UserComiti::$RELATIONSHIPS);
if($relation !== false)
$user->setParentRelationship($relation);
}
else {
$user->setRelatedMemberEmail($datas[0]);
$user->addRole ( "ROLE_MEMBER" );
}
$user->setEmail($mail);
$user->setLastName($lastName);
$user->setFirstName($firstName);
$user->setGender($gender);
$user->setBirthdate($birthdate);
$user->setCity($city);
$user->setPostalCode($zipCode);
$user->setAddressLine1($adressLine1);
$user->setAddressLine2($adressLine2);
$user->setCountry($country);
$user->setNationality($nationality);
$user->setPhoneNumber($phone);
//Entity Validation
$listErrors = $validator->validate($user);
//In case of errors
if(count($listErrors) > 0) {
foreach($listErrors as $error){
$nbError++;
$errors .= "Line " . $line . " : " . $error->getMessage() . "\n";
}
}
else {
if($mailParent != null)
$children[] = $user;
else{
$members[] = $user;
$nbAdded++;
}
}
foreach($members as $user){
$this->em->persist($user);
$this->em->flush();
}
foreach($children as $child){
//If the related user is already in DB
$parent = $this->userRepo->findOneBy(array('username' => $child->getRelatedMemberEmail(), 'club' => $this->club));
if ($parent !== false){
//Check if someone related to related user already has the same last name and first name. If it is the case we can guess that this user is already created
$testName = $this->userRepo->findByParentAndName($child->getFirstName(), $child->getLastName(), $parent, $this->club);
if(!$testName){
$child->setParent($parent);
$this->em->persist($child);
$nbAdded++;
}
else
$nbSkipped++;
}
//Else in case the related user is neither file nor in database we create a fake one that will be able to update his profile later.
else{
$newParent = clone $child;
$newParent->setUsername($child->getRelatedMemberEmail());
$newParent->setEmail($child->getRelatedMemberEmail());
$newParent->setFirstName('Unknown');
$this->em->persist($newParent);
$child->setParent($newParent);
$this->em->persist($child);
$nbAdded += 2;
$this->em->flush();
}
}
}
It's not my whole service because I don't think the remaining would help here but if you need more information ask me.
While I do not heave the means to quantitatively determine the bottlenecks in your program, I can suggest a couple of guidelines that will likely significantly increase its performance.
Minimize the number of database commits you are making. A lot happens when you write to the database. Is it possible to commit only once at the end?
Minimize the number of database reads you are making. Similar to the previous point, a lot happens when you read from the database.
If after considering the above points you still have issues, determine what SQL the ORM is actually generating and executing. ORMs work great until efficiency becomes a problem and more care needs to go into ensuring optimal queries are being generated. At this point, becoming more familiar with the ORM and SQL would be beneficial.
You don't seem to be working with too much data, but if you were, MySQL alone supports reading CSV files.
The LOAD DATA INFILE statement reads rows from a text file into a table at a very high speed.
https://dev.mysql.com/doc/refman/5.7/en/load-data.html
You may be able to access this MySQL specific feature through your ORM, but if not, you would need to write some plain SQL to utilize it. Since you need to modify the data you are reading from the CSV, you would likely be able to do this very, very quickly by following these steps:
Use LOAD DATA INFILE to read the CSV into a temporary table.
Manipulate the data in the temporary table and other tables as required.
SELECT the data from the temporary table into your destination table.
I know that it is very old topic, but some time ago I created a bundle, which can help import entities from csv to database. So maybe if someone will see this topic, it will be helpful for him.
https://github.com/jgrygierek/BatchEntityImportBundle
https://github.com/jgrygierek/SonataBatchEntityImportBundle

Seemingly identical sql queries in php, but one inserts an extra row

I generate the below query in two ways, but use the same function to insert into the database:
INSERT INTO person VALUES('','john', 'smith','new york', 'NY', '123456');
The below method results in CORRECT inserts, with no extra blank row in the sql database
foreach($_POST as $item)
$statement .= "'$item', ";
$size = count($statement);
$statement = substr($statement, 0, $size-3);
$statement .= ");";
The code below should be generating an identical query to the one above (they echo identically), but when I use it, an extra blank row (with an id) is inserted into the database, after the correct row with data. so two rows are inserted each time.
$mytest = "INSERT INTO person VALUES('','$_POST[name]', '$_POST[address]','$_POST[city]', '$_POST[state]', '$_POST[zip]');";
Because I need to run validations on posted items from the form, and need to do some manipulations before storing it into the database, I need to be able to use the second query method.
I can't understand how the two could be different. I'm using the exact same functions to connect and insert into the database, so the problem can't be there.
below is my insert function for reference:
function do_insertion($query) {
$db = get_db_connection();
if(!($result = mysqli_query($db, $query))) {
#die('SQL ERROR: '. mysqli_error($db));
write_error_page(mysqli_error($db));
} #end if
}
Thank you for any insite/help on this.
Using your $_POST directly in your query is opening you up to a lot of bad things, it's just bad practice. You should at least do something to clean your data before going to your database.
The $_POST variable often times can contain additional values depending on the browser, form submit. Have you tried doing a null/empty check in your foreach?
!~ Pseudo Code DO NOT USE IN PRODUCTION ~!
foreach($_POST as $item)
{
if(isset($item) && $item != "")
{
$statement .= "'$item', ";
$size = count($statement);
$statement = substr($statement, 0, $size-3);
$statement .= ");";
}
}
Please read #tadman's comment about using bind_param and protecting yourself against SQL injection. For the sake of answering your question it's likely your $_POST contains empty data that is being put into your query and resulting in the added row.
as #yycdev stated, you are in risk of SQL injection. Start by reading this and rewrite your code by proper use of protecting your database. SQL injection is not fun and will produce many bugs.

Generic fast coded PHP MySQL dynamic insert/update query creating table/fields named as variables if doesnt exists

I'm looking for a way to make MySQL insert/update queries more dynamic and fast to code since sometimes one just need another field in a form (when for example prototyping an application). This might be a dumb question.
My idea is to make an insert or update if ids match, and if table/fields doesn't exists create it with one function dynamically.
<?php
// $l is set with some db-login stuff
// creates and inserts
$f[] = nf(1,'this_id_x'); // this_id_* could be a prefix for ids
$f[] = nf('value yep',$fieldname_is_this2)
$tbl_name = "it_didnt_exist";
nyakilian_fiq($l, $tbl_name, $f);
// Done!
//This would do an update on above
$fieldname_is_this2 = "this is now updated";
$f[] = nf(1,'this_id_x');
$f[] = nf($fieldname_is_this2); // the function takes the variable name as field name
$tbl_name = "it_didnt_exist";
nyakilian_fiq($l, $tbl_name, $f);
?>
I have been using this function with success. It doesn't add a column but that is against the structure of my MVC framework. Try something like this:
public function save(DatabaseConnection &$db)
{
$properties = get_object_vars($this);
$table = $this->getTableName();
// $cols = array();
// $values = array();
foreach ($properties as $key => $value) {
$cols[] = "`$key`";
$values[] = '"'.$value.'"';
if ($value != NULL) {
$updateCols[] = "`$key`".' = "'.$value.'"';
}
}
$sql = 'INSERT INTO '.$table.' ('.implode(", ", $cols).') VALUES ('.implode(", ", $values).') ON DUPLICATE KEY UPDATE '.implode(", ", $updateCols);
$stmnt = $db->prepare($sql);
var_dump($stmnt);
if ($stmnt->execute($values)) return true;
return false;
}
I have a model abstract class that I extend with a child class for each database table. This function sits in the model abstract. Each child class contains a public property [so I can use PDO::fetchObject()] that corresponds to a column name in the table. If I need to create the table on the fly, I add a function to the child class to do so.
This is quite unusable approach.
You are trying to mix into one single function (not even a class(!) a functionality that fits for a decent framework. That's just impossible (or unusable for some parts).
Yet it resembles major frameworks' Models in many aspects.
So, I could give just some recommendations
Do not create tables dynamically. Data structure is a backbone of the application and have to be solid.
do not take too much considerations (like "If an ID is passed"). it will tie your hands for whatever more complex case
take a look at some major frameworks - it seems your wishes are already fulfilled with their codegeneration feature (an ugliest thing that ever existed on the Erath in my private opinion). They're working pretty the same way you're talking about: you have to only define a Model and the rest is done by framework's methods

Automatically build mySql table upon CSV upload

Automatically build mySql table upon a CSV file upload.
I have a admin section where admin can upload CSV files with different column count and different column name.
which it should then build a mySql table in the db which will read the first line and create the columns and then import the data accordingly.
I am aware of a similar issue, but this is different because of the following specs.
The name of the Table should be the name of the file (minus the extension [.csv])
each csv file can be diffrent
Should build a table with number of columns and names from the CSV file
add the the data from the second line and on
Here is a design sketch
Maybe there are known frameworks that makes this easy.
Thanks.
$file = 'filename.csv';
$table = 'table_name';
// get structure from csv and insert db
ini_set('auto_detect_line_endings',TRUE);
$handle = fopen($file,'r');
// first row, structure
if ( ($data = fgetcsv($handle) ) === FALSE ) {
echo "Cannot read from csv $file";die();
}
$fields = array();
$field_count = 0;
for($i=0;$i<count($data); $i++) {
$f = strtolower(trim($data[$i]));
if ($f) {
// normalize the field name, strip to 20 chars if too long
$f = substr(preg_replace ('/[^0-9a-z]/', '_', $f), 0, 20);
$field_count++;
$fields[] = $f.' VARCHAR(50)';
}
}
$sql = "CREATE TABLE $table (" . implode(', ', $fields) . ')';
echo $sql . "<br /><br />";
// $db->query($sql);
while ( ($data = fgetcsv($handle) ) !== FALSE ) {
$fields = array();
for($i=0;$i<$field_count; $i++) {
$fields[] = '\''.addslashes($data[$i]).'\'';
}
$sql = "Insert into $table values(" . implode(', ', $fields) . ')';
echo $sql;
// $db->query($sql);
}
fclose($handle);
ini_set('auto_detect_line_endings',FALSE);
Maybe this function will help you.
fgetcsv
(PHP 4, PHP 5)
fgetcsv — Gets line from file pointer
and parse for CSV fields
http://php.net/manual/en/function.fgetcsv.php
http://bytes.com/topic/mysql/answers/746696-create-mysql-table-field-headings-line-csv-file has a good example of how to do this.
The second example should put you on the right track, there isn't some automatic way to do it so your going to need to do a lil programming but it shouldn't be too hard once you implement that code as a starting point.
Building a table is a query like any other and theoretically you could get the names of your columns from the first row of a csv file.
However, there are some practical problems:
How would you know what data type a certain column is?
How would you know what the indexes are?
How would you get data out of the table / how would you know what column represents what?
As you can´t relate your new table to anything else, you are kind of defeating the purpose of a relational database so you might as well just keep and use the csv file.
What you are describing sounds like an ETL tool. Perhaps Google for MySQL ETL tools...You are going to have to decide what OS and style you want.
Or just write your own...

Insert image binary from xml data to mysql in PHP

I have some photos (not big, only 8kb) in mysql database (in my desktop). the field type is blob. i want to export the table to xml file then upload it to my database website. but it not success. Here is what i have done :
Exporting the data to xml (in my computer desktop):
FileStream fs = new FileStream(filename,FileMode.Create,FileAccess.Write,FileShare.None);
StreamWriter sw = new StreamWriter(fs,Encoding.ASCII);
ds.WriteXml(sw); //write the xml from the dataset ds
Then upload the xml from my joomla website. i load the xml before insert it to the database
...
$obj = simplexml_load($filename);
$cnt = count($obj->mydata); //mydata is the table name in the xml tag
for($i=0;$i<cnt;$i++)
{
...
$myphoto = 'NULL';
if(!empty($obj->mydata[$i]->myphoto))
{
$myphoto = base64_code($obj->mydata[$i]->myphoto);
}
//insert to the database
$sqlinsert = "insert into jos_myphoto (id,myphoto) values(".$i.",".$myphoto.")";
...
}
...
it keep telling me 'DB function failed'. when value of $myphoto is null, the query work well but if $myphoto is not null, the error appears. i think there is something wrong with the code
$myphoto = base64_code($obj->mydata[$i]->myphoto).
i try to remove base64_code function but it dont work. How to solve this problem? Sorry for my bad english
Your data may contain which needs escaping put mysql_real_escape_string() function and try
It is always a good habit to store data using this function which save you from sql injection also.
And put quotes around the column data.
$sqlinsert = "insert into jos_myphoto (id,myphoto)
values(".$i.",'".mysql_real_eascape_string($myphoto)."')";

Categories