Insert 10000 rows data from CSV in mysql without skipping any rows - php

I need to insert my CSV data into mysql by insert query. Currently my CSV having 9976 rows data. But after running the query 9-10 lines skips randomly after 1000 rows and very hard to find out that which rows was skipped.
Firstly I tried to access the folder, then accessing the file and taking the csv first row as my header for table. Then running the query to insert the data in mysql but it skipped some random rows.
$dir = "C:\Users\\".strtolower($username)."\Downloads";
$fp = opendir($dir);
$dates = array();
$latest_file = glob($dir."\\Filter_ Tempo-jql-AP*");
closedir($fp);
$filepath=$latest_file[0];
$the_big_array = [];
$tablename="aht_tracker";
$dbname="ford_resource_capacity";
$conn =mysqli_connect("localhost","root","","$dbname") or die(mysqli_connect_error());
$fields="";
$fields1="";
$fieldsinsert="";
if (($h = fopen("{$filepath}", "r")) !== FALSE)
{
if (($data = fgetcsv($h, 100000, ",")) !== FALSE)
{
$issuekey=array_search("Issue key", $data);
$hours=array_search("Hours", $data);
$username=array_search("Username", $data);
$issuetype=array_search("Issue Type", $data);
$workdescription=array_search("Work Description", $data);
$new_array=array($issuekey,$hours,$username,$issuetype,$workdescription);
$arr_count=count($new_array);
$c=0;
$fieldsinsert .='(';
foreach ($new_array as $key => $value)
{
$fieldsinsert .=($key==0) ? '' : ', ';
$fieldsinsert .="`".str_replace(" ","_",$data[$value])."`";
$fields .="`".str_replace(" ","_",$data[$value])."` varchar(250) DEFAULT NULL,";
}
$fieldsinsert .= ')';
}
while(($data = fgetcsv($h, 100000, ",")) !== FALSE)
{
$fieldsInsertvalues="";
$c=0;
foreach ($new_array as $key => $value)
{
$fieldsInsertvalues .=($key==0) ? '(' : ', ';
$fieldsInsertvalues .="'".$data[$value]."'";
}
$fieldsInsertvalues .= ')';
$sql1 = "INSERT INTO ".$tablename." ".$fieldsinsert."VALUES".$fieldsInsertvalues;
mysqli_query($conn,$sql1);
}
fclose($h);
//unlink($filepath);
}
Show me some code which will help me to insert my all rows from csv data or give me some idea that is it possible to insert the csv data in 500-500 packet.

https://dev.mysql.com/doc/refman/8.0/en/load-data.html
It is more effective to do the MySQL command. Import the data from CSV. You can specificy Container, Separator and stuff.

Now I understood why my program skips some random line.
This is not just skipping the random lines. It is skipping the whole row when that row contain single quote in any of the column data.
So, again i am asking to all of you those. Does any one have the solution that how can i skip that particular column which contain single quote. I do not want to skip the row i just want o skip that cell from the csv.
Please update my code in which the program can skip those particular cell which contain single quote.

Related

Php Upload CSV and Get Column data

What I am trying to do is Upload a CSV file with Php. The first line is the Column names and below that the data (of course). Each column name can change depends on the end user uploads. So the main column names we need can change spots (A1 or B1 etc...) So lets say the column I need is B1 and I need to get all the data in B. Not sure on how to go by it. So far this is what I have. Any ideas?
ini_set("allow_url_fopen", 1);
$handle = fopen($_FILES['fileToUpload']['tmp_name'], 'r') or die ('cannot open the file');
while(!feof($handle)) {
$data[] = fgetcsv($handle);
}
var_dump($data);
fclose($handle);
UPDATE:
I am importing this file from .CSV to PHP
I need to search for column header that starts with “SKU” and then “COST”
From there once those are found then I want the whole column… B, E. But those column letters can change, depends on how it is being exported by the end user. I do not need the rows, just columns.
Once the file is uploaded into the server, use something like the following code to parse it and actually use it as an array[];
Code:
$filename = "upload/sample.csv";
if (($handle = fopen($filename, 'r')) !== FALSE){
while (($row = fgetcsv($handle, 1000, ",")) !== FALSE){
print_r($row);
}
}
That's one way of doing it, you could also read more about it here.
If you want the value of a specific column for each row then you need to loop through the results and pick it out. It looks like you are getting an array of arrays so...(EDITED to get the column based on the header name):
$header = $data[0];
unset($data[0]); // delete the header row so those values don't show in results
$sku_index = '';
$cost_index = '';
// get the index of the desired columns by name
for($i=0; $i < count($header); $i++) {
if($header[$i] == 'SKU') {
$sku_index = $i;
}
if($header[$i] == 'COST') {
$cost_index = $i;
}
}
// loop through each row and grab the values for the desired columns
foreach($data as $row) {
echo $row[$sku_index];
echo $row[$cost_index];
}
Should get what you want.

Want to insert data in oracle table from 2nd row of a text file in php

i want to insert data in oracle database from a text file. The first row of text file contains the header. i want to skip the first line. Below is my code.
for($i =1;($data = fgetcsv($handle, 10000, ",")) !== FALSE; $i++) {
// The query uses placeholders for data
$sql_insert = oci_parse($conn, 'insert into auto_debit_data_input (input_id,req_by,company_name) values (auto_debit_input_id_seq.nextval,:req_by,:company_name)');
oci_bind_by_name($sql_insert, ':req_by', $data[0]);
oci_bind_by_name($sql_insert, ':company_name', $data[0]);
$result=oci_execute($sql_insert);
if (!$result) {
$errmsg="No Data inserted. Please check all field";
//exit;
}
}
Below is my file data.
REQ_BY,Name
Mr X, Bangladesh
Mr Y, India
My code inserting from the header of the file. But i want to insert from 2nd line. Please help me to fix this.
I have also tried using below line but no luck.
for($i =2;($data = fgetcsv($handle, 10000, ",")) !== FALSE; $i++)
Just add following line at first place of you loop (if you are starting from $i = 1):
if($i==1){
continue;
}
Not tested, But Hope It will work;
Logic:
Skip the first iteration of loop...:)

How to loop through CSV mysql update quicker?

I have a csv file with 26,000 rows which i'm looping through rows and updating records (sometimes multiple) in a table with 250,000+ records. At the moment, its taken ages! I was wondering if there was an alternative way to do this quicker (in code or mysql/etc)
$row = 1;
if (($handle = fopen("zip-codes-database-DELUXE-BUSINESS2.csv", "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
if($row> 1){
# GET THE AREACODE FROM data 20
# name is: 21
$insert = "UPDATE ".TBLPREFIX."cities SET data = '".escape(serialize($data))."' WHERE area_code = ".$data[20]." AND title = '".trim(strtoupper($data[21]))."'";
mysql_query($insert) or die(mysql_error());
}
$row++;
}
fclose($handle);
}
Based on nothing I might try:
get the csv into a table via cmd line or 'load data infile'
update the records into a temp table using a 'insert ... select' where you join the old and new
move the temp table back onto the original (delete/rename)
Seems like it would be faster.. if a bit kludgy.

Get specific data from csv file based on mysql value using php

hi
so this is the setup: i need to update some prices from a csv file called pricelist.csv. the database table is called products and there is a column called product_id, which contains the product ids which can also be found in the first column of the csv file and the prices and lastly i need are located in the 7th column of the csv file. i need to write these to the price column of my database.
i have tried my best to come up with the code, but it just seems too much for my skill level. here is what i made:
<?php
include("admin/include/db.php");
$res=mysql_query("select * from products");
$row = 1;
$mycsvfile = array(); //define the main array.
if (($handle = fopen("pricelist.csv", "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ";")) !== FALSE)
{
$num = count($data);
$row++;
$mycsvfile[] = $data;
}
fclose($handle);
}
$row['product_id'] = $mycsvfile[$which_row][1] //trying to find the row in the csv
$mycsvfile[$which_row][7] = $price; //should get price, but previous line does not work
while($row=mysql_fetch_array($res))
{
mysql_query("update products set price='".$price."', isavailable='1' where id='".$row['id']."'");
}
?>
any sort of help is welcome! thanks
I think you are looking for this (but I didn't test it):
<?php
include("admin/include/db.php");
if( ( $handle = fopen("pricelist.csv", "r") ) !== FALSE )
{
while( ( $r = fgetcsv( $handle, 1000, ";") ) !== FALSE )
{
mysql_query('UPDATE products SET "price"="'.$r[6].'", "isavailable"="1" where "id"="'.$r[0].'"');
}
}
Disclaimer: Yes I know I didn't sanitize the data, but I don't feel like working with outdated mysql functions.
You can use file() to read the CSV file into an array. Then use str_getcsv() to read each item from the file array and turn it into an array reprensnting a row from the CSV file. Then you pull the data from that array into an update query and run it.
Like this:
$id = 0;
$price = 6;
$csv_file = file('pricelist.csv');
foreach($csv_file as $row)
{
$data = str_getcsv($row);
$query = "UPDATE products SET price = {$data[$price]}, isavailable='1' WHERE `id` = {$data[$id]}";
//run the query
}

Parsing CSV File to MySQL DB in PHP

I have a some 350-lined CSV File with all sorts of vendors that fall into Clothes, Tools, Entertainment, etc.. categories. Using the following code I have been able to print out my CSV File.
<?php
$fp = fopen('promo_catalog_expanded.csv', 'r');
echo '<tr><td>';
echo implode('</td><td>', fgetcsv($fp, 4096, ','));
echo '</td></tr>';
while(!feof($fp)) {
list($cat, $var, $name, $var2, $web, $var3, $phone,$var4, $kw,$var5, $desc) = fgetcsv($fp, 4096);
echo '<tr><td>';
echo $cat. '</td><td>' . $name . '</td><td>' .$web.'</td><td>'.$phone.'</td><td>'.$kw.'</td><td>'.$desc.'</td>' ;
echo '</td></tr>';
}
fclose($file_handle);
show_source(__FILE__);
?>
First thing you will probably notice is the extraneous vars within the list(). this is because of how the excel spreadsheet/csv file:
Category,,Company Name,,Website,,Phone,,Keywords,,Description
,,,,,,,,,,
Clothes,,4imprint,,4imprint.com,,877-466-7746,,"polos, jackets, coats, workwear, sweatshirts, hoodies, long sleeve, pullovers, t-shirts, tees, tshirts,",,An embroidery and apparel company based in Wisconsin.
,,Apollo Embroidery,,apolloemb.com,,1-800-982-2146,,"hats, caps, headwear, bags, totes, backpacks, blankets, embroidery",,An embroidery sales company based in California.
One thing to note is that the last line starts with two commas as it is also listed within "Clothes" category.
My concern is that I am going about the CSV output wrong.
Should I be using a foreach loop instead of this list way?
Should I first get rid of any unnecessary blank columns?
Please advise any flaws you may find, improvements I can use so I can be ready to import this data to a MySQL DB.
Im not sure of the overall structure of your CSV - its hard to make rule assumptions based on two lines... but something like the following should work:
$fp = fopen('promo_catalog_expanded.csv', 'r');
// normalize the column names
$columns = array_change_key_case(fgetcsv($fp, 0, ','), CASE_LOWER);
$lastCategory = null;
while(false !== ($data = fgetcsv($fp, 0, ','))) {
$data = array_combine($columns, $data); // make it an assoc array
// test if category has a value - if it doesnt use the last category
if(empty($data['category']) && null !== $lastCategory ){
$data['category'] = $lastCategory;
}
// if we have started a new set of entries for a cat, we need to make it the $lastCategory
if($lastCategory !== $dataCategory && null !== $data['category']) {
$lastCategory = $data['category'];
}
// your sql to do the insert
}

Categories