Im trying to insert array in mysql table... but my code doesn't work
$File = 'testfile.csv';
$arrResult = array();
$handle = fopen($File, "r");
$row = 0;
if(empty($handle) === false) {
while(($data = fgetcsv($handle, 1000, ";")) !== FALSE){
$arrResult[] = $data;
$num = count($data); //2100 resultats in my testfile
$row++;
if($row>1){ //ignore header line
for ($c=0; $c < $num; $c++) { //start loop
$sql = '
INSERT INTO MyTable (name, class, level, ability)
VALUES ("'.$data[0].'","'.$data[1].'","'.$data[2].'","'.$data[3].'")
';
$Add=$db->query($sql);
}
}
}
fclose($handle);
};
Result in Mytable:
1,Hero1, Warrior, 65, vitality;
2,Hero1, Warrior, 65, vitality;
3,Hero1, Warrior, 65, vitality;
4,Hero1, Warrior, 65, vitality;
...
You don't need the inner for loop. You're inserting the same row multiple times, since $count is the number of fields in the CSV.
And instead of checking $row each time through the loop, you can simply read the first line and ignore it before the loop.
if(empty($handle) === false) {
fgets($handle); // skip header line
while(($data = fgetcsv($handle, 1000, ";")) !== FALSE){
$sql = '
INSERT INTO MyTable (name, class, level, ability)
VALUES ("'.$data[0].'","'.$data[1].'","'.$data[2].'","'.$data[3].'")
';
$Add=$db->query($sql);
}
}
// remove `for` loop
if($row>1){ //ignore header line
$sql = '
INSERT INTO MyTable (name, class, level, ability)
VALUES ("'.$data[0].'","'.$data[1].'","'.$data[2].'","'.$data[3].'")
';
$Add=$db->query($sql);
}
And of course move to prepared statements to make your code more secure.
Related
$arr=array();
$row = -1;
if (($handle = fopen("out.csv", "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$num = count($data);
$row++;
for ($c = 0; $c < $num; $c++) {
$arr[$row][$c]= $data[$c];
}
}
fclose($handle);
}
I'm using this code to read excel file data, this code counts elements in a row that are divided by comma (,),
Name, Surname, Num, Tel
Name
Surname
Num
tel
but in one field I have word Orginal, and this code also divides element by that word like this:
Orgina
l
and in that way, I receive wrong elemnts, any help?
In the line
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
the 1000 is the maximum length of data to read, as your data (in the example posted) has more than that, it will split the data.
To allow it to read the data properly you can just leave the second and third parameters as defaults ( null for length - in other words any length and the default delimiter is a comma anyway...
while (($data = fgetcsv($handle)) !== FALSE) {
I have a problem with this code, the thing is that the inner while just runs once while the outer while does it right. What could be the problem?
Note: $producto_id is an array with ids.
$st_column = 0;
$nd_column = 1;
$posicionArray = 0;
if (($handle = fopen($ruta, "r")) != FALSE) {
fgetcsv($handle);
mysqli_query($link, "BEGIN");
while($producto_id[$posicionArray]){
$ins_producto = mysqli_query ($link, "INSERT INTO productos (encuesta_id, producto_id, nom_producto) VALUES ('".$encuesta_id."', '".$producto_id[$posicionArray]."', '".$nombre_producto[$posicionArray]."')");
while (($data = fgetcsv($handle, 0, "$delimiter")) != FALSE) {
if($producto_id[$posicionArray] == $data[$st_column]){
$ins_cupon = mysqli_query ($link, "INSERT INTO cupones (encuesta_id, producto_id, cupon, estado) VALUES ('".$encuesta_id."', '".$producto_id[$posicionArray]."', '".$data[$nd_column]."', 0)");
}
}
$posicionArray ++;
}
fclose($handle);
}
I believe you have a csv, which holds ids and coupons. It seems you are trying to go through your producto_id array and check if that exists in your CSV. This is what I would do:
CSV:
id,cupon
1,15165165
1,16516151
2,16841684
PHP:
function turn_csv_to_array($csv) {
$result = array();
if (($h = fopen($csv, "r")) != FALSE) {
$header = fgetcsv($h);
while ($row = fgetcsv($h)) {
$result[] = array_combine($header, $row);
}
fclose($h);
}
return $result;
}
$coupons = turn_csv_to_array("test.csv");
$product_ids = [1, 2, 3, 4];
foreach ($product_ids as $pid) {
// INSERT PRODUCT TO PRODUCT_DB
foreach ($coupons as $c) {
if ($pid == $c['id']) {
// INSERT PRODUCT TO COUPONS_DB
}
}
}
It all depends on what you meant by:
($data = fgetcsv($handle, 0, "$delimiter")
if you meant:
($data == fgetcsv($handle, 0, "$delimiter"))
that is the value of data is equal to the result of fgetcsv
then your code is wrong. and switch to "=="
if you mean:
($data = fgetcsv($handle, 0, "$delimiter"))
and fgetcsv can return a "0" then that is why.
using "assignments" in the middle of if statements is always bad practice.
if you do an assignment in the middle of your if statement, the value of the assignment is passed on to the boolean expression. Any value other than 0 is considered true. otherwise a 0 is considered false.
I currently have some code like this:
$handle = fopen($_FILES['file']['tmp_name'], "r");
$i = 0;
while (($data = fgetcsv($handle, 1000, ",")) !== false) {
if($i > 0) {
$sql = "
insert into TABLE(A, B, C, D)
values ('$data[0]', '$data[1]', '$data[2]', '$data[3]')
";
$stmt = $dbh -> prepare($sql);
$stmt->execute();
}
$i++;
}
fclose($handle);
This allows me to write to a certain table the contents of a CSV file, excluding the first row where all the names are. I want to be able to extract only the filled rows. How would I use so using this code?
fgetcsv returns an array consisting of a single null if the rows are empty
http://www.php.net/manual/en/function.fgetcsv.php
so you should be able to do a check based on that.
if ($data[0]===null)
{
continue;
}
or something like that
fgetcsv() returns an array with null for blank lines so you can do something like below.
$handle = fopen($_FILES['file']['tmp_name'], "r");
$i = 0;
while (($data = fgetcsv($handle, 1000, ",")) !== false) {
if (array(null) === $data) { // ignore blank lines
continue;
}
if($i > 0) {
$sql = "
insert into TABLE(A, B, C, D)
values ('$data[0]', '$data[1]', '$data[2]', '$data[3]')
";
$stmt = $dbh -> prepare($sql);
$stmt->execute();
}
$i++;
}
fclose($handle);
Based on the documentation, fgetcsv will return an array consisting of a single null value for empty rows, so you should be able to test the return value against that and skip blank lines that way.
The following example code will skip processing blank lines. Note that I have changed the file and removed some other logic to make it more easily testable.
<?php
$handle = fopen("LocalInput.txt", "r");
$i = 0;
while (($data = fgetcsv($handle, 1000, ",")) !== false) {
if($data== array(null)) continue;
var_dump($data);
$i++;
}
fclose($handle);
?>
I am trying to import a CSV file. Due to the program we use, the first row is basically all headers that I would like to skip since I've already put my own headers in via HTML. How can I get the code to skip the first row of the CSV? (the strpos command is to cut off the first field in all the rows.)
<?php
$row = 1;
if (($handle = fopen("ptt.csv", "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$num = count($data);
$row++;
for ($c=0; $c < $num; $c++) {
if(strpos($data[$c], 'Finished') !== false) {
$c++;
echo "<TR> <TD nowrap>" . $data[$c] . "</ TD>"; }
Else{
echo "<TD nowrap>" . $data[$c] . "</ TD>";
}
}
}
fclose($handle);
}
?>
Rather than using if condition for checking whether it is the first row, a better solution is to just add an extra line of code before the line from where the while loop starts as shown below :
....
.....
fgetcsv($handle);//Adding this line will skip the reading of th first line from the csv file and the reading process will begin from the second line onwards
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
.......
.......
It is just as simple......
As you are keeping track of the row number anyway, you can use continue to skip the rest of the loop for the first row.
For example, add this at the start of your while loop (just above $num = count($data)):
if($row == 1){ $row++; continue; }
There are other ways to do this, but just make sure that when you continue, $row is still being incremented or you'll get an infinite loop!
Please use the following lines of code
$flag = true;
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
if($flag) { $flag = false; continue; }
//your code for insert
}
Having the flag variable as true and setting it to false will skip the first line of the CSV file. This is simple and easy to implement.
put this inside your while loop:
if ($row == 1) continue;
Add this in the body of the while loop above the $row++;:
if ($row == 1) {
continue;
}
$count = 0;
while (($fields = fgetcsv($handle, 0, ",")) !== FALSE) {
$count++;
if ($count == 1) { continue; }
this worked for me:
$count = 0;
while(! feof($file))
{
$entry = fgetcsv($file, 0, ';');
if ($count > 0) {
//skip first line, header
}
$count++;
}
use this code
// mysql hostname
$hostname = 'localhost';
// mysql username
$username = 'root';
// mysql password
$password = '';
if (isset($_FILES['file']))
{
// get the csv file and open it up
$file = $_FILES['file']['tmp_name'];
//$handle is a valid file pointer to a file successfully opened by fopen(), popen(), or fsockopen().
$handle = fopen($file, "r");
try {
// Database Connection using PDO
$dbh = new PDO("mysql:host=$hostname;dbname=clasdb", $username, $password);
// prepare for insertion
$STM = $dbh->prepare('INSERT INTO statstrackertemp (ServerName, HiMemUti, AvgMemUti, HiCpuUti, AvgCpuUti, HiIOPerSec, AvgIOPerSec, HiDiskUsage, AvgDsikUsage) VALUES (?, ?, ?, ?, ?,?, ?, ?, ? )');
if ($handle !== FALSE)
{
// fgets() Gets a line from file pointer and read the first line from $handle and ignore it.
fgets($handle);
// created loop here
while (($data = fgetcsv($handle, 1000, ',')) !== FALSE)
{
$STM->execute($data);
}
fclose($handle);
}
}
catch(PDOException $e)
{
die($e->getMessage());
}
echo 'Data imported';
}
else
{
echo 'Could not import Data';
}
?>
I'm having a really troublesome time trying to import a large CSV file into mysql on localhost.
The CSV is about 55 MB and has about 750,000 rows.
I've rewritten the script so that it parses the CSV and dumps the rows one by one.
Here's the code:
$row = 1;
if (($handle = fopen("postal_codes.csv", "r")) !== FALSE)
{
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE)
{
$num = count($data);
$row++;
for ($c=0; $c < $num; $c++)
{
$arr = explode('|', $data[$c]);
$postcode = mysql_real_escape_string($arr[1]);
$city_name = mysql_real_escape_string($arr[2]);
$city_slug = mysql_real_escape_string(toAscii($city_name));
$prov_name = mysql_real_escape_string($arr[3]);
$prov_slug = mysql_real_escape_string(toAscii($prov_name));
$prov_abbr = mysql_real_escape_string($arr[4]);
$lat = mysql_real_escape_string($arr[6]);
$lng = mysql_real_escape_string($arr[7]);
mysql_query("insert into cities (`postcode`, `city_name`, `city_slug`, `prov_name`, `prov_slug`, `prov_abbr`, `lat`, `lng`)
values ('$postcode', '$city_name', '$city_slug', '$prov_name', '$prov_slug', '$prov_abbr', '$lat', '$lng')") or die(mysql_error());
}
}
fclose($handle);
}
The problem is that it's taking forever to execute. Any suuggested solutions would be appreciated.
You are reinventing the wheel. Check out the mysqlimport tool, which comes with MySQL. It is an efficient tool for importing CSV data files.
mysqlimport is a command-line interface for the LOAD DATA LOCAL INFILE SQL statement.
Either should run 10-20x faster than doing INSERT row by row.
Your problem is likely that you have autocommit on (by default) so MySQL is committing a new transaction for each insert. You should turn autocommit off with SET autocommit=0;. If you can switch to using the mysqli library (and you should if possible), you can use mysqli::autocommit(false) to turn off autocommitting.
$mysqli = new mysqli('localhost','db_user','my_password','mysql');
$mysqli->autocommit(false);
$stmt=$mysqli->prepare("insert into cities (`postcode`, `city_name`, `city_slug`, `prov_name`, `prov_slug`, `prov_abbr`, `lat`, `lng`)
values (?, ?, ?, ?, ?, ?, ?, ?);")
$row = 1;
if (($handle = fopen("postal_codes.csv", "r")) !== FALSE)
{
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE)
{
$num = count($data);
$row++;
for ($c=0; $c < $num; $c++)
{
$arr = explode('|', $data[$c]);
$stmt->bind_param('ssssssdd', $arr[1], $arr[2], toAscii(arr[2]), $arr[3], toAscii($arr[3]), $arr[4], $arr[6], $arr[7]);
$stmt->execute();
}
}
}
$mysqli->commit();
fclose($handle);
It will be much faster to use LOAD DATA if you can
try to do it in one query.
It could be limited by your my.cnf (mysql configuration) though
<?php
$row = 1;
$query = ("insert into cities ");
if (($handle = fopen("postal_codes.csv", "r")) !== FALSE)
{
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE)
{
$num = count($data);
$row++;
for ($c=0; $c < $num; $c++)
{
$arr = explode('|', $data[$c]);
$postcode = mysql_real_escape_string($arr[1]);
$city_name = mysql_real_escape_string($arr[2]);
$city_slug = mysql_real_escape_string(toAscii($city_name));
$prov_name = mysql_real_escape_string($arr[3]);
$prov_slug = mysql_real_escape_string(toAscii($prov_name));
$prov_abbr = mysql_real_escape_string($arr[4]);
$lat = mysql_real_escape_string($arr[6]);
$lng = mysql_real_escape_string($arr[7]);
$query .= "(`postcode`, `city_name`, `city_slug`, `prov_name`, `prov_slug`, `prov_abbr`, `lat`, `lng`)
values ('$postcode', '$city_name', '$city_slug', '$prov_name', '$prov_slug', '$prov_abbr', '$lat', '$lng'),";
}
}
fclose($handle);
}
mysql_query(rtrim($query, ","));
if it won't work, you can try this (disable automatical commit)
mysql_query("SET autocommit = 0");
$row = 1;
if (($handle = fopen("postal_codes.csv", "r")) !== FALSE)
{
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE)
{
$num = count($data);
$row++;
for ($c=0; $c < $num; $c++)
{
$arr = explode('|', $data[$c]);
$postcode = mysql_real_escape_string($arr[1]);
$city_name = mysql_real_escape_string($arr[2]);
$city_slug = mysql_real_escape_string(toAscii($city_name));
$prov_name = mysql_real_escape_string($arr[3]);
$prov_slug = mysql_real_escape_string(toAscii($prov_name));
$prov_abbr = mysql_real_escape_string($arr[4]);
$lat = mysql_real_escape_string($arr[6]);
$lng = mysql_real_escape_string($arr[7]);
mysql_query("insert into cities (`postcode`, `city_name`, `city_slug`, `prov_name`, `prov_slug`, `prov_abbr`, `lat`, `lng`)
values ('$postcode', '$city_name', '$city_slug', '$prov_name', '$prov_slug', '$prov_abbr', '$lat', '$lng')") or die(mysql_error());
}
}
fclose($handle);
}
I did this with SQL server:
I used SQL Bulkinsert command combined with data tables.
Data Tables reside in memory and are built from reading rows inside the file.
Each data table is built from a chunk of rows, not the entire file.
Keep track from the chunk processed by keeping pointers from last row read and max size of chunk.
When you are reading the file. exit the loop when the row id > last row + chunk size.
Keeping on looping and keep on inserting.
Also sometimes when you are using Load data if there are warnings the import will stop. You can use the keyword ignore.
LOAD DATA INFILE 'file Path' IGNORE INTO TABLE YOUR_Table
I had a similar situation where is was NOT feasible to use LOAD DATA. Transactions were at times unacceptable as well, as data needed to be checked for duplicates. Yet, the following drastically improved the process time for some of my import data files.
Before your while loop (CSV Lines) set autocommit to 0 and start a transaction (InnoDB only):
mysql_query('SET autocommit=0;');
mysql_query('START TRANSACTION;');
After your loop, commit and reset autocommit back to 1 (default):
mysql_query('COMMIT;');
mysql_query('SET autocommit=1;');
Replace mysql_query() with whatever Database object your code is using. I hope this helps others.