I have a HTML select form filled by SQL Query using SELECT DISTINCT...
The idea is to not show duplicate values from database, and it's almost working, but in some case it's giving a problem. To fill the SQL columns I'm using a PHP reading a TXT file with delimiters and explode function. If I have in the TXT file 10 duplicate columns, on my HTML it shows 2, instead of only 1... And I note that 1 is with 9 of the entries from database, and the other one have 1 entry that is always the last line from TXT file.
Resuming: the last line of TXT file always duplicate on the HTML select form.
Checking the database, everything looks ok, I really don't know why it's duplicating always in the last one.
I told about the PHP that make the SQL entry because i'm not sure if the problem is in the PHP that contains the HTML select or in the PHP that fill the database... I believe that the problem is in the PHP with HTML select since I'm looking in the database and everything is ok. The SQL query in this php is like this:
<td class="formstyle"><select name="basenamelst" class="formstyle" id="basenamelst">
<option value="Any">Any</option>
<?
$sql = mysql_query("SELECT DISTINCT basename FROM dumpsbase where sold=0");
while($row = mysql_fetch_assoc($sql))
{
if($row['basename'] == "")
{
echo '<option value="'.htmlspecialchars($row['basename'], ENT_QUOTES, 'UTF-8').'">unknOwn</option>';
}
else
{
echo '<option value="'.htmlspecialchars($row['basename'], ENT_QUOTES, 'UTF-8').'">'.htmlspecialchars($row['basename'], ENT_QUOTES, 'UTF-8').'</option>';
}
}
?>
</select>
Remember: if I upload to database 10 duplicate columns, it shows 2 on select. One with 9 entries, and another with 1 entry (always the last line of my TXT file)...
Okay, many people told me to trim() the columns and it still showing duplicate... So I came to the conclusion that I have some issue while loading the TXT for database. Here is the code where I get the values to put on database:
$file = fopen($targetpath, "r") or exit("Unable to open uploaded file!");
while(!feof($file))
{
$line = fgets($file);
$details = explode(" | ", $line);
foreach($details as &$value) // clean each field
{
$value = mysql_real_escape_string($value);
if($value == "")
{
$value = "NONE";
}
}
unset($value);
mysql_query("INSERT INTO dumpsbase VALUES('NULL', '$details[0]', '$details[1]', '$details[2]', '$details[3]', '$details[4]', '0', '$price', 'NONE', now(), 'NONE', 'NONE')") or die ("Uploading Error!");
It sounds to me like the error is when you are populating the table from the file, and that one of the values is ending up subtly different to the others.
The fact that it's the last line that differs makes me wonder if there are newline characters being included in each value (except that last line).
If this is the case, you should be able to correct it by running trim() or similar in your DB.
[Edit] Ideally, you want to do this as early as possible, i.e. correct the data rather than remembering it's wrong when you access it. If you can't find why the initial import is messing it up, you could correct the data immediately afterwards with UPDATE dumpsbase SET basename = TRIM(basename)
Try changing your query to the following:
SELECT DISTINCT TRIM(basename) FROM dumpsbase WHERE sold=0
Hope this helps.
Related
I've run into a problem that is making me go a bit crazy. I have imported some csv data into a table in my phpadmin database and am now using a php script with mysql_query() to run a simple select query on the database and convert the result into json format - e.g. SELECT clients FROM TABLE 29.
Basically, some of the columns in the table result in a json string after passing them through mysql_query() but others simply return a blank. I have fiddled for hours now and can't figure out why this is. The last bit of my code looks like this:
$myquery = "SELECT `clients` FROM `TABLE 29`";
$query = mysql_query($myquery) or die(mysql_error());
if ( ! $query ) {
echo mysql_error();
die;
}
$data = array();
for ($x = 0; $x < mysql_num_rows($query); $x++) {
$data[] = mysql_fetch_assoc($query);
}
echo json_encode($data);
mysql_close($server);
Any help would be greatly appreciated. Could it be something about the data in the table? I'm at a loss.
thank you!
UPDATE: the length of the strings in the column clients seems to be having an effect. When I replace all the text with something shorter (e.g. aaa instead of something like company name 111 - 045 - project name - currency - etc) it works. However, I need it to be able to handle long strings as I want it to just take whatever users happen to import into it... what am I doing wrong?
No, its not about the table, its about how you loop them. Example:
$data = array();
while($row = mysql_fetch_assoc($query)) { // While a row of data exists, put that row in $row as an associative array
$data[] = $row;
}
echo json_encode($data);
mysql_close($server);
exit;
Note: mysql is depreacted and no longer maintained. Use the improved version of the mysql extension which is mysqli or use PDO instead.
After checking all the rows of the data I discovered that the source of the problem was a 'é' - yes, an 'e' with an accent. Once I replaced it with a regular 'e' the problem went away. So much lost time for something so tiny :(
Here is a function I have in my php file
function deleteLocation() {
global $con;
$val = $_POST['id'];
$escaped = mysqli_real_escape_string($con,$val);
$sql = "DELETE FROM settings WHERE value = '".$escaped."'";
if(!mysqli_query($con,$sql)){
die("Query failed:" . mysqli_error($con));
} else {
die("DELETE FROM settings WHERE value = '".$escaped."' / num rows affected: " . mysqli_affected_rows($con));
}
}
Here is the text that is returned on the page
DELETE FROM settings WHERE value = 'asdasd ' / num rows affected: 0
If I take the first part, and run it on my phpmyadmin page,
DELETE FROM settings WHERE value = 'asdasd '
it will correctly delete the row, but as you can see from the output, 0 rows are affected when the script is run on the page.
If anyone can help to fix this I will be very grateful.
PS: The connection string and user permissions are indeed set up correctly, because every other function in this file works properly
EDIT: Got it, the space at the end of the string was a newline character that was sent from my javascript.
I tried re-creating your problem on my machine and the only time I get the same message as you is when that item was already deleted from the table (or when it wasn't there in the first place)
This was a pretty unique situation, so I don't know how much this would help other people but,
I had an array of strings that all had \n at the end, so I had to do
str[value] = str[value].trim();
foreach value in the array. It turns out that this was not a php problem, but rather js
So I'm trying to make it so that I can update a MySQL database by importing a CSV file, only problem is I am seeing some of my data has commas, which is causing the data to be imported into the wrong tables. Here's my existing import code.
if ($_FILES[csv][size] > 0) {
//get the csv file
$file = $_FILES[csv][tmp_name];
$handle = fopen($file,"r");
//loop through the csv file and insert into database
do {
if ($data[0]) {
mysql_query("INSERT INTO songdb (artist, title) VALUES
(
'".addslashes($data[0])."',
'".addslashes($data[1])."'
)
") or die (mysql_error());
}
} while ($data = fgetcsv($handle,1000,",","'"));
//
//redirect
header('Location: import.php?success=1'); die;
}
Is there a way I can set it to ignore the commas, quotes and apostrophes in the CSV file?
I would also let to set it to ignore the first line in the csv, seeing as how it's just column information. If that is at all possible.
** EDIT **
For example if the CSV contains data such as "last name, first name", or "User's Data", these are literally just examples of the data that's actually in there. The data is imported each month and we've just noticed this issue.
Sample Data:
Column 1, Column 2
Item 1, Description
Item 2, Description
Item, 3, Description
Item, 4, Description
"Item 5", Description
"Item, 6", Description
Above is the sample data that was requested.
You might want to use MySQL's built-in LOAD DATA INFILE statement which not only will work faster, but will let you use the clause FIELDS OPTIONALLY ENCLOSED BY '"' to work with that kind of files.
So your query will be something like that:
mysql_query(<<<SQL
LOAD DATA LOCAL INFILE '{$_FILES['csv']['tmp_name']}'
INTO TABLE songdb
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\\n'
IGNORE LINES 1 (artist, title)
SQL
) or die(mysql_error());
If your data is dirty, the easiest way to handle this will be to clean it up manually, and either use data entry forms that strip out bad characters and/or escape the input data, or tell the users who are generating this data to stop putting commas in fields.
Your example has inconsistent column count and inconsistent fields due to lack of escaping input in whatever they used to generate this data.
That said, you could do some advanced logic to igore any comma after Item but before a space or digit, using regular expressions, but that is getting kind of ridiculous and depending on the number of rows, it may be easier to clean it up manually before importing.
In terms of skipping the header row, you can do this:
if ($_FILES[csv][size] > 0) {
//get the csv file
$file = $_FILES[csv][tmp_name];
$handle = fopen($file,"r");
$firstRow = false;
//loop through the csv file and insert into database
do {
if ($data[0]) {
// skip header row
if($firstRow) {
$firstRow=false;
continue;
}
mysql_query("INSERT INTO songdb (artist, title) VALUES
(
'".addslashes($data[0])."',
'".addslashes($data[1])."'
)
") or die (mysql_error());
}
} while ($data = fgetcsv($handle,1000,",","'"));
//
//redirect
header('Location: import.php?success=1'); die;
}
Oh I just read your comment, 5gb. Wow. Manual cleanup is not an option. You need to look at the range of possible ways the data is screwed up and really assess what logic you need to use to capture the right columns.
Is your example above a representative sample or could other fields without enclosures have commas?
Try this, this is working fine for me.
ini_set('auto_detect_line_endings',TRUE);
$csv_data=array();
$file_handle = fopen($_FILES['file_name']['tmp_name'], 'r');
while(($data = fgetcsv($file_handle) ) !== FALSE){
$update_data= array('first'=>$data['0'],
'second'=>$data['1'],
'third'=>$data['2'],
'fourth'=>$data['34']);
// save this array in your database
}
I have a txt file which has 1 line of \t data in it, the data in the file is updated via a loop and in this loop at the end it is placed into a table. The current data I am working with has 24 columns and 25 lines, and it enters it all fine apart from the 2 lines which have identical first values; of these only 1 is entered.
I do not have anything set as Primary Key or Unique Value enabled, yet it just doesn't enter the line with identical first value, even values 3,4 and 5 are all the same as well.
Here is the code in it's entirety:
<?php
$conn = mysql_connect('localhost', 'root', '');
mysql_select_db('amazondb', $conn);
mysql_query ("TRUNCATE TABLE imported_orders");
$lines = file('F:/xamptest/htdocs/UniProject/upload/Amazon output.txt');
$lineCount = count($lines);
for ($arrayCounter=0; $lineCount>$arrayCounter; $arrayCounter++)
{
$lineEx = explode("\t", $lines[$arrayCounter]);
$lineEx[2] = substr($lineEx[2], 0, -15);
$lineEx[3] = substr($lineEx[3], 0, -15);
$lineEx[4] = substr($lineEx[4], 0, -15);
$lineEx[5] = substr($lineEx[5], 0, -15);
$arri = implode("\t",$lineEx);
file_put_contents('upload/ImportLine.txt', $arri);
$r = mysql_query ("LOAD DATA INFILE 'F:/xamptest/htdocs/UniProject/upload/ImportLine.txt' INTO TABLE imported_orders FIELDS TERMINATED BY '\t'", $conn);
echo "Array Counter Loop: ".$arrayCounter." Order ID: ".$lineEx[0]."<br>";
}
echo "<br>WLoop has ran: ".$arrayCounter." Times";
?>
Even using a only a loop and the single line file on it's own it just adds one row, I tried adding a first column as a PK Auto Increment, which doesn't work how I assumed it would and tries to put the first value from the file in it -.-'
This has had me stumped for days, as there is no apparent reason for it not to work, ideally I'd like to INSERT the imploded array value directly and define the values by \t like I do now, but from what I've searched to doesn't seem to be possible.
EDIT: Made value 1 and 2 a composite primary key, worked, now it doesn't. Just ran it again, for 6th time and now only enters 24 rows again -.-'
Sorry for my clear lack of experience with php and sql.
Thanks for any help.
I have one problem here, and I don't even have clue what to Google and how to solve this.
I am making PHP application to export and import data from one MySQL table into another. And I have problem with these tables.
In source table it looks like this:
And my destination table has ID, and pr0, pr1, pr2 as rows. So it looks like this:
Now the problem is the following: If I just copy ( insert every value of 1st table as new row in second) It will have like 20.000 rows, instead of 1000 for example.
Even if I copy every record as new row in second database, is there any way I can fuse rows ? Basically I need to check if value exists in last row with that ID_, if it exist in that row and column (pr2 for example) then insert new row with it, but if last row with same ID_ does not have value in pr2 column, just update that row with value in pr2 column.
I need idea how to do it in PHP or MySQL.
So you got a few Problems:
1) copy the table from SQL to PHP, pay attention to memory usage, run your script with the PHP command Memory_usage(). it will show you that importing SQL Data can be expensive. Look this up. another thing is that PHP DOESNT realese memory on setting new values to array. it will be usefull later on.
2)i didnt understand if the values are unique at the source or should be unique at the destination table.. So i will assume that all the source need to be on the destination as is.
I will also assume that pr = pr0 and quant=pr1.
3) you have missmatch names.. that can also be an issue. would take care of that..also.
4) will use My_sql, as the SQL connector..and $db is connected..
SCRIPT:
<?PHP
$select_sql = "SELECT * FROM Table_source";
$data_source = array();
while($array_data= mysql_fetch_array($select_sql)) {
$data_source[] = $array_data;
$insert_data=array();
}
$bulk =2000;
foreach($data_source as $data){
if(isset($start_query) == false)
{
$start_query = 'REPLACE INTO DEST_TABLE ('ID_','pr0','pr1','pr2')';
}
$insert_data[]=implode(',',$data).',0)';// will set 0 to the
if(count($insert_data) >=$bulk){
$values = implode('),(',$insert_data);
$values = substr(1,2,$values);
$values = ' VALUES '.$values;
$insert_query = $start_query.' '.$values;
$mysqli->query($insert_query);
$insert_data = array();
} //CHECK THE SYNTAX IM NOT SURE OF ALL OF IT MOSTLY THE SQL PART>> SEE THAT THE QUERY IS OK
}
if(count($insert_data) >=$bulk) // IF THERE ARE ANY EXTRA PIECES..
{
$values = implode('),(',$insert_data);
$values = substr(1,2,$values);
$values = ' VALUES '.$values;
$insert_query = $start_query.' '.$values;
$mysqli->query($insert_query);
$insert_data = null;
}
?>
ITs off the top off my head but check this idea and tell me if this work, the bugs night be in small things i forgot with the QUERY structure, print this and PASTE to PHPmyADMIN or you DB query and see its all good, but this concept will sqve a lot of problems..