Value in Column - php

I am developing a small hobby application. Though I've worked with MySQL and PostgreSQL before, I'm more of a n00b here and would appreciate any help.
I have a table in my MySQL database called "TECH". This table has two columns: "ID" (primary key) and "name" (name of the tech - not a key of any sort). Here are a couple of example rows:
+----+--------+
| ID | name |
+----+--------+
| 1 | Python|
| 2 | ASP |
| 3 | java |
+----+--------+
Here is the code that creates TECH:
CREATE TABLE TECH (
id INT(5) ,
name VARCHAR(20),
PRIMARY KEY (id)
);
I have developed an html form for the user to input a new technology into TECH. However, I would like to ensure that duplicate entries do not exist in TECH. For example, the user should not be allowed to enter "Python" to be assigned ID 4. Further, the user should also not be allowed to enter "pYthon" (or any variant of capitalization) at another ID.
Currently, I have the following code that does this (on the PHP side, not the MySQL side):
// I discovered that MySQL is not case sensitive with TECH.name
$rows = 0;
$result = $mysql_query("SELECT * FROM tech AS T WHERE T.name='python'");
while ($row = mysql_fetch_array($result)) {
$rows += 1;
}
if ($rows != 0) {
echo "'python' cannot be inserted as it already exists";
} else {
// insertion code
}
Now, I know that the correct way to do this would be to constrain TECH.name to be UNIQUE by doing UNIQUE (name) and catching an "insert error" on the PHP side.
However, I have the following two questions regarding this process:
Does defining the UNIQUE constraint maintain the apparent case-insensitivity addressed above?
How do I go about catching exactly such an insert error on the PHP side?
I'd appreciate any help with this or any better ideas that anyone has.

When you manipulate mysql form php (i.e. by doing an INSERT or UPDATE), you can call mysql_get_rows_affected which will return the rows affected. If the query has failed due to the UNIQUE constraint then the affected rows will be 0
http://php.net/manual/en/function.mysql-affected-rows.php
I usually check the number of rows returned from that function, The same check can be applyed if you take the INSERT OR IGNORE approach

TRY
INSERT IGNORE INTO mytable
(primaryKey, field1, field2)
VALUES
('abc', 1, 2),
('def', 3, 4),
('ghi', 5, 6);
duplicated rows would be ignored

Changing the collation of the field to _ci or _cs would determine whether a unique key was caseinsensitive or casesensitive.
As for catching the error, you should try using mysqli or PDO to run db queries: http://www.php.net/manual/en/pdo.exec.php
You can catch a duplicate error entry with PDO like so:
try
{
$dbh->exec($mySqlQuery);
// insert was successful...
} catch (PDOException $e) {
if ($e->errorInfo[1]==1062) {
// a 'duplicate' error occurred...
} else {
// a non 'duplicate error' occurred...
}
}
Edit:
If you're not using PDO, this should work after your mysql_query:
if (mysql_errno() == 1062)
{
// you have a duplicate error...
}

Related

Mysql auto increment issue

function randomUnique(){
return $randomString =rand(0, 9999); //generate random key
}
function insert($uid,$name,$email){
$link = mysqli_connect("localhost", "root", "", "dummy");
$query = "insert into `usertbl`(`uid`,`name`,`email`)
values('".$uid."','".$name."','".$email."');";
if(mysqli_query($link, $query)){
return $rval = 1;
}else if(mysqli_errno($link) == 1062){
insert(randomUnique(),$name,$email);
}else if(mysqli_errno($link != 1062)){
return $rval = 2;// unsuccessful query
}
}
$uid = randomUnique();
$name = "sam";
$email = "sam#domain.com";
$msg_code = insert ($uid,$name,$email);
echo $msg_code;
I have 4 columns in the table :
id(PK AI),uid(varchar unique),name(varchar),email(varchar).
When I want to create a new user entry.A random key is generated using the function 'randomUnique()'.And I have the column 'id' set to AI so it tries to input the details, but if the key repeats that error number 1062 is returned back from mysql.Everything runs well except for id column which is set to AI. the column value is skipped once if one key is a duplicate.
The above code is a recursive function so the number of values skipped in column 'id' is directly proportional to the number of times the function is called.
Example:
id | uid | name | email
1 | 438 | dan | dan#domail.com
2 | 3688 | nick | nick#domain.com
4 | 410 | sid | sid#domain.com
Here, we can see number 3 has skipped bcoz either random number function gave us a number 438 or 3688 which tends to throw back an error and our recursive function repeats once skipping the number 3 and entering 4 next time on successful execution.
I need to fix the auto increment so it enters the value into proper sequence .
I cannot change the structure of the table.
You can check whether an entry already exists with that uid before performing the INSERT operation, e.g.
SELECT COUNT(*) FROM table WHERE uid = '$uid';
This will return you the count of records that have the newly generated uid. You can check this count and perform the INSERT only if count is 0. If not, you can call the function again to generate anoter random value.
In each function calling you are creating new db link, may be for this situation php provided mysqli_close($link);
Either you close connection
if(mysqli_query($link, $query)){
return $rval = 1;
}else if(mysqli_errno($link) == 1062){
mysqli_close($link);
insert(randomUnique(),$name,$email);
}else if(mysqli_errno($link != 1062)){
return $rval = 2;// unsuccessful query
}
OR simply put DB connection out of function
$link = mysqli_connect("localhost", "root", "", "dummy");
function insert($uid,$name,$email){
Use PHP's uniqid function, it generates a proper unique is.
http://php.net/manual/en/function.uniqid.php
What is this is being used for? You may be able to use the id column which will perform much faster and is already guaranteed to be unique.

SQL : INSERT if no exist and UPDATE if exist

I have a program that can perform inserts and updates to the database, I get the data from API.
This is sample data when I get:
$uname = $get['userName'];
$oname = $get['offerName'];
$visitdata= $get['visits'];
$convdata = $get['conversion'];
I save this data to database sql. (sucess) this is a sample:
$sql = "INSERT INTO data_tester(username_data, name_offer_data, visit_data, conversion_data) VALUES('$uname','$oname', '$visitdata', '$convdata')";
Sample data in database table
id | username_data | name_offer_data | visit_data | conversion_data
1 | MOJOJO | XXX AU | 177 | 13
2 | MOJOJO | XX US | 23 | 4
Now, I want to save data $uname, $oname, $visitdata, $convdata if NOT EXIST and UPDATE $visitdata, $convdata where $uname, $oname if EXIST
How to run the code with a simple query.
Please give me an example.
Thank you.
The feature you are looking for is called UPSERT and it is the part of SQL-2008 Standard. However not all DBMS-s implement it and some implement it differently.
For instance on MySQL you can use:
INSERT ... ON DUPLICATE KEY UPDATE
syntax (link to docs)
or
REPLACE INTO
syntax (link to docs).
These methods require you to have a proper PRIMARY KEY: (username_data name_offer_data) in your case.
Some PHP frameworks support this feature too provided you are using ActiveRecord (or similar) class. In Laravel it is called updateOrCreate and in Yii it is called save(). So if you are using a framework try to check its documentation.
If you are using neither framework nor modern DBMS you have to implement the method yourself. Run SELECT count(*) from data_tester WHERE username_data = ? AND name_offer_data = ?, check if it returned any rows and call an appropriate UPDATE/INSERT sql
it's simple, try this:
if(isset($get['userName'])){
$sql = "SELECT * FROM data_transfer WHERE userName = ".$userName.";";
$result = connection()->query($sql);
$rs = mysqli_fetch_array($result);
connection()->close();
//if is not void, means that this username exists
if ($rs != ''){
mysqli_free_result($result);
//InsertData
}
else{
mysqli_free_result($result);
//UpdateData
}
*chech that you have to use your PrimaryKey on where clause to ensure there are only one of this. if you use an ID and you don't get it by $_GET, you'll have to modify something to ensure non-duplicated data. For example, checking that userName cannot be duplicated or something similar
You can simply use replace into command instead of insert into command.
$sql = "REPLACE INTO data_tester(username_data, name_offer_data, visit_data, conversion_data) VALUES('$uname','$oname', '$visitdata', '$convdata')";
It is one of mysql good and useful feature. I used it many times.
Please ensure there is a unique key on column username_data, if so Mysql's ON DUPLICATE KEY UPDATE is suitable for this case, the SQL statement is like that:
$sql = "INSERT INTO data_tester(username_data, name_offer_data, visit_data,
conversion_data) VALUES('$uname','$oname', '$visitdata', '$convdata')
ON DUPLICATE KEY UPDATE username_data = '$uname', name_offer_data =
'$oname', visit_data = '$visitdata', conversion_data = '$convdata'"

MySQL stops running queries after if statement

I've been stuck on this for a few hours now ...
Here's my code:
$SQLQuery1 = $db_info->prepare("SELECT COUNT(ID) FROM menusize WHERE typesize=:typesize");
$SQLQuery1->bindValue(':typesize',$_POST['typesize'],PDO::PARAM_STR);
$SQLQuery1->execute();
if($SQLQuery1->fetchColumn() > 0) {
$SQLQuery2 = $db_info->prepare("INSERT INTO menucatagorysize (menucatagory_ID,menusize_ID) VALUES (:catagoryid,(SELECT ID FROM menusize WHERE typesize=:typesize))");
$SQLQuery2->bindValue(':typesize',$_POST['typesize'],PDO::PARAM_STR);
$SQLQuery2->bindValue(':catagoryid',$_POST['catagoryid'],PDO::PARAM_STR);
$SQLQuery2->execute();
} else {
$SQLQuery2 = $db_info->prepare("INSERT INTO menusize (typesize) VALUES (:typesize);
SET #menusizeid=LAST_INSERT_ID();
INSERT INTO menucatagorysize (menusize_ID,menucatagory_ID) VALUES (#menusizeid,:catagoryid)");
$SQLQuery2->bindValue(':typesize',$_POST['typesize'],PDO::PARAM_STR);
$SQLQuery2->bindValue(':catagoryid',$_POST['catagoryid'],PDO::PARAM_STR);
$SQLQuery2->execute();
}
$SQLQuery3 = $db_info->prepare("SELECT DISTINCT(menuitem_ID) FROM menuprice WHERE menucatagory_ID=:catagoryid");
$SQLQuery3->bindValue(':catagoryid',$_POST['catagoryid'],PDO::PARAM_STR);
$SQLQuery3->execute();
$rows = $SQLQuery3->fetchAll(PDO::FETCH_ASSOC);
So, it will run through the if statement fine, running $SQLQuery1 and $SQLQuery2 (Which ever one is required) without any problems, errors or warnings. But, if it runs the else { part of the code, it will not run $SQLQuery3. Any thoughts?
Thanks :D
EDIT: Got it to work by doing $SQLQuery2=NULL in the else statement ... Sucks that I still cant figure out why it wouldnt work the original way.
It appears that you're trying to enforce a uniqueness constraint over the typesize column of your menusize table from within your application code. However, the database can do this for you—which will make your subsequent operations much simpler:
ALTER TABLE menusize ADD UNIQUE (typesize)
Now, one can simply attempt to insert the posted value into the table and the database will prevent duplicates arising. Furthermore, as documented under INSERT ... ON DUPLICATE KEY UPDATE Syntax:
If a table contains an AUTO_INCREMENT column and INSERT ... ON DUPLICATE KEY UPDATE inserts or updates a row, the LAST_INSERT_ID() function returns the AUTO_INCREMENT value. Exception: For updates, LAST_INSERT_ID() is not meaningful prior to MySQL 5.1.12. However, you can work around this by using LAST_INSERT_ID(expr). Suppose that id is the AUTO_INCREMENT column. To make LAST_INSERT_ID() meaningful for updates, insert rows as follows:
INSERT INTO table (a,b,c) VALUES (1,2,3)
ON DUPLICATE KEY UPDATE id=LAST_INSERT_ID(id), c=3;
Therefore, you can do:
$db_info->prepare('
INSERT INTO menusize (typesize) VALUES (:typesize)
ON DUPLICATE KEY UPDATE typesize=LAST_INSERT_ID(typesize)
')->execute(array(
':typesize' => $_POST['typesize']
));
$db_info->prepare('
INSERT INTO menucatagorysize
(menusize_ID, menucatagory_ID)
VALUES
(LAST_INSERT_ID(), :catagoryid)
')->execute(array(
':catagoryid' => $_POST['catagoryid']
));
$stmt = $db_info->prepare('
SELECT DISTINCT menuitem_ID
FROM menuprice
WHERE menucatagory_ID = :catagoryid
');
$stmt->execute(array(
':catagoryid' => $_POST['catagoryid']
));
while ($row = $stmt->fetch(PDO::FETCH_ASSOC)) {
// etc.
}
(As an aside, the English word is spelled cat*e*gory, not cat*a*gory.)

help in uniquely identifying combine columns

How can I uniquely identify two or more columns, that I have used table named address in the database, now address is has fields like street name, suite name and street num.
strnum | strnam | sutname
1 | xyz | 32
1 | xyz | 32
now how can I uniquely these three columns. That is I want to check whether these three column are already inserted or not. If any field valus is changed than its ok, it will insert new one. but in case all three similar field..Help me to combinely identify these three fields.
You do it by adding unique constraint.
ALTER TABLE your_table ADD UNIQUE(strnum, strnam, sutname);
Then you do the following:
INSERT IGNORE INTO your_table (strnum, strnam, sutname) VALUES ('1', 'xyz', 'etc');
If the value exists already - no insert will happen and no errors will be raised (that's what the IGNORE part is).
By the way why do you use such short and vague column names? It's not the DOS era any more, you can be descriptive with your column names.
$query = "SELECT * FROM `address` WHERE `strnum` = '$strnum' AND `strnam` = '$strnam' AND `sutname` = '$sutname' LIMIT 1";
$result = mysql_query($query);
if (!mysql_num_rows($result)) {
// If you get to here, there is no existing record
$query = "INSERT INTO `address` (`strnum`,`strnam`,`sutname`) VALUES ('$strnum','$strnam','$sutname')";
if (!mysql_query($query)) print('Insert failed!');
} else print('Record already exists!');
EDIT I just added a missing ; so this parses...
just add them as unique keys in table structure and you'll not be able to insert two of them
you can do something like this
SELECT * FROM table WHERE col1 = $something AND col2 = $something2 AND col3 = $something3
(remember about escpaing php variables)
if the record is returned it means it exists. You can also add LIMIT 1 to make it faster.
if your question is about ENSURING that no duplicates occur in the table (for those 3 columns), then probably the best solution is to add UNIQUE index on those three columns.

Combine many MySQL queries with logic into data file

Background:
I am parsing a 330 meg xml file into a DB (netflix catalog) using PHP script from the console.
I can successfully add about 1,500 titles every 3 seconds until i addd the logic to add actors, genre and formats. These are separate tables linked by an associative table.
right now I have to run many, many queries for each title, in this order ( i truncate all tables first, to eliminate old titles, genres, etc)
add new title to 'titles' and capture insert id
check actor table for exising actor
if present, get id, if not insert
actor and get insert id
insert title id and actor id into
associative table
(steps 2-4 are repeated for genres too)
This drops my speed don to about 10 per 3 seconds. which would take eternitty to add the ~250,00 titles.
so how would I combine the 4 queries into a single query, without adding duplicate actors or genres
My goal is to just write all queries into a data file, and do a bulk insert.
I started by writing all associative queries into a data file, but it didn't do much for performance.
I start by inserting th etitle, and saving ID
function insertTitle($nfid, $title, $year){
$query="INSERT INTO ".$this->titles_table." (nf_id, title, year ) VALUES ('$nfid','$title','$year')";
mysql_query($query);
$this->updatedTitleCount++;
return mysql_insert_id();
}
that is then used in conjunction with each actor's name to create the association
function linkActor($value, $title_id){
//check if we already know value
$query="SELECT * FROM ".$this->persons_table." WHERE person = '$value' LIMIT 0,1";
//echo "<br>".$query."<br>";
$result=mysql_query($query);
if($result && mysql_num_rows($result) != 0){
while ($row = mysql_fetch_assoc($result)) {
$value_id=$row['id'];
}
}else{
//no value known, add to persons table
$query="INSERT INTO ".$this->persons_table." (person) VALUES ('$value')";
mysql_query($query);
$value_id=mysql_insert_id();
}
//echo "linking title:".$title_id." with rel:".$value_id;
$query = " INSERT INTO ".$this->title_persons_table." (title_id,person_id) VALUE ('$title_id','$value_id');";
//mysql_query($query);
//write query to data file to be read in bulk style
fwrite($this->fh, $query);
}
This is a perfect opportunity for using prepared statements.
Also take a look at the tips at http://dev.mysql.com/doc/refman/5.0/en/insert-speed.html, e.g.
To speed up INSERT operations that are performed with multiple statements for nontransactional tables, lock your tables
You can also decrease the number of queries. E.g. you can eliminate the SELECT...FROM persons_table to obtain the id by using INSERT...ON DUPLICATE KEY UPDATE and LAST_INSERT_ID(expr).
( sorry, running out of time for a lengthy description, but I wrote an example before noticing the time ;-) If this answer isn't downvoted too much I can hand it in later. )
class Foo {
protected $persons_table='personsTemp';
protected $pdo;
protected $stmts = array();
public function __construct($pdo) {
$this->pdo = $pdo;
$this->stmts['InsertPersons'] = $pdo->prepare('
INSERT INTO
'.$this->persons_table.'
(person)
VALUES
(:person)
ON DUPLICATE KEY UPDATE
id=LAST_INSERT_ID(id)
');
}
public function getActorId($name) {
$this->stmts['InsertPersons']->execute(array(':person'=>$name));
return $this->pdo->lastInsertId('id');
}
}
$pdo = new PDO("mysql:host=localhost;dbname=test", 'localonly', 'localonly');
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
// create a temporary/test table
$pdo->exec('CREATE TEMPORARY TABLE personsTemp (id int auto_increment, person varchar(32), primary key(id), unique key idxPerson(person))');
// and fill in some data
foreach(range('A', 'D') as $p) {
$pdo->exec("INSERT INTO personsTemp (person) VALUES ('Person $p')");
}
$foo = new Foo($pdo);
foreach( array('Person A', 'Person C', 'Person Z', 'Person B', 'Person Y', 'Person A', 'Person Z', 'Person A') as $name) {
echo $name, ' -> ', $foo->getActorId($name), "\n";
}
prints
Person A -> 1
Person C -> 3
Person Z -> 5
Person B -> 2
Person Y -> 6
Person A -> 1
Person Z -> 5
Person A -> 1
(someone might want to start a discussion whether a getXYZ() function should perform an INSERT or not ...but not me, not now....)
Your performance is glacially slow; something is very Wrong. I assume the following
You run your dedicated, otherwise-idle database server on respectable hardware
You have tuned it to some extent (i.e. at least configure it to use a few gigs of ram properly) - engine-specific optimisations will be required
You may be being stung by doing lots of tiny operations with autocommit on; this is a mistake as it generates an unreasonable number of disc IO operations. You should do a large amount of work (100, 1000 records etc) in a single transaction then commit it.
The lookups may be slowing things down because of the simple overhead of doing the queries (the queries themselves will be really easy as you'll have an index on actor name).
I also question your method of assuming that no two actors have the same name - surely your original database contains a unique actor ID, so you don't get them mixed up?
Can you use a language other than PHP? If not, are you running this as a PHP stand-alone script or through a webserver? The webserver is probably adding a lot of overhead you don't need.
I do something very similar at work, using Python, and can insert a couple thousand rows (with associative table lookups) per second on your standard 3.4 GHz, 3GB RAM, machine. MySQL database isn't hosted locally but within the LAN.

Categories