Send null if key does not exists - php

I have csv files what i parse to array and pass to mysql table.
Some CSV files does not contains some columns what are in database and when i convert it to array than i have smaller number of columns that are in table and i get "syntax error".
From controller i call:
function sendHistoric(){
$this->load->model('Historic_model');
$this->load->library('csvreader');
foreach($this->divisions as $div){
$result = $this->csvreader->parse_file("assets/csv/1516{$div}.csv");//path to csv file
$this->Historic_model->loadCSVtoDB($result);
//var_dump($result);
}
}
in model i have:
function loadCSVtoDB($data){
$sql = "call ins_historic(?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)";
foreach($data as $row){
var_dump($row);
$this->db->query($sql,$row);
}
//echo $this->db->conn_id->error_message();
}
Can i somehow send NULL instead where he find that i have smaller number of data or i need to check if each element of array exists, if not that set it to null?

One option to consider is not writing slower PHP looping but to harness the flexible code possibilties inside a LOAD DATA INFILE block, such as the processing seen in this answer link here. The functions, conversions possible are endless.
Maybe 6 of one, half dozen of another. But nice for simpler routines too, and I would suggest a fast data ingest.
Edit: (based on OP comment below)
create table t62
( c1 int not null,
c2 varchar(10) not null,
c3 int not null,
c4 char(5) not null,
someOther int not null,
-- an addtion 57 columns here
primary key(c1,c2,c3,c4)
);
-- Mimic the LOAD DATA INFILE or PHP loop:
insert t62(c1,c2,c3,c4,someOther) values (1,'t',1,'01742',777);
insert t62(c1,c2,c3,c4,someOther) values (1,'t',1,'01742',777); -- error 1062: Dupe PK
insert t62(c1,c2,c3,c4,someOther) values (1,'t',2,'01742',777); -- happy
So I don't know, how would you suggest code would pick that composite PK for you? If so, how would that made-up-data play with its relationships with other data?

Related

MysqlError: Duplicate entry '1-5' for key 'PRIMARY' on insert unsure of how

I am getting the error MysqlError: Duplicate entry '1-5' for key 'PRIMARY' as shown below in the code. It only happened once (that I could detect, but it was random) and I couldn't find a cause (New Relic reported), but I cannot reproduce and I don't have much more information except the line number and the error given. The schema and code is below.
num_rows() is somehow returning a value that is not 1 even though it shouldn't. If someone can give some insight on how to debug or fix that would be helpful.
Here is my schema for location_items:
CREATE TABLE `phppos_location_items` (
`location_id` int(11) NOT NULL,
`item_id` int(11) NOT NULL,
`location` varchar(255) COLLATE utf8_unicode_ci NOT NULL DEFAULT '',
`cost_price` decimal(23,10) DEFAULT NULL,
`unit_price` decimal(23,10) DEFAULT NULL,
`promo_price` decimal(23,10) DEFAULT NULL,
`start_date` date DEFAULT NULL,
`end_date` date DEFAULT NULL,
`quantity` decimal(23,10) DEFAULT '0.0000000000',
`reorder_level` decimal(23,10) DEFAULT NULL,
`override_default_tax` int(1) NOT NULL DEFAULT '0',
PRIMARY KEY (`location_id`,`item_id`),
KEY `phppos_location_items_ibfk_2` (`item_id`),
CONSTRAINT `phppos_location_items_ibfk_1` FOREIGN KEY (`location_id`) REFERENCES `phppos_locations` (`location_id`),
CONSTRAINT `phppos_location_items_ibfk_2` FOREIGN KEY (`item_id`) REFERENCES `phppos_items` (`item_id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci |
And the code:
//Lock tables involved in sale transaction so we do not have deadlock
$this->db->query('LOCK TABLES '.$this->db->dbprefix('customers').' WRITE, '.$this->db->dbprefix('receivings').' WRITE, 
'.$this->db->dbprefix('store_accounts').' WRITE, '.$this->db->dbprefix('receivings_items').' WRITE, 
'.$this->db->dbprefix('giftcards').' WRITE, '.$this->db->dbprefix('location_items').' WRITE, 
'.$this->db->dbprefix('inventory').' WRITE, 
'.$this->db->dbprefix('people').' READ,'.$this->db->dbprefix('items').' WRITE
,'.$this->db->dbprefix('employees_locations').' READ,'.$this->db->dbprefix('locations').' READ, '.$this->db->dbprefix('items_tier_prices').' READ
, '.$this->db->dbprefix('location_items_tier_prices').' READ, '.$this->db->dbprefix('items_taxes').' READ, '.$this->db->dbprefix('item_kits').' READ
, '.$this->db->dbprefix('location_item_kits').' READ, '.$this->db->dbprefix('item_kit_items').' READ, '.$this->db->dbprefix('employees').' READ , '.$this->db->dbprefix('item_kits_tier_prices').' READ
, '.$this->db->dbprefix('location_item_kits_tier_prices').' READ, '.$this->db->dbprefix('suppliers').' READ, '.$this->db->dbprefix('location_items_taxes').' READ
, '.$this->db->dbprefix('location_item_kits_taxes'). ' READ, '.$this->db->dbprefix('item_kits_taxes'). ' READ');
// other code for inserting data into other tables that are not relevant.
foreach($items as $line=>$item)
{
$cur_item_location_info->quantity = $cur_item_location_info->quantity !== NULL ? $cur_item_location_info->quantity : 0;
$quantity_data=array(
'quantity'=>$cur_item_location_info->quantity + $item['quantity'],
'location_id'=>$this->Employee->get_logged_in_employee_current_location_id(),
'item_id'=>$item['item_id']
);
$this->Item_location->save($quantity_data,$item['item_id']);
}
// other code for inserting data into other tables that are not relevant.
$this->db->query('UNLOCK TABLES');
class Item_location extends CI_Model
{
function exists($item_id,$location=false)
{
if(!$location)
{
$location= $this->Employee->get_logged_in_employee_current_location_id();
}
$this->db->from('location_items');
$this->db->where('item_id',$item_id);
$this->db->where('location_id',$location);
$query = $this->db->get();
return ($query->num_rows()==1);
}
function save($item_location_data,$item_id=-1,$location_id=false)
{
if(!$location_id)
{
$location_id= $this->Employee->get_logged_in_employee_current_location_id();
}
if (!$this->exists($item_id,$location_id))
{
$item_location_data['item_id'] = $item_id;
$item_location_data['location_id'] = $location_id;
//MysqlError: Duplicate entry '1-5' for key 'PRIMARY'
return $this->db->insert('location_items',$item_location_data);
}
$this->db->where('item_id',$item_id);
$this->db->where('location_id',$location_id);
return $this->db->update('location_items',$item_location_data);
}
}
function get_logged_in_employee_current_location_id()
{
if($this->is_logged_in())
{
//If we have a location in the session
if ($this->session->userdata('employee_current_location_id')!==FALSE)
{
return $this->session->userdata('employee_current_location_id');
}
//Return the first location user is authenticated for
return current($this->get_authenticated_location_ids($this->session->userdata('person_id')));
}
return FALSE;
}
It's not a good idea to check for existence prior to inserting data outside a transaction as this leaves open the possibility of data changing in the mean time. The fact that you've seen this error once but it isn't easily repeatable makes me wonder whether this might have happened.
Would suggest changing the code beneath the first if block in the save function to something that generates the following SQL instead:
INSERT INTO location_items (item_id, location_id)
VALUES ($item_id,$location_id)
ON DUPLICATE KEY UPDATE
This covers the existence check and insert or update in a single atomic statement. (To take this any further and say how to actually implement it I'd need access to the db code.)
EDIT: Sorry, only just noticed the db code is CodeIgniter. Am new to this framework but the above method looks perfectly possible from a brief look here. Something like this:
$sql = "INSERT INTO location_items (item_id, location_id)"
. " VALUES (?, ?)"
. " ON DUPLICATE KEY UPDATE";
$this->db->query($sql, array($item_id, $location_id));
(If for some reason you prefer not to do this, another way to keep it atomic would be to wrap the statements within a transaction instead ($this->db->trans_start(); before the existence check and $this->db->trans_complete(); after the insert/update. But IMO this introduces unnecessary complexity - personally much prefer the first method.)
Looks like a race condition. What likely happened is to roughly simultaneous calls to:
save($data,5);
both get to the exists check at the same time and see that there is no existing entry. Both then try to insert and the fastest gun wins.
You are not going to get a solution so long as the following conditions exist:
You cannot reproduce this problem yourself.
You do not share your source code and database for someone else to attempt to replicate.
I am not asking you to share your full source code. Rather, I am saying this to temper your expectations.
That being said, duplicates can exist for numerous reasons. It would help your question if you provided your version, but I did find one reason that could be a cause: Memory too low - could be reproducible if you lower your memory or put a high strain on your system. If you've had a hard time reproducing it, memory could well be why as you may not be trying to simulate that.
Other things to consider:
You may be wasting your time trying to duplicate something that just will not be duplicated.
If you are concerned you will experience this issue again, you should really consider logging. That can help you to track down the query which caused the issue. I would advise that you not have logging in a production environment and only in development, because it will probably lead to performance penalties that could well be significant. If this is a one-off issue you may never see it again, but it doesn't hurt to be prepared and armed with more information if the issue appears again.
Ultimately, debugging requires the ability to reproduce the error. A bug is part of a computer program, which means there are certain situations and environments in which this will occur, which can be reproduced. When you have no idea how or why a bug was caused there is nowhere to work back from. It is helpful to look to auxiliary concerns as the potential source of your issue by exploring bug reports, etc. If that fails, implement tools like logging that give you more information. This is the only way you will be able to find the root cause of this issue, or get any more specific insight from the SO community on how to do so.
I suspect that the problem could be related with the cache of where statements.
This suspect comes from this stackoverflow question.
Basically I think it could happen that:
- in one cycle this code is executed at the end of save method:
$this->db->where('item_id',$item_id);
$this->db->where('location_id',$location_id);
return $this->db->update('location_items',$item_location_data);
- in the subsequent cycle this code is executed in the exists method:
$this->db->where('item_id',$item_id);
$this->db->where('location_id',$location_id);
return $this->db->update('location_items',$item_location_data);
When executing the "exists" code the cache may still contain the where clauses of the previous statement and the new one (different) will be added.
This way the result will be empty and it seems that the row it is not in the table.
Try to use $this->db->flush_cache(); after the update in the save method.
Also try to use echo $this->db->last_query(); to see what is trying to do in the exists query.
may be ' phppos_location_items ' table is exist in past and a delete statement is executed over this table Delete from phppos_location_items;
in this case primary key column not accept previous values if you truncate the table then all previous record will be removed
Sorry its slightly long for a comment ..
What submits the form ? I assume its a button somewhere on a Page, When I have had a similar error its been due to a user double clicking on a button and 2 requests being sent, in a very close time to each other. This caused a check similar to yours to have this situatuion
Request 1 Check Insert
Request 2 Check Insert
As Request 2 was the last request (because the second click took priority) the error was shown though the first request completed all of the work.
i use the code
$("form").submit(function() {
$(this).submit(function() {
return false;
});
return true;
});
from this question
How to prevent form from submitting multiple times from client side?
Try to use
$insert = $this->db->insert('location_items',$item_location_data);
if($insert)
{
$this->db->reset();
//OR
try $this->db->_reset_write(); to flush all traces of the query
return $insert;
}
Solution 1:
Duplicate entry states that you have one more row which has a primary key same as that of some other previous row.
You can ignore this by statement
INSERT IGNORE INTO ..(rest is same, just add ignore)..
Solution 2:
If you wanna overwrite previous row with new one then you need to follow this query:
INSERT INTO TABLE (f1,f2) VALUES ('f1','f2') ON DUPLICATE KEY UPDATE f1='f1',f2='f2'
Solution: 3
Change your primary key by following these queries:
Create new field for primary key:
ALTER TABLE tablename ADD new_primary_key BIGINT NOT NULL FIRST;
Drop existing primary key:
ALTER TABLE tablename DROP PRIMARY KEY
Now make the new field created earlier as primary key with auto increment
ALTER TABLE tablename MODIFY new_primary_key BIGINT AUTO_INCREMENT PRIMARY KEY
(this will not affect other queries in the code, you can keep them as it is and just add LIMIT 1 in select statements)

update cell value if value not already in

I have to update the column File on the TABLE TEST. This column contains the files related to the row. Each file is separated by a |.
An example could be
ID NAME FILE
1 apple fruit.png | lemon.png
Now when I add a new file to the FILE column I use this query:
$link->query("UPDATE TEST SET File = CONCAT(File, '$dbfilename') WHERE id = '$p_id'")
where $dbfilename can be e.g. pineapple.jpg |
The problem is that, if $dbfilename is already on the File values, it will be added another time, resulting double.
How can I check if File contains already $dbfilename, and if yes, don't add id, or even don't execute the query?
This is not a good way of storing information in a database. But I'll get to that in a second. To directly answer your question, you could use this as your SQL query:
UPDATE TEST SET File = CONCAT(File, '$dbfilename')
WHERE id='$p_id'
AND File NOT LIKE '%$dbfilename%'
AND Lingua='$linguadilavoro'
However, this may cause some issues when one file pineapple.jpg and you try to add another-pineapple.jpg
Really, I think you should consider how this is a horribly bad approach to databases. Consider breaking the files off into a second table. For example:
# a table for the fruit names
CREATE TABLE fruits (
id INT UNSIGNED NOT NULL PRIMARY KEY AUTO_INCREMENT,
name VARCHAR(250) NOT NULL,
UNIQUE INDEX(name)
);
# a table for file names
CREATE TABLE files (
fileid INT UNSIGNED NOT NULL DEFAULT PRIMARY KEY AUTO_INCREMENT,
fruitid INT UNSIGNED,
filename VARCHAR(250),
UNIQUE INDEX(fruitid, filename)
);
# find all of the fruits with their associated files
SELECT fruits.id, fruits.name, files.filename
FROM fruits LEFT JOIN files ON fruits.id=files.fruitid
# add a file to a fruit
INSERT INTO files (fruitid, filename)
VALUES ('$fruitID', '$filename')
ON DUPLICATE KEY UPDATE fruitid=LAST_INSERT_ID(id)
You will have to select out the FILE for the id.
then use explode to break it into an array
then check use in_array to determine if it should be added or not
Here is some (untested) code for guidance
$stmt = $link->query("SELECT File File from TEST WHERE id = '$p_id'");
$rec = $stmt->fetchAssoc();
$files = explode(" | ",$rec["FILE"]);
if (!in_array($dbfilename, $files)){
// add to FILE
} else {
// it's already there
}
I would redesign your table structure and add a new table File with the following columns instead of using a varchar field for multiple values:
Table Test
TableId, Name
Table File
FileId, TestId, FileName

How to handle big arrays?

I am developing an application in PHP for which I need to implement a big file handler.
Reading and writing the file is not a problem, but checking the content of the file is a problem.
I built a recursive function which checks whether or not a variable is already used in the same document.
private function val_id($id){
if(!isset($this->id)){
$this->id = array();
}
if(in_array($id, $this->id)){
return $this->val_id($id+1);
}else{
$this->id[] = $id;
return $id;
}
}
When in_array($id,$this->id) returns FALSE, the $id will be added to $this->id (array which contains all used ids) and returns a valid id.
When this returns TRUE, it returns the same function with parameter $id+1
Since we are talking about over 300000 records a time, PHP won't not to be able to store such big arrays. It seems to quit writing lines in the documents I generate when this array gets too big. But I don't receive any error messages like that.
Since the generated documents are SQL files with multiple rows INSERT another solution could be to check if the id already exists in the database. Can MySQL catch these exceptions and try these entries again with adding 1 to id? How?
How do you think I need to solve this problem?
Kind regards,
Wouter
make error messages to appear.
increase memory_limit
instead of values store the parameter in the key - so you'll be able to use isset($array[$this->id]) instead of in_array()
Use INSERT IGNORE to disable duplicate key check in mysql and remove your key check in php. Your statement could look like this.
INSERT IGNORE INTO tbl_name SET key1 = 1, col1 = 'value1'
If you want to add 1 to the id always you could use ON DUPLICATE KEY to increment your key by one:
INSERT INTO table (a,b,c) VALUES (1,2,3)
ON DUPLICATE KEY UPDATE c=c+1;
Why should 30.000 records be a problem? Each record in a standard PHP array takes 144 bytes, for 30.000 that would mean 4218,75 kByte. No big deal.
Otherwise, Your Common Sense's idea with the array-key is worth a thought, because it's faster.

MySQL stops after inserting 255 rows

I got a pretty large DB-Table that I need to split into smaller tables for different reasons.
The handling happens via php close to this example:
// Note: It's an example and not working code - the actual function is much larger
function split_db()
{
$results = "
SELECT *
FROM big_table
";
foreach ( $results as $result )
{
// Here I split the big_tables contents and ...
$some_val = $result->SomeVal;
// ...
$another_val = $result->AnotherVal;
// ... here I insert the contents in the different tables
$sql = "
INSERT
INTO first_small_table
// ...
VALUES
// ...
";
}
}
Problem: The query inserts 255 rows, no matter if I'm in the local environment or on the test server.
Question: Why? What am I doing wrong or am I missing something? And how would I avoid this?
Info about MySQL-Client-Version:
Dev-Server: 5.0.32,
Local Dev-Env.: 5.1.41
I'm no MySQL-Hero, so any help and explanation is appreciated, as Google brought nothing meaningful (to me) up. Thanks!
I bet you have your primary key of unsigned tinyint type, that has limit of 255 for the maximum value.
So change it to just int
ALTER TABLE first_small_table MODIFY id INT;
I can't say why you're limited to 255 rows, but what I can say is that you can do a single query to add your rows from your big table into your small table with a INSERT INTO ... SELECT query :
INSERT INTO first_small_table (col1, col2, col3)
SELECT col1, col2, col3
FROM big_table;
If you don't need to use PHP, then by all mean don't use it. It's faster to only use SQL.

serialize not working for me in drupal

i am trying to insert data to database but it removing braces'{}' while inserting i am using this code.
<pre><code>
require_once './includes/bootstrap.inc';
drupal_bootstrap(DRUPAL_BOOTSTRAP_DATABASE);
$aa['alt']="happy alt";
$aa['title']="happy title";
$sldata=serialize($aa);
$sql="Insert into test(pval) values('".$sldata."')";
echo $sql;
db_query($sql);
</pre></code>
my db structure is as
<pre><code>
CREATE TABLE IF NOT EXISTS `test` (
`sl` int(11) NOT NULL AUTO_INCREMENT,
`pval` text NOT NULL,
PRIMARY KEY (`sl`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1
</pre></code>
suggest me what is wrong here..
Drupal uses {} arround the tables names, to be able to do some manipulations on those names -- like prefix them, if you have configured it to do so.
So, you must not use {} in your query -- except arround tables names, of course.
Instead of brutaly injecting your serialized-string into the SQL query, you must use place-holders in it -- and pass the corresponding values to db_query(), which will take care of escaping what has to be :
$sldata = serialize($aa);
$sql = "insert into {test} (pval) values('%s')";
db_query($sql, $sldata);
Here :
As the pval field is a string in database, I used a %s place-holder
And the first value passed to db_query() (after the SQL query itself, of course) will be injected by drupal, to replace that first (and only, here) placeholder.
And, for more informations, you might want to take a look at Database abstraction layer.
instead of just serialize, you could base64_encode to bypass curlies being a problem.
http://php.net/manual/en/function.base64-encode.php
base64_encode(serialize($aa));
Then on the retrieving side of the data
unserialize(base64_decode($db_data));

Categories