Does anybody has an idea how to multiple insert depending on a variable?
$diff = abs(strtotime($datumtot) - strtotime($datumvan));
$days = floor(($diff - $years * 365*60*60*24 - $months*30*60*60*24) / (60*60*24));
Here i get the result back it can be 1 to 14.
now i want to insert it into the database.
My insert statement is a follows lets say days == 3
INSERT INTO example
(example_id, name, value, other_value)
VALUES (100,'Name 1', 'Value 1', 'Other 1'),
(101, 'Name 2', 'Value 2', 'Other 2'),
(102, 'Name 3', 'Value 3', 'Other 3'),
is there a better way to do it? to check on the variable and then insert it how much it is?
thanks
Related
kindly enlightened me, which is faster/better approach or just the same between CI batch insert and loop insert.
$data = array(
array(
'title' => 'My title' ,
'name' => 'My Name' ,
'date' => 'My date'
),
array(
'title' => 'Another title' ,
'name' => 'Another Name' ,
'date' => 'Another date'
)
);
batch insert:
$this->db->insert_batch('mytable', $data);
/* produces: INSERT INTO mytable (title, name, date)
VALUES ('My title', 'My name', 'My date'),
('Another title', 'Another name', 'Another date'); */
loop insert (php):
for( $i = 0; $ < count($data); $i++ )
{
INSERT INTO mytable (title, name, date)
VALUES ($data[$i]['title'], $data[$i]['name'], $data[$i]['date'])
}
thanks!
Batch inserts are usually faster since they process the data in once, were as the INSERT has some overhead (eg, the SQL optimizer cannot deduct certain steps). That said, you need to process a tremendous number of rows to create a notable difference.
If your curious if it would matter anyway, then don't forget to also measure the time it costs the framework to map the classes to the database table(s). There's a good chance the ORM consumes more resources than a looped SQL INSERT.
I have an array which I will use for a dropdown in a form. My array is something like this...
$data = ('1' => 'Option 1', '2' => 'Option 2', '3' => 'Options 3')
Now I want to prepend this to the $data array
'' => 'Please select'
How do I do this? I've tried array_unshift but this adds a key to my option which I don't want because of my form validation.
Anyone help?
Thanks
All array values have a key. array_shift is the best way to do this.
You can use union operator
$data = ('1' => 'Option 1', '2' => 'Option 2', '3' => 'Options 3')
$prependArray = array('' => 'Please select');
$result = $prependArray + $data;
I would like to know how to efficiently update multiple rows of data with the UPDATE statement. I know I can insert multiple records like this one below.
INSERT INTO example
(example_id, name, value, other_value)
VALUES
(100, 'Name 1', 'Value 1', 'Other 1'),
(101, 'Name 2', 'Value 2', 'Other 2'),
(102, 'Name 3', 'Value 3', 'Other 3'),
(103, 'Name 4', 'Value 4', 'Other 4');
But how it works in Update, or if I have to loop the query and update one by one?
At the moment I have to use foreach to loop and update each SQL statement.
foreach() {
// update statement....
// and execute the query
}
You can use case when indise update..
An eg:
UPDATE users
SET value = CASE
WHEN id in (1,4) THEN 53
WHEN id = 2 THEN 65
WHEN id in (3,5) THEN 47
END
WHERE id IN (1,2,3,4,5)
Refer this so and this for more.
$data = array(
array(
100,
'Name 1',
'Value 1',
'Other 1'
),
array(
101,
'Name 2',
'Value 2',
'Other 2',
)
array(
102,
'Name 3',
'Value 3',
'Other 3'
)
)
for($i = 0; $i < count($data); $i++) {
for($j= 0; $j< count($data[$i]); $j++) {
$field1 = $data[$i][$j];
$field2 = $data[$i][$j];
$filed3 = $data[$i][$j];
$field4 = $data[$i][$j];
query("UPDATE TABLE SET.... WHERE id = $field1");
}
}
You can use CASE in MYSQL also:
Ex:
UPDATE example
SET example_id = CASE
WHEN example_id=100 THEN 1000
WHEN example_id=101 THEN 1001
WHEN example_id=102 THEN 1002
ELSE example_id*10
END
You can see another example, In this, there is a table Person and having wrongly entered Gender. So you need to correct them by updating Male to Female and Female to Male.
So here is your code:
UPDATE Person
SET Gender= CASE
WHEN gender= 'Male' THEN 'Female'
WHEN gender= 'Female' THEN 'Male'
ELSE gender
END
I am trying to write php code which takes an NSArray of NSDictionaries and adds the records to a database. Unfortunately, this is taking around 20 seconds to process around 500 records with a total size of around 2mb.
function uploadTracks($tracks, $partyID, $pName) {
$tracks = json_decode($tracks, true);
$itemInfo = array();
foreach($tracks as $itemInfo){
$track = $itemInfo['songTitle'];
$artist = $itemInfo['artistName'];
$album = $itemInfo['albumName'];
$artwork = $itemInfo['artwork'];
$result = query("INSERT INTO partyRecords(Pkey,songName,songArtist,imageBinary,partyName) VALUES ('%s','%s','%s','%s','%s')" ,$partyID,$track,$artist,$artwork,$pName);
}
}
Is there anyway to optimize the above code? Could json_decode be what is taking the most time?
You can insert multiple records in single insert query. In this case your db indexes will be updated once and it gain you performance boost:
INSERT INTO partyRecords
(Pkey, songName, songArtist, imageBinary, partyName)
VALUES
(1, 'Name 1', 'Artist 1', 'Image 1', 'Party 1'),
(2, 'Name 2', 'Artist 2', 'Image 2', 'Party 2'),
(3, 'Name 3', 'Artist 3', 'Image 3', 'Party 3'),
(4, 'Name 4', 'Artist 4', 'Image 4', 'Party 4');
I would say 95% it is the queries. You need to compile the queries and insert them 100 or so at a time. It would be best to escape the query variables yourself (mysql: mysql_real_escape_string() ).
INSERT INTO partyRecords(Pkey,songName,songArtist,imageBinary,partyName) VALUES (2,'12312','12312321','12312332423','23423432');INSERT INTO partyRecords(Pkey,songName,songArtist,imageBinary,partyName) VALUES (2,'12312','12312321','12312332423','23423432');...
Given this type of User table structure where you are storing many User values (such as phone #'s, preferences, contact info) in a table:
Table: User ID | Key | Value
$values = [
1 => 'A value for key 1',
2 => 'Hello',
8 => 'Meow',
]
// Update values
$stmt = $pdo_db->prepare('
INSERT INTO table (UID, KEY, VALUE)
VALUES (:UID, :KEY, :VALUE)
ON DUPLICATE KEY UPDATE VALUE = :VALUE');
foreach ($values as $key => $value) {
$stmt->bindParam(':KEY', $key);
$stmt->bindParam(':VALUE', $value);
$stmt->execute();
}
If you have 150 different pairs, thats 150 queries per update. How would I optimize this code? Would making a giant SQL make the work easier on the mysql side? Should I be looking at changing the structure itself?
Take this query as a guide-
INSERT INTO example
(example_id, name, value, other_value)
VALUES
(100, 'Name 1', 'Value 1', 'Other 1'),
(101, 'Name 2', 'Value 2', 'Other 2'),
(102, 'Name 3', 'Value 3', 'Other 3'),
(103, 'Name 4', 'Value 4', 'Other 4');
Allows you to insert multiple records at once.
If you know which records are new, it could speed up to separate those into a plain INSERT - mysql has to spend extra time reconciling the ON DUPLICATE... UPDATE otherwise.
Also think about using UPDATE or REPLACE if you're only updating.