I'd like to pass multiple variables in a foreach loop to add the value from $array_sma[] into my database. But so far I can only insert the value from $short_smas, while I'd also like to insert the values from $mid_smas. I have tried nested foreach but it's multiplying the values.
$period = array(1,2,3,4,5,6,7,8,9,10,11,12,13,14,15);
$sma = array(6,9);
foreach ($sma as $range) {
$sum = array_sum(array_slice($period, 0, $range));
$result = array($range - 1 => $sum / $range);
for ($i = $range, $n = count($period); $i != $n; ++$i) {
$result[$i] = $result[$i - 1] + ($period[$i] - $period[$i - $range]) / $range;
}
$array_sma[] = $result;
}
list($short_smas,$mid_smas)=$array_sma;
foreach ($short_smas as $short_sma) {
$sql = "INSERT INTO sma (short_sma)
VALUES ('$short_sma') ";
if ($con->query($sql) === TRUE) {
echo "New record created successfully<br><br>";
} else {
echo "Error: " . $sql . "<br>" . $con->error;
}
}
The code in my question works fine i.e. the value from the first sub array ($short_smas) of $array_sma[] gets inserted into the column short_sma of my msql database. The problem I have is when I try to insert the second sub array $mid_smas (see list()) from $array_sma[] in my second column of my database call mid_sma.
I think this is closed to what I want to achieve but still nothing gets inserted in the DB, source: php+mysql: insert a php array into mysql
I don't have any mysql syntax error.
$array_sma[] = $result;
$sql = "INSERT INTO sma (short_sma, mid_sma) VALUES ";
foreach ($array_sma as $item) {
$sql .= "('".$item[0]."','".$item[1]."'),";
}
$sql = rtrim($sql,",");
Main problem is that $short_smas and $mid_smas have different size. Moreover they are associative arrays so either you pick unique keys from both and will allow for empty values for keys that have only one value available or you pick only keys present in both arrays. Code below provides first solution.
// first lets pick unique keys from both arrays
$uniqe_keys = array_unique(array_merge(array_keys($short_smas), array_keys($mid_smas)));
// alternatively we can only pick those present in both
// $intersect_keys = array_intersect(array_keys($short_smas),array_keys($mid_smas));
// now lets build sql in loop as Marcelo Agimóvel sugested
// firs we need base command:
$sql = "INSERT INTO sma (short_sma, mid_sma) VALUES ";
// now we add value pairs to coma separated list of values to
// insert using keys from prepared keys array
foreach ($uniqe_keys as $key) {
$mid_sma = array_key_exists($key, $mid_smas)?$mid_smas[$key]:"";
$short_sma = array_key_exists($key, $short_smas)?$short_smas[$key]:"";
// here we build coma separated list of value pairs to insert
$sql .= "('$short_sma', '$mid_sma'),";
}
$sql = rtrim($sql, ",");
// with data provided in question $sql should have string:
// INSERT INTO sma (short_sma, mid_sma) VALUES, ('3.5', ''), ('4.5', ''), ('5.5', ''), ('6.5', '5'), ('7.5', '6'), ('8.5', '7'), ('9.5', '8'), ('10.5', '9'), ('11.5', '10'), ('12.5', '11')
// now we execute just one sql command
if ($con->query($sql) === TRUE) {
echo "New records created successfully<br><br>";
} else {
echo "Error: " . $sql . "<br>" . $con->error;
}
// don't forget to close connection
Marcelo Agimóvel also suggested that instead of multiple inserts like this:
INSERT INTO tbl_name (a,b,c) VALUES (1,2,3);
its better to use single:
INSERT INTO tbl_name
(a,b,c)
VALUES
(1,2,3),
(4,5,6),
(7,8,9);
That's why I append value pairs to $sql in foreach loop and execute query outside loop.
Also its worth mentioning that instead of executing straight sql its better to use prepared statements as they are less prone to sql injection.
Related
This is code for insert dynamic data to mysql database.
$name = $_POST['name'];
for ($i = 0; $i < count($name); $i++) {
if ($name[$i] != "") {
$test= implode(", ", (array)$name[$i]);
print_r($test);
$sql = "INSERT INTO employee_table (name)
VALUES ('$test')";
if ($conn->query($sql) === true) {
echo ('ok');
}
}
}
$conn->close();
I used implode(", ", (array)$name[$i]) to returns a string from $name by comma but when print_r($test); like this:
AlexBrownHelloHugo
I got 2 problems and hope your help:
Result when print_r($test); is Alex,Brown,Hello,Hugo
Store $test [Alex,Brown,Hello,Hugo] same row into dabase.
Thanks all.
Something like this:
$names = empty($_POST['name']) ? [] : $_POST['name'];
foreach($names AS $name){
if (!empty($name)) {
$test= '['.implode(", ", (array)$name).']';
print_r($test);
$sql = "INSERT INTO employee_table (name)
VALUES ('$test')";
if ($conn->query($sql) === true) {
echo ('ok');
}
}
}
I wanted to repost this comment I made:
its a bad idea to store data as a delimited list when you can make it a related table. In any case I would save it as this ,Alex,Brown,Hello,Hugo, with leading and trailing delimiters, that way when you query it you can do this field LIKE '%,Alex,%'. The difference is if you have foo,some,bar and foo,something,bar and you do field LIKE '%some%' note no , you will find both of those some and something. To query the first and last items like I showed above with , they would need the , around them. You can just use trim($field, ',') to remove them before explode etc
UPDATE
And this one
its unclear the structure of $name is it implode($name[$i]) or impode($name) You use the first one in your code which implies name is [['foo','bar'], [...]] not ['foo','bar', ...] If it's the second your also storing it multiple times which you probably don't want.
So you may be able to do just this:
//$_POST['name'] = ['foo','bar', ...]
//remove the loop
//we can assign $name in the if condition and save a line or 2
//the second part, the assignment, will always return true.
if (!empty($_POST['name']) && $name = $_POST['name']) {
$test= '['.implode(',', (array)$name).']'; //changed mainly this line
print_r($test);
$sql = "INSERT INTO employee_table (name) VALUES ('$test')";
if ($conn->query($sql) === true) {
echo 'ok';
}
}
With no loop, because when you loop over the count of names, your inserting the same data each time, up to the number of items in the names variable.
Explaining your code
So with my example data $_POST['name'] = ['foo','bar', ...] and a simplified version of your original code, you would be doing this:
Assuming you meant implode($name) and not implode($name[$i]) in your original code, which is the only sane thing if your data looks like my example data
//canned example data
$name = ['foo','bar'];
for ($i = 0; $i < count($name); $i++) {
if ($name[$i] != "") {
$test= implode(", ", (array)$name); //changed from $name[$i]
//just output this stuff so we can see the results
print_r($test);
echo "\nINSERT INTO employee_table (name) VALUES ('$test')\n";
}
}
Outputs:
foo, bar
INSERT INTO employee_table (name) VALUES ('foo, bar')
foo, bar
INSERT INTO employee_table (name) VALUES ('foo, bar')
Sandbox
If should be obvious but if you changed this line $test= implode(", ", (array)$name); to $test= '['.implode(',', (array)$name).']; in the above code the output would be this:
foo, bar
INSERT INTO employee_table (name) VALUES ('[foo,bar]')
foo, bar
INSERT INTO employee_table (name) VALUES ('[foo,bar]')
Which still saves it more then one time. So we need to dump that loop, which basically forces us into the code I put at the top of this update.
Hopefully that all makes sense.
Cheers
Try this code, in fact your $_POST['name'] store your value as an array, so you don't need to cast it.
$name = $_POST['name'];
for ($i = 0; $i < count($name); $i++) {
if ($name[$i] != "") {
$test= '['.implode(', ', $_POST['name']).']';
print_r($test);
$sql = "INSERT INTO employee_table (name)
VALUES ('$test')";
if ($conn->query($sql) === true) {
echo ('ok');
}
}
}
$conn->close();
sorry for the complicated heading.i am doing learning php and got stuck.i have a database table table_name
id(primary key) name ip
1 a 192.168.0.1,192.168.0.5,171.87.65 //separated by comma's
2 b 192.168.0.1,175.172.2.6,164.77.42
now i want to add an array of values ip[0] and ip[1] coming from a two different text-area to the end of the ip's of each name and just updating the ip column of each row.so it will just append new values with previous one.
name a<textarea rows="4" cols="40" name="ip[]"></textarea>
name b<textarea rows="4" cols="40" name="ip[]"></textarea>
<input type="submit" />
this is how its inserted
if(isset($_POST['submit'])) {
$ip_details = $_POST['ip'];
$values = array(
array('id' => '"1"', 'name' => '"a"', ip => '"'.$ip_details[0].'"'),
array('id' => '"2"','name' => '"b"', ip => '"'.$ip_details[1].'"'),
);
$columns = implode(', ', array_keys($values[0]));
foreach($values as $value) {
$value = implode(', ', $value);
$statement = "INSERT INTO `center_listt` (id,name,ip) VALUES ($value)";
$res=mysql_query($statement);
echo "success";
}
}
i need to update each rows of namea and b with new values coming from text-area with previous values.
i am thinking of array_push after fetching ip from table in while loop but could not really do it.warning: array_push expects parameter 1 to be array integer given its because the $row['ip'] fetched in while loop is not valid array which array_push expects.
and it will only add new values in different new rows each time which i don't want.can someone please help what to do.
<?php
if(isset($_POST['submit'])) {
//print_r($ips); die;
$i = 0;
foreach($_POST['ip_details'] as $ipaddr) {
$ips[$i] = $ips[$i].$ipaddr;
$i++;
}
$r = 1;
foreach($ips as $ip){
//echo "UPDATE center_listt SET ipdetails = '$ip' WHERE `id_center` = '$r'"; die;
if(mysql_query("UPDATE center_listt SET ipdetails = '$ip' WHERE `id_center` = '$r'")) echo "IP Address Updated <br />";
else echo 'error occurred';
$r++;
}
}
$sql="select * from center_listt";
$res=mysql_query($sql);
if(!$res) {
die('could not connect'.mysql_error());
}
while($row=mysql_fetch_assoc($res))
{
echo $row['ipdetails']; }
?>
its a bad practise to insert form values from array.you can fetch it from db bcoz if in future you want to add new form values you need to rewrite again with array values while fetching from db will only need you to insert new values in db.
my query will add ip's in your specific column in a single row only updating the ip with new values.
You could do this:
$values = array(...); // WARNING: escape `$ip_details` here!!
$to_insert = array();
foreach($values as $row) {
$to_insert[] = "(".implode(", ",$row).")";
}
$statement = "insert into `center_listt` (`id`, `name`, `ip`)
values ".implode(", ",$to_insert)."
ON DUPLICATE KEY UPDATE `ip`=concat(`ip`,',',values(`ip`))
";
mysql_query($statement);
This will perform a multi-insert (far more efficient than individual queries), and when you try to insert the same ID twice it will instead concatenate the values.
It should be noted that this is bad database design, though :p
I'm trying to write a Top-10 list with categories but it doesn't work in the way i want it to. There's an array with a dynamic number (n) of items and want to loop each item in this array 10 times to write n*10 rows into a MySQL table. ($i also increments the games rank).
If I echo, print_r or var_dump the code it works, but when I try to write it to the MySQL table it doesn't work.
Here's the code:
for ($i = 1; $i <= 10; $i++) {
foreach($titles as $val) {
$query .= "INSERT INTO charts (game_place, game_name, game_preweek, game_developer, game_release, game_link, game_image, game_category, charts_updated) VALUES (".$i.", '', '', '', '', '', '', '".$val."', '".time()."');";
mysql_query($query);
};
};
Does somebody know the answer to my problem?
$query .= "INSERT ....`
You're adding each new query onto the end of the previous query. That's going to produce invalid SQL after the first iteration. You just need to assign the query as:
$query = "INSERT ....`
You should also look at using PDO or mysqli_ instead - this sort of thing is an ideal use for a prepared statement.
$query = "INSERT INTO `charts`` (`game_place`, `game_category`, `charts_updated`) VALUES ";
foreach ($titles as $key => $val){
for ($i=1; $i<=10; $i++){
$query .= "(".$i.", '".$val."', ".time()."),";
}
}
$query = substr($query, 0, -1);//remove last comma
mysql_query($query);
I am looking for the fastest way to INSERT 1-3,000 rows into a MySQL database using PHP. My current solution is taking around 42 seconds to insert the rows which I think that could be much faster.
I am using a self-written DB class, the insert() method takes two params (string) $table and (array) $vars. The $items array is an associative array where the key is the column name in the table and the value is the value to insert. This works really well for because I sometimes have 30 columns in a table and already have the data there in an array. The insert() method is below:
function insert($table,$vars) {
if(empty($this->sql_link)){
$this->connection();
}
$cols = array();
$vals = array();
foreach($vars as $key => $value) {
$cols[] = "`" . $key . "`";
$vals[] = "'" . $this->esc($value) . "'";
}
//join the columns and values to insert into sql
$fields = join(', ', $cols);
$values = join(', ', $vals);
$insert = mysql_query("INSERT INTO `$table` ($fields) VALUES ($values);", $this->sql_link);
return $insert;
}
It should be self-explanatory but basically I take the keys and values from $vars and create an INSERT statement. It works, I think the problem I am having is sending the queries one at a time.
Should I build a long query string?
INSERT INTO table (field, field2, etc) VALUES (1, 2, ect);INSERT INTO table (field, field2, etc) VALUES (1, 2, ect);INSERT INTO table (field, field2, etc) VALUES (1, 2, ect);INSERT INTO table (field, field2, etc) VALUES (1, 2, ect);INSERT INTO table (field, field2, etc) VALUES (1, 2, ect); and send it all at one time? If so can this handle 3,000 insert statements in one call?
Is there another way I am not looking at? Any info is appreciated.
Thanks
The most performant way is to use the multiple-row insert syntax:
INSERT INTO table (field, field2, etc) VALUES (1, 2, etc),(1, 2, etc),(1, 2, etc);
Manual:
INSERT statements that use VALUES syntax can insert multiple rows. To do this, include multiple lists of column values, each enclosed within parentheses and separated by commas. Example:
INSERT INTO tbl_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);
The values list for each row must be enclosed within parentheses.
Two ways of improve insertion speeds:
At the start, before any INSERT, do a mysql_query("START TRANSACTION"); or a simpler mysql_query("BEGIN");. At the end, do a mysql_query("COMMIT");. These two lines, speeds up the bulk insertion a 5-10x performance.
If the table backend is MyISAM (NOT InnoDB), do the INSERTs followed with the word DELAYED. For example, instead of INSERT INTO table use INSERT DELAYED INTO table for an aditional 10-15x speed-up.
If you combine the 2 methods, is posible to achieve a speed-up of 100 times.
Mysql can import data directly from a file which can significantly speed up importing data. See:
LOAD DATA INFILE Syntax
<?php
$data = "data/fullz.txt";
$db = new PDO("sqlite:db/ssninfo.db");
$db->beginTransaction();
$stmt = $db->prepare('INSERT INTO ssninfo (fname,lname,ssn,address,city,state,zip,phone,birth,email) VALUES (?,?,?,?,?,?,?,?,?,?)');
if($file=fopen($data, "r")){
while(!feof($file)){
$line = fgets($file);
$part = explode('|', $line);
$stmt->execute($part);
}
}
$db->commit();
As usual, it depends; you don't even mention which engine you're using, which is a big determinant. But I've found the MySQL manual guidance pretty reliable.
http://dev.mysql.com/doc/refman/5.0/en/insert-speed.html
Auto discovering the maximum ammount of inserts.
to insert that kind of ammounts (3000) there should not be any problem of doing something like (assuming you use pdo):
$stmt = $dbh->prepare("INSERT INTO yourtable(name, id) VALUES " . str_repeat('(?,?),', $amountOfRows - 1) . '(?,?)');
You can improve that to make create generic way to create big statements like the one above for tables with different ammount of fields:
$fields = array("name", "id");
$fieldList = implode(", ", $fields);
$params = '(' . str_repeat('?,', count($fields) - 1) . '?)';
$values = str_repeat($params . ',', $ammountOfRows - 1) . $params;
$stmt = $dbh->prepare("INSERT INTO $table($fieldList) VALUES " . $values);
but the problem with the above solution is that wont work with any combination of rows and ammount of fields.
Seems to be that mysql is not only limited by the ammount of rows but also the ammount of parameters is taken into account.
But you dont want to be changing your code whenever a new mysql release changes the limit of the parameters, rows or even the size of the sql sentence.
So, a much better approach to create a generic way to generate big statements would be trying to feat the underlaying database engine:
/**
* Creates an insert sql with the maximum allowed of parameters
* #param string $table
* #param string $attributeList
* #param int &$ammountInserts returns the ammount of inserts
* #return \PDOStatement
*/
public static function getBiggestInsertStatement($table, $attributeList, $max, &$ammountInserts)
{
$previousSize = null;
$size = 10;
$sql = 'INSERT INTO ' . $table . '(' . implode(',', $attributeList) . ') values ';
$return = null;
$params = '(' . str_repeat('?,', count($attributeList) - 1) . '?)';
do {
try {
$previousSize = $size;
$values = str_repeat($params . ',', $size - 1) . $params;
$return = Db::getInstance()->prepare($sql . $values);
if ($size > $max) {
$values = str_repeat($params . ',', $max - 1) . $params;
$return = Db::getInstance()->prepare($sql . $values);
$ammountInserts = $max;
break;
}
$ammountInserts = $size;
$size *= 2;
} catch(\Exception $e) {
}
} while($previousSize != $size);
return $return;
}
One thing that you must have in mind is that since you dont know that limits the query could be able to push a lower ammount of items that all that you need to insert.
So you would have to create a strategy like the one below to succesfuly achieve insert them all in any possible scenario:
$insert = Db::getBiggestInsertStatement($table, array('field1','field2'), $numrows, $maximumInserts);
$i = 0;
$values = array();
for ($j = 0; $j < $numrows; $j++) {
if ($i === $maximumInserts) {
$insert->execute($values);
$i = 0;
$values = array();
}
$values[] = "value1" . $j;
$values[] = "value2" . $j;
$i++;
});
if ($i > 0) {
$insertRemaining = Db::getBiggestInsertStatement($table, array('field1', 'field2'), $i, $maximumInserts);
$insertRemaining->execute($values);
}
I have tried to insert in a table with a single column 1000000 rows, and it's done within seconds, agains minutes that would take to insert them one by one.
The standard technique for speeding up bulk inserts in to use a prepared SQL statement inside a loop inside a transaction. That will make it pretty well optimal. After that you could try tweaking it in various ways, but you are probably wasting your time.
I have two arrays with anywhere from 1 to 5 set values. I want to insert these values into a table with two columns.
Here's my current query, given to me in another SO question:
INSERT INTO table_name (country, redirect)
VALUES ('$country[1]', '$redirect[1]'),
('$country[2]', '$redirect[2]'),
('$country[3]', '$redirect[3]'),
('$country[4]', '$redirect[4]'),
('$country[5]', '$redirect[5]')
ON DUPLICATE KEY UPDATE redirect=VALUES(redirect)
I'm a little concerned however with what happens if some of these array values aren't set, as I believe the above assumes there's 5 sets of values (10 values in total), which definitely isn't certain. If a value is null/0 does it automatically skip it?
Would something like this work better, would it be a lot more taxing on resources?
for($i = 0, $size = $sizeof($country); $i <= size; $i++) {
$query = "INSERT INTO table_name (country, redirect) VALUES ('$country[$i]', '$redirect[$i]) ON DUPLICATE KEY UPDATE redirect='$redirect[$i]'";
$result = mysql_query($query);
}
Questions highlighted in bold ;). Any answers would be very much appreciated :) :)!!
Do something like this:
$vals = array()
foreach($country as $key => $country_val) {
if (empty($country_val) || empty($redirect[$key])) {
continue;
}
$vals[] = "('" . mysql_real_escape_string($country_val) . "','" . mysql_real_escape_string($redirect[$key]) . "')";
}
$val_string = implode(',', $vals);
$sql = "INSERT INTO .... VALUES $val_string";
That'll built up the values section dynamically, skipping any that aren't set. Note, however, that there is a length limit to mysql query strings, set by the max_allowed_packet setting. If you're building a "huge" query, you'll have to split it into multiple smaller ones if it exceeds this limit.
If you are asking whether php will automatically skip inserting your values into the query if it is null or 0, the answer is no. Why dont you loop through the countries, if they have a matching redirect then include that portion of the insert statement.. something like this: (not tested, just showing an example). It's one query, all values. You can also incorporate some checking or default to null if they do not exist.
$query = "INSERT INTO table_name (country, redirect) VALUES ";
for($i = 0, $size = $sizeof($country); $i <= size; $i++) {
if(array_key_exists($i, $country && array_key_exists($i, $redirect)
if($i + 1 != $size){
$query .= "('".$country[$i]."', '".$redirect[$i]).",";
} else $query .= "('".$country[$i]."', '".$redirect[$i].")";
}
}
$query .= " ON DUPLICATE KEY UPDATE redirect=VALUES(redirect);"
$result = mysql_query($query);