let me explain my problem..actualy i have a table where patient report get stored and patient can have more than one test so the result for every report should be different on print, result is inserting differnt but the field remark and nor inserting same value for more than one test..
this is input field image of report
and the field row can increase acording to increase of tested by patient..
now i am using this for inserting in table
function save_report_content()
{
$R=DIN_ALL($_REQUEST);
$dt = time();
foreach($R as $k=>$v)
{
$test_id = str_replace('rep_result_', '', $k);
if(strstr($k, 'rep_result_'))
{
$content = $v;
$SQL = "INSERT INTO report SET
rep_te_id = '$test_id',
rep_result = '$content',
record_id = '$R[payment_id]',
remark= '$R[remark]',
nor= '$R[nor]',
rep_date = '$dt'";
now result is going differently in table but remark and nor same for more than one test
i spend so much time to recover this problem but did not succeed, if i miss any relevant info regarding this question then feel free to ask me, and thanks in advance, any idea will be appreciate highly....
Whats the structure of your form ?
<input name='nor[]' />
It should be an array to allow each rows value to come other wise just the last rows value will come ...
when you are in the foreach .. you shouldn't use $R[remark] since you are using $k=>$v
it should be $v['remark']
Related
I have an app that is receiving sensor data. As an illustration let's say the expected range is between 0.01 and 10. In this case in the migration I may have something like:
$table->float('example',5, 2)
This way I am able to handle an order of magnitude outside the expected range. However it is possible for the sensor to glitch and send values of say 10 000. As the sensor is sending an array of values it is likely that not all the data is incorrect so it is preferred to still write the data to the DB. For the initial insert I am using the code below which is working as expected:
DB::table($tableName)->insertOrIgnore($insert_array);
However, in some circumstances the sensor can resend the data in which case the record needs to be updated. It's possible that the value(s) that are out of range can remain in the array, in which case the update statement below will throw an out of range error:
DB::table($tableName)->where('$some_id','=', $another_id)->update($insert_array);
I have not been able to find an something akin to an "updateorignore" functionality. What is the best way to handle updating this record? Note I cannot simply put it in a try catch since this table will be a parent table to some children and ignoring it will result in some orphaned entries.
Do you have some unique points of data to tie the records together, such as a timestamp or an id from the origin of the data? If so, you can use updateOrInsert
DB::table($tableName)
->updateOrInsert(
['remote_id' => $insert_array['remote_id'],
[
'example_field_1' => $insert_array['example_field_1'],
'example_field_2' => $insert_array['example_field_2'],
]
);
This is discussed at:
https://laravel.com/docs/master/queries#updates
Thanks to James Clark Developer to help me get to the solution. Below is some sample code to resolve the problem:
$values = array();
//set up values for binding to prevent SQL injection
foreach ($insert_array as $item) {
$values[] = '?';
}
//array values need to be in a "flat array"
$flat_values = implode(", ", $values);
//add in separators ` to column names
$columns = implode("`, `",array_keys($insert_array));
//write sql statement
$sql = "INSERT IGNORE INTO `example_table` (`$columns`) VALUES
($flat_values) ON DUPLICATE KEY UPDATE id = '$id'";
DB::insert($sql, array_values($insert_array));
}
I have a bunch of photos on a page and using jQuery UI's Sortable plugin, to allow for them to be reordered.
When my sortable function fires, it writes a new order sequence:
1030:0,1031:1,1032:2,1040:3,1033:4
Each item of the comma delimited string, consists of the photo ID and the order position, separated by a colon. When the user has completely finished their reordering, I'm posting this order sequence to a PHP page via AJAX, to store the changes in the database. Here's where I get into trouble.
I have no problem getting my script to work, but I'm pretty sure it's the incorrect way to achieve what I want, and will suffer hugely in performance and resources - I'm hoping somebody could advise me as to what would be the best approach.
This is my PHP script that deals with the sequence:
if ($sorted_order) {
$exploded_order = explode(',',$sorted_order);
foreach ($exploded_order as $order_part) {
$exploded_part = explode(':',$order_part);
$part_count = 0;
foreach ($exploded_part as $part) {
$part_count++;
if ($part_count == 1) {
$photo_id = $part;
} elseif ($part_count == 2) {
$order = $part;
}
$SQL = "UPDATE article_photos ";
$SQL .= "SET order_pos = :order_pos ";
$SQL .= "WHERE photo_id = :photo_id;";
... rest of PDO stuff ...
}
}
}
My concerns arise from the nested foreach functions and also running so many database updates. If a given sequence contained 150 items, would this script cry for help? If it will, how could I improve it?
** This is for an admin page, so it won't be heavily abused **
you can use one update, with some cleaver code like so:
create the array $data['order'] in the loop then:
$q = "UPDATE article_photos SET order_pos = (CASE photo_id ";
foreach($data['order'] as $sort => $id){
$q .= " WHEN {$id} THEN {$sort}";
}
$q .= " END ) WHERE photo_id IN (".implode(",",$data['order']).")";
a little clearer perhaps
UPDATE article_photos SET order_pos = (CASE photo_id
WHEN id = 1 THEN 999
WHEN id = 2 THEN 1000
WHEN id = 3 THEN 1001
END)
WHERE photo_id IN (1,2,3)
i use this approach for exactly what your doing, updating sort orders
No need for the second foreach: you know it's going to be two parts if your data passes validation (I'm assuming you validated this. If not: you should =) so just do:
if (count($exploded_part) == 2) {
$id = $exploded_part[0];
$seq = $exploded_part[1];
/* rest of code */
} else {
/* error - data does not conform despite validation */
}
As for update hammering: do your DB updates in a transaction. Your db will queue the ops, but not commit them to the main DB until you commit the transaction, at which point it'll happily do the update "for real" at lightning speed.
I suggest making your script even simplier and changing names of the variables, so the code would be way more readable.
$parts = explode(',',$sorted_order);
foreach ($parts as $part) {
list($id, $position) = explode(':',$order_part);
//Now you can work with $id and $position ;
}
More info about list: http://php.net/manual/en/function.list.php
Also, about performance and your data structure:
The way you store your data is not perfect. But that way you will not suffer any performance issues, that way you need to send less data, less overhead overall.
However the drawback of your data structure is that most probably you will be unable to establish relationships between tables and make joins or alter table structure in a correct way.
I have one problem here, and I don't even have clue what to Google and how to solve this.
I am making PHP application to export and import data from one MySQL table into another. And I have problem with these tables.
In source table it looks like this:
And my destination table has ID, and pr0, pr1, pr2 as rows. So it looks like this:
Now the problem is the following: If I just copy ( insert every value of 1st table as new row in second) It will have like 20.000 rows, instead of 1000 for example.
Even if I copy every record as new row in second database, is there any way I can fuse rows ? Basically I need to check if value exists in last row with that ID_, if it exist in that row and column (pr2 for example) then insert new row with it, but if last row with same ID_ does not have value in pr2 column, just update that row with value in pr2 column.
I need idea how to do it in PHP or MySQL.
So you got a few Problems:
1) copy the table from SQL to PHP, pay attention to memory usage, run your script with the PHP command Memory_usage(). it will show you that importing SQL Data can be expensive. Look this up. another thing is that PHP DOESNT realese memory on setting new values to array. it will be usefull later on.
2)i didnt understand if the values are unique at the source or should be unique at the destination table.. So i will assume that all the source need to be on the destination as is.
I will also assume that pr = pr0 and quant=pr1.
3) you have missmatch names.. that can also be an issue. would take care of that..also.
4) will use My_sql, as the SQL connector..and $db is connected..
SCRIPT:
<?PHP
$select_sql = "SELECT * FROM Table_source";
$data_source = array();
while($array_data= mysql_fetch_array($select_sql)) {
$data_source[] = $array_data;
$insert_data=array();
}
$bulk =2000;
foreach($data_source as $data){
if(isset($start_query) == false)
{
$start_query = 'REPLACE INTO DEST_TABLE ('ID_','pr0','pr1','pr2')';
}
$insert_data[]=implode(',',$data).',0)';// will set 0 to the
if(count($insert_data) >=$bulk){
$values = implode('),(',$insert_data);
$values = substr(1,2,$values);
$values = ' VALUES '.$values;
$insert_query = $start_query.' '.$values;
$mysqli->query($insert_query);
$insert_data = array();
} //CHECK THE SYNTAX IM NOT SURE OF ALL OF IT MOSTLY THE SQL PART>> SEE THAT THE QUERY IS OK
}
if(count($insert_data) >=$bulk) // IF THERE ARE ANY EXTRA PIECES..
{
$values = implode('),(',$insert_data);
$values = substr(1,2,$values);
$values = ' VALUES '.$values;
$insert_query = $start_query.' '.$values;
$mysqli->query($insert_query);
$insert_data = null;
}
?>
ITs off the top off my head but check this idea and tell me if this work, the bugs night be in small things i forgot with the QUERY structure, print this and PASTE to PHPmyADMIN or you DB query and see its all good, but this concept will sqve a lot of problems..
$url = mysql_real_escape_string($_POST['url']);
$shoutcast_url = mysql_real_escape_string($_POST['shoutcast_url']);
$site_name = mysql_real_escape_string($_POST['site_name']);
$site_subtitle = mysql_real_escape_string($_POST['site_subtitle']);
$email_suffix = mysql_real_escape_string($_POST['email_suffix']);
$logo_name = mysql_real_escape_string($_POST['logo_name']);
$twitter_username = mysql_real_escape_string($_POST['twitter_username']);
with all those options in a form, they are pre-filled in (by the database), however users can choose to change them, which updates the original database. Would it be better for me to update all the columns despite the chance that some of the rows have not been updated, or just do an if ($original_db_entry = $possible_new_entry) on each (which would be a query in itself)?
Thanks
I'd say it doesn't really matter either way - the size of the query you send to the server is hardly relevant here, and there is no "last updated" information for columns that would be updated unjustly, so...
By the way, what I like to do when working with such loads of data is create a temporary array.
$fields = array("url", "shoutcast_url", "site_name", "site_subtitle" , ....);
foreach ($fields as $field)
$$field = mysql_real_escape_string($_POST[$field]);
the only thing to be aware of here is that you have to be careful not to put variable names into $fields that would overwrite existing variables.
Update: Col. Shrapnel makes the correct and valid point that using variable variables is not a good practice. While I think it is perfectly acceptable to use variable variables within the scope of a function, it is indeed better not use them at all. The better way to sanitize all incoming fields and have them in a usable form would be:
$sanitized_data = array();
$fields = array("url", "shoutcast_url", "site_name", "site_subtitle" , ....);
foreach ($fields as $field)
$sanizited_data[$field] = mysql_real_escape_string($_POST[$field]);
this will leave you with an array you can work with:
$sanitized_data["url"] = ....
$sanitized_data["shoutcast_url"] = ....
Just run a single query that updates all columns:
UPDATE table SET col1='a', col2='b', col3='c' WHERE id = '5'
I would recommend that you execute the UPDATE with all column values. It'd be less costly than trying to confirm that the value is different than what's currently in the database. And that confirmation would be irrelevant anyway, because the values in the database could change instantly after you check them if someone else updates them.
If you issue an UPDATE against MySQL and the values are identical to values already in the database, the UPDATE will be a no-op. That is, MySQL reports zero rows affected.
MySQL knows not to do unnecessary work during an UPDATE.
If only one column changes, MySQL does need to do work. It only changes the columns that are different, but it still creates a new row version (assuming you're using InnoDB).
And of course there's some small amount of work necessary to actually send the UPDATE statement to the MySQL server so it can compare against the existing row. But typically this takes only hundredths of a millisecond on a modern server.
Yes, it's ok to update every field.
A simple function to produce SET statement:
function dbSet($fields) {
$set='';
foreach ($fields as $field) {
if (isset($_POST[$field])) {
$set.="`$field`='".mysql_real_escape_string($_POST[$field])."', ";
}
}
return substr($set, 0, -2);
}
and usage:
$fields = explode(" ","name surname lastname address zip fax phone");
$query = "UPDATE $table SET ".dbSet($fields)." WHERE id=$id";
I have a table in MySQL with "text", "date_posted", and "user". I currently query all text from user=Andy, and call those questions. All of the other text fields from other users are answers to the most recent question.
What I want is to associate those answers with the most recent question, with a loop similar to "for each text where user=Andy, find the text where user!=Andy until date>the next user=Andy (question)"
This seems awfully contrived, and I'm wondering if it can be done roughly as I've outlined, or if I can save myself some trouble in how I'm storing the data or something.
Thanks for any advice.
EDIT: I've added in the insert queries I've been using.
$url = "http://search.twitter.com/search.json?q=&ands=&phrase=&ors=¬s=RT%2C+%40&tag=andyasks&lang=all&from=amcafee&to=&ref=&near=&within=1000&units=mi&since=&until=&tude%5B%5D=%3F&rpp=50)";
$contents = file_get_contents($url);
$decode = json_decode($contents, true);
foreach($decode['results'] as $current) {
$query = "INSERT IGNORE INTO andyasks (questions, date, user) VALUES ('$current[text]','$current[created_at]','Andy')";
mysql_query($query);
}
$url2 = "http://search.twitter.com/search.json?q=&ands=&phrase=&ors=¬s=RT&tag=andyasks&lang=all&from=&to=amcafee&ref=&near=&within=15&units=mi&since=&until=&rpp=50";
$contents2 = file_get_contents($url2);
$decode2 = json_decode($contents2, true);
foreach($decode2['results'] as $current2) {
$query2 = "INSERT IGNORE INTO andyasks (questions, date, user) VALUES ('$current2[text]','$current2[created_at]','$current2[from_user]')";
mysql_query($query2);
}
And then on the SELECT side, this is where I am currently:
$results = mysql_query("SELECT * FROM andyasks");
$answers = mysql_query("SELECT * FROM andyasks WHERE 'user' != 'Andy'");
while($row = mysql_fetch_array($results))
{
if ($row['user'] == 'Andy') {
print(preg_replace($pattern, $replace, "<p>".$row["questions"]."</p>"));
}
}
while($row = mysql_fetch_array($answers))
{
print(preg_replace('/#amcafee/', '', "<p>".$row["questions"]."</p>"));
}
What you have in mind could, I believe, be done with subtle use of JOIN or nested SELECT, ORDER BY, LIMIT, etc, but, as you surmise, it would be "awfully contrived" and likely pretty slow.
As you suspect, you would save yourself a lot of trouble at SELECT time if you added a column to the table, which, for answers, has the primary key of the question they're answering (that could be easily obtained at INSERT time, since it's the latest entry with user equal Alex). Then the retrieval would be easier!
If you can alter your schema this way, but need help with the SQL, pls comment or edit your answer to indicate that and I'll be happy to follow up (similarly, I'd be happy to follow up if you're stuck with this schema and need the "awfully contrived" SQL -- I just don't know which of the two possibilities applies!-).
Edit: since the schema's changed, the INSERT could be (using form :name to indicate parameters you should bind):
INSERT IGNORE INTO andyasks
(questions, date, user, answering)
SELECT :text, :created_at, :from_user,
IF(:from_user='Andy', NULL, aa.id)
FROM andyasks AS aa
WHERE user='Andy'
ORDER BY date DESC
LIMIT 1
i.e.: use INSERT INTO ... SELECT' to do a query-within-insertion, which picks the latest post by Andy. I'm assuming you do also have a primary keyid` that's auto-increment, which is the normal arrangement of things.
Later to get all answers to a given question, you only need to select rows whose answering attribute equals that question's id.
If I understand you correctly you want something like:
$myArr = array("bob","joe","jennifer","mary");
while ($something = next($myArr)) {
if ($nextone = next($myArr)) {
//do Something
prev($myArr)
}
}
see http://jp2.php.net/next as well as the sections on prev, reset and current