I currently have a relatively large HTML form (100+ fields). I want to take the data from that form and upload it to a mySQL database when the use hits submit. I have created the PHP code below and have been slowly adding fields and testing to see if the connection is successful. Everything was working through $skilled_nursing, but when I added the next set of values I am no longer successfully creating database entries. All of my echo commands are displayed and I am not getting failures in my error log, but the data is not being received in the database.
Can anyone see what is going wrong? I have checked multiple times for spelling errors, but I haven't seen any. I am wondering if I am somehow timing out with the connection or if I am trying to stick too many values into the execute command.
<?php
echo 'started ok';
// configuration
$dbtype = "mysql";
$dbhost = "localhost";
$dbname = "dbname";
$dbuser = "dbuser";
$dbpass = "userpass";
echo 'variables assigned ok';
// database connection
$conn = new PDO("mysql:host=$dbhost;dbname=$dbname",$dbuser,$dbpass);
echo 'connection established';
// new data
$facility_name = $_POST['facility_name'];
$facility_street = $_POST['facility_street'];
$facility_county = $_POST['facility_county'];
$facility_city = $_POST['facility_city'];
$facility_state = $_POST['facility_state'];
$facility_zipcode = $_POST['facility_zipcode'];
$facility_phone = $_POST['facility_phone'];
$facility_fax = $_POST['facility_fax'];
$facility_licensetype = $_POST['facility_licensetype'];
$facility_licensenumber = $_POST['facility_licensenumber'];
$facility_email = $_POST['facility_email'];
$facility_administrator = $_POST['facility_administrator'];
$skilled_nursing = $_POST['skilled_nursing'];
$independent_living = $_POST['independent_living'];
$assisted_living = $_POST['assisted_living'];
$memory_care = $_POST['memory_care'];
$facility_type_other = $_POST['facility_type_other'];
$care_ratio = $_POST['care_ratio'];
$nurse_ratio = $_POST['nurse_ratio'];
// query
$sql = "INSERT INTO Facilities (facility_name, facility_street, facility_county, facility_city, facility_state, facility_zipcode, facility_phone, facility_fax, facility_licensetype, facility_licensenumber, facility_email, facility_administrator, skilled_nursing, independent_living, assisted_living, memory_care, facility_type_other, care_ratio, nurse_ratio) VALUES (:facility_name, :facility_street, :facility_county, :facility_city, :facility_state, :facility_zipcode, :facility_phone, :facility_fax, :facility_licensetype, :facility_licensenumber, :facility_email, :facility_administrator, :skilled_nursing, :independent_living, :assisted_living, :memory_care, :facility_type_other, :care_ratio, :nurse_ratio)";
$q = $conn->prepare($sql);
$q->execute(array(':facility_state'=>$facility_name,
':facility_street'=>$facility_street,
':facility_county'=>$facility_county,
':facility_city'=>$facility_city,
':facility_state'=>$facility_state,
':facility_name'=>$facility_name,
':facility_zipcode'=>$facility_zipcode,
':facility_phone'=>$facility_phone,
':facility_fax'=>$facility_fax,
':facility_licensetype'=>$facility_licensetype,
':facility_licensenumber'=>$facility_licensenumber,
':facility_email'=>$facility_email,
':facility_administrator'=>$facility_administrator,
':skilled_nursing'=>$skilled_nursing,
':independent_living'=>$independent_living,
':assisted_living'=>$assisted_living,
':memory_care'=>$memory_care,
':facility_type_other'=>$facility_type_other,
':care_ratio'=>$care_ratio,
':nurse_ratio'=>$nurse_ratio));
echo 'query parsed';
?>
This doesn't exactly answer what's going wrong with your code, but it might help solve it.
I would do this a bit differently. You say that you have a lot of fields. Your code is likely to get very long and repetitive. Since it looks like your form field names already correspond with your table columns, I would do something more like this (not tested):
// get a list of column names that exist in the table
$sql = "SELECT column_name FROM information_schema.columns WHERE table_name = 'Facilities'";
$q = $conn->prepare($sql);
$q->execute();
$columns = $q->fetchAll(PDO::FETCH_COLUMN, 0);
$cols = array();
foreach ($_POST as $key=>$value)
{
// if a field is passed in that doesn't exist in the table, remove it
if (!in_array($key, $columns)) {
unset($_POST[$key]);
}
}
$cols = array_keys($_POST);
$sql = "INSERT INTO Facilities(". implode(", ", $cols) .") VALUES (:". implode(", :", $cols) .")";
$q = $conn->prepare($sql);
array_walk($_POST, "addColons");
$q->execute($_POST);
function addColons($value, &$key)
{
$key = ":{$key}";
}
This way, you could have 10, 100, or 1000 fields and this code won't have to change at all. You also reduce your chance for typo errors because there's only one place where the column name is specified. You don't have to worry about SQL injection on the column names because you check to make sure that the column exists before allowing it to be used in your query.
This does, of course, assume that all fields passed in via $_POST correspond with column names in your table. If this isn't the case, it may be easiest to just store those particular field values that aren't columns in separate variables and unset() them from the $_POST array.
Related
What's wrong with the following syntax:
if( isset($_POST['save_changes']) ) {
// Get current id of customer
$currentID = $_GET['id'];
// Get Input Values
$newfirstName = validateInputData($_POST['first_name']);
$newlastName = validateInputData($_POST['last_name']);
$newemail = validateInputData($_POST['email']);
$newphone = validateInputData($_POST['phone_number']);
$newaddressOne = validateInputData($_POST['address_one']);
$newaddressTwo = validateInputData($_POST['address_two']);
$newcounty = validateInputData($_POST['county']);
$newcity = validateInputData($_POST['city']);
$newzipCode = validateInputData($_POST['zip_code']);
$newprovince = validateInputData($_POST['province']);
$newstate = validateInputData($_POST['state']);
// Queries
$query = "UPDATE customers
SET
first_name='$newfirstName',
last_name='$newlastName',
email='$newemail',
phone='$newphone'
WHERE id='$currentID'
";
$conn->query($query) or die($conn->error.__LINE__);
$query = "UPDATE addresses
SET
address_one='$newaddressOne',
address_two='$newaddressTwo',
county='$newcounty',
city='$newcity',
province='$newprovince',
zip_code='$newzipCode',
state='$newstate'
WHERE customer_id='$currentID'
";
$conn->query($query) or die($conn->error.__LINE__);
// Bring user back to index
header("Location: index.php?alert=savechanges");
// Close connection to database
$conn->close();
}
the above query runs fine, but the row is not updated. all the field names are appropriate. When the query is tried in phpMyAdmin, row updated.
Please help, thank you.
Your validateInputData() function is not doing any validation. Hopefully it's doing some escaping, implying that you are assuming global scope for your database connection object. You didn't tell us what type of database object this is. Your error checking is poor. You don't do an explicit exit after the redirect.
Apart from that the sql looks ok.
I have a MySQL table with the following fields:
ID
PHONE
NAME
CITY
COUNTRY
Using PHP, I am reading a comma separated dump of values off a text document, parsing the values and inserting records to the table. For reference, here's the code:
<?php
// Includes
require_once 'PROJdbconn.php';
// Read comma-separated text file
$arrindx = 0;
$i = 0;
$filehandle = fopen(PROJCDUMPPATH.PROJCDUMPNAME,"rb");
while (!feof($filehandle)){
$parts = explode(',', fgets($filehandle));
$contnames[$arrindx] = $parts['0'];
$contnumbers[$arrindx] = preg_replace('/[^0-9]/','',$parts['1']);
$arrindx += 1;
}
fclose($filehandle);
$arrindx -= 1;
$filehandle = NULL;
$parts = NULL;
// Build SQL query
$sql = "INSERT INTO Contact_table (PHONE, NAME) VALUES ";
for ($i = 0; $i < $arrindx; ++$i){
$sql .= "('".$contnumbers[$i]."', '".$contnames[$i]."'),";
}
$i = NULL;
$arrindx = NULL;
$contnames = NULL;
$contnumbers = NULL;
$sql = substr($sql,0,strlen($sql)-1).";";
// Connect to MySQL database
$connect = dbconn(PROJHOST,PROJDB,PROJDBUSER,PROJDBPWD);
// Execute SQL query
$query = $connect->query($sql);
$sql = NULL;
$query = NULL;
// Close connection to MySQL database
$connect = NULL;
?>
Now, this code, as you can see, blindly dumps all records into the table. However, I need to modify the code logic as such:
Read text file and parse records into arrays (already doing)
For each record in text file
Check if PHONE exists in the table
If yes,
For each field in the text file record
If text file field != NULL
Update corresponding field in table
Else
Skip
If no,
INSERT record (already doing)
I apologize if the logic isn't terribly clear, feel free to ask me if any aspect confuses you. So, I understand this logic would involve an insane number of SELECT, UPDATE, and INSERT queries, depending on the number of fields (I intend to add more fields in future) and records. Is there any way to either somehow morph them into a single query or leastwise optimize the code by minimizing the number of queries?
What you're trying to do is called an "upsert" (update/insert).
MySQL INSERT else if exists UPDATE
I am really trying to wrap my head around this and failing miserably. What I want to do it build a MySQL query based on the URL parameters passed by the URL. I am trying to create a re usable dynamic script that can do what it needs to do based on the URL parameter.
This is what I have come up with, and it appears that it does what it is supposed to do (no errors or anything) but nothing actually gets inserted in the database. I know somewhere I have made a dumb mistake (or thought something out wrong) so hopefully one of you guys can point me in the right direction.
Thanks!
//List all possible variables you can expect the script to receive.
$expectedVars = array('name', 'email', 'score', 'age', 'date');
// This is used for the second part of the query (WHERE, VALUES, ETC)
$fields = array('uName','uEmail','uScore','uAge','uDate');
// Make sure some fields are actually populated....
foreach ($expectedVars as $Var)
{
if (!empty($_GET[$Var]))
{
$fields[] = sprintf("'%s' = '%s'", $Var, mysql_real_escape_string($_GET[$Var]));
}
}
if (count($fields) > 0)
{
// Construct the WHERE Clause
$whereClause = "VALUES " . implode(",",$fields);
//Create the SQL query itself
$sql = ("INSERT INTO $mysql_table ($fields) . $whereClause ");
echo "1"; //It worked
mysql_close($con);
}
else
{
// Return 0 if query failed.
echo "0";
}
?>
You missed mysql_query($sql):
if(!mysql_query($sql)){
//die(mysql_error());
}
Please consider to use PDO or My SQLi using parametrize query because mysl_* function depreciated.
Your SQL is all wrong. You're using the field = value syntax for an INSERT, then you're concatenating an array as if it were a string ($fields), and you're missing a couple of parentheses around the values.
a couple of things: i've found for php <-> mysql its important to see what's going into mysql and experiement directly with those queries in phpmyadmin when i get stuck.
1 - in my code I output mysql_error() when the query fails or when a debug flag is set. this usually explains the sql issue in a way that can point me to a misspelled field name etc...
2 - this way i can feed that mysql query directly into phpmyadmin and tweak it until it gives me the results i want. (while i'm there i can also use explain to see if i need to optimize the table)
specifics in your code. unlike C languages sprintf is implied. here's how i'd write your code:
// List all possible variables you can expect the script to receive.
$expectedvars = array('name', 'email', 'score', 'age', 'date');
// This is used for the second part of the query (WHERE, VALUES, ETC)
// $fields = array('uName','uEmail','uScore','uAge','uDate');
$fields = array();
// Set only the variables that were populated ...
foreach ($expectedvars as $var) {
if (!empty($_GET[$var])) {
$name = "u" + ucwords($var); // convert var into mysql field names
$fields[] = "{$name} = " . mysql_real_escape_string($_GET[$var]);
}
}
// only set those fields which are passed in, let the rest use the mysql default
if (count($fields) > 0) {
// Create the SQL query itself
$sql = "INSERT INTO {$mysql_table} SET " . implode("," , $fields);
$ret = mysql_query($sql);
if (!$ret) {
var_dump('query_failed: ', $sql, $ret);
echo "0"; // Query failed
} else {
echo "1"; // It worked
}
} else {
// Return 0 if nothing to do
echo "0";
}
mysql_close($con);
I'm using Postgresql 9.2 and PHP 5.5 on Linux. I have a database with "patient" records in it, and I'm displaying the records on a web page. That works fine, but now I need to add interactive filters so it will display only certain types of records depending on what filters the user engages, something like having 10 checkboxes from which I build an ad-hoc WHERE clause based off of that information and then rerun the query in realtime. I'm a bit unclear how to do that.
How would one approach this using PHP?
All you need to do is recieve all the data of your user's selected filters with $_POST or $_GET and then make a small function with a loop to concatenate everything the way your query needs it.
Something like this... IN THE CASE you have only ONE field in your DB to match with. It's a simple scenario and with more fields you'll need to make it so that you add the field you really need in each case, nothing too complex.
<?php
//recieve all the filters and save them in array
$keys[] = isset($_POST['filter1'])?'$_POST['filter1']':''; //this sends empty if the filter is not set.
$keys[] = isset($_POST['filter2'])?'$_POST['filter2']':'';
$keys[] = isset($_POST['filter3'])?'$_POST['filter3']':'';
//Go through the array and concatenate the string you need. Of course, you might need AND instead of OR, depending on what your needs are.
foreach ($keys as $id => $value) {
if($id > 0){
$filters.=" OR ";
}
$filters.=" your_field = '".$value."' ";
}
//at this point $filters has a string with all your
//Then make the connection and send the query. Notice how the select concatenates the $filters variable
$host = "localhost";
$user = "user";
$pass = "pass";
$db = "database";
$con = pg_connect("host=$host dbname=$db user=$user password=$pass")
or die ("Could not connect to server\n");
$query = "SELECT * FROM table WHERE ".$filters;
$rs = pg_query($con, $query) or die("Cannot execute query: $query\n");
while ($row = pg_fetch_row($rs)) {
echo "$row[0] $row[1] $row[2]\n";
//or whatever way you want to print it...
}
pg_close($con);
?>
The above code will get variables from a form that sent 3 variables (assuming all of them correspond to the SAME field in your DB, and makes a string to use as your WHERE clause.
If you have more than one field of your db to filter through, all you need to do is be careful on how you match the user input with your fields.
NOTE: I did not add it here for practical reasons... but please, please sanitize user input.. ALWAYS sanitize user input before using user controlled data in your queries.
Good luck.
Don't do string concatenation. Once you have the values just pass them to the constant query string:
$query = "
select a, b
from patient
where
($x is not null and x = $x)
or
('$y' != '' and y = '$y')
";
If the value was not informed by the user pass it as null or empty. In the above query the x = $x condition will be ignored if $x is null and the y = '$y' condition will be ignored if $y is empty.
With that said, a check box will always be either true or false. What is the exact problem you are facing?
Always sanitize the user input or use a driver to do it for you!
I have created a Where clause builder exactly for that purpose. It comes with the Pomm project but you can use it stand alone.
<?php
$where = Pomm\Query\Where::create("birthdate > ?", array($date->format('Y-m-d')))
->andWhere('gender = ?', array('M'));
$where2 = Pomm\Query\Where::createWhereIn('something_id', array(1, 15, 43, 104))
->orWhere($where);
$sql = sprintf("SELECT * FROM my_table WHERE %s", $where2);
$statement = $pdo->prepare($sql);
$statement->bind($where2->getValues());
$results = $statement->execute();
This way, your values are escaped and you can build dynamically your where clause. You will find more information in Pomm's documentation.
So I have a flatfile db in the format of
username:$SHA$1010101010101010$010110010101010010101010100101010101001010:255.255.255.255:1342078265214
Each record on a new line... about 5000+ lines.. I want to import it into a mysql table. Normally I'd do this using phpmyadmin and "file import", but now I want to automate this process by using php to download the db via ftp and then clean up the existing table data and upload the updated db.
id(AUTH INCREMENT) | username | password | ip | lastlogin
The script I've got below for the most part works.. although php will generate an error:
"PHP Fatal error: Maximum execution time of 30 seconds exceeded" I believe I could just increase this time, but on remote server I doubt I'll be allowed, so I need to find better way of doing this.
Only about 1000 records will get inserted into the database before that timeout...
The code I'm using is below.. I will say right now I'm not a pro in php and this was mainly gathered up and cobbled together. I'm looking for some help to make this more efficient as I've heard that doing an insert like this is just bad. And it really sounds bad aswel, as a lot of disk scratching when I run this script on local pc.. I mean why does it want to kill the hdd for doing such a seemingly simple task.
<?php
require ('Connections/local.php');
$wx = array_map('trim',file("auths.db"));
$username = array();
$password = array();
$ip = array();
$lastlogin = array();
foreach($wx as $i => $line) {
$tmp = array_filter(explode(':',$line));
$username[$i] = $tmp[0];
$password[$i] = $tmp[1];
$ip[$i] = $tmp[2];
$lastlogin[$i] = $tmp[3];
mysql_query("INSERT INTO authdb (username,password,ip,lastlogin) VALUES('$username[$i]', '$password[$i]', '$ip[$i]', '$lastlogin[$i]') ") or die(mysql_error());
}
?>
Try this, with bound parameters and PDO.
<?php
require ('Connections/local.php');
$wx = array_map('trim',file("auths.db"));
$username = array();
$password = array();
$ip = array();
$lastlogin = array();
try {
$dbh = new PDO("mysql:host=$ip;dbname=$database", $dbUsername, $dbPassword);
$dbh->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
} catch(PDOException $e) {
echo 'ERROR: ' . $e->getMessage();
}
$mysql_query = "INSERT INTO authdb (username,password,ip,lastlogin) VALUES(:username, :password, :ip, :lastlogin)";
$statement = $dbh->prepare($mysql_query);
foreach($wx as $i => $line) {
set_time_limit(0);
$tmp = array_filter(explode(':',$line));
$username[$i] = $tmp[0];
$password[$i] = $tmp[1];
$ip[$i] = $tmp[2];
$lastlogin[$i] = $tmp[3];
$params = array(":username" => $username[$i],
":password" => $password[$i],
":ip" => $ip[$i],
":lastlogin" => $lastlogin[$i]);
$statement->execute($params);
}
?>
Instead of sending queries to server one by one in the form
insert into table (x,y,z) values (1,2,3)
You should use extended insert syntax, as in:
insert into table (x,y,z) values (1,2,3),(4,5,6),(7,8,9),...
This will increase insert performance by miles. However you need to be careful about how many rows you insert in one statement, since there is a limit to the size of a single SQL can be. So, I'd say start with 100 row packs and see how it goes, then adjust pack size accordingly. Chances are your insert time will go down to like 5 seconds, putting it way under max_execution_time limit.