I am coding in php and the code takes data from an array to fetch additional data from a mysql db. Because I need data from two different tables, I use nested while loops. But the code below always only prints out (echo "a: " . $data3[2]; or echo "b: " . $data3[2];) one time:
foreach($stuff as $key)
{
$query3 = "SELECT * FROM foobar WHERE id='$key'";
$result3 = MySQL_query($query3, $link_id);
while ($data3 = mysql_fetch_array($result3))
{
$query4 = "SELECT * FROM foobar_img WHERE id='$data3[0]'";
$result4 = MySQL_query($query4, $link_id);
while ($data4 = mysql_fetch_array($result4))
{
$x += 1;
if ($x % 3 == 0)
{
echo "a: " . $data3[2];
}
else
{
echo "b: " . $data3[2];
}
}
}
}
First and foremost, improve your SQL:
SELECT
img.*
FROM
foobar foo
INNER JOIN foobar_img img ON
foo.id = img.id
WHERE
foo.id = $key
You will only have to iterate through one array.
Also, it appears that you're actually only selecting one row, so spitting out one row is expected behavior.
Additionally, please prevent yourself from SQL injection by using mysql_real_escape_string():
$query3 = "SELECT * FROM foobar WHERE id='" .
mysql_real_escape_string($key) . "'";
Update: As Dan as intimated, please run this query in your MySQL console to get the result set back, so you know what you're playing with. When you limit the query to one ID, you're probably only pulling back one row. That being said, I have no idea how many $keys are in $stuff, but if it spins over once, then it will be one.
You may be better off iterating through $stuff and building out an IN clause for your SQL:
$key_array = "";
foreach($stuff as $key)
{
$key_array .= ",'" . mysql_real_escape_string($key) . "'";
}
$key_array = substr($key_array, 1);
...
WHERE foo.id IN ($key_array)
This will give you a result set with your complete list back, instead of sending a bunch of SELECT queries to the DB. Be kind to your DB and please use set-based operations when possible. MySQL will appreciate it.
I will also point out that it appears as if you're using text primary keys. Integer, incremental keys work best as PK's, and I highly suggest you use them!
You should use a JOIN between these two tables. It the correct way to use SQL, and it will work much faster. Doing an extra query inside the loop is bad practice, like putting loop-invariant code inside a loop.
Related
I have a web application and I'm trying to modify one of the queries. The query fetches information (from a table named voyage_list) and returns various fields.
I want to modify the query so that it is based on certain filters the user applies (which will be placed in the URL).
I can't get the query to work in the web application, but if I copy the query and execute it directly within PHPMyAdmin, it works fine.
$vesselFilter = $_GET['vesselFilter'];
$vesselArray = explode(',', $vesselFilter);
$arrayCount = count($vesselArray);
$sqlExtend = ' status = 1 AND';
foreach ($vesselArray as $value) {
$i = $i + 1;
$sqlExtend .= " vesselID = '$value'";
if ($i < $arrayCount){
$sqlExtend .= " OR";
}
}
$newQuery = "SELECT * FROM voyage_list WHERE" . $sqlExtend;
echo $newQuery;
$query = $db->query($newQuery)->fetchAll();
I appreciate the above is pretty messy, but it's just so I can try and figure out how to get the query to work.
Any help would be greatly appreciated!
Thanks
That query probably doesn't return what you think it does. AND takes precedence over OR, so it will return the first vessel in the list if the status is 1, and also any other vessel in the list, regardless of status.
You'd do better to create a query with an IN clause like this:
SELECT * FROM voyage_list WHERE status = 1 AND vesselID IN(8,9,10)
Here's some code to do just that:
$vesselFilter = $_GET['vesselFilter'];
// Validate data. Since we're expecting a string containing only integers and commas, reject anything else
// This throws out bad data and also protects against SQL injection.
if (preg_match('/[^0-9,]/', $vesselFilter)) {
echo "Bad data in input";
exit;
}
// filter out any empty entries.
$vesselArray = array_filter(explode(',', $vesselFilter));
// Now create the WHERE clause using IN
$sqlExtend = 'status = 1 AND vesselID IN ('.join(',', $vesselArray).')';
$newQuery = "SELECT * FROM voyage_list WHERE " . $sqlExtend;
echo $newQuery;
$query = $db->query($newQuery)->fetchAll();
var_dump($query);
With the foreach loop, I wanna count how many results are displayed. For example, if it's displaying
Jack Ane
Steve Jobs
Sara Bill
I want to echo that there are 3 results.
Likewise, if it's like
Marc Kil
Bill Smith
I want to echo that there are 2 results.
It's a bit tricky for me becasue this is my code:
<div>
<?php
$container = array();
if (is_array($row))
{
foreach ($row as $data) {
if(!isset($container[$data->first_name . $data->last_name])) {
$container[$data->first_name . $data->last_name] = $data;
echo $data->first_name . " " .$data->last_name . "</div>";
}
}
}
?>
</p>
</div>
How exactly would I be able to do that? Since these values are coming straight from the database, I was thinking of doing a database count but there are duplicate values in the database since I'm logging the views of users with the first and the last name. So when I try to do it, say for example there are 20 Jack Ane in my database. Then it shows me all of the 20 Jack Ane's instead of just one because I just want it once.
Sorry if it's confusing.
Thanks.
I traditional use the count() to do that if you dont use any :
foreach ($row as $data) {
if(!isset($container[$data->first_name . $data->last_name])) {
$container[$data->first_name . $data->last_name] = $data;
echo $data->first_name . " " .$data->last_name . "</div>";
}
}
echo "Results: " . count($row);
Hope that help you.
I suggest you to rewrite your query. If you will do this in right way, you will get faster solution, with no needs to new array and unnecessary "isset" checks.
The reason you get duplicated data from query may be:
1 - Wrong query logic
2 - Query is OK, but you need to use DISTINCT or GROUP BY to remove duplicates
If you use PDO, you can then get number of returned rows just by using rowCount() method
$sql="SELECT * from table WHERE blablabla";
$result = $this->db->query($sql);
$result->rowCount(); // here
Then you can fetch $result->fetchAll(); and print data.
You can to do a SELECT DISTINCT or a GROUP BY across the two columns to have the database do the work and eliminate the duplicate checking in your PHP. To do this you can use something like the following:
SELECT DISTINCT first_name, last_name FROM users;
SELECT first_name, last_name FROM users GROUP BY first_name, last_name;
DISTINCT is more succinct while GROUP BY supports more flexibility.
In your example, since you are building an associative array, you can just do a count() after the loop, but you will have cleaner code if you have the database do it:
$count = count($container);
You could do an easy variable that increments inside your foreach that gives you the exact count, then use the variable to create actions depending on it's value. Because if you count the container and you wish to filter out the results inside the container, you won't get the filtered amount.
<?php
$container = array();
if (is_array($row))
{
$count = 0;
foreach ($row as $data) {
if(!isset($container[$data->first_name . $data->last_name])) {
$container[$data->first_name . $data->last_name] = $data;
echo $data->first_name . " " .$data->last_name . "</div>";
$count++;
}
}
}
if ($count > 0) {
echo "There were $count results.";
}
?>
use:
echo "Results: " . count($container);
I have some strange problem with inserting rows in loop into mySQL table.
Let me show you php code first that I use then I describe some statistic.
I tend to think that it is some mySQL issue, but absolutely no idea what kind of. Max inserts in table per minute? (Can't be max row reached - planty of spase on disk)
echo " For character=" . $row[1];
$xml = simplexml_load_file($api_url);
$i=0;
foreach ($xml->result->rowset->row as $value) {
$newQuery = 'INSERT INTO '.$tableName.' (transactionDateTime, quantity, typeName, price, clientName, station, transactionType, seller) VALUES ("'.$value['transactionDateTime'].'",'.$value['quantity'].',"'.$value['typeName'].'","'.$value['price'].'","'.$value['clientName'].'","'.$value['stationName'].'","'.$value['transactionType'].'","'.$row[1].'")';
$i++;
if (!mysqli_query($conn, $newQuery)) {
die('Error while adding transaction record: ' . mysqli_error($conn));
} // if END
} // foreach END
echo " added records=" . $i;
I have same data in XML that doesn't change. (XML has something like 1400+ rows that i would insert)
It always inserts different amount of rows. Max amount it inserted was around 800+
If I insert like 10sec delay into foreach loop at $i==400 it will add even less rows. And more delays - less rows.
It never comes to that part of code where mysqli_error($conn)
It never reaches echo " added records=" . $i; part of the code.
Since it alwasy stops on different recors I have to assume nothing wrong with INSERT query.
Since it never reaches line after foreach loop echo " added records=" . $i; I also assume XML data wasn't processed by the end of it.
If I use another sources of data (another character) where are less records in XML then this code works just fine.
What could possibly be my problem?
Could be that your firing multiple queries at your SQL server. Better to build a single SQL query via your foreach then fire it once.
Something like this, basically:
$db = new mysqli($hostname, $username, $password, $database);
if($db->connect_errno > 0)
{
$error[] = "Couldn't establish connection to the database.";
}
$commaIncrement = 1;
$commaCount = count($result);
$SQL[] = "INSERT INTO $table $columns VALUES";
foreach ($result as $value)
{
$comma = $commaCount == $commaIncrement ? "" : ",";
$SQL[] = "(";
$SQL[] = "'$value[0]'"."'$value[1]'"."'$value[2]'"."'$value[3]'";
$SQL[] = ")$comma";
$commaIncrement++;
}
$SQL[] = ";";
$completedSQL = implode(' ',$SQL);
$query = $db->prepare($completedSQL);
if($query)
{
$db->query($completedSQL)
}
$db->close();
Scrowler is right, your php is timing out. As a test, you can add
set_time_limit(0);
to the start of your php script.
WARNING - Don't use this in production or anywhere else. Always set a reasonable time limit for the script.
I'm trying to count how many times a hashtag is mentioned in the database. So first while is getting all the hashtags, and the second is inside the other while to count how many times the hashtag is mentioned. But the problem is that the numbers isn't getting along correct, it is just showing 1,2,3,4,5.. etc, and when there is a hashtag mentioned two times it's showing i.e. 3+4.
How can I solve this?
$i = 0;
$popular_hashtags_query = mysql_query("SELECT * FROM " . $dbPrefix . "hashtags WHERE status=1");
while ($popular_hashtags = mysql_fetch_array($popular_hashtags_query)) {
echo "<div class='hashtag_label'><a data-hover='";
$count_hashtags_query = mysql_query("SELECT * FROM " . $dbPrefix . "hashtags WHERE status=1 AND hashtag='" . $popular_hashtags['hashtag'] . "'");
while ($count_hashtags = mysql_fetch_array($count_hashtags_query)) {
$i++;
echo $i;
}
echo "'><span>#".$popular_hashtags['hashtag'] . "</span></a></div>";
}
I suggest to use mysqli or PDO. in this case group by hashtag in the query is better way and not need to extra query and loop so.
$popular_hashtags_query = mysql_query("
SELECT
`hashtag`, count(*) AS `count`
FROM `" . $dbPrefix . "hashtags` WHERE `status` = 1 GROUP BY `hashtag`
");
while ($popular_hashtags = mysql_fetch_array($popular_hashtags_query)) {
echo "<div class='hashtag_label'><a data-hover='";
echo $popular_hashtags['count'];
echo "'><span>#" . $popular_hashtags['hashtag'] . "</span></a></div>";
}
There are a bunch of problems with your code. First, the mysql_ functions are deprecated and will eventually be removed from php entirely, so you should move over to mysqli_ functions or a PDO.
Second, your actions on $i are in the wrong place. You should be resetting $i = 0 as the first action inside the first loop, or else it's just going to count up the total number of times ANY hashtag is used.
Third, you're echoing $i inside the second while loop, which means every time the loop runs, you're going to be forever echoing increasing numbers. The echo should be outside the inner loop, after you've counted up the instances of the hashtag.
And finally, you can actually accomplish all of this with one loop by executing "SELECT hashtag, count(*) FROM " . $dbPrefix . "hashtags WHERE status=1 group by hashtag"
There's no need to loop over all records in a table to do the same thing again.
What you want is to use an aggregate function in you SQL query.
Something like the following:
SELECT hashtag, COUNT(hashtag)
FROM hashtags
GROUP BY hashtag
Long before I knew anything - not that I know much even now - I desgined a web app in php which inserted data in my mysql database after running the values through htmlentities(). I eventually came to my senses and removed this step and stuck it in the output rather than input and went on my merry way.
However I've since had to revisit some of this old data and unfortunately I have an issue, when it's displayed on the screen I'm getting values displayed which are effectively htmlentitied twice.
So, is there a mysql or phpmyadmin way of changing all the older, affected rows back into their relevant characters or will I have to write a script to read each row, decode and update all 17 million rows in 12 tables?
EDIT:
Thanks for the help everyone, I wrote my own answer down below with some code in, it's not pretty but it worked on the test data earlier so barring someone pointing out a glaring error in my code while I'm in bed I'll be running it on a backup DB tomorrow and then on the live one if that works out alright.
I ended up using this, not pretty, but I'm tired, it's 2am and it did its job! (Edit: on test data)
$tables = array('users', 'users_more', 'users_extra', 'forum_posts', 'posts_edits', 'forum_threads', 'orders', 'product_comments', 'products', 'favourites', 'blocked', 'notes');
foreach($tables as $table)
{
$sql = "SELECT * FROM {$table} WHERE data_date_ts < '{$encode_cutoff}'";
$rows = $database->query($sql);
while($row = mysql_fetch_assoc($rows))
{
$new = array();
foreach($row as $key => $data)
{
$new[$key] = $database->escape_value(html_entity_decode($data, ENT_QUOTES, 'UTF-8'));
}
array_shift($new);
$new_string = "";
$i = 0;
foreach($new as $new_key => $new_data)
{
if($i > 0) { $new_string.= ", "; }
$new_string.= $new_key . "='" . $new_data . "'";
$i++;
}
$sql = "UPDATE {$table} SET " . $new_string . " WHERE id='" . $row['id'] . "'";
$database->query($sql);
// plus some code to check that all out
}
}
Since PHP was the method of encoding, you'll want to use it to decode. You can use html_entity_decode to convert them back to their original characters. Gotta loop!
Just be careful not to decode rows that don't need it. Not sure how you'll determine that.
I think writing a php script is good thing to do in this situation. You can use, as Dave said, the html_entity_decode() function to convert your texts back.
Try your script on a table with few entries first. This will make you save a lot of testing time. Of course, remember to backup your table(s) before running the php script.
I'm afraid there is no shorter possibility. The computation for millions of rows remains quite expensive, no matter how you convert the datasets back. So go for a php script... it's the easiest way
This is my bullet proof version. It iterates over all Tables and String columns in a database, determines primary key(s) and performs updates.
It is intended to run the php-file from command line to get progress information.
<?php
$DBC = new mysqli("localhost", "user", "dbpass", "dbname");
$DBC->set_charset("utf8");
$tables = $DBC->query("SHOW FULL TABLES WHERE Table_type='BASE TABLE'");
while($table = $tables->fetch_array()) {
$table = $table[0];
$columns = $DBC->query("DESCRIBE `{$table}`");
$textFields = array();
$primaryKeys = array();
while($column = $columns->fetch_assoc()) {
// check for char, varchar, text, mediumtext and so on
if ($column["Key"] == "PRI") {
$primaryKeys[] = $column['Field'];
} else if (strpos( $column["Type"], "char") !== false || strpos($column["Type"], "text") !== false ) {
$textFields[] = $column['Field'];
}
}
if (!count($primaryKeys)) {
echo "Cannot convert table without primary key: '$table'\n";
continue;
}
foreach ($textFields as $textField) {
$sql = "SELECT `".implode("`,`", $primaryKeys)."`,`$textField` from `$table` WHERE `$textField` like '%&%'";
$candidates = $DBC->query($sql);
$tmp = $DBC->query("SELECT FOUND_ROWS()");
$rowCount = $tmp->fetch_array()[0];
$tmp->free();
echo "Updating $rowCount in $table.$textField\n";
$count=0;
while($candidate = $candidates->fetch_assoc()) {
$oldValue = $candidate[$textField];
$newValue = html_entity_decode($candidate[$textField], ENT_QUOTES | ENT_XML1, 'UTF-8');
if ($oldValue != $newValue) {
$sql = "UPDATE `$table` SET `$textField` = '"
. $DBC->real_escape_string($newValue)
. "' WHERE ";
foreach ($primaryKeys as $pk) {
$sql .= "`$pk` = '" . $DBC->real_escape_string($candidate[$pk]) . "' AND ";
}
$sql .= "1";
$DBC->query($sql);
}
$count++;
echo "$count / $rowCount\r";
}
}
}
?>
cheers
Roland
It's a bit kludgy but I think the mass update is the only way to go...
$Query = "SELECT row_id, html_entitied_column FROM table";
$result = mysql_query($Query, $connection);
while($row = mysql_fetch_array($result)){
$updatedValue = html_entity_decode($row['html_entitied_column']);
$Query = "UPDATE table SET html_entitied_column = '" . $updatedValue . "' ";
$Query .= "WHERE row_id = " . $row['row_id'];
mysql_query($Query, $connection);
}
This is simplified, no error handling etc.
Not sure what the processing time would be on millions of rows so you might need to break it up into chunks to avoid script timeouts.
I had the exact same problem. Since I had multiple clients running the application in production, I wanted to avoid running a PHP script to clean the database for every one of them.
I came up with a solution that is far from perfect, but does the job painlessly.
Track all the spots in your code where you use htmlentities() before inserting data, and remove that.
Change your "display data as HTML" method to something like this :
return html_entity_decode(htmlentities($chaine, ENT_NOQUOTES), ENT_NOQUOTES);
The undo-redo process is kind of ridiculous, but it does the job. And your database will slowly clean itself everytime users update the incorrect data.