I'm trying to select multiple values from 1 column in my MySQL database. I have a Table 'products' with a column 'category'. Categories: Home, Garden, Cars, Bicycle etc. I want to fetch the number of products with these categories for statistics. It sounds simple but I can only get it done with allot of code. I want all of these categories to be variables so I only have to put my variables in my statistics engine to do the calculation. Right now this is my code to fetch the number of products with category 'Garden':
$query = "SELECT * FROM products WHERE category='Garden'";
$result= mysql_query($query);
$row = mysql_fetch_array($result);
echo "$row[category]";
Repeating this for every category does 't seem right to me.. Does anyone understand my question and have a solution?
I think this is what you want
$query = "SELECT `category`, COUNT(`category`) FROM `products` GROUP BY `category`;";
try this
$query = "SELECT *, count(*) as counts FROM products group by category";
$result= mysql_query($query);
while($row = mysql_fetch_array($result))
{
echo $row['category']." ". $row['counts'];
}
you have to use while to fetch multiple categories.
you need to group by category to get distinct categories.
Just use a while as a loop to get the data.
What is While Loop?
The do while construct consists of a process symbol and a condition. First, the code within the block is executed, and then the condition is evaluated. If the condition is true the code within the block is executed again. This repeats until the condition becomes false. Because do while loops check the condition after the block is executed, the control structure is often also known as a post-test loop. Contrast with the while loop, which tests the condition before the code within the block is executed.The do-while loop is an exit-condition loop. This means that the code must always be executed first and then the expression or test condition is evaluated. If it is true, the code executes the body of the loop again. This process is repeated as long as the expression evaluates to true. If the expression is false, the loop terminates and control transfers to the statement following the do-while loop.
Just use the code below to fetch multiple values.
$query = "SELECT * FROM products WHERE category='Garden'";
$result= mysql_query($query);
while($row=mysql_fetch_array($result))
{
echo "$row[category]<br>";
}
Related
Function render makes website 500% slow! Can anyone fix that please ?
Someone told me :
because it sends a database request on each iteration of the loop (it's not the only problem with this chunk of code but it's the most taxing one)
Yes I understand what that means. His way is:
you need to get all of the data before you start building the menu,
then you just insert the data instead of requesting more data on each
iteration
But i don't know how i must do it!
<?php
$menu_html='';
function render_menu($parent_id,$actmenuid)
{
$obj = new Database();
$con = $obj->dbconnectt();
global $menu_html;
$result=mysqli_query($con, "select * from tbl_menu where parent_id='$parent_id'");
if(mysqli_num_rows($result)==0) return;
if($parent_id==0){
$menu_html.='<ul class="topnav">';
}else{
$menu_html.='<ul>';
}
while($row=mysqli_fetch_array($result)) {
$childnum = $obj->recordcount("SELECT * FROM tbl_menu WHERE parent_id='".$row['id']."'");
if($childnum == 0){
$linkvalue='/category/'.$row['id'].'.html';
} else{
$linkvalue='#';
}
if($row['id']==$actmenuid && $actmenuid !=NULL){
$actv='class="active"';
}else{
$actv='';
}
$menu_html.='<li '.$actv.'>'.$row['title'].'';
render_menu($row['id'],$actmenuid);
$menu_html.='</li>';
}
$menu_html.='</ul>';return $menu_html;
}
if($isDsh==false){
echo render_menu(0,$actmenuid);
}
?>
Depending on how many records you have, try removing this query from inside the loop since it's running for every record on the first query.
$childnum = $obj->recordcount("SELECT * FROM tbl_menu WHERE parent_id='".$row['id']."'");
Change it a single query like this where it returns counts for each parent idea, and place it outside of the loop:
$parentcount = mysqli_query($con, ("SELECT parent_id, count(*) FROM tbl_menu GROUP BY parent_id");
There may be other issues, so please post the database structure and number of records that you're working with too.
Don't make recursive queries.
Having "more than 1000" rows is not too big. You can simply call everything from the table into php, then perform the recursive html build in php this will have a memory overhead, but far less processing overhead because you only ever make one trip to the db.
Alternatively (when your db table is prohibitively large), you should avoid gathering rows unnecessarily by adding a new column. The new column will store all "descendants" for the respective row when the row is INSERTed or update it when it is UPDATEd. Then you only need to reference this column when needing to call specific rows. In other words, do the recursive processing only once (when writing to the db) AND not when needing to display the data. This will, again, produce a finite result set in one query which can then be recursively traversed to build the desired output.
basically you need to do what #spudly has suggested.
But there is a small catch in his solution which depending on the number of the rows in yous tbl_menu table you may use a big chunk of memory to fetch all the records.
you can optimise it more with using his solution but changing the query to:
select
parent_tbl_menu.id,
count(child_tbl_menu.id) as cnt
from
tbl_menu as parent_tbl_menu
left join
tbl_menu as child_tbl_menu
on parent_tbl_menu.id = child_tbl_menu.parent_id
where
parent_tbl_menu.parent_id = ?
group by
parent_tbl_menu.id
This way you will only fetch the child records of a specific parent.
And please consider using prepared statements as your code has sql injection vulnerability.
Connect (from PHP to MySQL) only once for the entire web page.
Don't put a SELECT inside a loop if you can do all the work in a single SELECT, such as with a JOIN. (Exception: A "hierarchical" table needs the nested SELECT. Exception to the exception: MySQL 8.0 and MariaDB 10.2 can do it with a "recursive CTE".)
Don't fetch all the columns (SELECT *) when all you want it is a recordcount. Instead, SELECT COUNT(*) ... and use the number returned.
1000 of anything is probably excessive for a web page. Re-think the UI.
I have a MySQLi query that returns all of the "assets" assigned to an employee based on their EmployeeID. This works great. The problem I'm facing is in the presentation.
I have an HTML table that has two sections: 1 for Hardware and 1 for software. What I am hoping to avoid is having to perform separate lookups that generate separate result sets for each type of asset. The end result needs to display as follows:
I can build the table just fine. The result sets contains a field of asset_type but I've not had any luck figuring out the code to use to iterate through my single result set. Is this even possible? Can I pull just the hardware assets from the result set with a while? Perhaps a
while($result['asset_type'] == "hardware"){
echo ""; // table row code
}
And then repeat the same thing later in my table code for asset_type software?
UPDATE 1
The code I've thought might work so far, but isn't doing anything, is
// SQL query
$q = "SELECT * FROM `assets_table` WHERE `emp_id` = '".$emp_id."'";
$r = mysqli_query($connect, $q);
$total_assets = mysqli_num_rows($r);
while($r){
if($r['category'] = "hardware"){
echo $r['asset_name']." - ".$r['hw_make']." ".$r['hw_model'];
}
}
I ended up going ahead and breaking out the query into multiple result sets and dealing with them that way. It'd be awesome if you could have a while() with a WHERE statement when iterating through result sets / arrays.
I'm still very new to PHP and pgsql... new to coding in general.
I'm trying to figure out if I should do a while or do while loop on this problem.
I need to query a remote source for data and update my db, but I'm limited to the number of returns per call. I have over 1000 rows to update, but my call limit is 100. This means I need to do multiple calls until all rows in a column are no longer null.
I believe this is the right query, but is my while statement correct?
Here my code:
// $dbconn = connection......
$result = pg_query($dbconn, "WITH data(full_address) AS (VALUES ('location'))
SELECT full_address FROM $table WHERE latitude is NULL limit 5;");
while ($row = pg_num_rows($result > 0)) {
$arr = pg_fetch_all($row);
//curl commands fetch data and ingest
}
Use do while id the loop should get executed atleast once.
While is entry control loop(it will check condition while you are entering the loop)
Do While is exit control loop(it will check condition after executing the loop one time.)
If you want the loop to run at least once; use do, but if your loop might never be executed (because of the condition) then use while.
In your case, while is prefered since the database query might give no results. Your while loop needs to fetch a single row and process it until there are no more rows to fetch.
while ($row = pg_fetch_row($result)) {
//your code to use row's data
}
do-while loops are very similar to while loops, except the truth
expression is checked at the end of each iteration instead of in the
beginning. The main difference from regular while loops is that the
first iteration of a do-while loop is guaranteed to run (the truth
expression is only checked at the end of the iteration), whereas it
may not necessarily run with a regular while loop (the truth
expression is checked at the beginning of each iteration, if it
evaluates to FALSE right from the beginning, the loop execution would
end immediately).
from: http://php.net/manual/en/control-structures.do.while.php
EDIT
// $dbconn = connection......
for($=0;$i<10;$i++){
$result = pg_query($dbconn, "**your query with** Limit ".(100*$i).",100;");
while ($row = pg_fetch_row($result)) {
//your code to use row's data
// do your curl stuff here for the 1 result
}
}
This query generates a list of items per zip code.
$ziparrayimplode = implode(",", $ziparray);
$listingquery = "SELECT * FROM listings WHERE (CONCAT(title, description) LIKE '%".$searchstring."%') AND auc_cat LIKE '%".$category."%' AND zip IN ($ziparrayimplode) AND all_zip=$allzip ORDER BY list_ts DESC $pages->limit";
$listinghistory = mysql_query($listingquery) or die(mysql_error());
If I use "AND" in the WHERE statement for all_zip=$allzip then all the items that are true for all_zip will show, but not the items in $ziparray. If I use "OR" in the WHERE statement then the items true for $ziparray will be included as well as $allzip... but my search function won't work at all.
Am I phrasing this query correctly or should I use "OR" in the WHERE statement and look for the problem in the way the search is coded?
You could manage it playing with OR / AND operators precedence (your strange result seems to show that you're actually a "victim" of wrong usage of operator precedence), or just add parentheses.
...
AND (zip IN ($ziparrayimplode) OR all_zip=$allzip)
ORDER BY...
I'm having problems debugging a failing mysql 5.1 insert under PHP 5.3.4. I can't seem to see anything in the mysql error log or php error logs.
Based on a Yahoo presentation on efficient pagination, I was adding order numbers to posters on my site (order rank, not order sales).
I wrote a quick test app and asked it to create the order numbers on one category. There are 32,233 rows in that category and each and very time I run it I get 23,304 rows updated. Each and every time. I've increased memory usage, I've put ini setting in the script, I've run it from the PHP CLI and PHP-FPM. Each time it doesn't get past 23,304 rows updated.
Here's my script, which I've added massive timeouts to.
include 'common.inc'; //database connection stuff
ini_set("memory_limit","300M");
ini_set("max_execution_time","3600");
ini_set('mysql.connect_timeout','3600');
ini_set('mysql.trace_mode','On');
ini_set('max_input_time','3600');
$sql1="SELECT apcatnum FROM poster_categories_inno LIMIT 1";
$result1 = mysql_query($sql1);
while ($cats = mysql_fetch_array ($result1)) {
$sql2="SELECT poster_data_inno.apnumber,poster_data_inno.aptitle FROM poster_prodcat_inno, poster_data_inno WHERE poster_prodcat_inno.apcatnum ='$cats[apcatnum]' AND poster_data_inno.apnumber = poster_prodcat_inno.apnumber ORDER BY aptitle ASC";
$result2 = mysql_query($sql2);
$ordernum=1;
while ($order = mysql_fetch_array ($result2)) {
$sql3="UPDATE poster_prodcat_inno SET catorder='$ordernum' WHERE apnumber='$order[apnumber]' AND apcatnum='$cats[apcatnum]'";
$result3 = mysql_query($sql3);
$ordernum++;
} // end of 2nd while
}
I'm at a head-scratching loss. Just did a test on a smaller category and only 13,199 out of 17,662 rows were updated. For the two experiments only 72-74% of the rows are getting updated.
I'd say your problem lies with your 2nd query. Have you done an EXPLAIN on it? Because of the ORDER BY clause a filesort will be required. If you don't have appropriate indices that can slow things down further. Try this syntax and sub in a valid integer for your apcatnum variable during testing.
SELECT d.apnumber, d.aptitle
FROM poster_prodcat_inno p JOIN poster_data_inno d
ON poster_data_inno.apnumber = poster_prodcat_inno.apnumber
WHERE p.apcatnum ='{$cats['apcatnum']}'
ORDER BY aptitle ASC;
Secondly, since catorder is just an integer version of the combination of apcatnum and aptitle, it's a denormalization for convenience sake. This isn't necessarily bad, but it does mean that you have to update it every time you add a new title or category. Perhaps it might be better to partition your poster_prodcat_inno table by apcatnum and just do the JOIN with poster_data_inno when you need the actually need the catorder.
Please escape your query input, even if it does come from your own database (quotes and other characters will get you every time). Your SQL statement is incorrect because you're not using the variables correctly, please use hints, such as:
while ($order = mysql_fetch_array($result2)) {
$order = array_filter($order, 'mysql_real_escape_string');
$sql3 = "UPDATE poster_prodcat_inno SET catorder='$ordernum' WHERE apnumber='{$order['apnumber']}' AND apcatnum='{$cats['apcatnum']}'";
}